Audionamix – 7.1.20

Category Archives: VFX

Framestore creates variety of animation styles for Libresse/Bodyform spots

Partnering with creatives Nick & Nadja at agency AMV BBDO, Framestore provided animation and VFX for the latest campaign for Libresse and Bodyform. The campaign was directed by Golden Globe winner and Emmy-nominated Nisha Ganatra (Late Night). Framestore provided six animated sequences, each featuring a different style of animation to show the inner worlds that act as reflections to the realities of the uterus. The film has been created to dispel myths, encourage a positive conversation and address the life-changing moments in a woman’s life, from miscarriages and menopause to endometriosis.

Framestore creative director Sharon Lock worked with Nick & Nadja to select the styles of animations that would bring to life the emotions and unique perspectives of each story. Styles included 2D cell techniques and stop-frame animation, as well as hand-painted images created with oil paint on glass.

Lock worked with the team of artists to direct the animated sequences and work as the main central point of creativity for them with the client and agency. Talking about bringing those visually different elements together into a single cohesive film, she says, “it was important that the animations produced for this film not only looked as good as possible but also made an emotional impact on audiences because of the nature of the film.

“We worked with animators who had wonderful storytelling abilities and whose work was unique and handmade and could communicate a range of tone and emotion to audiences in a short amount of time on screen.”

The team at Framestore, which included producers Niamh O’Donohoe and Emma Cook, was a part of the film’s predominantly female cast and crew, which they felt made a big difference in creating something that was honest and powerful. “We were telling real stories about the experiences of being a woman, so having the team we did meant we had something of a shorthand,” explains O’Donohoe. ‘We could easily communicate what we needed because there was a mutual understanding of how these stories had to be presented, something that I feel beautifully reflects the messages that Libresse/Bodyform is always communicating.”

Framestore also delivered invisible VFX work for the film’s live-action portions and created a world of uteri, which represents the billions of women who are a part of the Libresse/Bodyform story. These visuals are featured in the opening and closing sequences that will become the brand’s main visual for this campaign. Framestore also provided the color grade.

“It was important that everyone worked really closely together to make sure every frame did its part in telling the stories and I think the final piece speaks for itself. It was amazing to be part of such an inspiring and creative campaign,” concludes Lock.

VFX supervisor Jay Worth talks Season 2 of Netflix’s Altered Carbon

By Barry Goch

Netflix’s Altered Carbon is now streaming Season 2, with a new lead in Anthony Mackie as Takeshi Kovacs in a new skin. He’s the only surviving soldier of a group of elite interstellar warriors, continuing his centuries-old quest to find his lost love, Quellcrist Falconer (Renée Elise Goldsberry). After decades of planet-hopping and searching the galaxy, Kovacs finds himself recruited back to his home planet of Harlan’s World with the promise of finding Quell. In this world of Altered Carbon, lives can be continued after death by taking on a new skin and using the person’s stack — or brain.

Jay Worth — Credit: Rob Flate

As you can imagine, there are a ton of visual effects used to tell Takeshi’s story. To find out more, we reached out to Jay Worth, an Emmy Award-winning VFX supervisor with 15 years of experience working in visual effects. His credits include Fringe, Person of Interest and Westworld, for which he won the Emmy for Outstanding Special Visual Effects in 2017.

How did you get involved in Altered Carbon?
I have a relationship with showrunner Alison Schapker. We go way back to the good old days of Fringe and a few other things. I had worked with the head of visual effects and post for Skydance, Dieter Ismagil, and then I had just come off of working on a Netflix show. It worked out for all three of those parties to come together and have me join the team. It was a fun bit of a reunion for us to get back together.

At what point did you come on board for Season 2?
I came in after it was shot in order to usher things through post and the final creative push through the final delivery. VFX producer Tony Meagher and I were able to keep the ball rolling and push it through to the final. The VFX team at Double Negative and the other vendors that we had were really able to carry it through from the beginning to the end as well.

Tell us about your review process. Where were you based?
We were in Los Angeles — the showrunners, Tony Meagher and I — but the rest of the team was in Toronto: our VFX coordinator, VFX editor, post team and DI facility (Deluxe Toronto). The VFX vendors were spread across Canada. The interesting thing for us was how to set up the review process while being in Los Angeles. We relied really completely on ClearView and that amazing technology. We were able to do editorial reviews and full-range color UHD review sessions for final VFX shots. It was a beautiful process. Being able to review many things in the edit and make a checklist was useful. Then we needed to look at this one in color, so being able to go downstairs and just flip a switch in our bay and have our beautifully calibrated setup was amazing. That afforded us the ability to work seamlessly even though we weren’t all in the same place.

This was the first time I had done a show that was so remote. I’ve done many shows where editorial is in one place and the VFX team is in another, but this was the first time I’d done something this ambitious. We did everything remotely, from editorial reviews to effects reviews to color and even the sound, and it was really an amazing, far more seamless process than I thought it would be when we started. The team at Skydance, the production team and the post team really had all the variables dialed in, and it was really painless considering we were spread out. The editorial team and the VFX team on the show side were just phenomenal in terms of how they were able to coordinate with everybody.

       
Before and After

This production predates the COVID-19 restrictions. Do you think that would have impacted your production?
It would have been a challenge, but not impossible. We would have probably ended up having more ClearView boxes for the team in order to work remotely. I’ve worked recently on other shows that have the colorists working from home, and they’re all tapping into the same box; it just happens to be a pipeline issue. It was doable before, but now there’s just a little bit more back and forth to set up the pipeline.

What was the hardest sequence on “Broken Angels,” the last episode of the season, and why?
One of the larger challenges in visual effects is how to convey something visually from a story perspective and still have it feel real and organic. A lot of times, it ends up being a more challenging hurdle to get over from a visual standpoint when the storytellers are trusting you to help convey these different story points. That’s really where visual effects shine: When you are willing to take on that risk and that narrative responsibility, that’s really where the fun lies.

For the finale, it was telling the story of Angelfire. People kind of understand the overarching idea of satellites and weapons from space, but we had to help people understand the communication between them. We also needed them to understand how it connects to the older technology and what that’s going to mean for our characters. That was by far the biggest challenge for that episode and for the season.

Tell us about the look development of the Angelfire.
It was definitely a journey, but it started with the page and trying to visualize it. Alison Schapker and EP James Middleton had written up what these moments were going to be: a communication tower and a force field around a planet they didn’t quite understand. That was part of the mystery for the viewers and the characters as they were going through the season.

Our goal, from a visual effects standpoint, was to show this ancient-yet-modern communication and to figure out how to visually tell the story of how these things are communicating … that they’re all kind of like-minded and they’re protective. We key that up when Danica fires off the rocket with the rebels attached to them so we can see firsthand what these orbitals can do. Then we see Angelfire come down on the soldiers in the forest.

We’re starting to understand more and more what this thing does so that we can understand what the sacrifice really means … to figure out what the orbitals are and how they could look and feel organic and threatening as well as benign and ultimately destructive. I feel like we ended at a point where it makes sense and it all works together, but at the beginning, when you have a blank canvas, it’s a rather daunting task to figure out what it all should look like.

We had so many conversations about how to depict Angelfire. Should it be more like glass breaking? Should it be like lightning? Should it be like a wave? Should it just crackle? Should it splash in? We had so many iterations of things that just didn’t feel or look quite right. It didn’t convey what we wanted it to convey. “It looks too digital; it looks fake.” To end up with something that felt integrated into the environment and the sky was a testament not only to the team’s perseverance but to Alison’s and James’ patience, leadership and ability to explain creatively what they were going for. I’m really happy with where we finally landed.

How did you lock in the final look?
We wanted it to feel organic and real for the audience. We had a lot of different meetings to talk about what perspective we were going to take — how high up we need to be, how close we need to be to understand that they were communicating with each other and still firing — and whether those different perspectives should be down on the ground or up in the sky. We figured it out with editorial while we were locking episodes, which is a fairly normal process when you’re dealing with full CG shots mixed with pieces that we shot on the day.

We obviously had numerous versions of animatics, and we had to figure out how it was going to work in the edit before we could lock down animation and timing. Honestly, for the final moments when Kovacs sacrificed himself and Angelfire was going off, we were tweaking those with editorial, and our editorial team did a phenomenal job of helping us realize the moment.

Any people or companies that you want to give a shout-out to?
Bob Munroe (a production-side VFX supervisor) and Tony Meagher. All the work they did was groundwork for everything that ended up on the screen. And all the vendors, like Double Negative, Mavericks, Spin, Switch and Krow. Also our VFX coordinating team and everybody up in Toronto. They were the backbone of everything we did this season. And it was just so much fun to work with Alison and James and the team.

Any advice for people wanting to work in visual effects?
From my standpoint, there are not enough people on the show side of things, and if they have a passion for it, there’s a lot of opportunity to get into that.

I would say try to find your lane. Is it on the artist side? Is it on the coordinating and producing side? There are so many resources out there now. And now that the technology is available for everybody, it’s an amazing opportunity for creatives to get together and collaborate and to make things that that are compelling.

When I’m on a show or in the office, I can tell which PA or assistant has a fascination with VFX, and I always encourage them to come along. I have hired from within many times. It’s about trying to educate yourself and figure out what your passion is, and realizing there’s space for almost any role when it comes to visual effects. That’s the exciting thing about it.


Barry Goch is senior finishing artist at The Foundation and an instructor in post production at UCLA Extension.

Audionamix – 7.1.20

VFX supervisors talk Amazing Stories and Stargirl

By Iain Blair

Even if you don’t know who Crafty Apes are, you’ve definitely seen their visual effects work in movies such as Deadpool 2, La La Land, Captain America: Civil War and Little Women, and in episodics like Star Trek: Picard and Westworld. The full-service VFX company was founded by Chris LeDoux, Jason Sanford and Tim LeDoux and has locations in Atlanta, Los Angeles, Baton Rouge, Vancouver, Albuquerque and New York, and its roster of creative and production supervisors offers a full suite of services, including set supervision, VFX consultation, 2D compositing and CG animation, digital cosmetics, previsualization and look development.

Aldo Ruggiero

Recently, Crafty Apes worked on two high-profile projects — the reboot of Steven Spielberg’s classic TV series Amazing Stories for Apple TV+ and the Disney+’s Stargirl.

Let’s take a closer look at their work on both. First up is Amazing Stories and Crafty Apes VFX supervisor Aldo Ruggiero.

How many VFX did you have to create for the show?
The first season has five episodes, and we created VFX for two episodes — “The Heat” and “Dynoman and the Volt!!” I was on the set for the whole of those shoots, and we worked out all the challenges and problems we had to solve day by day. But it wasn’t like we got the plates and then figured out there was a problem. We were very well-prepared and we were all based in Atlanta where all the shooting took place, which was a big help. We worked very closely with Mark Stetson, who was the VFX supervisor for the whole show, and because they were shooting three shows at once, he couldn’t always be on set, so he wanted us there every day. Mark really inspired me just to take charge and to solve any problems and challenges.
What were the main challenges?
Of the two episodes, “Dynoman and the Volt!” was definitely the most challenging to do, as we had this entire rooftop sequence, and it was quite complicated, as half was done with bluescreen and half was done using a real roof. We had about 40 shots cutting back and forth between them, and we had to create this 360-degree environment that matched the real roof seamlessly. Doing scenes like that, with all the continuity involved and making it totally photo-real, is very challenging. To do a one-off shot is really easy compared with that, as it may take 20 man-days to do. But this took about 300 man-days to get it done — to match every detail exactly and all the color and so on. The work we did for the other episode, “The Heat,” was less challenging technically and more subtle. We did a lot of crowd replacement and a lot of clean-up, as Atlanta was doubling for other locations.

It’s been 35 years since the original Amazing Stories first aired. How involved was Spielberg, who also acts as EP on this?
He was more involved with the writing than the actual production, and I think the finale of “Dynoman and the Volt!!” was completely his idea. He wasn’t on the set, but he gave us some notes, which were very specific, very concise and pointed. And of course, visual effects and all the technology have advanced so much since then.

Gabriel Sanchez

What tools did you use?
We used Foundry Nuke for compositing and Autodesk Maya for 3D animation, plus a ton more. We finished all the work months ago, so I was happy to finally just see the finished result on TV. It turned out really well I think.

Stargirl
I spoke with VFX supervisor Gabriel Sanchez, a frequent collaborator with Wes Anderson. He talked about creating the VFX and the pipeline for Stargirl, the musical romantic drama about teenage angst and first love, based on the best-selling YA novel of the same name, and directed by Julia Hart (Fast Color).

How many VFX did you have to create for the film, and how closely did you work with Julia Hart?
While you usually meet the director in preproduction, I didn’t meet Julia until we got on set since I’d been so busy with other jobs. We did well over 200 shots at our offices in El Segundo, and we worked very closely together, especially in post. Originally, I was brought on board to be on the set to oversee all the crowd duplication for the football game, but once we got into post, it evolved into something much bigger and more complex.

Typically during bidding and even doing the script breakdown, we always know there’ll be invisible VFX, but you don’t know exactly what they’ll be until you get into post. So during preproduction on this, the big things we knew we’d have to do up front were the football and crowd scenes, maybe with some stunt work, and the CG pet rat.

What were the main challenges?
The football game was complex, because they wanted not just the crowd duplication, but also to create one long, seamless take because it’s the half-time performance. So we blocked it and did it in sections, trying to create the 360 so we could go around the band and so on.

The big challenge was then doing all those cuts together in a seamless take, but there were issues, like where the crowd would maybe creep in during the 360, or we’d have a shadow or we’d see the crane or a light. So that kind of set the tone, and we’d know what we had to clean up in post.

Another issue was a shot wherein it was raining and we had raindrops bouncing off a barn door onto the camera, which created this really weird long streak on the lens, and we had to remove that. We also had to change the façade of the school a bit, and we had a do a lot of continuity fixes. So once we began doing all that stuff, which is fairly normal in a movie, then it all evolved in post into a lot more complex and creative work.

What did it entail?
Sometimes, in terms of performance, you might like a take of how an actress speaks her lines technically, but prefer another take of how an actor replies or responds, so we had a lot of split screens to make the performance come together. We also had to re-adjust the timing of the actors’ lip movements sometimes to sync up with the audio, which they wanted to off-set. And there were VFX shots we created in post where we had no coverage.

For instance, Julia needed a bike in front of a garage for a shot that was never filmed, so I had to scan through everything, find footage, then basically create a matte painting of the garage and find a bike from another take, but it still didn’t quite work. In the end, I had to take the bike frame from one take, the wheels from another and then assemble it all. When Julia saw it, she said, ‘Perfect!’ That’s when she realized what was feasible with VFX, depending on the time and budget we had.

How many people were on your team?
I had about 10 artists and two teams. One worked on the big long seamless 360 shot, and then another team worked on all the other shots. I did most of the finishing of the long halftime show sequence on Autodesk Flame, with assistance from three other artists on Nuke, and I parceled out various bits to them — “take out this shadow,” “remove this lens flare” and so on — and did the complete assembly to make it feel seamless on Flame. I also did all the timing of the crowd plates on Flame. Ultimately, the whole job took us about two months to complete, and it was demanding but a lot of fun to work on.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.


Battlesuit: Creating a pilot remotely, in realtime with Unreal Engine

By Randi Altman

Filmmaker Hasraf “HaZ” Dulull, who began his career as a visual effects artist and VFX supervisor, enjoys working on sci-fi projects. His reel is full of them, including his own films, such as The Beyond and 2036: Origin Unknown. Even the Disney show he directed, Fast Layne, featured an aspect of futurism to it thanks to a sophisticated talking car. He is even offering a master class on science-fiction filmmaking.

Hasraf “HaZ” Dulull

So it’s not surprising that Dulull’s most recent project also focuses on this genre. Dulull recently completed Battlesuit, a proof-of-concept pilot episode for a sci-fi animated series called The Theory. It’s based on the graphic novel of the same name and in development by the comic book publisher TPub Comics. The story centers on an astro-archaeologist who travels the universe looking for answers to help save humanity. In this particular story, she discovers the remains of a mech robot whose last memory will reveal what happened to the planet’s civilization.

Dulull used Epic’s realtime Unreal Engine to produce the pilot cost effectively and remotely — just before COVID-19 hit. The series is currently in development and being shopped around to different networks with Dulull attached as director and executive producer. You can see the proof of concept here.

OK, let’s find out more from Dulull.

How early did you get involved in Battlesuit?
Neil Gibson, who is the creator of the graphic novel “The Theory,” reached out to me last year October for a general coffee chat. He was a fan of my previous work and wanted to get some advice about moving TPub Comic’s IP into the world of film and TV. He gave me some graphic novels that he thought I would like, and “The Theory” was one of them. There was one story within the book that stood out for me — Battlesuit. We caught up once again and Neil mentioned that they were looking at creating proof of concepts for some of their IP being developed for TV and asked if I was interested in directing one. I requested Battlesuit.

Was it always meant to be animated?
Originally, it was planned for a live-action proof of concept, but due to budget and schedule constraints we knew there was no way we would be able to do the vision justice. And I really wanted to stay true to the graphic novel’s story, so I went away to rethink how I would pull it off. That was around the time when Netflix was putting out a lot of animated projects like Love, Death & Robots and Castlevania, so there was a huge rise in that market. So I went back to them with a pitch to do it as a pilot episode for an animated series.

Naturally, their reaction was that animation is waaay too expensive — you’d need an animation house and tons of time, etc. But I had already come up with a way of executing it using a realtime animation approach. I did a quick test scene using existing free assets to get my point across on what this will look like, but also to really know for myself if it could be done — this was the test scene I did. Their response was, whoa, if you can do that for the budget and in a 12-minute pilot episode duration, then go for it. It also helped that I put the test scene online and got great reactions from it. That gave TPub Comics, who financed Battlesuit, the confidence to move ahead.

What was your team size and duration of production and remote production?
The team on the actual animation was three, including myself. As the director I handled all the camera, layout, lighting and shot creations, which was great, as doing realtime animation gave me so much freedom and control. There was also Ronen Eytan, who was the technical Unreal Engine artist who put together a cool animation pipeline using a live link with his iPad to capture the face performance of the actors. Lastly, there was Andrea Tedeschi, a CG artist/generalist responsible for assets and environments. He has collaborated with me on all my projects right back since 2015 when I did my short film Sync, and since then he worked on my features and other stuff with me.

Outside of the animation team, I brought on music composer and sound designer Edward Patrick White, who had just finished delivering the score for the latest Xbox title Gears Tactics. Our voice actors included Nigel Barber (Mission: Impossible – Rogue Nation, Spectre and my feature film The Beyond) and Kosha Engler, who has done performance for video games such as Terminator: Resistance and Star Wars: Battlefront.

How did you find your way to Unreal Engine specifically?
I had been using Unreal Engine since September last year doing previz for my live-action feature film Lunar, which was in soft prep while casting (due to COVID-19, production on that project is on hold), and I realized that the quality of the previz I was creating was very high with cinematic lighting. I thought with a bit of love and raytracing, this could end up being an animated film … but Lunar remains a 100% live-action movie.

There are other realtime engines out there, like Unity, which is great, but I had already been using Unreal Engine, so it made sense to push further with it. I also got some great support from the team at Epic in London to assist me pushing this angle I was going for.

The big thing with using Unreal Engine is the “what you see is what you get” approach to creating scenes and shots, and as a filmmaker that is very hands-on (control freak really!), this was a pure joy to be able to create shots and then hand the shots over to Andrea and Ronen to build further.

But the other big point I want to make is the fact that we removed all the various pipeline steps you usually get with CGI animation projects (rendering, compositing, etc.) because everything was being rendered in real time. So all I was doing was exporting ProRes 444 QuickTimes (Rec 709 color space) and in some cases EXR frames directly out of Unreal Engine and into editorial; that was it.

Any challenges or lessons learned from working in realtime?
The big challenge is adjusting the way to think about what shots and scenes are in a realtime environment. Traditionally in CGI you have each 3D file as a scene or shot, but in Unreal Engine you have one big world called “The Level,” which lives in the main project. Then inside each level are cinematic sequences you create that use all the assets that live in the content part of your project. Once you get your head around that, it’s so much fun and you realize it’s actually way faster working this way.

The other challenge was that everything was coming out of Unreal Engine with no compositing at all to cheat and fix things. This ensured our assets all worked well and we were being smart with shot constructions. One thing to note is the fact that all the shots were created entirely on a laptop — the Razer Studio. Andrea and Ronen used desktop PCs for their work and then sent their packaged Unreal Engine files and assets to me via Dropbox, which I then migrated into the project.

HaZ working on the Razer

The Razer laptop comes with an Nvidia Quadro RTX 5000, and it was literally like having a beast of a desktop machine in my laptop. This was super-helpful because back in early January I was travelling to various CG conferences giving talks and keynotes, and this allowed me to keep working away on the project in a variety of hotel rooms.

Raytracing took the project’s visual look to another level as we were getting reflections, shadows and lighting at such a cinematic level… all in realtime. It was kind of mind blowing at times to be scrubbing back and forth in a sequence in Unreal Engine with explosions going off, spaceships flying, robots firing weapons as I was moving my camera around — again, all in real time.

What other tools did you use in this workflow?
For the war zone sequence, I wanted to have a visceral and gritty tone to the camera moves. I also knew it would take a lot of keyframe animation to do this, so I used a virtual camera solution called DragonFly from Glassbox Technology. Phillipa Carrol, who I knew from The Foundry, reached out to me after seeing my early tests online and gave me a license to use along with some great support from her team. I was able to shoot the action scenes using my iPad as a virtual production camera while the warzone action scenes were playing in realtime.

Virtual camera

The exported shots from Unreal Engine were brought into Blackmagic DaVinci Resolve 16 for editorial and color grading.

Do you think this is the future of filmmaking, especially in the world of COVID-19? How do you see it helping getting production working again?
I think virtual production in general is going to play a big part in content being made for films and TV. And it’s going to be used more and more as the rendering quality of CG in real time is getting so photoreal (I have seen the recent Unreal Engine 5 demo, and wow!) and you can play that back on LED screens and capture actors all in camera.

From my end, it’s allowed me to develop and create the big, bold ideas with animated series content without having a big studio or huge teams — with the entire production done remotely. Even the additional voice recording we needed during editorial was done remotely, with me directing Nigel Barber via iMessage on the iPhone. He would then email me the Wav files and, boom, we have our character voiced in the edit.

Realtime technology also removes that common reason of having everyone under one roof for speed and efficiency in communication, because thanks to Zoom or Skype screen sharing, I can be directing artists as they do the changes instantly in Unreal Engine — without them needing to upload versions for me to review and annotate and send back. So those Zoom/Skype dailies sessions are actually production sessions because at the end of the call, all the changes have been implemented.

What’s next for you?
Battlesuit actually opened the doors for me as a filmmaker to tell stories using animation and broke down the various barriers and obstacles I had before when trying to get animated projects off the ground.

I recently signed on to direct an animated feature film based on a video game IP with producers in Hollywood. I can’t say much about it yetm but it’s using the same approach I did with Battlesuit (all in Unreal Engine). The details will be announced later this year.

You can watch the episode and the making of here:


Weta opens animation wing led by Prem Akkaraju

Weta Digital will begin producing original content for the first time in its 25-year history. Under the banner Weta Animated, the company will develop original animated content for both cinema and streaming platforms. The company has named Prem Akkaraju, a co-founder of SR Labs, as CEO.

Weta Animated has been a long-held dream of majority owners Peter Jackson and Fran Walsh, who will write, produce and direct several animated projects for the company.

“We are huge fans of animated storytelling in all of its forms, but it can be a long, protracted and often costly way to make movies. That’s, in part, why we have created this company — to change the model and open the doors to filmmakers and storytellers who might not otherwise be given the chance to show what they can do,” says Jackson.

Academy Award winners Jackson and Walsh will play a key role in the development of Weta Animated. The new production company will work alongside Weta’s visual effects business for the film and television industry.

Jackson goes on to say, “We’re fortunate to have a strong, creative leadership team at Weta. Both [senior VFX supervisor] Joe Letteri and [executive VFX producer] David Conley have played a huge role in the success of the company. With the expansion of the company, adding someone of Prem’s caliber to this mix is essential. Prem’s energy and passion for film is inspiring; we cannot wait to work with him.”

Weta Digital is known for revolutionizing the VFX production pipeline for some of the biggest films of all time, including Avatar, The Lord of the Rings Trilogy and Avengers: Endgame. Over the past 25 years, Weta Digital has developed over 100 proprietary tool sets and AI technology.

“Weta Digital began with one machine and just one artist, who created the digital effects in Heavenly Creatures,” says Walsh. “None of us knew what we were doing, but even in those early days, we could see the incredible potential of this new technology. Since then, VFX has become a huge industry, but our goal has remained the same — to bring stories to life through the power of imagination. If you can dream it, we can create it.”

This sentiment is shared by entrepreneur Sean Parker (Napster, Facebook), who invested in Weta Digital last year and who has joined the board as vice chairman. Akkaraju co-founded SR Labs with Parker, where he previously served as CEO and still serves as executive chairman. SR Labs solutions streamline traditional distribution to theaters and consumers. Akkaraju is an inventor on 11 domestic and 25 global utility technology patents related to SR Labs’ groundbreaking architecture. Prior to SR Labs, Akkaraju was the chief content officer of SFX Entertainment. Before joining SFX, he was a principal at JPMorgan Entertainment Partners, the largest entertainment-focused Wall Street investment fund at the time.


Foundry Katana 3.6 includes UI and workflow updates

Foundry has released Katana 3.6, the latest version features fundamental UI and workflow updates with artist-focused snapping functionality that accelerates tasks such as light placement. The new Katana 3.6 includes advancements within 3Delight NSI 2.0, which features a toon shading tool set, overhauled live rendering and powerful new texturing tools.

The new Network Material Edit node provides a new UX on top of Katana’s procedural shading workflows. Existing network materials can be edited with minor tweaks or whole new sections of node graph, allowing procedural shot-based edits. Changes are captured in a color-coded UI that clearly document all changes made by any artist, facilitating collaboration and subsequent edits.

Katana 3.6 highlights include:

• Snapping in the Hydra-powered viewer. Thanks to visual clues including wireframe, object outline, and face and edge highlighting, artists know exactly how they are managing objects.
• The new Material Edit node combines the basis of Katana 3.2’s UI work with the procedural functionality of the legacy Network Material Splice and Network Material Parameter Edit tools. It offers a natural and intuitive workflow to artists familiar with the Network Material node graph and a new UX boosting procedural power across entire look development and lighting teams.
• Dockable Widgets in the UI and a new modular system make it possible to use Dockable areas on the top, bottom, left and right of the UI.
• 3Delight NSI 2.0 features a toon shading workflow, which can now leverage all the benefits of Katana’s bulk asset look development and sequence-based lighting, plus overhauled live rendering and new tools for texture-based look development.

“Katana 3.6 represents another release that brings us closer to Foundry’s vision of the digital cinematography platform of the future,” says Jordan Thistlewood, director of product — preproduction, look development and lighting. “The work on tools like Snapping is more than just a tool to itself; it is the foundation of much more to come in the future.”

 


Lenovo intros next-gen ThinkPad mobile workstations

Lenovo has launched the next generation of its ThinkPad P Series: the ThinkPad P15, ThinkPad P17 and ThinkPad P1 Gen 3; the new ThinkPad P15v; and the ThinkPad X1 Extreme Gen 3. Equipped with high-performance 10th Gen Intel H series mobile processors, these new ThinkPads are available in a variety of configurations.

ThinkPad P1 Gen 3

The ThinkPad P Series and the ThinkPad X1 Extreme Gen 3 feature the new Ultra Performance Mode, exclusive to these systems, which allows users to take full control of their performance settings. Users can now dial up the system, ensuring peak performance when they need to complete a render as fast as possible or demo high-fidelity VR content while maintaining a stable frame rate.

Enabled by default as a setting in BIOS, Ultra Performance Mode relaxes restrictions on acoustics and temperature, allowing users to tap into the GPU and CPU and leverage an improved thermal design to maintain the integrity of the machine and deliver increased performance.

A complete reengineering of the thermal design optimizes performance on the ThinkPad P15 and P17 over their predecessors, resulting in what Lenovo says is an added 13% more air flow, a 30% larger CPU heat sink, larger vents and a new thermal mesh to dissipate heat faster.

Lenovo has also moved to a new daughter card design instead of relying on a soldered solution. The ThinkPad P15 and P17 will feature this modular design, offering four times the number of GPU and CPU configurations than previous generations. With Nvidia Quadro RTX GPUs on board, the ThinkPad P15 and P17 support higher-GPU-wattage graphics than their predecessors, increasing from 80 watts to 90 watts and 90 watts to 110 watts, respectively. This increase allows users to select the right configuration for their needs — optimizing performance for their workflow directly out of the box and enabling more complex graphics on a mobile workstation.

ThinkPad P15 and P17

The ThinkPad P15 and P17 boast additional shared features – including a new 94WHr battery and up to 4TB of storage, along with up to 128GB DDR4 of memory and UHD Dolby Vision HDR displays.

The ThinkPad P15 and P17 will be available in July starting at $1,979 and $2119, respectively.

Lenovo’s thinnest and lightest 15-inch mobile workstation – the ThinkPad P1 Gen 3 – has been updated with additional usability features including a new anti-smudge coating, upgraded speakers and a new UHD LCD display option with a 600-nit panel. For mobile workstation users in areas without expansive Wi-Fi access, the ThinkPad P1 Gen 3 also offers optional LTE WWAN – the fastest internet option for remote workers – for increased mobility and performance.

The ThinkPad P1 Gen 3 will be available in July starting at $2,019.

ThinkPad X1 Extreme Gen 3

The latest ThinkPad X1 Extreme Gen 3 is designed for advanced users seeking a high-performance Windows 10 laptop with 10th Gen Intel H series vPro mobile processors up to Core i9 and optional Nvidia GeForce 1650Ti graphics. This combination of processing power and high-performance graphics, along with a 15.6-inch display with up to 600-nits brightness, offer users advanced productivity and collaboration capabilities.

New Wi-Fi 6 and optional Cat 16 LTE-A wireless WAN provide reliable high-speed data transfers for a highly efficient remote working experience. Modern Standby helps ensure emails, messages and updates are received, even when the lid is closed, and allows rapid resume.
The ThinkPad X1 Extreme Gen 3 will be available in July. Price to be announced.

Rounding out the mobile workstation portfolio is the new ThinkPad P15v. Powered by 10th Gen Intel H series mobile processors, the 15-inch P15v offers a UHD 600-nit LCD display and the Nvidia Quadro P620 GPU.
The ThinkPad P15v will be available in July starting at $1,349.


Tom Kendall

Picture Shop VFX and Ghost merge, Tom Kendall named president

Ghost artists at work in Copenhagen studio.

Streamland Media (formerly Picture Head Holdings) has consolidated its visual effects offerings under the Ghost VFX brand. Picture Shop’s visual effects division will merge with Ghost VFX to service feature film, television and interactive media clients. LA-based Picture Shop, as part of the Streamland Media Group, acquired Denmark’s Ghost VFX in January.

Tom Kendall, who headed Picture Shop VFX, will move into the the role of president for Ghost VFX, based out of the Los Angeles facility. Jeppe Nygaard Christensen, Ghost co-founder and EVP, and Phillip Prahl, Ghost SVP, will continue to operate out of the Copenhagen studio.

“I’m very excited about combining both teams,” says Kendall. “It strengthens our award-winning VFX services worldwide, while concentrating our growing team of talent and expertise under one global brand. With strategic focus on the customer experience, we are confident that Ghost VFX will continue to be a partner of choice for leading storytellers around the world.”

Over the years, Ghost has contributed to more than 70 feature films and titles. Some of Ghost’s work includes Star Wars: The Rise of Skywalker, The Mandalorian, The Walking DeadSee, Black Panther and Star Trek Discovery. Recent Picture Shop VFX credits include Hawaii Five-O, Magnum P.I., The Walking Dead and Fear the Walking Dead.

The Streamland Media Group includes Picture Shop, Formosa Group, Picture Head, Ghost VFX, The Farm and Finalé, with locations in the US, Denmark, Canada and the UK.


Quick Chat: Compositor Jen Howard on her move from films to spots

By Randi Altman

Industry veteran Jen Howard started her career as a model maker before transitioning to a career as a compositor. After spending the last 20 years at ILM working on features — including Avatar, Pirates of the Caribbean: At World’s End, Transformers, Hulk and Jurassic World — she recently made the move to Carbon Chicago to work on commercials.

While Howard’s official title is Nuke compositor, she has been credited on films as digital artist, lead digital artist, sequence lead, compositing lead and sequence supervisor. We recently reached out to her to talk about her transition, her past and present. Enjoy!

While you specialize in Nuke, your official title is compositor. What does that title entail?
Regardless of what software package one uses, being a compositor entails marrying together many pieces of separately shot footage so that they appear to be part of a single image sequence captured at one time.

For realistic-style productions, these pieces of photography can include live-action plates, rendered creatures, rendered simulations (like smoke or water), actors shot against greenscreen, miniatures, explosions or other practical elements shot on a stage. For more stylistic productions that list might also include hand-drawn, stop motion or rendered animations.

Sounds fun as well as challenging.
Yes, compositing presents both technical and aesthetic challenges, and this is what I love about it. Each shot is both a math problem and an art problem.

Technically, you need to be able to process the image data in the gentlest way possible while achieving a seamless blend of all your elements. No matte lines, no layering mistakes, solid tracking, proper defocus and depth hazing. Whether or not you’ve done this correctly is easy to see by looking at the final image — there is largely a right and a wrong result. The tracked-in element is either sliding, or it isn’t. However, whether you’ve made the right aesthetic decisions is a trickier question.

The less quantifiable goal for all the artists on a shot is to manifest the director’s vision … to take the image in their head and put it on the screen. This requires a lot of verbal discussion about visuals, which is tricky. Sometimes there is production art, but often there isn’t. So what does it mean when the director says, “Make it more mysterious”? Or what if they don’t even know what they want? What if they do, but the people between the director and the artists can’t communicate that vision downstream clearly?

When you build an image from scratch, almost everything can be in play — composition, contrast, saturation, depth of field, the direction and falloff of lighting, the placement of elements to frame the action and direct the eye. It is a compositor’s job to interpret the verbal input they’ve received and know what changes to make to each of these parameters to deliver the visual look and feel the director is after and to tell their story.

What would surprise people the most about what falls under that title?
I think people are still surprised at how many aspects of an effects shot are in a compositor’s control, even today when folks are pretty tech-savvy. Between the person doing the lighting and rendering and the compositor, you can create any look. And they’re surprised at the amount of “hand work” it entails, as they imagine the process to be more automated than it is.

How long have you been working in visual effects?
During college, I became a production assistant for master model maker Greg Jein, and he taught me that craft. Interesting fact — the first lesson was how to get your fingers apart after you’ve glued them together. I worked building models until about 1997 then crossed over to the digital side. So that’s about 30 years, and it’s a good thing I’m sitting down as I say that.

Kong

How has the industry changed in the time you’ve been working? What’s been good? What’s been bad?
When I was a model maker, most of that work was happening in the LA area. The VFX houses with their own model shops and stages and the stand-alone model shops were there. There was also ILM in the Bay Area. These places drew on local talent. They had a regular pool of local freelancers who knew each other, and a lot of them fell into the field by accident..

I worked with welders, machinists and sci-fi geeks good at bashing model kits who ended up working at these places because someone there knew them, and the company needed their skill set. Then all of a sudden, they were in show business. There was a family feel to most shops, and it was always fun. Some shops were union, so the schedules for projects at those places mostly fit the scope of work, and late nights were rare. The digital world was the same for a long time.

Model shops mostly went away, and as everyone knows, most digital feature effects are now done overseas, with some tasks like roto and matchmoving entirely farmed out to separate smaller companies. Crews are from all over the globe, and I’d hazard a guess that those folks got into the industry on purpose because now it is a thing.

What we’ve gained with this new paradigm is a more diverse pool of new talent who can find their way into the industry pretty much no matter where they’re from. That makes me happy because I feel strongly that everyone who has a love for this kind of work should get a shot at trying it. They bring fresh vision and new ideas to the industry and an appetite for pushing the technology further.

What’s lost is the shorthand and efficiency you get from a crew that’s worked together for a long time. They’re older and have made a lot of the mistakes already and can cut to the chase quickly. They make great mentors for the younger artists when tapped for that job, but I don’t feel that there’s been the amount of knowledge transfer there could have been — in either direction. Sometimes an “us versus them” dynamic emerges, which is really unfortunate.

Another change is the increasingly compressed schedule of feature production, which creates long hours and weekend work. This is hard on everyone, both physically and emotionally. The stress can be intense and translates into work injuries and relationship tension and is extremely hard on families with children. Studios have been pushing for these shorter schedules and cheaper prices. VFX work has been moved to countries that offer tax breaks or a generally cheaper labor pool. So quality now takes a back seat two ways: There isn’t enough time, and sometimes there isn’t enough experience.

You recently made the move to Chicago and spot work after years at ILM working on features. Can you talk about the differences in workflows?
The powerful role of advertising agencies in commercial work really surprised me. In film, the director is king, and they’re there all the way through the project, making every creative decision. In advertising, it seems the director shoots and moves on, and the agency takes up the direction of the creative vision in post production.

The shorter timeline for spot work translates into less time for 3D artists to iterate and finesse their renders, which are time-intensive to run, and so the flexibility and faster turnaround of comp means more comp work on renders, sooner. In features, 3D artists ideally have the time to get their render to a place that they’re mostly happy with before comp steps in, and the comp touch can be pretty light. (Of course, feature timelines are becoming more compressed, so that’s not always true now.)

Did a particular film inspire you along this path?
Two words. Star Wars. (Not unusual I know.) Also, when I was older, Japanese anime. Starblazers (Yamato), specifically.

Growing up, I watched my mom struggle to make enough money to support us. She had to look for opportunity everywhere, taking whatever job was available. Mostly she didn’t particularly enjoy her jobs, and I noticed the price she paid for that – spending so many hours with people she didn’t enjoy, doing work that didn’t resonate for her. So it became very important for me to find work that I loved. It was a very conscious goal.

You mentioned school earlier. Was that film school?
Yes, I went to Cal Arts in Valencia, California, just outside of LA. I studied animation and motion graphics, but I discovered pretty quickly that I had no talent for animation. However, I became fascinated with the school’s optical printer and motion control camera, and I played a lot with those. The optical printer is the chemical way of compositing that was used before digital compositing was developed. Using those analog machines helped me understand digital compositing down the road.

Porche’s The Heist

Can you name some recent projects you’ve worked on?
My last ILM project was the new Star Wars ride that opened recently in Disneyland, called Rise of the Resistance. Other recent projects include Solo: A Star Wars Story, Transformers: The Last Knight, Kong: Skull Island and Bumblebee: The Movie.

While at Carbon I worked on a spot for Porche called The Heist and a Corona campaign.

What projects are you most proud of?
For model making, I’m proud of the work I did on Judge Dredd, which came out in 1995. I got to spend several months just detailing out a miniature city with little greebles — making up futuristic-looking antennae and spires to give the city more scale.

Batman

On the digital side I’m really proud of the look we developed for Rango, ILM’s one and only animated feature, directed by Gore Verbinski. We brought a lot of realistic cinematic zing to that world using some practical elements in combination with rendered layers, and we built comp into the process deliberately so we could dial to our hearts’ content.

I’m also extremely proud of the first three Pirates movies, in which we did something of the opposite — brought a fantasy world to reality. The pirate characters are extreme in their design, and it was especially rewarding to see them come to life.

Where do you find inspiration now?
Chicago is amazing. I’m a fan of architecture, and I have to say, this city knocks my socks off in that department. It is such a pleasure to live somewhere where so much thought has gone into the built environment. The Art Institute is constantly inspirational, and so is my backyard, which is full of bunnies and squirrels and my wife and our two kids.

What do you do to destress from it all, especially these days?
Well, we don’t really leave the house, so right now I mostly hide in the bathroom.

Any tips for folks just starting out?
– Do whatever you’re doing now to the best of your ability, even if it isn’t the job you ultimately want or even the field you want to be in. Relationships are key, and it can be surprising how someone you worked with 10 years ago can pop up suddenly in a position to help you out later on..

– Also, don’t be scared of software. Your most important asset is your ability to know what an image needs. You can learn any software.

– Start saving for retirement now.

As for me, I’m glad I didn’t know anything and that there was no internet or social media of significance until after I finished school. It meant I had to look inward to figure out what felt right, and that really worked for me. I wouldn’t want to spoil that.

VFX studio Cinesite adds three to global management team

Visual effects and animation studio Cinesite has added three to its management team: Melissa Taylor joins as general manager in London, Siobhan Bentley is head of production VFX in London, and Tamara Boutcher is the company’s new global head of production for feature animation in Montreal.

“Melissa, Siobhan and Tamara are proven talented executives with deep knowledge of the visual effects and feature animation industries,” reports Cinesite CEO Anthony Hunt. “Working toward equal representation in an industry that is statistically male-dominated is very important to us all, and we’re working hard to improve the balance. We have a collective philosophy on diversity and inclusion, which is embodied in our wider approach of encouraging everyone on the team.”

Taylor will oversee Cinesite London’s visual effects studio while working closely with Montreal colleagues and group VFX brands Image Engine and Trixter. Taylor brings over 30 years’ industry knowledge and relationship-building experience. She joins Cinesite from visual effects studio Framestore, where she served as global head of business development for film and was involved with projects such as Spider-Man: Far From Home, Wonder Woman 1984, Lady and the Tramp, Tom and Jerry and Fast & Furious Presents: Hobbs & Shaw. Prior to Framestore, Taylor was EP at DNeg.

Bentley is tasked with leading, developing and motivating the London production teams throughout a show’s lifecycle. She joins Cinesite from MPC, where she oversaw the production teams on many acclaimed films, such as The Lion King, Roma, The Jungle Book, The Martian and Guardians of the Galaxy. She will work closely with crewing and producers to ensure projects are progressing through departments and deadlines are being met.

Based at Cinesite’s Montreal studio, Boutcher will be responsible for the production and day-to-day operations of the company’s feature animation service slate, developed and produced out of the Montreal and Vancouver studios. Her animation career began at The Walt Disney Company and as the studio transitioned from 2D features to 3D features, Boutcher worked as the director of production, helping to guide the teams and develop practices and technologies for the blending of traditional animation and CGI. Her work is featured in The Addams Family, Dinosaur, The Angry Birds Movie and We Are Not Princesses.

Image Caption: L-R Siobhan Bentley, Melissa Taylor and Tamara Boutcher

VFX school for those with autism graduates class of 2020 via zoom

Exceptional Minds, an LA-based school that teaches young adults with autism how to create visual effects, motion graphics and other digital arts, has announced that the Class of 2020 has graduated after finishing courses remotely.

The graduation was a Zoom event attended by parents and friends, among them guest speaker Rob Paulsen, best known as the voice of Yakko in Animaniacs and Pinky from Pinky and the Brain. “People will look at Exceptional Minds as pioneers and they will be inspired by it, and that inspiration will help them find their own gifts,” he said.

When COVID-19 hit, hit, Exceptional Minds shut down its school and sent its students home. Overnight, its entire training model — the place where students met; the way instructors taught, socialized and interacted with this special population; and the lifeblood of what Exceptional Minds does and how it does it, right down to funding — was gone. And as anyone with a child, sibling or friend with autism knows, sudden change is a huge challenge.

No one was entirely sure if they could replicate this model in a remote, virtual environment, for a population that was already isolated and challenged in so many other ways. Remote learning carried Exceptional Minds 2020 graduates through the last critical months of their three-year training, not an easy feat for those on the autism spectrum.

According to Exceptional Minds’ Dee McVicker, what took the most time was figuring out how to format the classes. “We closed the school just as the rest of Los Angeles closed down. Our instructors spent a week working out the details of remote learning, and we were online with our students the following week. That was really quite amazing considering we are working with many in our population who do best with one-on-one learning. One thing our instructors decided right away is that instead of sending students home with homework and checking online from time to time, they would conduct their classes online in an all-day format. Our instructors also tried to instill a sense of community that is so important to any group of students, and especially students on the spectrum. We conducted Dungeons and Dragons games online, had movie nights (remotely) and did as much as we could online to keep those relationships engaged with us and each other.

“It was a wonderful way to develop an online program,” she says, “even if it was done quickly, and to learn about what works and doesn’t work online with our student population. As we move into summer, we are able to extend those learnings to our summer workshops. And going forward, we will be able to provide more of a hybrid approach to training our students, so we are no longer tied to the classroom as before.”

“From COVID-19 to Protest 2020, we have faced challenges the likes of which most of us have never seen before. But as we are learning, challenges test us in unexpected ways, and you all have demonstrated incredible flexibility and resilience in this new remote learning space that will serve you so well as you continue on an amazing journey that has just begun,” said Exceptional Minds executive director David Siegel to the class of 12 during Friday’s Zoom graduation.

For those who aren’t familiar with Exceptional Minds, the organization opened its doors in 2011 as a training school for individuals with autism and, in 2014, added a professional studio to bring in contract work for graduates of its three-year program.

Exceptional Minds 2020 graduates join alumni who have gone on to careers at Marvel Studios and Cartoon Network, worked on Oscar-nominated movies and produced animations for Sesame Street, among others. Exceptional Minds academy and working studio’s mission is to create opportunities for individuals with autism, with a new online summer program to help ensure the learning continues.

Estudios GGM to resume production, open new soundstages

Estudios GGM in Mexico is unveiling three new soundstages as it prepares to resume production activity later this month. Ranging from 10,000 to 13,000 square feet, the new stages will be the studio’s largest and give it a total of nine shooting spaces. Construction of one stage is already complete, while work on the other two will be finished by November, when the studio expects to be supporting a full slate of television and feature productions.

Planned before the coronavirus outbreak, the new stages are meant to serve Mexico’s accelerating boom in television and film production. Launched in 2016, Estudios GGM was operating at capacity prior to the lockdown, providing stages, production offices, casting, editing, visual effects and other services to projects from Telemundo, Netflix, Amazon, Viacom, MGM and other producers. Enemigo Intimo, Falsa Identidad, El Club, Luis Miguel: The Series and Ingobernable are among the streaming series recently shot in whole or in part at the studio.

Francisco Bonilla

“We expect production activity to pick up rapidly beginning in June,” says Estudios GGM CTIO Francisco Bonilla. “We built these stages to increase capacity and meet the needs of producers from around the world who are want to shoot in Mexico. They are large shooting spaces, have high ceilings and are supported by many other resources to accommodate a cinematic style of production.”

Adding to the social distancing guidelines mandated by the Mexican government, the studio will apply a variety of health and safety measures to protect cast and crew, including culture changes and hygienic training for work and everyday life; thermal CCTV monitoring; periodic chemical, ozone and UV sanitization; and restricted access to facilities, sets and offices. The new stages are complemented by modular, multi-purpose space that will allow directors, cinematographers, control room crew and other personnel to work in isolation. Other steps will include regular sanitizing of cameras, lighting, wardrobe and props; the use of masks and gloves; and modifications to craft and catering services. All the studio’s stages are equipped with HVAC systems that draw fresh air from outdoors to reduce the risk of spreading infection.

“We are working with local health officials and medical advisors to develop appropriate protocols,” notes Bonilla. “We are also monitoring the situations in Spain, Italy, Germany, Iceland, Australia and other countries where production has resumed. We are gathering as much information as possible to allow production to ramp up quickly but safely.”

While production has been curtailed during the lockdown, other work has continued. The studio has been using Bebop remote collaboration technology and Adobe tools to allow sound and picture editors, visual effects artists and others to carry on their work remotely. It has also been serving as a beta site for Avid On-Demand, a cloud-based editing platform. Similarly, post finishing has continued at Cinematic Media, the post facility located within the studio complex, with most staff working off site.

Estudios GGM is also expanding its visual effects department. It is hiring artists and adding new capabilities, including high-end motion capture and virtual set technology. Demand for visual effects services has risen dramatically along with the broader push to elevate production value. The studio expects the need for sophisticated visual effects to grow as productions look to limit travel and location production.

For producers eager to get back to production, Estudios GGM wants to make the process simple by providing one-stop solutions. “We provide everything necessary to produce premium television and cinema,” Bonilla says. “That includes experienced talent and crew to reduce the need to travel or bring people from outside the country.”

Behind the Title: Trollbäck ECD Elliot Chaffer

This artist’s biggest passion is live-action directing, “specifically in-camera VFX and CG integration.”

Name: Elliott Chaffer

Company: Trollbäck+Company

What does Trollbäck do?
We are a branding and design studio that builds strategy, multi-platform brands and moving experiences. Our founder, Jakob Trollbäck, started the company in 1999 with the goal of revolutionizing the way we communicate through motion graphics and emerging technologies. Since then, we’ve grown into a multidisciplinary design studio that offers brand design and content across industries and platforms.

What’s your job title?
Executive Creative Director

What does that entail?
As we are a small company with big ambitions, I wear many hats and really enjoy the broad range of projects we bring in.
Primarily, I am responsible for leading creative teams from pitch through production to delivery and amplification.

On any given day, I can be found ideating, in new business meetings, upselling to current clients, building decks, pitching creative, participating in strategic workshops, editing, directing animators and editors, directing live-action shoots and now with the lockdown, homeschooling my two kids at the same time.

Elliot Chaffer on set

What would surprise people about what falls under that title?
That I am not an “on the box” creative director, and you don’t have to be.

What’s your favorite part of the job?
I love it all. Mostly the team and our energy that we put into our work. My biggest passion is live-action directing, specifically in-camera VFX and CG integration; I try to apply that to projects where it best suits the client’s needs. Maybe because I’m old-school and ADD, I don’t like to sit still (hence why I am not on the box) but prefer to move fluidly between my different teams and have a more personal one-on-one connection than through Slack. Also, as I mentioned before, I love the variety of projects. It helps to keep it fresh and to learn new things from new people all the time.

What’s your least favorite?
The ones that got away. The jobs you were deeply invested in, pitched on hard and didn’t win, or that just disappear because of uncontrollable circumstances. Also, the jobs you are super-proud of but are not allowed to promote due to contractual agreements with clients. And finally, filling out time sheets and trying to account for the various minutes and hours spent on a whole range of projects.

What’s your most productive time of the day?
In the old days, it used to be after 8pm when the office went quiet, but more recently it was 8:30am after I dropped my daughter at school and had that hour of peace before the floodgates opened.

However, now that we are in COVID quarantine, I find that the whole day feels more productive because it is easier to be more focused when you are not all together in the studio. But I do miss the direct contact with the team and the energy that is created by being together. Zoom calls are just not the same.

If you didn’t have this job, what would you be doing instead?
Since childhood I always wanted to be an underwater cameraman exploring ocean caves, so maybe I would finally follow that burning passion. And then after work, I would float to the surface and go surfing, sit on the beach and watch the sun go down, sleep early, rinse repeat.

FX

How did you choose this profession?
I feel that it chose me to a certain extent, and it came about organically. In school I was only interested in art and languages, and everything else just seemed meaningless. (I was wrong, of course.) My dad had a photographic studio, and I used to spend a lot of my holiday time taking pictures and teaching myself how to develop and print them, which I found hugely satisfying.

I studied graphic design at art college and picked that course because I knew I wanted to have a broad approach and be able to work across different mediums. I got a chance to intern at MTV in London by pretending to be my brother, who actually had the internship but could not make it. Pretty soon I discovered the creative department, and, naturally, I wanted to work in graphics but was urged to be a director/producer, so I thought, OK, I’ll give it a try.

Very quickly I realized I loved combining my design and photography knowledge in ways I never thought about while at college. I learned to animate on the job, and the combination of these three fundamentals led me into branding on a larger level. It gave me so much pleasure to make stuff and see it go out on air to the whole of Europe that day. I was hooked and have never looked back. My career has been about continually creating my own luck and rolling into the next thing, from co-authoring the first-ever coffee table art book about sneaker collectors to starting a design company to going freelance to moving to USA and working at two of the top creative shops in NYC with a great team for meaningful brands.

Can you name some recent projects you have worked on?
FX Networks’ masterbrand design system — We created a new visual identity, motion theory and custom-coded typeface that’s able to adapt to any mood, any series and any setting to maximize the brand’s attribution across platforms.

IRIS

We redrew a typeface and then deconstructed each letterform to create a custom animated typeface, designed to be built and manipulated in Adobe After Effects through the use of a custom script UI panel.

Fox Entertainment rebrand — Following the Disney merger, we relaunched Fox Entertainment as a bold new challenger brand and created robust systems for messaging, tone of voice, design and animation across every touchpoint, including on-air, streaming, digital, social, print and IRL applications.

Iris headphones — We developed a substantive, industry-disrupting brand identity for Iris, a new audio brand promising to change the way see, live and experience sound in the world around us, defining Iris as an audio brand on a mission to reshape the culture of sound.

I am currently working from home, rebranding ABC networks and the BET Experience and working on a title sequence for a new series coming out on Amazon. As a studio, we are also getting involved in a large creative collective that will be responding to the current COVID-19 crisis, using our skills to help the world.

Fox

What is the project you are most proud of?
The Fox Entertainment internal brand film, because it combined all of our skills of brand strategy, writing, animation, in-camera VFX and CG integration, edit and sound design to create a really powerful piece that inspired a collective sea of change throughout the brand.

Apple live wallpapers — I got to go to Thailand to shoot hundreds of beautiful tiny Siamese fighting fish on a Phantom camera. We had to smuggle a high-powered zero-heat LED light into the country so we could film the fish without boiling them in their tanks! We were capturing abstract shots of movement that could be activated by pressing the iPhone screen. When the job was done, it was a moment of pride to see something you have done in the hands of millions all around the world and used on video walls and interactive point of sale in Apple stores.

The Super Bowl halftime show graphics for The Who — The sheer scale of the audience for the halftime show was staggering, and the high-stakes stress of connecting a giant LED stage in 12 minutes and to see everything sync up perfectly with the lighting cues was probably an all-time career high.

What social media channels do you follow?
I only do Instagram and LinkedIn and mostly follow friends, family, competitors and collaborators:
@rogiervanderzwaag — This Dutch guy makes some really inspiring optical illusions in camera that are so simple and graphic.
@fxwrx — My good friend and collaborator Christopher Webb has an amazing studio dedicated to shooting in-camera VFX.
@_xlmilk — This is a channel recently started that is posting spreads from the Sneaker book we made in the ‘90s and will be promoting the launch of the new book that is currently in production 20 years later.

I also like to watch Houdini tutorials on Entagma.com.

Do you listen to music while you work?
Yes. When I eventually get to my desk, I like to listen to abstract ambient music with no lyrics so I can hear my own thoughts. Nils Frahm, Kiasmos, Olafur Arnalds — that kind of stuff. Also, whatever Spotify Discover Weekly wants to serve me up usually hits the mark.

Name three pieces of technology you can’t live without.
My QS6 synthesizer. It keeps me sane when I want to zone out and get away from the noise and make music. My cappuccino machine. Keeping me caffeinated, safe from going out, and saving me money during the COVID-19 lockdown. My laptop.

What do you do to destress from it all?
Now that we’re working from home, I like to get up and play along to background music on my keyboard when I need to refresh my mind or just untangle my thoughts.

Also, now that I have more time in the mornings and don’t have to do the school run, I like to make a routine of going to the park at 7:30am and 7pm to meditate, stretch and exercise. On the weekends I like to skateboard and snowboard and surf in the summer, and spend time with my kids, of course.

VFX studios Mr. X and Mill Film merge to target post-COVID world

Technicolor visual effects companies Mr. X and Mill Film have merged under the Mr. X name. Mr. X now becomes a VFX studio crossing four time zones, spanning Canada, the United States, Australia and India. This does not impact The Mill which continues to operate as a separate entity.

The newly combined, expanded studio will service clients across both features and episodic. Laura Fitzpatrick, MD of Mill Film, will move into a managing director role at Mr. X, based in Montreal. Dennis Berardi, founder of Mr. X, assumes the role of creative director for the studio.

Technicolor acknowledges that COVID-19 is changing the entertainment industry with the theatrical market being re-imagined and many projects are currently on hold indefinitely. They say this merger is a direct and necessary response to align to the changing needs of the industry and creative partners, as productions begin again, and as the entertainment industry looks to move forward.

The newly combined and expanded studio will service clients across both features and episodic — with the flexibility to serve productions resuming at different times in different parts of the world, and the capacity to handle an anticipated increase in VFX in response to changes required for live action filming.

“As the entertainment landscape has continued to evolve, both studios were naturally overlapping into each other’s spaces,” says Berardi. “Merging both brands allows us to build the perfect team for each and every client.”

With 20 years in the business, Mr. X has built collaborative partnerships with directors such as Guillermo Del Toro and Paul W.S Anderson. The studio has worked on The Shape of Water, Roma and Shazam! to name a few.

Mill Film has delivered projects such as Gladiator, which won the Academy Award for Best Visual Effects in 2001, Harry Potter and the Philosopher’s Stone, plus more recent releases Maleficent: Mistress of Evil and Dora and the Lost City of Gold.

“Our aim is to partner with clients to realize their ideas and exceed visual expectations,” says Fitzpatrick. “With our merged brand we can pitch global expertise in all creative areas: original design and art direction, on-set supervision, environment creation, FX simulations, creature and character work.”

All facilities remain open in Toronto, Montreal, Los Angeles, Adelaide and Bangalore. The merger will be effectively immediately, with a period of transition for employees.

Main Image: Laura Fitzpatrick and Dennis Berardi.

Jellyfish Pictures uses cloud to grow global talent pool

Animation and VFX studio Jellyfish Pictures has expanded its operating model to access talent across the world. The move is the company’s next stage of development after opening a large virtual studio at the end of last year.

This new way of working allows Jellyfish Pictures to access talent anywhere in the world without having to invest in brick and mortar or on-premises hardware. Artists can work from their own homes and have the same experience as teammates located 6,000 miles away, thanks to Teradici Cloud Access Software and Microsoft Azure. This new model has been implemented with artists joining the company from Israel, India, North America, Finland, Canada, Spain and Réunion.

With Jellyfish Pictures’ IT infrastructure already housed off site and completely virtual, the company uses Azure’s backbone to set up hubs all over the world, which connect back to the Jellyfish Pictures’ tier-one data center in the UK.

Cristina Ortega working from home in the UK.

All content resides on PixStor, Pixit Media’s software-defined storage solution. Using Pixit Media’s dynamic data manager, Ngenea, integrated with pipeline tools and Azure, Jellyfish Pictures distributes files across creative hubs quickly and securely. Artists access their content from PixStor running in the cloud hub, which guarantees their performance requirements are always met. When completed, files automatically move back to the UK data center.

Data never leaves the secure Azure hub, with pixels streamed to artists’ monitors via an encrypted streaming session over Teradici PCoIP technology. Data cannot be downloaded, shared or accessed, remaining fully compliant with TPN protocols and the stringent security measures withheld in the physical studios.

To further strengthen the global operation, Jellyfish Pictures’ review tool, which extends to the public cloud, allows clients to review content seamlessly in 4K. No matter where they are based in the world, both client and artist can share the same screen, updating and annotating in real time.

According to Jellyfish CEO Phil Dobree, “From the very beginning, when I first started looking at cloud and virtual technologies with Jellyfish CTO Jeremy Smith, it was always my vision to be able to go to where the artists are. We introduced cloud rendering and virtual desktops so we could break out of our four walls. Now in 2020, with events no one could have foreseen, we have over 280 artists working from home with no loss in productivity. Moving our staff to this environment was a relatively simple; connecting to the data center from home is the same as if they were connecting from the studio.

“It was always our intention to roll out this way of working on a global scale. We have merely accelerated our plan due to current circumstances.”

Main Image: Art director Katri Valkamo working out of her home in Finland. 

Alkemy X: VFX supervisors share work from home process

By Bilali Mack and Erin Nash

On the heels of joining Alkemy X’s VFX team, what we expected of our first few weeks was quickly interrupted by a global crisis. After getting to know the company and settling in, we were tasked with responding to the COVID-19 pandemic and transitioning the staff to remote work as quickly and efficiently as possible. As a headcount, that would be 42 artists, three supervisors, three pipeline engineers, three in editorial and the I/O department, and eight production management personnel.

Erin Nash’s WFH setup

We were fortunate that Alkemy X already had systems and processes in place and ready for these virtual workflows. It was just a matter of making the decision to get ahead of state mandates and make the shift early to set ourselves up for success. Our pivot to a remote workflow was structured and executed the week prior to March 16. We began to build our plan starting Tuesday, March 10, and by that Friday, the engineering and pipeline team had built on its pre-existing security-compliant processes to roll out to the entire staff of artists and production.

The company uses RGS to connect artists to a low-latency screen-sharing session on their work computers. Since the remote artists are working off the computer they normally use at work, they still have access to all of the software, licenses and tools they have when at the office. Agile and innovative responses have made our jobs easier, despite these circumstances.

Alkemy X built an openVPN server to allow secure, encrypted, multi-factor authentication and remote access to our internal network. By working remotely, we are able to maintain security and keep assets contained within our secure network. Artists have access to their files via high-speed file servers, with no need for time-consuming file transfers.

Bilali Mack working from home

Alkemy X uses Shotgun to manage our shows and workflow, but we are leaning on it more heavily now as a first-line review tool before heading to high-resolution reviews through HP RGS. Our traditional dailies have been replaced by rolling spot checks in Shotgun followed by more exhaustive reviews of full-resolution media.

We use Google Meet for meetings, screen sharing, video chat and telephone calls. We use Slack extensively on non-networked computers for team communication, keeping everyone connected and up to date and to quickly get assistance with any technical problems.

Priority is still placed on building and maintaining the company’s culture in addition to the quality of creative work, but we’re doing so behind the top of a dining room table or bedroom-stationed desk and within steps from our kitchens.

Erin Nash

As we move from our former posts, here’s how we are individually navigating working from home:

Erin Nash: Although managing a team remotely is a new experience for me, I can’t say I have found it very difficult to transition. While the team as a whole is new to me, I have known many of the artists for years. Being able to guide their creative process and help them solve difficult technical problems from afar isn’t as different as I would have expected. Now instead of saying “Can I drive your box?” it has become “Let’s do a screen share.”

People by and large do all the same things from home that they would do in the office, with the main difference being that now nobody can tell if I’ve gone for a workout over lunch.

Bilali Mack: Starting out at any company takes time to get up to speed. Add something like a global pandemic, and you would think it would be nearly impossible not only to get up to speed, but also to manage teams, collaborate on creative and retain our company’s culture. We adapted by preparing artist and production remote on-boarding documents and deploying necessary hardware and software to any and all artists on our team.

On a cultural note, we’re still holding company happy hours and open Google Meet “office” hours, just because it’s nice to be able to jump on and chat with each other about how things are different now.

Bilali Mack

Alkemy X built an openVPN server to allow secure, encrypted, multi-factor authentication, remote access to our internal network. Alkemy X uses RGS to connect artists to a low-latency screen sharing session on their work computers. Since the artists working remotely are working off of the computer that they normally use at work, they still have access to all of the software, licenses, tools that they have when at the office. By working remotely, we are able to maintain security and keep assets contained within our secure network. Artists have access to their files via high-speed file servers and with no need to do time consuming file transfers.

Alkemy X uses Shotgun as usual to manage our shows and workflow but are leaning on it heavier now as a first line review tool before heading to high resolution reviews through HP RGS. Our traditional dailies have been replaced by rolling spot checks in Shotgun followed by more exhaustive reviews of full resolution media.

We use Google Meet for meetings, screen sharing, video chat, and telephone calls. We use Slack extensively on non-networked computers for team communication, keeping everyone connected and up to date, and to quickly get assistance with any technical problems. All regular company meetings, and Friday night happy hours are done with Google Meet.

Main Image: Bilali Mack WFH.


VFX supervisor Bilali Mack comes to Alkemy X from MPC, where he supervised and executed VFX for brands including Adidas, Google and BMW. Erin Nash joined the team from FuseFX was head of 2D/VFX supervisor, leveraging his experience across television, film and commercial work.

Invisible VFX on Hulu’s Big Time Adolescence

By Randi Altman

Hulu’s original film Big Time Adolescence is a coming-of-age story that follows 16-year-old Mo, who is befriended by his sister’s older and sketchy ex-boyfriend Zeke. This aimless college dropout happily introduces the innocent-but-curious Mo to drink and drugs and a poorly thought-out tattoo.

Big Time Adolescence stars Pete Davidson (Zeke), Griffin Gluck (Mo) and Machine Gun Kelly (Nick) and features Jon Cryer as Mo’s dad. This irony will not be lost on those who know Cryer from his own role as disenfranchised teen Duckie in Pretty in Pink.

Shaina Holmes

While this film doesn’t scream visual effects movie, they are there — 29 shots — and they are invisible, created by Syracuse, New York-based post house Flying Turtle. We recently reached out to Flying Turtle’s Shaina Holmes to find out about her work on the film and her process.

Holmes served as VFX supervisor, VFX producer and lead VFX artist on Big Time Adolescence, creating things like flying baseballs, adding smoke to a hotboxed car, removals, replacements and more. In addition to owning Flying Turtle Post, she is a teacher at Syracuse University, where she mentors students who often end up working at her post house.

She has over 200 film and television credits, including The Notebook, Tropic Thunder, Eternal Sunshine of the Spotless Mind, Men in Black 3, Swiss Army Man and True Detective.

Let’s find our more…

How early did you get involved on Big Time Adolescence?
This this was our fifth project in a year with production company American High. With all projects overlapping in various stages of production, we were in constant contact with the client to help answer any questions that arose in early stages of pre-production and production.

Once the edit was picture-locked, we bid all the VFX shots in October/November 2018, VFX turnovers were received in November, and we had a few short weeks to complete all VFX in time for the premiere at the Sundance Film Festival in January 2019.

What direction were you given from your client?
Because this was our fifth feature with American High and each project has similar basic needs, we already had plans in place for how to shoot certain elements.

For example, most of the American High projects deal with high school, so cell phones and computer screens are a large part of how the characters communicate. Production has been really proactive about hiring an on-set graphics artist to design and create phone and computer screen graphics that can be used either during the shoot or provided to my team to add in VFX.

Having these graphics prebuilt has saved a lot of design time in post. While we still need to occasionally change times and dates, remove the carrier, change photos, replace text and other editorial changes, we end up only needing to do a handful of shots instead of all the screen replacements. We really encourage communication during the entire process to come up with alternatives and solutions that can be shot practically, and that usually makes our jobs more efficient later on.

Were you on set?
I was not physically needed on set for this film, however after filming completed, we realized in post that we were missing some footage during the batting cages scene. The post supervisor and I, along with my VFX coordinator, rented a camera and braved the freezing Syracuse, New York, winter to go to the same batting cages and shoot the missing elements. These plates became essential, as production had turned off the pitching machine during the filming.

Before and After: Digital baseballs

To recreate the baseball in CG, we needed more information for modeling, texture and animation within this space to create more realistic interaction with the characters and environment in VFX. After shoveling snow and ice, we were able to set the camera up at the batting cage and create the reference footage we needed to match our CG baseball animation. Luckily, since the film shot so close to where we all live and work, this was not a problem… besides our frozen fingers!

What other effects did you provide?
We aren’t reinventing the wheel here in the work we do. We work on features wherein invisible VFX are the supporting roles that help create a seamless experience for the audience without distractions from technical imperfections and without revising graphics to enable the story to unfold properly. I work with the production team to advise on ways to shoot to save on costs in post production and use creative problem solving to cut down costs in VFX to satisfy their budget and achieve their intended vision

That being said, we were able to do some fun sequences including CG baseballs, hotboxing a car, screen replacements, graphic animation and alterations, fluid morphs and artifact cleanup, intricate wipe transitions, split screens and removals (tattoos, equipment, out-of-season nature elements).

Can you talk about some of those more challenging scenes/effects?
Besides the CG baseball, the most difficult shots are the fluid morphs. These usually consist of split screens where one side of the split has a speed change effect to editorially cut out dialogue or revise action/reactions.

They seem simple, but to seamlessly morph two completely different actions together over a few frames and create all the in-betweens takes a lot of skill. These are often more advanced than our entry-level artists can handle, so they usually end up on my plate.

What was the review and approval process like?
All the work starts with me receiving plates from the clients and ends with me delivering final versions to the clients. As I am the compositing supervisor, we go through many internal reviews and versions before I approve shots to send to the client for feedback, which is a role I’ve done for the bulk of my career.

For most of the American High projects, the clients are spread out between Syracuse, LA and NYC. No reviews were done in person, although if needed, I could go to Syracuse Studios at any time to review dailies if there was any footage I thought could help with some fix-it-in-post VFX requests.

All shots were sent online for review and final delivery. We worked closely with the executive producer, post supervisor, editor and assistant editor for feedback, notes, design and revisions. Most review sessions were collaborative as far as feedback and what’s possible.

What tools did you use on the film?
Blackmagic’s Fusion is the main compositing software. Artists were trained on Fusion by me when they were in college, so it is an easy and affordable transition for them to use for professional-quality work. Since everyone has their own personal computer setup at home, it’s been fairly easy for artists to send comp files back to me and I render on my end after relinking. That has been a much quicker process for internal feedback and deliveries as we’re working on UHD and 4K resolutions.

For Big Time Adolescence specifically, we also needed to use Adobe After Effects for some of the fluid morph shots, plus some final clean-up in Fusion. For the CG baseball shots, we used Autodesk Maya and Substance Painter, rendered with Arnold and comped in Fusion.

You are female-owned and you are in Syracuse, New York. Not something you hear about every day.
Yes, we are definitely set up in a great up-and-coming area here in Ithaca and Syracuse. I went to film school at Ithaca College. From there, I worked in LA and NYC for 20 years as a VFX artist and producer. In 2016, I was offered the opportunity to teach VFX back at Ithaca College, so I came back to the Central New York area to see if teaching was the next chapter for me.

Timing worked out perfectly when some of my former co-workers were helping create American High, using the Central New York tax incentives and they were prepping to shoot feature films in Syracuse. They brought me on as the local VFX support since we had already been working together off and on since 2010 in NYC. When I found myself both teaching and working on feature films, that gave me the idea to create a company to combine forces.

Teaching at Syracuse University and focusing on VFX and post for live-action film and TV, I am based at The Newhouse School, which is very closely connected with American High and Syracuse Studios. I was already integrated into their productions, so this was just a really good fit all around to bring our students into the growing Central New York film industry, aiming to create a sustainable local talent pool.

Our team is made up of artists who started with me in post mentorship groups I created at both Ithaca College (Park Post) and Syracuse University (SU Post). I teach them in class, they join these post group collaborative learning spaces for peer-to-peer mentorship, and then a select few continue to grow at Flying Turtle Post.

What haven’t I asked that’s important?
When most people hear visual effects, they think of huge blockbusters, but that was never my thing. I love working on invisible VFX and the fact that it blows people’s minds — how so much attention is paid to every single shot, let alone frame, to achieve complete immersion for the audience, so they’re not picking out the boom mic or dead pixels. So much work goes on to create this perfect illusion. It’s odd to say, but there is such satisfaction when no one noticed the work you did. That’s the sign of doing your job right!

Every show relies on invisible VFX these days, even the smallest indie film with a tiny budget. These are the projects I really like to be involved in as that’s where creativity and innovation are at their best. It’s my hope that up-and-coming filmmakers who have amazing stories to tell will identify with my company’s mentorship-focused approach and feel they also are able to grow their vision with us. We support female and underrepresented filmmakers in their pursuit to make change in our industry.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Arch platform launches for cloud-based visual effects

Arch Platform Technologies, a provider of cloud-based infrastructure for content creation, has made its secure, scalable, cloud-based visual effects platform available commercially. The Arch platform is designed for movie studios, productions and VFX companies and enables them to leverage a VFX infrastructure in the cloud from anywhere in the world.

An earlier iteration of the Arch platform was only available to those companies who were already working with Hollywood-based Vitality VFX, where the technology was created by Guy Botham. Now, Arch is making its next-generation version of its “rent vs. own” cloud-based VFX platform commercially available broadly to movie studios, productions and VFX companies. This version was well along in its development when COVID-19 arrived, making it a very timely offering.

By moving VFX to the cloud, the platform lets VFX teams scale up and down quickly from anywhere and build and manage capacity with cloud-based workstations, renderfarms, storage and workflow management – all in a secure environment.

“We engineered a robust Infrastructure as a Service (IaaS), which now enables a group of VFX artists to collaborate on the same infrastructure as if they were using an on-premises system,” says Botham. “Networked workstations can be added in minutes nearly anywhere in the world, including at an artist’s home, to create a small to large VFX studio environment running all the industry-standard software and plugins.”

Recently, Solstice Studios, a Hollywood distribution and production studio, used the Arch platform for the VFX work on the studio’s upcoming first movie, Unhinged. The platform has also been used by VFX companies Track VFX and FatBelly VFX and is now commercially available to the industry.

Epic Games offers first look at Unreal Engine 5

Epic Games has offered a first look at Unreal Engine 5 — the next-generation of its technology designed to create photorealistic images on par with movie CG and real life. Designed for development teams of all sizes, it offers productive tools and content libraries

Unreal Engine 5 will be available in preview in early 2021, and in full release late in 2021, supporting next-generation consoles, current-generation consoles, PC, Mac, iOS and Android.

The reveal was introduced with Lumen in the Land of Nanite, a realtime demo running live on PlayStation 5, to showcase Unreal Engine technologies that can allow creators to reach the highest level of realtime rendering detail in the next generation of games and beyond.

New core technologies in Unreal Engine 5
Nanite virtualized micropolygon geometry will allow artists to create as much geometric detail as the eye can see. Nanite virtualized geometry means that film-quality source art comprising hundreds of millions or billions of polygons can be imported directly into Unreal Engine — anything from ZBrush sculpts to photogrammetry scans to CAD data. Nanite geometry is streamed and scaled in real time, so there are no more polygon count budgets, polygon memory budgets, or draw count budgets. Users won’t need to bake details to normal maps or manually author LODs, and, according to Epic, there is no loss in quality.

Lumen is a fully dynamic global Illumination solution that reacts to scene and light changes. The system renders diffuse interreflection with infinite bounces and indirect specular reflections in detailed environments, at scales ranging from kilometers to millimeters. Artists can create more dynamic scenes using Lumen, for example, changing the sun angle for time of day, turning on a flashlight, or blowing a hole in the ceiling. Additionally, indirect lighting will adapt accordingly. Lumen erases the need to wait for lightmap bakes to finish and to author light map UVs — a big time savings when an artist can move a light inside the Unreal Editor and lighting looks the same as when the game is run on console.

To build large scenes with Nanite geometry technology, Epic’s team made heavy use of the Quixel Megascans library, which provides film-quality objects up to hundreds of millions of polygons. To support vastly larger and more detailed scenes than previous generations, PlayStation 5 provides a dramatic increase in storage bandwidth.

The demo also showcases existing engine systems such as Chaos physics and destruction, Niagara VFX, convolution reverb and ambisonics rendering.

Unreal Engine 4 and 5 Timeline
Unreal Engine 4.25 already supports next-generation console platforms from Sony and Microsoft, and Epic is working closely with console manufacturers and dozens of game developers and publishers using Unreal Engine 4 to build next-gen games. Epic is designing for forward compatibility, so developers can get started with next-gen development now in UE4 and move projects to UE5 when ready.

Epic will release Fortnite, built with UE4, on next-gen consoles at launch and, in keeping with the team’s commitment to prove out industry-leading features through internal production, migrate the game to UE5 in mid-2021.

Waiving Unreal Engine Royalties: first $1 Million in Game Revenue
Game developers can still download and use Unreal Engine for free, but now royalties are waived on the first $1 million in gross revenue per title. The new Unreal Engine license terms are retroactive to January 1, 2020.

Epic Online Services
Friends, matchmaking, lobbies, achievements, leaderboards and accounts: Epic built these services for Fortnite, and launched them across seven major platforms — PlayStation, Xbox, Nintendo Switch, PC, Mac, iOS, and Android. Now Epic Online Services are opened up to all developers for free in a simple multiplatform SDK.

Developers can mix and match these services together with their own account services, platform accounts, or Epic Games accounts, which reach the world’s largest social graph with over 350 million players and their 2.2 billion friend connections across half a billion devices.

Review: Boris FX Continuum 2020.5 and Sapphire 2020

By Brady Betzel

The latest Boris FX 2020 plugin releases like Continuum, Sapphire and Mocha, as well as the addition of the Silhouette software (and paint plugin!), have really changed the landscape of effects and compositing.

Over the course of two reviews I will be covering all four of Boris FX’s 2020 offerings — Continuum 2020.5 and Sapphire 2020 now, and Mocha Pro 2020.5 and Silhouette 2020.5 to come soon — for NLE applications like Avid Media Composer, Adobe Premiere and Blackmagic Resolve. Silhouette is a bit different in that it comes as a stand-alone or a compatible plugin for Adobe Premiere or After Effects (just not Avid Symphony/Media Composer at the moment).

Because they are comparable, and editors tend to use both or choose between the two, Continuum 2020.5 and Sapphire 2020 are first. In an upcoming review, I will cover Mocha 2020.5 and Silhouette 2020.5; they have a similar feature set from the outside but work symbiotically on the inside.

While writing this review, Boris FX released the 2020.5 updates for everything but Sapphire, which will eventually come out, but they are dialing it in. You’ll see that I jump back and forth between 2020 and 2020.5 a little bit. Sorry if it’s confusing, but 2020 has some great updates, and 2020.5 has even more improvements.

All four Boris FX plugins could have a place in your editing tool kit, and I will point out the perks of each as well as how all of them can come together to make the ultimate Voltron-like plugin package for editors, content creators, VFX artists and more.

Boris FX has standardized the naming of each plugin and app with release 2020. Beyond that, Continuum and Sapphire 2020 continue to offer the same high-quality effects you know, continue to integrate Mocha tracking, and have added even more benefits to what I always thought was an endless supply of effects.

You have a few pricing options called Annual Subscription, Permanent, Renewals (upgrades), Units and Enterprise. While I have always been a fan of outright owning the products I use, I do like the yearly upgrades to the Boris FX products and think the Annual Subscription price (if you can afford it) is probably the sweet spot. Continuum alone ranges from $295 per year for Adobe-only to $695 per year for Avid, Adobe, Apple and OFX (Resolve). Sapphire alone ranges from $495 to $895 per year, Mocha Pro ranges from $295 to $595 per year, and Silhouette goes for $995 per year. You can bundle Continuum, Sapphire and Mocha Pro from $795 to $1,195 per year. If the entire suite of plugins is too expensive for your wallet, you can purchase individual categories of plugins called “units,” and you can find more pricing options here.

Ok, let’s run through some updates …

Continuum 2020.5
Boris FX Continuum 2020.5 has a few updates that make the 2020 and 2020.5 releases very valuable. At its base level, I consider Continuum to be more of an image restoration, utility and online editor tool kit. In comparison, Sapphire is more of a motion graphics, unicorn poop, particle emitter sparkle-fest. I mean unicorn poop in the most rainbow-colored and magnanimous way possible. I use Continuum and Sapphire every day, and Continuum is the go-to for keying, tracking, roto, film grain and more. Sapphire can really spice up a scene, main title or motion-graphics masterpiece.

My go-to Continuum tools are Gaussian Blur, Primatte Keyer (which has an amazing new secondary spill suppressor update) and Film Grain — all of which use the built-in Mocha planar tracking. There are more new tools to look at in Continuum 2020, including the new BCC Corner Pin Studio, BCC Cast Shadow and BCC Reflection. BCC Corner Pin Studio is a phenomenal addition to the Continuum plugin suite, particularly inside of NLEs such as Media Composer, which don’t have great built-in corner pinning abilities.

As an online editor, I often have to jump out of the NLE I’m using to do title work. After Effects is my tool of choice because I’m familiar with it, but that involves exporting QuickTime files, doing the work and re-exporting either QuickTime files with alpha channels or QuickTime files with the effect baked into the footage. If possible, I like to stay as “un-baked” as possible (feel free to make your own joke about that).

BCC Corner Pin Studio is another step forward in keeping us inside of one application. Using Corner Pin Studio with Mocha planar tracking is surprisingly easy. Inside of Media Composer, place the background on v2 and foreground on v1 of the timeline, apply BCC Corner Pin Studio, step into Effects Mode, identify the foreground and background, use Mocha to track the shot, adjust compositing elements inside of Avid’s Effect window, and you’re done. I’ve over-simplified this process, but it works pretty quickly, and with a render, you will be playing back a rock-solid corner pin track inside of the same NLE you are editing in.

Avid has a few quirks when working with alpha channels to say the least. When using BCC Corner Pin Studio along with the Avid title tool, you will have to “remove” the background when compositing the text. To do this, you click and drag (DO NOT Alt + Drag) a plugin like BCC Brightness and Contrast on top of the Avid title tool layer, enable “Apply to Title Matte” and set background to “None.”

It’s a little cumbersome, but once you get the workflow down, it gets mechanical. The only problem with this method is that when you replace the matte key on the Avid title tool layer, you lose the ability to change, alter or reposition the title natively inside of the Avid title effect or title tool itself. Just make sure your title is “final,” whatever final means these days. But corner pinning with this amount of detail inside of Media Composer can save hours of time, which in my mind equals saving money (or making more money with all your newly found free time). You can find a great six-minute tutorial on this by Vin Morreale on Boris FX’s YouTube page.

Two more great new additions to Continuum in release 2020 are BCC Cast Shadow and Reflection. What’s interesting is that all three — Corner Pin Studio, Cast Shadow and Reflection — can be used simultaneously. Well, maybe not all three at once, but Corner Pin Studio with Shadow or Reflection can be used together when putting text into live-action footage.

Life Below Zero, a show I online edit for Nat Geo, uses this technique. Sometimes I composite text in the snow or in the middle of a field with a shadow. I don’t typically do this inside of Media Composer, but after seeing what Corner Pin Studio can do, I might try it. It would save a few exports and round trips.

To ramp up text inserted into live-action footage, I like to add shadows or reflections. The 2020 Continuum update with Cast Shadow and Reflection makes it easy to add these effects inside of my NLE instead of having to link layers with pick whips or having special setups. Throw the effect onto my text (pre-built graphic in After Effects with an alpha channel) and boom: immediate shadow and/or reflection. To sell the effect, just feather off the edge, enable a composite-mode overlay, or knock the opacity down and you are done. Go print your money.

In the Continuum 2020.5 update, one of my most prized online editing tools that has been updated is BCC Remover. I use BCC Remover daily to remove camera people, drones in the sky, stray people in the background of shots and more. In the 2020.5 update, BCC Remover added some great new features that make one of the most important tools even more useful.

From an ease-of-use standpoint, BCC Remover now has Clone Color and Clone Detail sliders. Clone Color can be used to clone only the color from the source, whereas Clone Detail can be used to take the actual image from the source. You can mix back and forth to get the perfect clone. Inside of Media Composer, the Paint Effect has always been a go-to tool for me, mainly for its blurring and cloning abilities. Unfortunately, it is not robust — you can’t brighten or darken a clone; you can only clone color or clone the detail. But you can do both in BCC Remover in Continuum 2020.5.

In addition, you can now apply Mocha Tracking data to the Clone Spot option and specify relative offset or absolute offset under the “Clone” dropdown menu when Clone Spot is selected. Relative offset allows you to set the destination (in the GUI or Effects panel), then set the source (where you want to clone from), and when you move the destination widget, the source widget will be locked at the same distance it was set at. Absolute offset allows both the source and destination to be moved independently and tracked independently inside of Mocha.

There are a lot more Continuum 2020 updates that I couldn’t get into in this space, and even more for the 2020.5 update. More new transitions were added, like the trendy spin blur dissolve, the area brush in Mocha (which I now use all the time to make a quick garbage matte), huge Particle Illusion improvements (including additional shapes) and Title Studio improvements.

In 2020.5, Particle Illusion now has force and turbulence options, and Title Studio has the ability to cast shadows directly inside the plugin. Outside of Title Studio (and back inside of an NLE like Avid), you have direct access to Composite modes and Transformations, letting you easily adjust parameters directly inside of Media Composer instead of jumping back and forth.

Title Studio is really becoming a much more user-friendly plugin. But I like to cover what I actually use in my everyday editing work, and Corner Pin Studio, Cast Shadow/Reflection and Remover are what I use consistently.

And don’t forget there are hundreds of effects and presets including BCC Flicker Fixer, which is an easy fix to iris shifts in footage (I’m looking at you, drone footage)!

Sapphire 2020
I’ve worked in post long enough to remember when Boris FX merged with GenArts and acquired Sapphire. Even before the merger, every offline editor used Sapphire for its unmistakable S_Glow, Film Looks and more.

It’s safe to say that Sapphire is more of an artsy-look plugin. If you are wondering how it compares to Continuum, Sapphire will take over after you are done performing image restoration and technical improvements in Continuum. Adding glows, blurs, dissolves, flutter cuts and more. Sapphire is more “video candy” than technical improvements. But Sapphire also has technical plugins like Math Ops, Z Depth and more, so each plugin has its own perks. Ideally both work together very well if you can afford it.

What’s new in Sapphire 2020? There are a few big ones that might not be considered sexy, but they are necessary. One is OCIO support and the ability to apply Mocha-based tracking to 10 parameter-driven effects: S_LensFlare, S_EdgeRays, S_Rays, S_Luna, S_Grunge, S_Spotlight, S_Aurora, S_Zap, S_MuzzleFlash and S_FreeLens.

In addition, there are some beauty updates, like the new S_FreeLens. And one of the biggest under-the-hood updates is the faster GPU rendering. A big hurdle with third-party effects apps like Continuum and Sapphire is the render times when using effects like motion blur and edge rays with Mocha tracking. In Sapphire 2020 there is a 3x speed and performance increase (depending on the host app you are using it on). Boris FX has a great benchmark comparison.

So up first I want to cover the new OCIO support inside of Sapphire 2020. OCIO is an acronym for “OpenColorIO,” which was created by Sony Picture Imageworks. It’s essentially a way to use Sapphire effects, like lens flares, in high-end production workflows. For example, for Netflix final deliverables, they ask the colorist to work in an ACES environment, but the footage may be HLG-based. The OCIO options can be configured in the effect editor. So just choose the color space of the video/image you are working on and what the viewing color space is. That’s it.

If you are in an app without OpenColorIO, you can apply the effect S_OCIOTransform. This will allow you to use the OCIO workflow even inside apps that don’t have OCIO built in. If you aren’t worried about color space, this stuff can make your eyes glaze over, but it is very important when delivering a show or feature and definitely something to remember if you can.

On top of the tried-and-true Sapphire beauty plugins like S_Glow or S_Luna (to add a moon), Boris FX has added S_FreeLens to its beauty arsenal. Out in the “real world,” free lensing or “lens whacking” is when you take your lens off of your camera, hold it close to where it would normally mount and move the lens around to create dream-like images. It can add a sort of blur-flare dreamy look; it’s actually pretty great when you need it, but you are locked into the look once you do it in-camera. That’s why S_FreeLens is so great; you can now adjust these looks after you shoot instead of baking in a look. There are a lot of parameters to adjust, but if you load a preset, you can get to a great starting point. From defocus to the light leak color, you can animate and dial in the exact look you are going for.

Parameter tracking has been the next logical step in tying Mocha, Continuum and Sapphire together. Finally, in Sapphire 2020, you can use Mocha to track individual parameters. Like in S_LensFlare, you can track the placement of the hotspot and separately track its pivot.

It’s really not too hard once you understand how it correlates inside the Mocha interface. Sapphire sets up two trackers inside of Mocha: 1) the hotspot search area and position of the actual flare, and 2) the pivot search area and position of the pivot point. The search area gathers the tracking data, while the position crosshair is the actual spot on which the parameter will be placed.

While I’m talking about Mocha, in the Sapphire 2020 update, Mocha has added the Area Brush tool. At first, I was skeptical of the Area Brush tool — it seemed a little too easy — but once I gave in, I realized the Area Brush tool is a great way to make a rough garbage matte. Think of a magnetic lasso but with less work. It’s something to check out when you are inside of Mocha.

Summing Up
Continuum and Sapphire continue to be staples of broadcast TV editors for a reason. You can even save presets between NLEs and swap them (for instance, Media Composer to Resolve).

Are the Boris FX plugins perfect? No, but they will get you a lot further faster in your Media Composer projects without having to jump into a bunch of different apps. One thing I would love to see Boris FX add to Continuum and Sapphire is the ability to individually adjust your Mocha shapes and tracks in the Avid Effects editor.

For instance, if I use Mocha inside of BCC Gaussian Blur to track and blur 20 license plates on one shot — I would love to be able to adjust each “shape’s” blur amount, feather, brightness, etc., without having to stack additional plugin instances on top.

But Boris FX has put in a lot of effort over the past few updates of Continuum and Sapphire. Without a doubt, I know Continuum and Sapphire have saved me time, which saves me and my clients money. With the lines between editor, VFX artist and colorist being more and more blurred, Continuum and Sapphire are necessary tools in your arsenal.

Check out the many tutorials Boris FX has put up: and go update Continuum and Sapphire to the latest versions.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producers Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

New ZBooks and Envy offerings from HP

A couple of weeks ago, HP introduced the HP ZBook Studio and HP ZBook Create mobile workstations as well as the HP Envy 15. All are the latest additions to the HP Create ecosystem, an initiative introduced during last year’s Adobe Max.

ZBook Studio

These Z by HP solutions are small-form-factor devices for resource-intensive tasks and target professional content creators working in design and modeling. The HP Envy portfolio, including the newest Envy 15, is built for editing video, stills, graphics and web design.

The systems are accelerated by Nvidia Quadro and GeForce RTX GPUs and backed by Nvidia Studio drivers. HP’s ZBook Studio, ZBook Create and Envy 15 laptops with RTX GPUs are members of Nvidia’s RTX Studio program, featuring acceleration for demanding raytraced rendering and AI creative workloads and Studio drivers for reliability.

HP says that the latest additions to the Z by HP portfolio are different from other thin and light mobile workstations and 15-inch notebooks in that they are built specifically for use in the most demanding creative workflows, which call for pro applications, graphics and color accuracy.

The ZBook Studio and ZBook Create, which target visual effects artists, animators and colorists, have all-day battery life. And HP’s DreamColor display accurately represents their content on screen thanks to the built-in colorimeter for automatic self-calibration, 100% sRGB and Adobe RGB for accuracy.

The Z Power Slider gives users control over the type of performance and acoustics for specific workflows. At the same time, the Z Predictive Fan Algorithm intelligently manages fan behavior based on the kind of work and applications used by creatives.

HP Envy 15

The systems feature vapor cooling chamber and liquid crystal polymer, gaming-class thermals. The custom advanced cooling system pushes air away from the CPU and GPU in two-dimensional paths, unlocking power density that the company says is 2.8 times higher gen-to-gen in a laptop design that is up to 22% smaller.

HP says the highly recyclable and lightweight aluminum exterior provides five times the abrasion resistance of painted carbon fiber and still complies with MIL-STD 810G testing.

The HP Envy offers a minimalist design with a sophisticated aluminum chassis and diamond-cut design and is the first Envy with a layer of glass on top of the touchpad for a smooth-touch experience. The HP Envy 15 features an all-aluminum chassis with 82.6% screen-to-body ratio, up to a 4K OLED Vesa DisplayHDR True Black display with touch interface, 10th Gen Intel Core processors, Nvidia GeForce RTX 2060 with Max-Q, and gaming-class thermals for the ultimate experience in creator performance.

Framestore adds three to London film team

During these difficult times, it’s great to hear that Framestore in London has further beefed up its staff. Three new hires will join the VFX studio’s entire workforce (approximately 2,500 people) working from home at the moment.

Two-time VES Award-winner Graham Page joins the company as VFX supervisor after 14 years with Dneg, where he supervised the company’s work on titles such as Avengers: Endgame, Captain Marvel and Avengers: Infinity War. He brings Framestore’s tally of VFX supervisors to 24 — all of whom from pre-production to on-set supervision through to the final delivery.

Mark Hodgkins, who rejoins the company after a 12-year stint with Dneg, will serve as Framestore’s global head of FX, film, and brings with him technical knowledge and extensive experience working on properties from Marvel, DC and J.K. Rowling.

Anna Ford joins Framestore as head of business development, film. Formerly sales and bidding manager at Cinesite, Ford brings knowledge of the global production industry and a passion for emerging technologies that will help identify and secure exciting projects that will push and challenge Framestore’s team of creative thinkers.

“While working in different areas of the company’s business, Graham, Anna and Mark all share the kind of outlook and attitude we’re always looking for at Framestore. They’re forward-thinking, creative in their approaches and never shy away from the kind of challenges that will bring out the best in themselves and those they work with,” says Fiona Walkinshaw, Framestore’s global managing director, film.

How VFX house Phosphene has been working remotely

By Randi Altman

In our ongoing coverage of how studios are working remotely, we reached out to New York City-based visual effects house Phosphene. Founded in 2010 by Vivian Connolly and John Bair, Phosphene specializes in photorealistic VFX for film and television, and is particularly known for their detailed CG environments and set extensions.

This four-time Emmy-nominated (Mildred Pierce and Boardwalk Empire Season 3, Season 5, Escape at Dannemora) studio’s more recent work includes The Plot Against America, The Hunters, A Beautiful Day in The Neighborhood and Motherless Brooklyn.

The Plot Against America

Like many others, Phosphene tasked with developing secure remote workflows, so we reached out to director of IT Jimmy Marrero and head of operations and strategy Beck Dunn to find out more.

How is Phosphene weathering this storm? Do you have most of your folks working remotely?
Beck Dunn: We were fortunate to be able to switch to remote work very quickly and are extremely grateful for our team who had been preparing for this major change. We are grateful we are in a position to support staff and productions who are able to continue working remotely.

Can you talk about what it took to get artists setup from their homes and walk us through that workflow?
Jimmy Marrero: Luckily, we’ve had experience with using PCOIP technology in the past and were in a good place to transition smoothly to remote work. We had a good number of workstations already set up with PCOIP remote workstation cards. We also leveraged AWS to create cloud workstations that are connected to our office via a VPC (virtual private cloud). This gives us the capability to securely increase our capacity for work way beyond any physical hardware limitations.

What tools are you using to make sure these folks stay connected?
Marrero: We all communicate with each other via chat using an open-source tool called Rocket.Chat. Producers connect via BlueJeans video conference.

For anyone setting up a remote pipeline, I would also recommend taking advantage of cloud-based software like Slack for communication, Trello for organization, and AnyDesk to allow IT to help troubleshoot any issues that might occur during the setup process.

What about security and working remotely?
Marrero: Security was the driving force for us to investigate the advantages of PCOIP technology. Having remote workstation cards installed at the office allows us to stream encrypted screen information directly to the artists monitors and eliminates the need for any data to be hosted outside of Phosphene’s internal network.

Using PCOIP combined with only being able to access our network via VPN with two-factor authentication, we were able to address many security concerns from our clients, which was a key factor in our being able to work remotely.

PCOIP technology also allows us to easily use all the tools on our internal network, with no change in set up, or compromise to security. Once logged in, artists are able to access Nuke, Hiero, 3dsMax, Houdini and Deadline as though they are in the office.

What types of work are you guys doing at the moment?
Dunn: We can’t talk about any of our current work, but one project we recently finished is HBO’s The Plot Against America, created by Ed Burns and David Simon. The show is based on Philip Roth’s 2004 novel depicting the lives of US citizens in an alternate history where Franklin D.Roosevelt loses the 1940 presidential election to Charles Lindbergh.

Phosphene worked with show-side VFX supervisor Jim Rider on a wide range of visual effects for the show, including creating period-accurate aerial views of 1940’s Manhattan, exteriors of Newark Airport and a British Navy base, and extensive crowd duplication shots inside Madison Square Garden. In total, Phosphene delivered 274 shots for the limited series.

The Plot Against America

Any tips for those companies who are just starting to get set up remotely or even those who are currently working remotely?
Marrero: Be nice to your IT department. (Smiles) Working remotely has many moving parts that need to all work perfectly for things to go smoothly. Expect delays in the beginning as all the kinks are worked out.

What has helped staffers get settled into working from home?
Dunn: I’ll let them speak for themselves.

VFX producer Matthew Griffin: I found it really helpful to set up a dedicated mini-office rather than just working on a laptop from the couch. When I sit down at my workspace, I feel like I am still “going into” the office. Holding team meetings via video chat and maintaining rituals like having my morning coffee at the same time also helps me to stay in a familiar rhythm. We also have a dog, so walking him at the end of the day makes the workday feel complete. I close the laptop, walk the dog, and once I’m home, it’s like my commute is over and it’s time to relax.

VFX producer Steven Weigle: Producers are used to working remotely for short stints, so this hasn’t been an entirely foreign experience. I did recently add a KVM switch to my home setup, to use my full-sized keyboard, mouse and monitor to control my work laptop but be able to switch back to my personal machine with the click of a button. It’s a small, basic upgrade but it helps me maximize my desk space while still separating my “work brain” from my “home brain.”


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

The Embassy opens in Culver City with EP Kenny Solomon leading charge

Vancouver-based visual effects and production studio The Embassy is opening an office in LA office in Culver City. EP Kenny Solomon will head up the operation. The move comes following the studio’s growth in film, advertising and streaming, and a successful 2019.The LA-based office will allow The Embassy to have a direct connection and point of contact with its growing US client base and provide front-end project support and creative development while Vancouver — offering pipeline and technology infrastructure — remains the heart of operations.

New studio head Solomon has worked in the film, TV and streaming industries for the past 20 years, launching and operating a number of companies. The most recent of which was Big Block Media Holdings — an Emmy-, Cannes-, Webby-, Promax- and Clio-winning integrated media company founded by Solomon nine years ago.

“We have a beautiful studio in Culver City with infrastructure to quickly staff up to 15 artists 2D, 3D and design, a screening room, conference room, edit bay and wonderful outdoor space for a late night ping-pong match and a local Golden Road beer or two,” says Solomon. “Obviously, everyone is WFH right now but at a moment’s notice we are able to scale accordingly. And Vancouver will always be our heartbeat and main production hub.”

“We have happily been here in Vancouver for the past 17 years plus,” says The Embassy president Winston Helgason. “I’ve seen the global industry go through its ups and downs, and yet we continue to thrive. The last few months have been a difficult period of uncertainty and business interruption and, while we are operating successfully out of current WFH restrictions, I can’t wait to open up to our full potential once the world is a little more back to normal.”

In 2020, The Embassy reunited with Area 23/FCB and RSA director Robert Stromberg (Maleficent) to craft a series of fantastical VFX environments for Emgality’s new campaign. The team has also been in full production for the past 16 months on all VFX work for Warrior Nun, an upcoming 10-episode series for Netflix. The Embassy was responsible for providing everything from concept art to pre-production, on-set supervision and almost 700 visual VFX shots for the show. The team in Vancouver is working both remotely and in the studio to deliver the full 10 episodes.

Solomon is excited to get to work, saying that he always respected The Embassy’s work, even while he was competing with them when he was at CafeFX/The Syndicate and Big Block.

As part of the expansion, The Embassy has also added a number of new reps to the team — Sarah Gitersonke joins for Midwest representation, and Kelly Flint and Sarah Lange join for East Coast.

Vegas Post upgrades for VFX, compositing and stills

Vegas Creative Software, in partnership with FXhome, has added new versions of Vegas Effects and Vegas Image to the Vegas Post suite of editing, VFX, compositing and imaging tools for video professionals, editors and VFX artists.

The Vegas Post workflow centers on Vegas Pro for editing and adds Vegas Effects and Vegas Image for VFX, compositing and still-image editing.

Vegas Effects is a full-featured visual effects and compositing tool that provides a variety of high-quality effects, presets and correction tools. With over 800 effects and filters to tweak, combine, pull apart and put back together, Vegas Effects provides users with a powerful library of effects including:
• Particle generators
• Text and titling
• Behavior effects
• 3D model rendering
• A unified 3D space
• Fire and lightning generators
• Greenscreen removal
• Muzzle flash generators
• Picture in picture
• Vertical video integration

Vegas Image is a non-destructive raw image compositor that enables video editors to work with still-image and graphical content and incorporate it directly into their final productions — all directly integrated with Vegas Post. This new version of Vegas Image contains feature updates including:
• Brush masks: A new mask type that allows the user to brush in/out effects or layers and includes basic brush settings like radius, opacity, softness, spacing and smoothing
• Multiple layer transform: Gives the ability to move, rotate and scale a selection of layers
• Multi-point gradient effect: An effect that enables users to create colored gradients using an unlimited amount of colored points
• Light rays effect: An effect that uses bright spots to cast light rays in scenes, e.g., light rays streaming through trees
• Raw denoise: Bespoke denoise step for raw images, which can remove defective pixels and large noise patterns
• Lens distortion effect: Can be used to perform lens-based adjustments, such as barrel/pincushion distortion or chromatic aberration
• Halftone effect: Produces a halftone look, like a newspaper print or pop art
• Configurable mask overlay color: Users can now pick what color is overlaid when the mask overlay render option is enabled

Vegas Post is available now for $999 or as a subscription starting at $21 per month.

Dolores McGinley heads Goldcrest London’s VFX division

London’s Goldcrest Post, a picture and audio post studio, has launched a visual effects division at its Lexington Street location. It will be led by VFX vet Dolores McGinley, whose first task is to assemble a team of artists that will provide services for both new and existing clients.

During the COVID-19 crisis, all Goldcrest staff is working from home except the colorists, who are coming in as needed and working alone in the grading suites. McGinley and her team will move into the Goldcrest facility when lockdown has ended.

“Having been immersed in such a diverse range of projects over the past five years, we identified the need to expand into VFX some time ago,” explains Goldcrest MD Patrick Malone. “We know how essential an integrated VFX service is to our continued success as a leading supplier of creative post solutions to the film and broadcast community.

“As a successful VFX artist in her own right, Dolores is positioned to interpret the client’s brief and offer constructive creative input throughout the production process. She will also draw upon her considerable experience working with colorists to streamline the inclusion of VFX into the grade and guarantee we are able to meet the specific creative requirements of our clients.”

With over two decades of creative experience, McGinley joins Goldcrest having held various senior roles within the London VFX community. Recent examples of her work include The Crown, Giri/Haji and Good Omens.

Working From Home: VFX house The Molecule

By Randi Altman

With the COVID-19 crisis affecting all aspects of our industry, we’ve been talking to companies that have set up remote workflows to meet their clients’ needs. One of those studios is The Molecule, which is based in New York and has a location in LA as well. The Molecule has focused on creating visual effects for episodics and films since its inception in 2005.

Blaine Cone 

The Molecule artists are currently working on series such as Dickinson and Little Voice (AppleTV+), Billions (Showtime), Genius: Aretha (NatGeo), Schooled and For Life (ABC) and The Stranger (Quibi). And on the feature side, there is Stillwater (Focus Features) and Bliss (Amazon). Other notable projects include The Plot Against America (HBO), Fosse/Verdon (FX) and The Sinner (USA).

In order to keep these high-profile projects flowing, head of production Blaine Cone and IT manager Kevin Hopper worked together to create the studio’s work-from-home setup.

Let’s find out more…

In the weeks leading up to the shutdown, what were you doing to prepare?
Blaine Cone: We had already been investigating and testing various remote workflows in an attempt to find a secure solution we could extend to artists who weren’t readily available to join us in house. Once we realized this would be a necessity for everyone in the company, we accelerated our plans. In the weeks before the lockdown, we had increasingly larger groups of artists work from home to gradually stress-test the system.

How difficult was it to get that set up?
Cone: We were fortunate to have a head start on our remote secure platform. Because we decided to tie into AWS, as well as into our own servers and farm (custom software running on a custom-built hypervisor server on Dell machines), it took a little while, but once we saw the need to fast-track it we were able to refine our solution pretty quickly. We’re still optimizing and improving behind the scenes, but the artists have been able to work uninterrupted since the beginning.

Kevin Hopper

What was your process in choosing the right tools to make this work?
Kevin Hopper: We have been dedicated to nailing down TPN-compliant remote work practices for the better part of a year now. We knew that there was a larger market of artists available for us to tap into if we could get a remote work solution configured properly from a security standpoint. We looked through a few companies offering full remote working suites via Teradici PCOIP setups and ultimately decided to configure our own images and administer them to our users ourselves. This route gives us the most flexibility and allows us to accurately and effectively mirror our required security standards.

Did employees bring home their workstations/monitors? How is that working?
Cone: In the majority of cases, employees are using their home workstations and monitors to tap into their dedicated AWS instance. In fact, the home setup could be relatively modest because they were tapping into a very strong machine on the cloud. In a few cases, we sent home 4K monitors with individuals so they could better look at their work..

Can you describe your set up and what tools you are using?
Cone: We are using Teradici to give artists access to dedicated, powerful and secure AWS machines to work off of files on our server. This is set up for Nuke, Maya, Houdini, Mocha, Syntheyes, Krita, Resolve, Mari and Substance Painter. We spin up the AWS instances in the morning and then down again after the workday is over. It allows us to scale as necessary, and it limits the amount of technical troubleshooting and support we might have to do otherwise. We have our own internal workflow tools built into the workflow just as we did when artists were at our office. It’s been relatively seamless.

Fosse/Verdon

How are you dealing with the issues of security while artists are working remotely?
Cone: Teradici gives us the security we need to ensure that the data exists only on our servers. It limits the artists from web traffic as well.

How is this allowing you to continue creating visual effects for shows?
Cone: It’s really not dissimilar to how we normally work. The most challenging change has been the lack of in-person interaction. Shotgun, which we use to manage our shots, still serves as our creative hub, but Slack has become an even more integral aspect of our communication workflow as we’ve gone remote. We’ve also set up regular team calls, video chats and more to make up for the lack of interpersonal interaction inherent in a remote scenario.

Can you talk about review and approval on shots?
Cone: Our supervisors are all set up with Teradici to review shots securely. They also have 4K monitors. In some cases, artists are doing Region of Interest to review their work. We’ve continued our regular methods of delivery to our clients so that they can review and approve as necessary.

How many artists do you have working remotely right now?
Cone: Between supervisors, producers, artists and support staff in NY and LA, we have about 50 remote users working on a daily basis. Our Zoom chats are a lot of fun. In a strange way, this has brought us all closer together than ever before.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

 

Maxon releases first subscription-only Cinema 4D

Maxon is now offering Cinema 4D Subscription Release 22 (S22), the next generation of its 3D app and its first subscription-only release. Cinema 4D S22  offers a number of performance and interactivity improvements, including UV unwrapping and editing tools, improved selection and modeling tool functionality, organizational licensing for volume customers and updated viewport technology with support for Metal on macOS.

In addition, Maxon has boosted Cinema 4D’s pipeline compatibility with GLTF export, improved GoZ integration with Z-Brush, and support for node-based materials in FBX and Cineware. Cinema 4D S22 is immediately available for subscription customers. For perpetual license holders of Cinema 4D, a release is scheduled later this year that will incorporate the features of S22 and additional enhancements.

“In September last year, we introduced subscription-based options so we could offer professional 3D software at a significantly lower price. This also allows us to deliver more frequent improvements and enhancements to our subscription customers,” says Maxon CEO Dave McGavran. “S22 offers subscription users early access to solutions like the much-requested UV tools improvements and organizational license management for our volume customers. And yes, we will roll all these features and more into an upgrade later this year for our perpetual customers.”

S22 feature highlights
• New UV workflow enhancements, improved packing and automatic UVs
Improved selection tools, visualization tools and a progressive unwrapping workflow make it much simpler to define a UV map, while new packing algorithms optimize texture resolution. A new automatic UV unwrapping option based on the Ministry of Flat licensed technology developed by Eskil Steenberg of Quel Solaar makes it easy to create a basic unwrap with minimal distortion and overlaps for baking and texture painting.
• Enhanced viewport
Cinema 4D’s new viewport core provides a framework to make the best use of graphics technology in the coming years, with full support for Apple Metal. Users enjoy a more accurate view of the 3D scene, improved filtering and multi-instance performance.
• Pipeline – GTLF export, GoZ integration and more
GLTF export offers users a flexible and efficient format for sharing 3D animations on the web and within AR applications, while GoZ integration offers a smooth workflow with Pixologic Z-Brush for advanced sculpting. Support for Nodal materials within FBX and Cineware expands the pipeline for advanced materials.
• Modeling tools improvements
In addition to many small usability enhancements, modeling tools are faster and more robust and better preserve mesh attributes like UV and vertex maps, thanks to a new core architecture.
• Organizational licensing options
Volume License customers can leverage organizational accounts within the MyMaxon ecosystem to assign licenses to individual users or groups, coupling the flexibility of floating licenses with the accessibility and reliability of Maxon’s servers.

Cinema 4D S22 can be downloaded immediately and is available for both macOS and Windows.

Autodesk’s Flame 2021 adds Dolby Vision, expands AI and ML offerings

Autodesk has released Flame 2021 with new features aimed at innovating and accelerating creative workflows for VFX, color grading, look development and editorial finishing. Flame 2021 increases workflow flexibility for artists, expands AI capabilities with new machine learning-powered human face segmentation and simplifies finishing for streaming services with new functionality for Dolby Vision HDR authoring and display. In response to user requests, the release also adds a new GPU-accelerated Physical Defocus effect and finishing enhancements that make it easier to adjust looks across many shots, share updates with clients and work faster.

Useful for compositing, color grading and cosmetic beauty work, the AI-based face segmentation tool automates all tracking and identifies and isolates facial features — including nose, eyes, mouth, laugh lines and cheekbones — for further manipulation. Face matching algorithms are also capable of specialized tasks, including specific mole or scar isolation, through custom layout workflows. Built-in machine learning analysis algorithms isolate and modify common objects in moving footage, dramatically accelerating VFX and compositing workflows.

To meet increasing demand for HDR content mastering driven by OTT streaming services, Flame 2021 introduces a new Dolby Vision HDR authoring and display workflow. This enables Flame to import, author, display and export Dolby Vision HDR shot-by-shot animatable metadata, streamlining creation and delivery of high dynamic range imagery required by leading OTT streaming services. The update also expands collaboration with Autodesk Lustre and other Dolby-certified color grading tools through enabling XML metadata import/export.

Other new features in the Flame 2021 family include:
● Save and recall color grading and VFX: Quickly save and recall color grading and VFX work in the new Explorer, a dedicated “grade bin” and reference comparison area to support artist workflows.
● Viewing area: A new video preview mode shares artist viewports — including storyboard, manager and schematic — to SDI or HDMI preview monitors. In broadcast mode, Gmasks can now be observed in the view area during editing along with any other tools that get directly manipulated.
● Gmask premade shapes: New Gmask premade shapes with softness are available for colorists, compositors, and finishing VFX artists in the image and action nodes.

Flame, Flare and Flame Assist 2021 are available at no additional cost to Flame Family 2020 subscribers.

VFX turn Minnesota into Alabama for indie film Tuscaloosa

By Randi Altman

Director Philip Harder’s Tuscaloosa is a 1970s coming-of-age story that follows recent college graduate Billy Mitchell as he falls in love with a psychiatric patient from his dad’s mental hospital. As you can imagine, the elder Mitchell is not okay with the relationship or the interest his son is taking in the racial tensions that are heating up in Alabama.

As a period-piece, Tuscaloosa required a good amount of visual effects work, and Minneapolis-based Splice served as the picture’s main post and VFX house. Splice called the newly launched and local boutique Nocturnal Robot for overspill and to help turn current-day Minnesota, where the film was shot, into 1970s Tuscaloosa, Alabama.

Jeremy Wanek

Nocturnal Robot’s owner, editor and VFX artist, Jeremy Wanek and artist Conrad Flemming provided a variety of effects, from removing foliage to adding store signs and period cars to rebuilding a Tuscaloosa street. Let’s find out more.

How early did you get involved?
Nocturnal Robot got involved as the edit was nearing picture lock. Splice was bidding on the project’s VFX at the time and it became apparent that they were going to need some support due to the volume of complex shots and constrained budget.

Splice was the main VFX house on the film, and they provided editing as well?
Yes, Splice handled the edit and was the main hub for the VFX work. Clayton Condit edited the movie, along with Kyle Walczak as additional editor. The VFX team was led by Ben Watne. Splice handled around 50 shots, while my team handled around 20, and then Rude Cock Productions (led by the now LA-based Jon Julsrud) jumped in toward the end to finish up some miscellaneous shots, and finally, The Harbor Picture Company tackled some last-minute effects during finishing — so lots of support on the VFX front!

What direction were you given from the client?
Phillip Harder and I met at Splice and went through the shots that concerned him most. Primarily, these discussions centered on details that would lend themselves well to the transformation of modern-day Minnesota, where the movie was shot, to 1970s Alabama, where the movie takes place.

Were you on set?
We were brought in well after the movie had been shot, which is usually the case on a lot of the indie films we work on.

      
Before and After: Period car addition

Can you talk about prepro? Did you do any and if so in what tool?
No prepro, just the discussion I had with the director before we started working. As far as tools, he loved using his laser pointer to point out details (laughs).

Speaking of tools, what did you use on the show, and can you talk about review and approvals?
Our team was very small for this project, since my company had just officially launched. It was just me, as VFX supervisor/VFX artist along with VFX artist Conrad Flemming. We did our compositing in Adobe After Effects, sometimes using Red Giant tools as well. Digital cleanup was via Adobe Photoshop, and planar tracking was done using BorisFX Mocha Pro. We did 3D work in Maxon Cinema 4D, as well as Video Copilot’s Element 3D plugin for Adobe After Effects.

I would stop by Splice, there Kyle Walczak (who was doing some additional editing at the time) would transfer footage over to a hard drive for me. Then, it was a simple workflow between me and Conrad. I worked on my Mac Pro trash can, while Conrad worked on his PC. I sent him shots via my Google Drive. For review and final delivery we used Splice’s personal FTP site.

   
Before and After: Foliage removal

A lot of the review process was sending emails back and forth with Phil. This worked out okay because we were able to get most shots approved quickly. Right after this project we started using Frame.io, and I wish I would have been using that on this one — it’s much cleaner and more efficient.

Can you talk about what types of VFX you provided, and did they pick Minnesota because the buildings more closely resembled Tuscaloosa of the ‘70s?
Phil picked Minnesota because it’s where he lives, and he realized that present-day Alabama doesn’t look much like it did in the 70s. In Minnesota he could pull the resources he had access to and stretch his budget further. They shot at a lot of great timeless locations and brought in some period cars and lots of wardrobe to really sell it. They did an incredible job during production, so VFX-wise, it was mostly just enhancing here and there.

Can you talk about some of those more challenging scenes?
There were two shots in particular that were challenging, and we handled each case very differently. I had lengthy discussions with Phil about how to handle them. If you watch our VFX reel, they are the first and last shots shown.

In the first shot, we see our two lead characters pull up to a restaurant. Phil wanted to change the environment and add a parking lot of period-appropriate cars. He had some images that were photographed in the ’70s and he wanted to composite them into the live-action plate the crew had shot. It was really interesting trying to blend something that was nearly 50 years old into a high-quality Alexa shot. It took a lot of cleanup work on the old images since they were littered with artifacts, as well as cutting them up to fit into the shot more seamlessly. It also involved adding some CG period cars. It was a fun challenge, and once it was all put together it created a unique look.

       
Before and After: Bama Theater

In the second challenging shot, the live-action plate featured a modern-day Minnesota street with a few period vehicles driving down it. We needed to transform this, as you’d expect, into a 70s Alabama street — this time featuring the Bama Theater. This involved a lot of detailed work. I had Conrad focus most of his attention on this shot because I knew he could pull it off in the limited time we had, and his attention to the period’s details would go a long way. There wasn’t a lot of reference material for us to analyze from the ’70s that was taken on that particular street, so we did our best looking at other images we could find from the time and area.

Phil had a lot of notes and details to help us along. We had live-action plates shot on the Red camera to build upon — some buildings, period cars, the extras walking around and a handful of other small objects. But because so much had to be reconstructed, the shot had to be put together from scratch.

Some of the things we noticed in images from the ’70s that we implemented were removing lots of the foliage/trees, adding the fancy signs above stores and adding the stoplights that hung on wires, among other details. It also involved adding lots of CG cars to the environment to fill out the street and add some movement to the foreground and background.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

GTC: GPUs power The Mandalorian‘s in-camera VFX, realtime workflows

By Mike McCarthy

Each year, Nvidia hosts a series of conferences that focus on new developments in GPU-based computing. Originally, these were about graphics and visualization, which were the most advanced things being done with GPUs. Now they focus on everything from supercomputing and AI to self-driving cars and VR. The first GTC conference I attended was in 2016, when Nvidia announced its Pascal architecture with dedicated Tensor cores. While that was targeted to supercomputer users, there was still a lot of graphics-based content to explore, especially with VR.

Over time, the focus has shifted from visual applications to AI applications that aren’t necessarily graphics-based; they just have similar parallel computing requirements to graphics processing and are optimal tasks to be accelerated on GPU hardware. This has made GTC more relevant to programmers and similar users, but the hardware developments that enable those capabilities also accelerate the more traditional graphics workflows — and new ways of using that power are constantly being developed.

I was looking forward to going to March’s GTC to hear the details on what was expected to be an announcement about Nvidia’s next generation of hardware architecture and to see all of the other presentations about how others have been using current GPU technology. Then came the coronavirus, and the world changed. Nvidia canceled the online keynote and a few SDK updates were released, but all major product announcements have been deferred for the time being. What Nvidia did offer was a selection of talks and seminars that were remotely recorded and hosted as videos to watch. These are available to anyone who registers for the free online version of GTC, instead of paying the hundreds it would cost to attend in person.

One that really stood out to me was “Creating In-Camera VFX with Realtime Workflows.” It highlighted the Unreal Engine and what that technology allowed on The Mandalorian — it was amazing. The basic premise is to replace greenscreen composites with VFX projections behind the elements being photographed. This was done years ago for exteriors of in-car scenes using flat prerecorded footage, but technology has progressed dramatically since then. The main advances are in motion capture, 3D rendering and LED walls.

From the physical standpoint, LED video walls have greater brightness, allowing them not only to match the lit foreground subjects, but to light those subjects for accurate shadows and reflections without post compositing. And if that background imagery can be generated in real time — instead of recordings or renders — it can respond to the movement of the camera as well. That is where Unreal comes in — as a 3D game rendering engine that is repurposed to generate images corrected for the camera’s perspective in order to project on the background. This allows live-action actors to be recorded in complex CGI environments as if they were real locations. Actors can see the CGI elements they are interacting with, and the crew can see it all working together in real time without having to imagine how it’s going to look after VFX. We looked at using this technology on the last film I worked on, but it wasn’t quite there yet at the scale we needed; we used greenscreens instead, but it looks like this use of the technology has arrived. And Nvidia should be happy, because it takes a lot more GPU power to render the whole environment in real time than it does to render just what the camera sees after filming. But the power is clearly available, and even more is coming.

While no new Nvidia technology has been announced, something is always right around the corner. The current Turing generation of GPUs, which has been available for over 18 months, brought dedicated RTX cores for realtime raytracing. The coming generation is expected to scale up the number of CUDA cores and amount of memory by using smaller transistors than Turing’s 12nm process. This should offer more processing power for less money, which is always a welcome development.


Mike McCarthy is an online editor/workflow consultant with over 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.

Chaos offering V-Ray Education Collection via single license

Chaos Group has launched its V-Ray Education Collection, a new offering that provides access to 11 V-Ray and Phoenix FD products through a single license. Students, schools and educators can now use Chaos Group’s renderer and fluid simulation software for $149 a year (an 86% savings compared to purchasing individually).

“We wanted to make it easier for students and educators to access a full suite of software, as they learn and teach visualization for architecture, design and visual effects,” says Chaos Group education director Veselina Zheleva. “With our new collection, users can easily switch between the leading 3D applications, helping them branch out, reduce costs and expand their curriculums.”

The V-Ray Education Collection offers students more flexibility as they begin to master workflows and build their portfolios. As coursework and interests develop, students can apply different versions of V-Ray to their own challenges. With options for architecture (Revit, SketchUp, Rhino, 3ds Max), visual effects (Maya, Houdini), realtime (Unreal) and more, students can now hone in on the areas they are most focused on.

With one price across products, schools and teachers can choose the right product for the job, and then switch it if plans or industry trends start to shift. As cost is a historical barrier to departmental growth, Chaos says the V-Ray Education Collection has been priced to spur new classes and expanded curriculums, so administrators won’t have to wait for new budgets when ideas hit.

The V-Ray Education Collection includes full versions of 11 products and free upgrades for the length of the license. Free access to Chaos Group’s commercial support team is also included, providing on-demand support from set-up to settings.

The V-Ray Education Collection Includes: V-Ray for 3ds Max, V-Ray for Maya, V-Ray for SketchUp, V-Ray for Rhino, V-Ray for Revit, V-Ray for Modo, V-Ray for Unreal, V-Ray for Houdini, V-Ray for Cinema 4D, Phoenix FD for 3ds Max and Phoenix FD for Maya.

Main Image: Courtesy of ZEILT Productions 

Autodesk’s 3ds Max 2021 now available

Autodesk has introduced 3ds Max 2021, a new version of its 3D modeling, animation and rendering software. This latest release offers new tools designed to give 3D artists the ability to work across design visualization and media and entertainment with a fully scriptable baking experience, simpler install process, viewport and rendering improvements, and integrated Python 3 support, among other enhancements.

Highlights include:
• New texture baking experience supports physically based rendering (PBR), overrides and OSL workflows and provides intuitive new tool set.
• Updated installer allows users to get up and running quickly and easily.
• Integrated support for Python 3 and an improved pymxs API that ensure developers and technical artists can better customize pipelines.
• Native integration with the Arnold Renderer V6.0 offers a high-end rendering experience out of the box, while included scripts efficiently convert V-Ray and Corona files to the Physical Material for added flexibility.
• Performance enhancements simplify the use of PBR workflows across renderers, including with realtime game engines; provide direct access to high-fidelity viewports; improve the OSL user experience; significantly accelerate file I/O; and enhance control over modeling with a new weighted normals modifier.
• Tool set advancements to SketchUp import, Substance, ProSound and FBX streamline the creation and movement of high-quality 3D assets.
• New plugin interop and improvements – from support for AMG and OSL shaders to scene converter extensions – allow for a broader range of plugins to easily hook into 3ds Max while also simplifying plugin development and installation.

Early 3ds Max 2021 adopter Eloi Andaluz Fullà, a freelance VFX artist on beta, reported, “The revamped viewport, IBL controls and persistent Ambient Occlusion speed up the client review process because I can easily share high-quality viewport images without having to wait for renders. The new bake to texture update is also a huge time-saver because we can easily modify multiple parameters at once, while other updates simplify day-to-day tasks.”

3ds Max 2021 is now available as a stand-alone subscription or with the Autodesk Media & Entertainment Collection.

Workstations Roundtable

By Randi Altman

In our Workstations Special Edition, we spoke to pros working in offline editing, visual effects and finishing about what they need technically in order to keep creating. Here in our Workstations Roundtable, we reached out to both users and those who make computers and related tools, all of whom talk about what they need from their workstations in order to get the job done.

The Foundation’s Director of Engineering, John Stevens 

John Stevens

Located just across the street from the Warner Bros. lot, The Foundation provides post production picture services and workflows in HD, 2K, 4K, UHD, HDR10 and Dolby Vision HDR. They work on many episodic shows, including Black-ish, Grown-ish, Curb Your Enthusiasm and American Soul.

Do you typically buy off the shelf or custom? Both?
Both. It depends on the primary application the system will be running. Typically, we buy off-the-shelf systems that have the CPU and memory configurations we are looking for.

How often do you upgrade your workstations, and what process do you go through in finding the right one?
There is no defined time frame. We look at every system manufacturer’s offerings, look at specs and request demo systems for test after we have narrowed it to a few systems.

How important is the GPU to your work?
The GPU is extremely important, as almost every application uses the GPU to allow for faster processing. A lot of applications allow for multiple GPUs, so I look for systems that will support them.

Curb Your Enthusiasm

What are the questions you ask yourself before buying a new system? And what do you do with your older systems?
What is the primary application that the system is being purchased for? Does the software vendor have a list of certified configurations? Is the application well-threaded, meaning, can the application make efficient use of multiple cores, or does a higher core clock rate make the application perform faster? How many PCI slots are available? What is the power supply capability? What’s the reputation and experience of the manufacturer?

Do you feel mobile workstations are just as powerful for your work as desktops these days?
No, systems are limited in expandability.

 

Puget Systems’ Solutions Research & Development, Matt Bach

Based in Auburn, Washington, Puget Systems specializes in high-performance, custom-built computers for media and entertainment.

Matt Bach

What is your definition of a workstation? We know there are a few definitions out there in the world.
While many people tend to focus on the hardware to define what a workstation is, to us it is really whether or not the computer is able to effectively allow you to get your work done. In order to do so, it has to be not only fast but reliable. In the past, you had to purchase very expensive “workstation-class” hardware to get the proper balance of performance and stability, but these days it is more about getting the right brands and models of parts to complement your workflow than just throwing money at the problem.

For users looking to buy a computer but are torn between off-the-shelf and building their own, what would you tell them?
The first thing I would clarify is that there are vastly different kinds of “off-the-shelf” computers. There are the systems you get from a big box store, where you have a handful of choices but no real customization options. Then there are systems from companies like us, where each system is tailor-made to match what applications you use and what you do in those applications. The sticker price on these kinds of systems might appear to be a bit higher, but in reality — because it is the right hardware for you — the actual performance you get per dollar tends to be quite a bit better.

Of course, you can build a system yourself, and in fact, many of our customers used to do exactly that. But when you are a professional trying to get your work done, most people don’t want to spend their time keeping up on the latest hardware, figuring out what exact components they should use and troubleshooting any issues that come up. Time spent fiddling with your computer is time that you could spend getting your job done. Working with a company like us that understands what it is you are doing — and how to quickly get you back up and running — can easily offset any cost of building your own system.

What questions would you suggest pros ask before deciding on the right computer for their work?
This could easily be an entire post all its own, and this is the reason why we highly encourage every customer to talk to one of our consultants — if not on the phone, then at least by email. The right configuration depends on a huge number of factors that are never quite the same from one person to the next. It includes what applications you use and what you do in those applications. For example, if you are a video editor, what resolution, fps and codec do you tend to work with? Do you do any multicam work? What about VFX or motion graphics?

Depending on what applications you use, it is often also the case that you will run into times when you have opposing “optimal” hardware. A program like After Effects prefers CPUs with high per-core performance, while Premiere Pro can benefit from a CPU with more cores. That means there is no single “best” option if you use both of those applications, so it comes down to determining which application is more likely to benefit from more performance in your own personal workflow.

This really only scratches the surface, however. There is also the need to make sure the system supports your existing peripherals (Thunderbolt, 10G networking, etc.), the physical size of the system and upgradability. Not to mention the quality of support from the system manufacturer.

How do you decide on what components to include in your systems … GPUs, for example?
We actually have an entire department (Puget Labs) that is dedicated to this exact question. Not only does hardware change very quickly, but software is constantly evolving as well. A few years back, developers were working on making their applications multi-threaded. Now, much of that dev time has switched over to GPU acceleration. And in the very near future, we expect work in AI and machine learning to be a major focus.

Keeping up with these trends — and how each individual application is keeping up with them — takes a lot of work. We do a huge amount of internal testing that we make available to the public to determine exactly how individual applications benefit from things like more CPU cores, more powerful GPUs or faster storage.

Can you talk about warranties and support? What do you offer?
As for support and warranty, our systems come with lifetime tech support and one to three years parts warranty. What makes us the most different from big box stores is that we understand your workflow. We do not want your tech support experience to be finger pointing between Adobe, Microsoft and Puget Systems. Our goal is to get you up and running, regardless of what the root cause is, and often that means we need to be creative and work with you individually on the best solution to the problem.

 

Goldcrest Post’s Technical Director, Barbary Ahmed

Barbary Ahmed

Goldcrest Post New York, located in the heart of the bustling Meatpacking District, is a full-service post facility offering offline and picture and sound finishing.  Recent credits include The Laundromat, Godfather of Harlem, Russian Doll, High Flying Bird, Her Smell; Sorry to Bother You, Billions and Unsane.   

Do you typically buy off the shelf or custom? Both?
We do both. But for most cases, we do custom builds because color grading workstations need more power, more GPUs and a lot of I/O options.

How often do you upgrade your workstations, and what process do you go through in finding the right one?
This is technically a long research process. We depend on our trusty vendors, and it also depends on pricing and availability of items and how quick we need them.

How important is the GPU to your work?
For color grading and visual effects, using applications such as Autodesk’s Maya and Flame, Blackmagic Resolve and Adobe Premiere, a high-end workstation will provide a smoother and faster workflow. 4K/UHD media and above can tax a computer, so having access to a top-of-the-line machine is a key for us.

The importance of GPUs is that the video software mentioned above is now able to dump much of the heavy lifting onto the GPU (or even several GPUs), leaving the CPU free to do its job of delegating tasks, applications, APIs, hardware process, I/O device requests and so on. The CPU just makes sure all the basic tasks run in harmony, while the GPU takes care of crunching the more complex and intensive computation needed by the application. It is important to know that for all but the most basic video — and certainly for any form of 4K.

What are the questions you ask yourself before buying a new system? And what do you do with your older systems?
There are many questions to ask here: Is this system scalable? Can we upgrade it in the future? What real change will it bring to our workflow? What are others in my industry using? Does my team like it? These are the kind of questions we start with for any job.

In terms of what to do with older systems, there are a couple things that we think about: Can we use it as a secondary system? Can we donate it? Can we turn it into an experimental box? Can we recycle it? These are the kind of questions we ask ourselves.

Do you feel mobile workstations are just as powerful for your work as desktops these days? Especially now, with the coronavirus shutdowns?
During these unprecedented times, it seems that mobile workstations are the only way to keep up with our clients’ needs. But we were innovative about it; we established the capability to conduct most picture and sound post production work remotely. Colorists, conform editors and other staff are now able to work from home or a remote site and connect to the facility’s central storage and main desktop workstations via remote collaboration software.

This allows Goldcrest to ensure theatrical and television projects remain on track while allowing clients to oversee work in as normal a manner as possible under current circumstances.

 

Dell’s M&E Strategist, Client Solutions, Matt Allard

Matt Allard

Dell Technologies helps users create, manage and deliver media through a complete and scalable IT infrastructure, including workstations, monitors, servers, shared storage, switches, virtualization solutions and more paired with the support and services.

What is Dell’s definition of a workstation? We know there are a few definitions.
One of the most important definitions is the International Data Corporation’s (IDC) definition that assesses the overall market for workstations. This definition includes several important elements:

1. Workstations should be highly configurable and include workstation-grade components, including:
a. Workstation-grade CPUs (like Intel Xeon processors)
b. Professional and discrete GPUs, like those in the Nvidia Quadro line and AMD Radeon Pro line
c. Support for ECC memory

2. Workstations must be certified with commonly used professional ISV software, like that from Adobe, Autodesk, Avid, Blackmagic and others.

3. IDC requires a brand that is dedicated and known for workstations.

Beyond the IDC’s requirements, we understand that workstation customers are seeking the utmost in performance and reliability to run the software they use every day. We feel that workstation-grade components and Dell Precision’s engineering deliver that environment. Reliability can also include the security and manageability that large enterprises expect, and our designs provide the hooks that allow IT to manage and maintain workstations across a large studio or media enterprise. Consumer PCs rarely include these commercial-grade IT capabilities.

Additionally, software and technology (such as the Dell Precision Optimizer, our Reliable Memory Technology, Dell Client Command Suite) can extend the performance, reliability and manageability on top of the hardware components in the system.

For users looking to buy a computer but are torn between off the shelf and building their own, what would you tell them?
It’s a common misconception that a computer is just a sum of its parts. It can be better to deal with a vendor that has the supply chain volume and market presence to have advantageous access during times like these, when supply constraints exist on popular CPUs and GPUs. Additionally, most professional ISV software is not qualified or certified on a set of off-the-shelf components, but on specific vendor PC models. If users want absolute confidence that their software will run optimally, using a certified/qualified platform is the best choice. Warranties are also important, but more on that in a bit.

What questions would you suggest pros ask before deciding on the right computer for their work?
The first question is to be clear about the nature of the work you do as a pro, using what software applications in the media and entertainment industry. Your working resolution has a large bearing on the ideal configuration for the workstation. We try to make deciding easier with Dell’s Precision Workstation Advisor, which provides pros an easy way to find configuration choices based on our certification testing and interaction with our ISV partners.

Do you think we are at a time when mobile workstations are as powerful as desktops?
The reality is that it is not challenging to build a desktop configuration that is more powerful than the most powerful mobile workstation. For instance, Dell Precision fixed workstations support configurations with multiple CPUs and GPUs, and those actually require beefier power supplies, more slots and thermal designs that need more physical space than in a reasonably sized mobile.

A more appropriate question might be, can a mobile workstation be an effective tool for M&E professionals who need to be on the road or on shoot? And the answer to that is a resounding yes.

How do you decide on what components to include in your systems … GPUs, for example?
As mentioned above, workstations tend to be highly configurable, often with multiple options for CPUs, GPUs and other components. We work to stay at the forefront of our suppliers’ roadmap offerings and to provide a variety of options so customers can choose the right price/performance configuration that suits their needs. This is where having a clear guidance on certified system for the ISV software a customer is using makes selecting the right configuration easier.

Can you talk about warranties and support?
An advantage of dealing with a Tier 1 workstation vendor like Dell is that pros can pick the right warranty and support level for their business, from basic hardware warranty to our ProSupport with aggressive availability and response times. All Dell Precision fixed workstations come with a three-year Dell Limited Hardware warranty, and users can opt for as many as five years. Precision mobile workstations come with a one-year warranty (except 7000 series mobile, which has three years standard), and users can opt for as many as five years’ warranty with ProSupport.

 

Performance Post’s Owner/President, Fausto Sanchez

Fausto Sanchez

Burbank’s Independently owned Performance Post focuses on broadcast television work. It works with Disney, Warner Bros. and NBCUniversal. Credits include TV versions of the Guardians of the Galaxy franchise and SD to UHD upconversion and framerate conversions for HBO’s At the Heart of Gold: Inside the USA Gymnastics Scandal.

Do you typically buy off the shelf or custom? Both?
We look to the major suppliers like HP, Dell and Apple for off-the-shelf products. We also have
purchased custom workstations, and we build our own.

How often do you upgrade your workstations, and what process do you go through in finding the right one?
If we have done our homework well, our workstations can last for three to five years. This timeline is becoming shorter, though, with new technologies such as higher core count and clock speed.

In evaluating our needs, first we look at the community for best practices. We look to see what has been successful for others. I love that we can get that info and stories here on postPerspective! We look at what the main suppliers are providing. These are great if you have a lot of extra cash. For many of us, the market is always demanding and squeezing everything it can. We are no different. We have bought both preconfigured systems from the primary manufacturers as well as custom systems.

HBO’s At the Heart of Gold: Inside the USA Gymnastics Scandal.

How important is the GPU to your work?
In our editorial workflows — Avid Media Composer, Adobe Premiere, Blackmagic Resolve (for editing) — GPU use is not a big deal because these applications are currently not relying on GPU so much for basic editing. Mostly, you select the one best for your applications. Nvidia has been the mainstay for a long time, but AMD has gotten great support, especially in the new Mac Pro workstation.

For color work or encoding, the GPU selection becomes critical. Currently, we are using the Nvidia Titan series GPUs for some of our heaviest processor-intensive workflows

What are the questions you ask yourself before buying new systems? And what do you do with your older systems?
When buying a new system, obviously the first questions are: What is it for? Can we expand it? How much? What kind of support is there? These questions become key, especially if you decide to build your custom workstation. Our old systems many times are repurposed for other work. Many can function in other duties for years.

Do you feel mobile workstations are just as powerful for your work as desktops these days?
We have had our eye on mobile workstations for some time. Many are extremely powerful and can find a good place for a specific purpose. There can be a few problems in this setup: additional monitor capabilities, external GPU, external mass storage connectivity. For a lot of work, mobile workstations make sense; if I do not have to connect a lot of peripherals and can work mostly self-contained or cloud-based, these can be great. In many cases you quickly learn that the keyboard, screen and battery life are not conducive to a long-term workflow. For the right workflow though, these can be great. They’re just not for us right now.

 

AMD’s Director of VFX/Media & Entertainment, James Knight

James Knight

AMD provides Threadripper and Epyc CPUs that accelerate workflows in M&E.

How does AMD describe a workstation?
Some companies have different definitions of what makes a workstation. 
Essentially AMD thinks of workstations as a combination of powerful CPUs and GPUs that enable professionals to create, produce, analyze, design, visualize, simulate and investigate without having to compromise on power or workload performance to achieve their desired results. In the specific case of media and entertainment, AMD designs and tests products aligned with the workstation ecosystem to enable professionals to do so much more within the same exact deadlines. We are giving them more time to create.

For users looking to buy a computer but are torn between off the shelf and building their own, what would you tell them?
Ultimately, professionals need to choose the best solution to meet their creative goals. We work closely with major OEMs to provide them with the best we have to offer for the market. For example, 64-core Threadripper has certainly been recognized by workstation manufacturers. System builders can offer these new CPUs to achieve great results.

What questions should pros ask before purchasing a workstation, in order to make sure they are getting the right workstation for their needs?
I typically ask professionals to focus on their pain points and how they want the new workstation to resolve those issues. More often than not, they tell me they want more time to create and time to try various renderings. With an optimized workstation matched with on optimal balance of powerful CPUs and reliable GPUs, pros can achieve the results they demand over and over.

What trends have you seen happening in this space over the last couple of years?
As memory technology improves and larger models of higher resolution are created, I’ve seen user expectations increase dramatically, as has their desire to work easily with these files. The demand for reliable tools for creating, editing and producing content has been constantly growing. For example, in the case of movie mastering and encoding, AMD’s 32-core and 64-core Threadripper CPUs have exceeded expectations when working with these large files.

PFX‘s Partner/VFX Supervisor, Jan Rybar 

Jan Rybar

PFX is a Czech-based company focused on animation, post and visual effects. They work on international projects ranging from short films to commercials, TV series and feature films. The 110-member team works in their studios in Prague

How often do you upgrade your workstations, and what process do you go through in finding the right one?
We upgrade the workstations themselves maybe every two or three years. We try to select good quality vendors and robust specs so we won’t be forced to replace workstations too often.

Do you also build your own workstations and renderfarms?
Not really — we have a vendor we like and buy all the hardware there. A long time ago, we found out that the reliability of HP and their Z line of workstations is what we need. So 99% of our workstations and blade renderfarms are HP.

How do your needs as a VFX house differ from a traditional post house?
It blends together a lot — it’s more about what the traditional post house specializes in. If it’s focused on animation or film, then the needs are quite similar, which means more based on CPU power. Lately, as we have been involved more and more in realtime engine-based workflows, state-of-the-art GPU technology is crucial. The Last Whale Singer teaser we did was created with the help of the latest GeForce RTX 2080ti hardware. This allowed us to work both efficiently and with the desired quality (raytracing).

Can you walk us through your typical workflow and how your workstations and their components play a part?
The workflow is quite similar to any other production: design/concept, sculpting, modeling, rigging, layout, animation, lighting/effects, rendering, compositing, color grading, etc.

The main question these days is whether the project runs in a classic animation pipeline, on a realtime engine pipeline or a hybrid. Based on this, we change our approach and adapt it to the technology. For example, when Telescope Animation works on a scene in Unreal, it requires different technology compared to a team that’s working in Maya/Houdini.

PNY’s Nvidia Quadro Product Marketing Manager, Carl Flygare

Carl Flygare

Nvidia’s Quadro RTX-powered workstations, featuring Nvidia Turing GPU architecture, allow for realtime raytracing, AI and advanced graphics capabilities for visualization pros. PNY is Nvidia’s Quadro channel partner throughout North America, Latin America, Europe and India.

How does PNY describe a workstation? Some folks have different definitions of what makes a workstation.
The traditional definition of the term comes from CAD – a system optimized for computer aided design — with a professional CPU (e.g., Xeon, Ryzen), generous DRAM capacity with ECC (Error Correction Code), a significant amount of mass storage, a graphics board capable of running a range of pro applications required by a given workflow and a power supply and system enclosure sufficient to handle all of the above. Markets and use cases also matter.

Contemporary M&E requires realtime cinematic quality rendering in application viewports, with an AI denoising assist. Retiming video (e.g., from 30 fps to 120 fps) for a slow-motion effect can be done by AI, with results essentially indistinguishable from a slow-motion session on the set. A data scientist would see things differently. GPU Tensor TFLOPS enable rapid model training to achieve inference accuracy requirements, GPU memory capacity to hold extremely large datasets, and a CPU/GPU combination that offers a balanced architectural approach to performance. With so many different markets and needs, practically speaking, a workstation is a system that allows a professional to do their best work in the least amount of time. Have the hardware address that need, and you’ve got a workstation.

For users looking to buy a computer but are torn between off the shelf and building their own, what would you tell them?
As Henry Ford famously said about the Model T: “Any customer can have a car painted any color that he wants so long as it is black.” That is the off-the-shelf approach to acquiring a workstation. Large Tier 1 OEMs offer extensive product lines and daunting Configure to Order options, but ultimately, all offer similar classes of systems. Off-the-shelf is easy; once you successfully navigate the product line and specifications maze, you order a product, and a box arrives. But building your own system is not for the faint-hearted. Pick up CPU data sheets from Intel or AMD — you can read them for days.

The same applies to GPUs. System memory is easier, but mass storage offers a dizzying array of options. HDD (hard disk drive) or SSD (solid state drive)? RAID (and if so, what kind) or no RAID? How much power supply capacity is required for stable performance? A built-from-scratch workstation can result in a dream system, but with a system of one (or a few), how well will critical applications run on it? What if an essential workflow component doesn’t behave correctly? In many instances this will leave you on your own. Do you want to buy a system to perform the work you went into business to do, or do you want to spend time maintaining a system you need to do your work?

A middle path is available. A vibrant, lithe, agile and market-solutions knowledge-based system builder community exists. Vendors like Boxx Technologies, Exxact, Rave Computer, Silverdraft Supercomputing and @Xi Computer (among others) come to mind. These companies specialize in workstations (as defined by any of the definitions discussed earlier), have deep vertical knowledge, react quickly to technological advances that provide a performance and productivity edge, and vigorously support what they sell

What questions would you suggest pros ask before deciding on the right computer for their work?
Where is their current system lacking? How are these deficits affecting creativity and productivity? What use cases does a new system need to perform well? What other parts of my employment environment do I need to interact with, and what do they expect me to provide? These top-line questions transition to many others. What is the model or scene size I need to be able to fit into GPU memory to benefit from full GPU performance acceleration? Will marketing show up in my office or cubicle and ask for a photorealistic render even though a project is early in the design stage? Will a client want to interact with and request changes by using VR? Is a component of singular significance — the GPU — certified and supported by the ISVs that my workflow is built around? Answer these questions first, and you’ll find the remainder of the process goes much more easily. Use case first, last and always!

You guys have a relationship with Nvidia and your system-builder partners use their Nvidia GPUs in their workstations. Can you talk about that?
PNY is Nvidia’s sole authorized channel partner for Nvidia Quadro products throughout North America and Latin America and Europe, Middle East, Africa and India. Every Quadro board is designed, tested and built by Nvidia, whether it comes from PNY, Dell, HP or Lenovo. The difference is that PNY supports Quadro in any system brand. Tier 1 OEMs only support a Quadro board’s “slot win” in systems they build. This makes PNY a much better choice for GPU upgrades — a great way to extend the life of existing workstations — or when looking for suppliers that can deliver the technical support required for a wonderful out-of-box experience with a new system. It’s true whether the workstation is custom-built or purchased through a PNY Partner that specializes in delivering turnkey systems (workstations) built for professionals.

Can you talk about warranties and support? What do you offer?
PNY offers support for Nvidia in any system brand. We have dedicated Nvidia Quadro technical support reps available by phone or email. PNY never asks for a credit card number before offering product or technical support. We also have full access to Nvidia product and technical specialists should escalation be necessary – and direct access to the same Nvidia bug reporting system used by Nvidia employees around the world.

Finally, what trends do you see in the workstation market currently?
First the good: Nvidia Quadro RTX has enabled a workstation renaissance. It’s driving innovation for design, visualization and data science professionals across all major market segments. An entirely new class of product — the data science workstation — has been developed. Quadro RTX in the data centers and virtual GPU technology can bring the benefits of Quadro RTX to many users while protecting essential intellectual property. This trend toward workstation specialization by use case offers buyers more choices that better fit their specific criteria. Workstations — however defined — have never been more relevant or central to creative pros across the globe. Another good trend is the advent of true mobile workstations and notebooks, including thin and light systems, with up to Quadro RTX 5000 class GPUs.

The bad? With choice comes confusion. So many to choose from. Which best meets my needs? Companies with large IT staff can navigate this maze, but what about small and medium businesses? They can find the expertise necessary to make the right choice with PNY’s extensive portfolio of systems builders. For that matter, enterprises can find solutions built from the chassis up to support a given use case. Workstations are better than ever before and purchasing one can be easier than ever as well.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Maxon to live-stream NAB news and artist presentation

With the Las Vegas NAB Show now cancelled, Maxon will be hosting a virtual NAB presence on C4DLive.com featuring a lineup of working artists. Starting on Monday, April 20, and running through Thursday, April 23, these pros — who were originally slated to appear in Las Vegas — will share production tips, techniques and inspiration and talk about working with Maxon’s Cinema 4D, Red Giant and Redshift product lines

For over a decade, Maxon has supplemented its physical booth presence with live streaming presentations. This has allowed show attendees and those unable to attend events in person, to benefit from demos, technology updates and interaction with the guest artists in real time. First up will be CEO Dave McGavran, who will talk about Maxon’s latest news and recent merger with Red Giant.

In terms of artists, Penelope Nederlander will break down her latest end credit animation for Birds of Prey; filmmaker Seth Worley will walk through some of the visual effects shots from his latest short film, Darker Colors; Doug Appleton will share the creative processes behind creating the technology for Spider-Man: Far From Home; Jonathan Winbush will demo importing C4D scenes into Unreal Engine for rendering or VR/AR output; and Veronica Falconieri Hays will share how she builds cellular landscapes and molecular structures in order to convey complex scientific stories.

The line-up of artists also includes Mike “Beeple” Winkelmann, Stu Maschwitz, EJ Hassenfratz, Chris Schmidt, Angie Feret, Kelcey Steele, Daniel “Hashi” Hashimoto, Dan Pierse, Andy Needham, Saida Saetgareeva and many more.

Additional presenters’ info and a live streaming schedule will be available at C4DLive.com.

Main Image: (L-R) Saida Saetgareeva and Penelope Nederlander

Workstations and Visual Effects

By Karen Moltenbrey

A couple of decades or so ago, studios needed the exceptional power of a machine offered by the likes of SGI for complex visual effects. Non-specialized PCs simply were not cut out for this type of work. But then a sea change occurred, and suddenly those big blue and purple boxes were being replaced with options in the form of workstations from companies such as Sun, DEC, HP, IBM and others, which offered users tremendous value for something that could conveniently fit on a desktop.

Those hardware companies began to duke it out, leading to the demise of some and the rise of others. But the big winners in this early war for 3D content creators’ business were the users. With a price point that was affordable, these workstations were embraced by facilities big and small, leading to an explosion of 3D content.

Here, we look at two VFX facilities that have taken different approaches when it comes to selecting workstations for their digital artists. NYC’s Bonfire, a boutique studio, uses a range of Boxx and custom-built machines, along with iMacs. Meanwhile, Digital Domain, an Oscar-winning VFX powerhouse, recently set up a new site in Montreal outfitted with Dell workstations.

Dave Dimeola

Bonfire
Bonfire is a relative newcomer to the industry, founded three years ago by Flame artist Brendan O’Neil, who teamed up with Dave Dimeola to get the venture off the ground. Their goal was to create a boutique-style base for those working in post production. The front end would comprise a beautiful, comfortable space where Autodesk Flame and motion graphics artists could work and interact with clients in comfortable suites within a townhouse setting, while the back end would consist of a cloud-based pipeline.

“We figured that if we combined these two things — the traditional boutique shop with the client experience and the power of a cloud-based pipeline — we’d have something,” says Dimeola, whose prior experience in leveraging the cloud proved invaluable in this regard.

Soon after, Peter Corbett, who had sold his Click 3X creative digital studio in 2018, agreed to be part of Bonfire’s advisory board. Believing Dimeola and O’Neil were on to something, Corbett came aboard as a partner and brought “some essential talent” into the company as well. Currently, Bonfire has 11 people on staff, with great talent across the gamut of production and post — from CG and creative directing to producing and more. One of the first key people who Corbett brought in was managing director Jason Mayo.

And thanks to the company’s unconventional production pipeline, it is able to expand its services with remote teams as needed. (Currently, Bonfire has more than 3,000 vetted artists in its network, with a core group of around 150 who are constantly on rotation for work.)

“It’s a game-changer in the current climate,” Dimeola says of the company’s setup. The group is performing traditional post work for advertising agencies and direct to client. “We’re doing content, commercials, social content and brand films,” says Dimeola, “anything that requires storytelling and visual communication and is design-centric.” One of the studio’s key offerings is now color grading, handled by colorist Dario Bigi. In terms of visual effects, Dimeola notes that Bonfire can indeed make animals talk or blow things up, although the VFX work Bonfire does “is more seamless, artful, abstract and weird. We get into all those facets of creation.”

Bonfire has approximately 10 workstations at its location in New York and is expanding into the second floor. The company just ordered a new set of customized PCs with Nvidia GeForce RTX 2070 Super graphic cards and some new ultra-powerful Apple iMacs, which will be used for motion graphics work and editing. The core software running on the machines includes the major industry packages: Autodesk’s Maya, Maxon’s Cinema 4D, Side Effects’ Houdini, Autodesk’s Flame, Foundry’s Nuke and Adobe’s Creative Suite, in addition to Thinkbox’s Krakatoa for particle rendering and Autodesk’s 3ds Max for architectural work. Cinema 4D motion graphics software and the Adobe software will run on the new iMac, while the more render-intensive projects within Maya, Houdini and Nuke will run on the PC network.

As Dimeola points out, workstations have taken some interesting turns in the past two to three years, and Bonfire has embraced the move from CPU-based systems to ones that are GPU-based. As such, the company’s PC workstations — a mix of Boxx and custom-built machines (with an AMD Threadripper 2950X CPU and a pair of Asus GeForce RTX 2080 Ti video cards, along with significant memory) — contain powerful Nvidia Quadro RTX 1080 cards. He attributes Bonfire’s move in processing to the changing needs of CGI rendering and lighting, which are increasingly relying on GPU power.

“We still use CPU power, however, because we feel some of the best lighting is still being done in the CPU with software like [Autodesk’s] Arnold,” Dimeola contends. “But we’re flexible enough to be using GPU-based lighting, like Otoy’s OctaneRender and Maxon’s Redshift for jobs that need to look great but also move very quickly through the pipeline. Some shops own one way of rendering, but we really keep a flexible pipeline so we can pivot and render just about any way we want based on the creative, the look, the schedule. It has to be very flexible in order for us to be efficient.”

The media will work off the SAN, a communal server that is partitioned into segments: one for the CGI, another for the editing (Flame) and a third for color (Blackmagic DaVinci Resolve). “We partitioned a cloud section for the server, which allows us to have complete control on how we sync media with an external team,” explains Dimeola. “That’s a big part of how we share, collaborate and move assets quickly with a team inside and outside and how we scale for projects. This is unique for Bonfire. It is the innovative part of the post production that I don’t think any other shops are really talking about at this time.”

In addition to the local machines and software, Bonfire also runs applications on virtual machines in the cloud. The key, says Dimeola, is knowing how to create harmony between the internal and external infrastructures. The backbone is built on Amazon Web Services (AWS) and Google Cloud Platform (GCP) and functions much the same way as its internal pipeline does. A proprietary project tracker built by Bonfire enables the teams to manage shots and assets; it also has an array of services and tools that help the staff efficiently manage projects that vary in complexity and scale.

Brooklyn Nets

“It’s no single piece to our pipeline that’s so innovative; rather, it’s the way that we’ve configured it between our talent and infrastructure,” says Dimeola, noting that in addition to being able to take on big projects, the company is able to get its clients what they need in real time and make complete changes literally overnight. Dimeola recalls a recent project for Google requiring intensive CGI fluid simulations. The team sat with the client one day to work out the direction and was able to post fresh edits, which included rendered shots, for the client the very next morning. “[In a traditional setup], that never would have been possible,” he points out.

However, getting the best deal on the cloud services requires additional work. “We play that market like the stock market, where we’re finding the best deals and configurations based on our needs at the time,” Dimeola says, and the result is an exponential increase in workflow. “You can ramp up a team and be rendering and working 24/7 because you’re using people in different time zones, and you’re never down a machine for rendering and working.”

Best of all, the setup goes unnoticed by the customer. “The client doesn’t feel or see anything different,” says Dimeola. That is, with one exception: “a dramatic change in the cost of doing the work, particularly if they are requiring a lot of speed.”

Digital Domain Montreal
A longtime creative and technological force in the visual effects industry, Digital Domain has crafted a range of work spanning feature films, video games, commercials and virtual reality experiences. With global headquarters in LA, plus locations in Vancouver, Beijing, Shanghai and elsewhere around the globe, the studio has been the driving force behind many memorable and cutting-edge projects, including the Oscar-winning The Curious Case of Benjamin Button and more. In fact, Digital Domain is known for its technological prowess within visual effects, particularly in the area of realistic digital humans, recently recreating a photoreal 3D version of Martin Luther King Jr. for a groundbreaking immersive project.

Michael Quan

A year ago, Digital Domain expanded its North American footprint by opening an office in Montreal, which celebrated its grand opening this past February. The office has approximately 100 employees, with plans to expand in the future. Most of the work produced by Digital Domain is shared by its five worldwide studios, and that trend will continue with Digital Domain Montreal, particularly with the LA and Vancouver offices; it also will tackle regional projects, focusing mostly on features and episodic content.

Setting up the Montreal site’s infrastructure fell to Digital Domain’s internal IT department, including senior systems engineer Michael Quan, who helped outfit the facility with the same classes of machines that the Los Angeles and Vancouver locations use: the Dell Precision R7920 and R7910 rack workstation PCs. “All the studios share common configuration specifications,” he notes. “Having a common specification makes it tremendously easy to move resources around when necessary.”

In fact, the majority of the machines were purchased approximately during the third quarter of 2019. Prior to that, the location was started up with resources from the facility’s other sites, and since they are using a common configuration, doing so did not present an issue.

Quan notes that the studio is able to cover all aspects of typical VFX production, such as modeling, rigging, animation, lighting, rotoscoping, texture painting, compositing and so forth, using the machines. And with some additional hardware, the office can also leverage those workstations for dailies review, he adds. As for the software, Digital Domain runs the typical programs: Autodesk’s Maya, Foundry’s Mari and Nuke, Chaos’ V-Ray, Maxon’s Redshift, Adobe’s Photoshop and so on, in addition to proprietary software.

Terminator: Dark Fate

As Quan points out, Digital Domain has specific requirements for its workstations, aside from the general CPU, RAM and hard drive specs. The machines must be able to handle the GPUs required by Digital Domain along with additional support devices. While that might seem obvious, when a requirement comes into play, it reduces the number of systems that are available for evaluation, he points out. Furthermore, the workstations must be rack-mountable and of a “reasonable” size (2U) to fit within the data center as opposed to deskside. Also, since the workstations are deployed in the data center, they must be manageable remotely.

“Preferably, it is commodity hardware, meaning using a vendor that is stable, has a relatively large market share and isn’t using some exotic technology,” Quan says, “so if necessary, we could source from secondary markets.” Unfortunately, the company learned this the hard way in the past by using a vendor that implemented custom power and management hardware; the vendor exited the market, leaving the studio without an option for repair and no secondary market to source defective parts.

Just how long Digital Domain retains its workstations depends on their performance effectiveness: If an artist can no longer work due to a resource inefficiency, Quan says, then a new round of hardware specification is initialized.

“The workstations we use are multi-processor-based, have a relatively high amount of memory and are capable of running the higher-performing professional GPUs that our work requires,” he says. “These days, ‘workstations’ could mean what would normally be called gaming rigs but with more memory, a top-end GPU and a high-clock-speed single processor. It just depends on what software will be used and the hardware configuration that is optimized for that.”

Lost in Space, Season 2

As Quan points out, graphics workstations have evolved to where they have the same capabilities as some low- to mid-class servers. “For example, the Dell R7910/R7920 that we are using definitely could be used as servers, since they share the same hardware capability as their server class,” he says. “It used to be that if you wanted performance, you might have to sacrifice manageability and footprint. Now there are systems deployed with one, eight and 10 GPU configurations in a relatively small footprint, which is a fully remotely manageable system in one of our data centers.” He predicts that workstations are evolving to a point where they will just be a specification. “In the near future, it will just be an abstract for us. Gone will be the days of one workstation equating to one physical system.”

According to Quan, the Montreal studio is still ramping up and has several major projects on the horizon, including feature films from Marvel, Sony, 20th Century Studios and more. Some of Digital Domain’s more recent work includes Avengers: Endgame, Lost in Space (Season 2), Terminator: Dark Fate and several others. Globally, its New Media and Digital Humans groups are doing incredible things, he notes, and the Ads/Games Group is producing some exceptional work as well.

“The workstations at Digital Domain have constantly evolved. We went from generic white boxes to Tier 1 systems, back to white boxes, and now to a more sophisticated Tier 1 data center-targeted ecosystem. With the evolutionary steps we are taking, we are iterating to a more efficient management of these resources,” Quan says. “One of the great advantages of having the workstations placed in a remote location is the security aspects. And on a more human level, the reduction of the fan noises and the beeps all those workstations would have created in the artist locations is notable.”


Karen Moltenbrey is a veteran writer, covering visual effects and post production.

VFX studio One Of Us adds CTO Benoit Leveau

Veteran post technologist Benoit Leveau has joined London’s One of Us as CTO. The studio, which is in its 16th year, employs 200 VFX artists.

Leveau, who joins One of Us from Milk VFX, has been in the industry for 18 years, starting out in his native France before moving to MPC in London. He then joined Prime Focus, integrating the company’s Vancouver and Mumbai pipelines with London. In 2013, he joined Milk in its opening year as head of pipeline. He helped to build that department and later led the development of Milk’s cloud rendering system.

The studio, which depends on what it calls “the efficient use of existing technology and the timely adoption of new technology,” says Leveau’s knowledge and experience will ensure that “their artists’ creativity has the technical foundation which allows it to flourish.”

Maxon plugin allows for integration of Cinema 4D assets into Unity


Maxon is now a Unity Technologies Verified Solutions Partner and is distributing a plugin for Unity called Cineware by Maxon. The new plugin provides developers and creatives with seamless integration of Cinema 4D assets into Unity. Artists can easily create models and animations in Cinema 4D for use in realtime 3D (RT3D), interactive 2D, 3D, VR and AR experiences. The Cineware by Maxon plugin is now available free of charge on the Unity Asset Store.

The plugin is compatible with Cinema 4D Release 21, the latest version of the software, and Unity’s latest release, 2019.3. The plugin does not require a license of Cinema 4D as long as Cinema 4D scenes have been “Saved for Cineware.” By default, imported assets will appear relative to the asset folder or imported asset. The plugin also supports user-defined folder hierarchies.

Cineware by Maxon currently supports Geometry:
• Vertex Position, Normals, UV, Skinning Weight, Color
• Skin and Binding Rig
• Pose Morphs as Blend Shapes
• Lightmap UV2 Generation on Import

Materials:
• PBR Reflectance Channel Materials conversion
• Albedo/Metal/Rough
• Normal Map
• Bump Map
• Emission

Animated Materials:
• Color including Transparency
• Metalness
• Roughness
• Emission Intensity, Color
• Alpha Cutout Threshold

Lighting:
• Spot, Directional, Point
• Animated properties supported:
• Cone
• Intensity
• Color

Cameras:
• Animated properties
• Field of Vision (FOV)

Main Image: Courtesy of Cornelius Dämmrich

Director Vincent Lin discusses colorful Seagram’s Escapes spot

By Randi Altman

Valiant Pictures, a New York-based production house, recently produced a commercial spot featuring The Bachelor/Bachelorette host Chris Harrison promoting Seagram’s Escapes and its line of alcohol-based fruit drinks. A new addition to the product line is Tropical Rosé, which was co-developed by Harrison and contains natural passion fruit, dragon fruit and rosé flavors.

Valiant’s Vincent Lin directed the piece, which features Harrison in a tropical-looking room — brightened with sunny pinks and yellows thanks to NYC’s Nice Shoes — describing the rosé and signing off with the Seagram’s Escapes brand slogan, “Keep it colorful!”

Here, director Lin — and his DP Alexander Chinnici — talks about the project’s conception, shoot and post.

How early did you get involved? Did Valiant act as the creative agency on this spot?
Valiant has a long-standing history with the Seagram’s Escapes brand team, and we were fortunate enough to have the opportunity to brainstorm a few ideas with them early on for their launch of Seagram’s Escapes Tropical Rosé with Chris Harrison. The creative concept was developed by Valiant’s in-house creative agency, headed by creative directors Nicole Zizila and Steven Zizila, and me. Seagram’s was very instrumental in the creative for the project, and we collaborated to make sure it felt fresh and new — like an elevated evolution of their “Keep It Colorful” campaign rather than a replacement.

Clearly, it’s meant to have a tropical vibe. Was it shot greenscreen?
We had considered doing this greenscreen, which would open up some interesting options, but also it would pose some challenges. What was important for this campaign creatively was to seamlessly take Chris Harrison to the magical world of Seagram’s Escapes Tropical Rosé. A practical approach was chosen so it didn’t feel too “out of this world,” and the live action still felt real and relatable. We had considered putting Chris in a tropical location — either in greenscreen or on location — but we really wanted to play to Chris’ personality and strengths and have him lead us to this world, rather than throw him into it. Plus, they didn’t sign off on letting us film in the Maldives. I tried (smiles).

L-R: Vincent Lin and Alex Chinnici

What was the spot shot on?
Working with the very talented DP Alex Chinnici, he recommended shooting on the ARRI Alexa for many reasons. I’ll let Alex answer this one.

Alex Chinnici: Some DPs would likely answer with something sexier  like, “I love the look!” But that is ignoring a lot of the technical realities available to us these days. A lot of these cameras are wonderful. I can manipulate the look, so I choose a camera based on other reasons. Without an on-set live, color-capable DIT, I had to rely on the default LUT seen on set and through post. The Alexa’s default LUT is my preference among the digital cameras. For lighting and everyone on the set, we start in a wonderful place right off the bat. Post houses also know it so well, along with colorists and VFX. Knowing our limitations and expecting not to be entirely involved, I prefer giving these departments the best image/file possible.

Inherently, the color, highlight retention and skin tone are wonderful right off the bat without having to bend over backward for anyone. With the Alexa, you end up being much closer to the end rather than having to jump through hoops to get there like you would with some other cameras. Lastly, the reliability is key. With the little time that we had, and a celebrity talent, I would never put a production through the risk of some new tech. Being in a studio, we had full control but still, I’d rather start in a place of success and only make it better from there.

What about the lenses?
Chinnici: I chose the Zeiss Master Primes for similar reasons. While sharp, they are not overbearing. With some mild filtration and very soft and controlled lighting, I can adjust that in other ways. Plus, I know that post will beautify anything that needs it; giving them a clean, sharp image (especially considering the seltzer can) is key.

I shot at a deeper stop to ensure that the lenses are even cleaner and sharper, although the Master Primes do hold up very well wide open. I also wanted the Seagram’s can to be in focus as much as possible and for us to be able to see the set behind Chris Harrison, as opposed to a very shallow depth of field. I also wanted to ensure little to no flares, solid contrast, sharpness across the field and no surprises.

Thanks Alex. Back to you Vincent. How did you work with Alex to get the right look?
There was a lot of back and forth between Alex and me, and we pulled references to discuss. Ultimately, we knew the two most important things were to highlight Chris Harrison and the product. We also knew we wanted the spot to feel like a progression from the brand’s previous work. We decided the best way to do this was to introduce some dimensionality by giving the set depth with lighting, while keeping a clean, polished and sophisticated aesthetic. We also introduced a bit of camera movement to further pull the audience in and to compose the shots it in a way that all the focus would be on Chris Harrison to bring us into that vibrant CG world.

How did you work with Nice Shoes colorist Chris Ryan to make sure the look stayed on point? 
Nice Shoes is always one of our preferred partners, and Chris Ryan was perfect for the job. Our creatives, Nicole and Steven, had worked with him a number of times. As with all jobs, there are certain challenges and limitations, and we knew we had to work fast. Chris is not only detail oriented, creative and a wizard with color correction, but also able to work efficiently.

He worked on a FilmLight Baselight system off the Alexa raw files. The color grading really brought out the saturation to further reinforce the brand’s slogan, “Keep It Colorful,” but also to manage the highlights and whites so it felt inviting and bright throughout, but not at all sterile.

What about the VFX? Can you talk about how that was accomplished? 
Much like the camera work, we wanted to continue giving dimensionality to the spot by having depth in each of our CG shots. Not only depth in space but also in movement and choreography. We wanted the CG world to feel full of life and vibrant in order to highlight key elements of the beverage — the flavors, dragonfruit and passionfruit — and give it a sense of motion that draws you in while making you believe there’s a world outside of it. We wanted the hero to shine in the center and the animation to play out as if a kaleidoscope or tornado was pulling you in closer and closer.

We sought the help of creative production studio Taylor James tto build the CG elements. We chose to work with a core of 3ds Max artists who could do a range of tasks using Autodesk 3ds Max and Chaos Group’s V-Ray (we also use Maya and Arnold). We used Foundry Nuke to composite all of the shots and integrate the CGI into the footage. The 3D asset creation, animation and lighting were constructed and rendered in Autodesk Maya, with compositing done in Adobe After Effects.

One of the biggest challenges was making sure the live action felt connected to the CG world, but with each still having its own personality. There is a modern and clean feel to these spots that we wanted to uphold while still making it feel fun and playful with colors and movement. There were definitely a few earlier versions that we went a bit crazy with and had to scale down a bit.

Does a lot of your work feature live action and visual effects combined?
I think of VFX like any film technique: It’s simply a tool for directors and creatives to use. The most essential thing is to understand the brand, if it’s a commercial, and to understand the story you are trying to tell. I’ve been fortunate to do a number of spots that involve live-action and VFX now, but truth be told, VFX almost always sneaks its way in these days.

Even if I do a practical effect, there are limitless possibilities in post production and VFX. Anything from simple cleanup to enhancing, compositing, set building and extending — it’s all possible. It’d be foolish not to consider it as a viable tool. Now, that’s not to say you should rely solely on VFX to fix problems, but if there’s a way it can improve your work, definitely use it. For this particular project, obviously, the CG was crucial to let us really be immersed in a magical world at the level of realism and proximity we desired.

Anything challenging about this spot that you’d like to share?
Chris Harrison was terrible to work with and refused to wear a shirt for some reason … I’m just kidding! Chris was one of the most professional, humblest and kindest celebrity talents that I’ve had the pleasure to work with. This wasn’t a simple endorsement for him; he actually did work closely with Seagram’s Escapes over several months to create and flavor-test the Tropical Rosé beverage.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

The-Artery sees red, creates VFX for Huawei’s AppGallery

The-Artery recently worked on a global campaign for agency LH in Israel, and consumer electronics brand Huawei’s official app distribution platform, AppGallery.

The campaign — set to an original musical track called Explore It by artist Tomer Biran — is meant to show the AppGallery as more than a mobile app store, but rather as a gate to an endless world of digital content that comes with data protection and privacy.

Each scene features the platform’s signature red square logo but shown in a variety of creative ways thanks to The-Artery’s visual effects work. This includes floating Tetris-like cubes that change with the beat of the music, camera focuses, red-seated subway cars with a floating red cube and more.

“Director Eli Sverdlov, editor Noam Weissman and executive producer Kobi Hoffman all have distinct artistic processes that are unforgiving to conventional storytelling,” explains founder/executive creative director Vico Sharabani. “We had ongoing conversations about how to create a deeper connection between the brand and audiences. The agency, LH, gave us the freedom to really explore the fun, convenience and security behind downloading apps on the Huawei AppGallery.”

Filming took place across the globe in Kiev, Ukraine, via production company Jiminy Creative Tel Aviv, while editing, design, animation, visual effects and color grading were all done under one roof in The-Artery’s New York studio. The entire production was completed in only 16 days.

The studio used Autodesk’s Flame and 3DS Max, Side Effects Houdini, Adobe’s After Effects and Photoshop for the visual effects and graphics. Colorist: Steve Picano called on Blackmagic’s DaVinci Resolve. Asaf Bitton provided sound design.

Foundry Nuke 12.1 offers upgrades across product line

Foundry has released Nuke 12.1, with UI enhancements and tool improvements across the entire Nuke family. The largest update to Blink and BlinkScript in recent years improves Cara VR node performance and introduces new tools for developers, while extended functionality in the timeline-based applications speeds up and enriches artist and team review.

Here are the upgrade highlights:
– New Shuffle node updates the classic checkboxes with an artist-friendly, node-based UI that supports up to eight channels per layer (Nuke’s limit) and consistent channel ordering, offering a more robust tool set at the heart of Nuke’s multi-channel workflow.
– Lens Distortion Workflow improvements: The LensDistortion node in NukeX is updated to have a more intuitive workflow and UI, making it easier and quicker to access the faster and more accurate algorithms and expanded options introduced in Nuke 11.
– Blink and BlinkScript improvements: Nuke’s architecture for GPU-accelerated nodes and the associated API can now store data on the GPU between operations, resulting in what Foundry says are “dramatic performance improvements to chains of nodes with GPU caching enabled.” This new functionality is available to developers using BlinkScript, along with bug fixes and a debug print out on Linux.
– Cara VR GPU performance improvements: The Cara VR nodes in NukeX have been updated to take advantage of the new GPU-caching functionality in Blink, offering performance improvements in viewer processing and rendering when using chains of these nodes together. Foundry’s internal tests on production projects show rendering time that’s up to 2.4 times faster.
– Updated Nuke Spherical Transform and Bilateral: The Cara VR versions of the Spherical Transform and Bilateral nodes have been merged with the Nuke versions of these nodes, adding increased functionality and GPU support in Nuke. Both nodes take advantage of the GPU performance improvements added in Nuke 12.1. They are now available in Nuke and no longer require a NukeX license.
– New ParticleBlinkScript node: NukeX now includes a new ParticleBlinkScript node, allowing developers to write BlinkScripts that operate on particles. Nuke 12.1 ships with more than 15 new gizmos, offering a starting point for artists who work with particle effects and developers looking to use BlinkScript.
– QuickTime audio and surround sound support: Nuke Studio, Hiero and HieroPlayer now support multi-channel audio. Artists can now import Mov containers holding audio on Linux and Windows without needing to extract and import the audio as a separate Wav file.

– Faster HieroPlayer launch and Nuke Flipbook integration: Foundry says new instances of HieroPlayer launch 1.2 times faster on Windows and up to 1.5 times faster on Linux in internal tests, improving the experience for artists using HieroPlayer for review. With Nuke 12.1, artists can also use HieroPlayer as the Flipbook tool for Nuke and NukeX, giving them more control when comparing different versions of their work in progress.
– High DPI Windows and Linux: UI scaling when using high-resolution monitors is now available on Windows and Linux, bringing all platforms in line with high-resolution display support added for macOS in Nuke 12.0 v1.
– Extended ARRI camera support: Nuke 12.1 adds support for ARRI formats, including Codex HDE .arx files, ProRes MXFs and the popular Alexa Mini LF. Foundry also says there are performance gains when debayering footage on CUDA GPUs, and there’s an SDK update.

The Call of the Wild director Chris Sanders on combining live-action, VFX

By Iain Blair

The Fox family film The Call of the Wild, based on the Jack London tale, tells the story of  a big-hearted dog named Buck whose is stolen from his California home and transported to the Canadian Yukon during the Gold Rush. Director Chris Sanders called on the latest visual effects and animation technology to bring the animals in the film to life. The film stars Harrison Ford and is based on a screenplay by Michael Green.

Sanders’ crew included two-time Oscar–winning cinematographer Janusz Kaminski; production designer Stefan Dechant; editors William Hoy, ACE, and David Heinz; composer John Powell; and visual effects supervisor Erik Nash.

I spoke with Sanders — who has helmed the animated films Lilo & Stitch, The Croods and How to Train Your Dragon — about making the film, which features a ton of visual effects.

You’ve had a very successful career in animation but wasn’t this a very ambitious project to take on for your live-action debut?
It was. It’s a big story, but I felt comfortable because it has such a huge animated element, and I felt I could bring a lot to the party. I also felt up to the task of learning — and having such an amazing crew made all of that as easy as it could possibly be.

Chris Sanders on set.

What sort of film did you set out to make?
As true a version as we could tell in a family-friendly way. No one’s ever tried to do the whole story. This is the first time. Before, people just focused on the last 30 pages of the novel and focused on the relationship between Buck and John Thornton, played by Harrison. And that makes perfect sense, but what you miss is the whole origin story of how they end up together — how Buck has to learn to become a sled dog, how he meets the wolves and joins their world. I loved all that, and also all the animation needed to bring it all alive.

How early on did you start integrating post and all the visual effects?
Right away, and we began with previs.

Your animation background must have helped with all the previs needed on this. Did you do a lot of previs, and what was the most demanding sequence?
We did a ton. In animation it’s called layout, a rough version, and on this we didn’t arrive on set without having explored the sequence many times in previs. It helped us place the cameras and block it all, and we also improvised and invented on set. But previs was a huge help with any heavy VFX element, like when Thornton’s going down river. We had real canoes in a river in Canada with inertial measurement devices and inertial recorders, and that was the most extensive recording we had to do. Later in post, we had to replace the stuntman in the canoe with Thornton and Buck in an identical canoe with identical movements. That was so intensive.

 

How was it working with Harrison Ford?
The devotion to his craft and professionalism… he really made me understand what “preparing for a role” really means, and he really focused on Thornton’s back story. The scene where he writes the letter to his wife? Harrison dictated all of that to me and I just wrote it down on top of the script. He invented all that. He did that quite a few times. He made the whole experience exciting and easy.

The film has a sort of retro look. Talk about working with DP Janusz Kaminski.
We talked about the look a lot, and we both wanted to evoke those old Disney films we saw as kids —something very rich with a magical storybook feel to it. We storyboarded a lot of the film, and I used all the skills I’d learned in animation. I’d see sequences a certain way, draw it out, and sometimes we’d keep them and cut them into editorial, which is exactly what you do in animation.

How tough was the shoot? It must have been quite a change of pace for you.
You’re right. It was about 50 days, and it was extremely arduous. It’s the hardest thing I’ve ever done physically, and I was not fully prepared for how exhausted you get — and there’s no time to rest. I’d be driving to set by 4:30am every day, and we’d be shooting by 6am. And we weren’t even in the Yukon — we shot here in California, a mixture of locations doubling for the Yukon and stage work.

 

Where did you post?
All on the Fox lot, and MPC Montreal did all the VFX. We cut it in relatively small offices. I’m so used to post, as all animation is basically post. I wish it was faster, but you can’t rush it.

You had two editors — William Hoy and David Heinz. How did that work?
We sent them dailies and they divided up the work since we had so much material. Having two great voices is great, as long as everyone’s making the same movie.

What were the big editing challenges?
The creative process in editorial is very different from animation, and I was floored by how malleable this thing was. I wasn’t prepared for that. You could change a scene completely in editorial, and I was blown away at what they could accomplish. It took a long time because we came back with over three hours of material in the first assembly, and we had to crush that down to 90 minutes. So we had to lose a huge amount, and what we kept had to be really condensed, and the narrative would shift a lot. We’d take comedic bits and make them more serious and vice versa.

Visual effects play a key role. Can you talk about working on them with VFX supervisor Erik Nash.
I love working with VFX, and they were huge in this. I believe there are less than 30 shots in the whole film that don’t have some VFX. And apart from creating Buck and most of the other dogs and animals, we had some very complex visual effects scenes, like the avalanche and the sledding sequence.

L-R: Director Chris Sanders and writer Iain Blair

We had VFX people on set at all times. Erik was always there supervising the reference. He’d also advise us on camera angles now and then, and we’d work very closely with him all the time. The cameras were hooked up to send data to our recording units so that we always knew what lens was on what camera at what focal length and aperture, so later the VFX team knew exactly how to lens the scenes with all the set extensions and how to light them.

The music and sound also play a key role, especially for Buck, right?
Yes, because music becomes Buck’s voice. The dogs don’t talk like they do in Lion King, so it was critical. John Powell wrote a beautiful score that we recorded on the Newman Stage at Fox, and then we mixed at 5 Cat Studios.

Where did you do the DI, and how important is it to you?
We did it at Technicolor with colorist Mike Hatzer, and I’m pretty involved. Janusz did the first pass and set the table, and then we fine-tuned it, and I’m very happy with the rich look we got.

Do you want to direct another live-action film?
Yes. I’m much more comfortable with the idea now that I know what goes into it. It’s a challenge, but a welcome one.

What’s next?
I’m looking at all sorts of projects, and I love the idea of doing another hybrid like this.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Behind the Title: Blue Bolt VFX supervisor Richard Frazer

“If we have done our job well, the viewers should never notice the work and instead just be enjoying the storytelling.”

Name: Richard Frazer

Company: London’s BlueBolt

Can you describe your company?
For the last four years, I’ve worked at BlueBolt, a Soho-based visual effects company in London. We work on high-end TV and feature films, and our main area of specialty is creating CG environments and populating them. BlueBolt is a privately owned company run by two women, which is pretty rare. They believe in nurturing good talent and training them up to help break through the glass ceiling, if an artist is up for it.

What’s your job title?
I joined as a lead compositor with a view to becoming a 2D supervisor, and now I am one of the studio’s main core VFX supervisors.

What does that entail?
It means I oversee all of the visual effects work for a specific TV show or movie — from script stage to final delivery. That includes working with the director and DP in preproduction to determine what they would like to depict on the screen. We then work out what is possible to shoot practically, or if we need to use visual effects to help out.

I’ll then often be on the set during the shoot to make sure we correctly capture everything we need for post work. I’ll work with the VFX producer to calculate the costs and time scales of the VFX work. Finally, I will creatively lead our team of talented artists to create those rendered images and make sure it all fits in with the show in a visually seamless way.

What would surprise people the most about what falls under that title?
The staggering amount of time and effort involved by many talented people to create something that an audience should be totally unaware exists. If we have done our job well, the viewers should never notice the work and instead just be enjoying the storytelling.

How long have you been working in VFX?
For around a decade. I started out as a rotoscope artist in 2008 and then became a compositor. I did my first supervisor job back in 2012.

How has the VFX industry changed in the time you’ve been working?
A big shift has been just how much more visual effects work there is on TV shows and how much the standard of that work has improved. It used to be that TV work was looked down on as the poor cousin of feature film work. But shows like Game of Thrones have set audience expectations so much higher now. I worked on nothing but movies for the first part of my career, but the majority of my work now is on TV shows.

Did a particular film inspire you along this path in entertainment?
I grew up on ‘80s sci-fi and horror, so movies like Aliens and The Thing were definitely inspirations. This was back when effects were almost all done practically, so I wanted to get into model-making or prosthetics. The first time I remember being blown away by digital VFX work was seeing Terminator 2 at the cinema. I’ve ended up doing the type of work I dreamed of as a kid, just in a digital form.

Did you go to film school?
No, I actually studied graphic design. I worked for some time doing animation, video editing and motion graphics. I taught myself compositing for commercials using After Effects. But I always had a love of cinema and decided to try and specializing in this area. Almost all of what I’ve learned has been on the job. I think there’s no better training than just throwing yourself at the work, absorbing everything you can from the people around you and just being passionate about what you do.

What’s your favorite part of the job?
Each project has its own unique set of challenges, and every day involves creative problem-solving. I love the process of translating what only exists in someone’s imagination and the journey of creating those images in a way that looks entirely real.

I also love the mix of being at the offices one day creating things that only exist in a virtual world, while the next day I might be on a film set shooting things in the real world. I get to travel to all kinds of random places and get paid to do so!

What’s your least favorite?
There are so many moving parts involved in creating a TV show or movie — so many departments all working together trying to complete the task at hand, as well as factors that are utterly out of your control. You have to have a perfectly clear idea of what needs to be done, but also be able to completely scrap that and come up with another idea at a moment’s notice.

If you didn’t have this job, what would you be doing instead?
Something where I can be creative and make things that physically exist. I’m always in awe of people who build and craft things with their hands.

Can you name some recent projects you have worked on?
Recent work has included Peaky Blinders, The Last Kingdom and Jamestown, as well as a movie called The Rhythm Section.

What is the project that you are most proud of?
I worked on a movie called Under the Skin a few years ago, which was a very technically and creatively challenging project. It was a very interesting piece of sci-fi that people seem to either love or hate, and everyone I ask seems to have a slightly different interpretation of what it was actually about.

What tools so you use day to day?
Almost exclusively Foundry Nuke. I use it for everything from drawing up concepts to reviewing artists’ work. If there’s functionality that I need from it that doesn’t exist, I’ll just write Python code to add features.

Where do you find inspiration now?
In the real world, if you just spend the time observing it in the right way. I often find myself distracted by how things look in certain light. And Instagram — it’s the perfect social media for me, as it’s just beautiful images, artwork and photography.

What do you do to de-stress from it all?
The job can be quite mentally and creatively draining and you spend a lot of time in dark rooms staring at screens, so I try to do the opposite of that. Anything that involves being outdoors or doing something physical — I find cycling or boxing are good ways to unwind.

I recently went on a paragliding trip in the French Alps, which was great, but I found myself looking at all these beautiful views of sunsets over mountains and just analyzing how the sunlight was interacting with the fog and the atmospheric hazing. Apparently, I can never entirely turn off that part of my brain.

ILM’s virtual production platform used on The Mandalorian

To bring The Mandalorian to life, Industrial Light & Magic (ILM) and Epic Games — along with production technology partners Fuse, Lux Machina, Profile Studios, Nvidia and ARRI — have introduced a new way to shoot VFX-heavy projects in collaboration with Jon Favreau’s Golem Creations.

The new virtual production workflow allows filmmakers to capture a significant amount of complex visual effects shots in-camera using realtime game engine technology (Epic’s Unreal Engine) and LED screens to represent dynamic photoreal digital landscapes and sets with creative flexibility previously unimaginable.

Also part of the news, ILM has made its new end-to-end virtual production solution, ILM StageCraft, available for use by filmmakers, agencies and showrunners worldwide.

Over 50 percent of The Mandalorian Season 1 was filmed using this new methodology, eliminating the need for location shoots entirely. Instead, actors in The Mandalorian performed in an immersive and massive 20-foot-high by 270-degree semicircular LED video wall and ceiling with a 75-foot-diameter performance space, where the practical set pieces were combined with digital extensions on the screens.

Digital 3D environments created by ILM played back interactively on the LED walls, edited in realtime during the shoot, which allowed for pixel-accurate tracking and perspective-correct 3D imagery rendered at high resolution via systems powered by Nvidia GPUs.

L-R: Jon Favreau and Richard Bluff

The environments were lit and rendered from the perspective of the camera to provide parallax in real time, as if the camera were really capturing the physical environment with accurate interactive light on the actors and practical sets, giving showrunner Favreau; executive producer/director Dave Filoni; visual effects supervisor Richard Bluff; cinematographers Greig Fraser and Barry “Baz” Idoine and the episodic directors the ability to make concrete creative choices for visual effects-driven work during photography and achieve realtime in-camera composites on set.

The technology and workflow required to make in-camera compositing and effects practical for on-set use combined the ingenuity of all the partners involved.

“We’ve been experimenting with these technologies on my past projects and were finally able to bring a group together with different perspectives to synergize film and gaming advances and test the limits of realtime, in-camera rendering,” explains Favreau adding, “We are proud of what was achieved and feel that the system we built was the most efficient way to bring The Mandalorian to life.”

“Merging our efforts in the space with what Jon Favreau has been working toward using virtual reality and game engine technology in his filmmaking finally gave us the chance to execute the vision,” says Rob Bredow, executive creative director and head of ILM. “StageCraft has grown out of the culmination of over a decade of innovation in the virtual production space at ILM. Seeing our digital sets fully integrated, in real time on stage, providing the kind of in-camera shots we’ve always dreamed of while also providing the majority of the lighting was really a dream come true.”

Bluff adds, “Working with Kim Libreri and his Unreal team, Golem Creations, and the ILM StageCraft team has opened new avenues to both the filmmakers and my fellow key creatives on The Mandalorian, allowing us to shoot principal photography on photoreal, virtual sets that are indistinguishable from their physical counterparts while incorporating physical set pieces and props as needed for interaction. It’s truly a game-changer.”

ILM StageCraft’s production tools provide filmmakers with the combination of traditional filmmaking equipment and methodologies with all of the advantages of a fully digital workflow. With ILM StageCraft, a production can acquire many in-camera finals, allowing filmmakers immediate and complete creative control of work typically handed off and reinterpreted in post, improving the quality of visual effects shots with perfectly integrated elements and reducing visual effects requirements in post, which is a major benefit considering today’s compressed schedules.

MPI restores The Wizard of Oz in 4K HDR

By Barry Goch

The classic Victor Fleming-directed film The Wizard of Oz, which was released by MGM in 1939 and won two of its six Academy Award nominations, has been beautifully restored by Burbank’s Warner Bros. Motion Picture Imaging (MPI).

Bob Bailey

To share its workflow on the film, MPI invited a group of journalists to learn about the 4K UHD HDR restoration of this classic film. The tour guide for our high-tech restoration journey was MPI’s VP of operations and sales Bob Bailey, who walked us through the entire restoration process — from the original camera negative to final color.

The Wizard of Oz, which starred Judy Garland, was shot on a Technicolor three-strip camera system. According to Bailey, it ran three black and white negatives simultaneously. “That is why it is known as three-strip Technicolor. The magazine on top of the camera was triple the width of a normal black and white camera because it contained each roll of negative to capture your red, green and blue records,” explained Bailey.

“When shooting in Technicolor, you weren’t just getting the camera. You would rent a package that included the camera, a camera crew with three assistants, the film, the processing and a Technicolor color consultant.”

George Feltenstein, SVP of theatrical catalog marketing for Warner Bros. Home Entertainment, spoke about why the film was chosen for restoration. “The Wizard of Oz is among the crown jewels that we hold,” he said. “We wanted to embrace the new 4K HDR technology, but nobody’s ever released a film that old using this technology. HDR, or high dynamic range, has a color range that is wider than anything that’s come before it. There are colors [in The Wizard of Oz] that were never reproducible before, so what better a film to represent that color?”

Feltenstein went on to explain that this is the oldest film to get released in the 4K format. He hopes that this is just the beginning and that many of the films in Warner Bros.’ classic library will also be released on 4K HDR and worked on at MPI under Bailey’s direction.

The Process
MPI scanned each of the three-strip Technicolor nitrate film negatives at 8K 16-bit, composited them together and then applied a new color grain. The film was rescanned with the Lasergraphics Director 10K scanner. “We have just under 15 petabytes of storage here,” said Bailey. “That’s working storage, because we’re working on 8K movies since [some places in the world] are now broadcasting 8K.”

Steven Anastasi

Our first stop was to look at the Lasergraphics Director. We then moved on to MPI’s climate-controlled vault, where we were introduced to Steven Anastasi, VP of technical operations at Warner Bros. Anastasi explained that the original negative vault has climate-controlled conditions with 25% humidity at 35 degrees Fahrenheit, which is the combination required for keeping these precious assets safe for future generations. He said there are 2 million assets in the building, including picture and sound.

It was amazing to see film reels for 2001: A Space Odyssey sitting on a shelf right in front of me. In addition to the feature reels, MPI also stores millions of negatives captured throughout the years by Warner productions. “We also have a very large library,” reported Anastasi. “So the original negatives from the set, a lot of unit photography, head shots in some cases and so forth. There are 10 million of these.”

Finally, we were led into the color bay to view the film. Janet Wilson, senior digital colorist at MPI, has overseen every remaster of The Wizard of Oz for the past 20 years. Wilson used a FilmLight Baselight X system for the color grade. The grading suite housed multiple screens: a Dolby Pulsar for the Dolby Vision pass, a Sony X300 and a Panasonic EZ1000 OLED 4K HDR.

“We have every 4K monitor manufactured, and we run the film through all of them,” said Bailey. “We painstakingly go through the process from a post perspective to make sure that our consumers get the best quality product that’s available out in the marketplace.”

“We want the consumer experience on all monitors to be something that’s taken into account,” added Feltenstein. “So we’ve changed our workflow by having a consumer or prosumer monitor in these color correction suites so the colorist has an idea of what people are going to see at home, and that’s helped us make a better product.”

Our first view of the feature was a side-by-side comparison of the black and white scanned negative and the sepia color corrected footage. The first part of the film, which takes place in Kansas, was shot in black and white, and then a sepia look was applied to it. The reveal scene, when Dorothy passes through the door going into Oz, was originally shot in color. For this new release, the team generated a matte so Wilson could add this sepia area to the inside of the house as Dorothy transitioned into Oz.

“So this is an example of some of the stuff that we could do in this version of the restoration,” explained Wilson. “With this version, you can see that the part of the image where she’s supposed to be in the monochrome house is not actually black and white. It was really a color image. So the trick was always to get the interior of the house to look sepia and the exterior to look like all of the colors that it’s supposed to. Our visual effects team here at MPI — Mike Moser and Richie Hiltzik — was able to draw a matte for me so that I could color inside of the house independently of the exterior and make them look right, which was always a really tricky thing to do.”

Wilson referred back to the Technicolor three-strip, explaining that because you’ve got three different pieces of film — the different records — they’re receiving the light in different ways. “So sometimes one will be a little brighter than the other. One will be a little darker than the other, which means that the Technicolor is not a consistent color. It goes a little red, and then it goes a little green, and then it goes a little blue, and then it goes a little red again. So if you stop on any given frame, it’s going to look a little different than the frames around it, which is one of the tricky parts of color correcting technical art. When that’s being projected by a film projector, it’s less noticeable than when you’re looking at it on a video monitor, so it takes a lot of little individual corrections to smooth those kinds of things out.”

Wilson reported seeing new things with the 8K scan and 4K display. “The amount of detail that went into this film really shows up.” She said that one of the most remarkable things about the restoration was the amazing detail visible on the characters. For the first time in many generations, maybe ever, you can actually see the detail of the freckles on Dorothy’s face.

In terms of leveraging the expanded dynamic range of HDR, I asked Wilson if she tried to map the HDR, like in kind of a sweet spot, so that it’s both spectacular yet not overpowering at the same time.

“I ended up isolating the very brightest parts of the picture,” she replied. “In this case, it’s mostly the sparkles on their shoes and curving those off so I could run those in, because this movie is not supposed to have modern-day animation levels of brightness. It’s supposed to be much more contained. I wanted to take advantage of brightness and the ability to show the contrast we get from this format, because you can really see the darker parts of the picture. You can really see detail within the Wicked Witch’s dress. I don’t want it to look like it’s not the same film. I want it to replicate that experience of the way this film should look if it was projected on a good print on a good projector.”

Dorothy’s ruby slippers also presented a challenge to Wilson. “They are so red and so bright. They’re so light-reflective, but there were times when they were just a little too distracting. So I had to isolate this level at the same track with slippers and bring them down a little bit so that it wasn’t the first and only thing you saw in the image.”

If you are wondering if audio was part of this most recent restoration, the answer is no, but it had been remastered for a previous version. “As early at 1929, MGM began recording its film music using multiple microphones. Those microphonic angles allowed the mixer to get the most balanced monophonic mix, and they were preserved,” explained Feltenstein. “Twenty years ago, we created a 5.1 surround mix that was organically made from the original elements that were created in 1939. It is full-frequency, lossless audio, and a beautiful restoration job was made to create that track so you can improve upon what I consider to be close to perfection without anything that would be disingenuous to the production.”

In all, it was an amazing experience to go behind the scenes and see how the wizards of MPI created a new version of this masterpiece for today and preserved it for future generations.

This restored version of The Wizard of Oz is a must-see visual extravaganza, and there is no better way to see it than in UHD, HDR, Dolby Vision or HDR10+. What I saw in person took my breath away, and I hope every movie fan out there can have the opportunity to see this classic film in its never-before-seen glory.

The 4K version of The Wizard of Oz is currently available via an Ultra HD Blu-ray Combo Pack and digital.


Barry Goch is a finishing artist at LA’s The Foundation as well as a UCLA Extension Instructor, Post Production. You can follow him on Twitter at @Gochya

Kevin Lau heads up advertising, immersive at Digital Domain

Visual effects studio Digital Domain has brought on Kevin Lau as executive creative director of advertising, games and new media. In this newly created position, Lau will oversee all short-form projects and act as a creative partner for agencies and brands.

Lau brings over 18 years of ad-based visual effects and commercial production experience, working on campaigns for brands such as Target, Visa and Sprint.

Most recently, he was the executive creative director and founding partner at Timber, an LA-based studio focused on ads (GMC, Winter Olympics) and music videos (Kendrick Lamar’s Humble). Prior to that, he held creative director positions at Mirada, Brand New School and Superfad. Throughout his career, his work has been honored with multiple awards including Clios, AICP Awards, MTV VMAs and a Cannes Gold Lion for Sprint’s “Now Network” campaign via Goodby.

Lau, who joins Digital Domain EPs Nicole Fina and John Canning as they continue to build the studio’s short-form business, will help unify the vision for the advertising, games and new media/experiential groups, promoting a consistent voice across campaigns.

Lau joins the team as the new media group prepares to unveil its biggest project to date: Time’s The March, a virtual reality recreation of the 1963 March on Washington for Jobs and Freedom. Digital Domain’s experience with digital humans will play a major role in the future of both groups as they continue to build on the photoreal cinematics and in-game characters previously created for Activision, Electronic Arts and Ubisoft.

Visible Studios produces, posts Dance Monkey music video

If you haven’t heard about the Dance Monkey song by Tones and I, you soon will.  Australia’s Visible Studios provided production and post on the video to go with the song that has hit number one in more than 30 countries, went seven times platinum and remained at the top of the charts in Australia for 22 weeks. The video has been viewed on YouTube more than half a billion times.

Visible Studios, a full production and post company, is run by producer Tim Whiting and director and editor Nick Kozakis. The company features a team of directors, scriptwriters, designers, motion graphic artists and editors working on films, TV commercials and music videos.

For Dance Monkey, Visible Studios worked directly with Tones and I to develop the idea for the video. The video, which was shot on Red cameras at the beginning of the song’s meteoric rise, was completed in less than a week and on a small budget.

“The Dance Monkey music video was made on an extremely quick turnaround,” says Whiting. “[Tones] was blowing up at the time, and they needed the music video out fast. The video was shot in one day, edited in two, with an extra day and a half for color and VFX.”  Visible Studios called on Blackmagic Resolve studio for edit, VFX and color.

Dance Monkey features the singer dressed as Old Tones, an elderly man whisked away by his friends to a golf course to dance and party. On the day of production, the sun was nowhere to be found, and each shot was done against a gray and dismal background. To fix this, the team brought in a sky image as a matte and used Resolve’s match move tool, keyer, lens blur and power windows to turn gray footage to brilliant sunshine.

“In post we decided to replace the overcast skies with a cloudy blue sky. We ended up doing this all in Resolve’s color page and keyed the grass and plants to make them more lush, and we were there,” says Whiting.

Editor/directors Kozakis and Liam Kelly used Resolve for the entire editing process. “Being able to edit 6K raw footage smoothly on a 4K timeline, at a good quality debayer, means that we don’t have to mess around with proxies and that the footage gets out of the way of the editing process. The recent update for decompression and debayer on Nvidia cards has made this performance even better,” Kozakis says.

 

Missing Link, The Lion King among VES Award winners

The Visual Effects Society (VES), the industry’s global professional honorary society, held its 18th Annual VES Awards, the yearly celebration that recognizes outstanding visual effects artistry and innovation in film, animation, television, commercials, video games and special venues.

Comedian Patton Oswalt served as host for the 9th time to the more than 1,000 guests gathered at the Beverly Hilton to celebrate VFX talent in 25 awards categories. The Lion King was named the photoreal feature winner, garnering three awards. Missing Link was named top animated film, winning two awards. The Mandalorian was named best photoreal episode and garnered two awards, with Game of Thrones and Stranger Things 3 also winning two awards each. Hennessy: The Seven Worlds topped the commercial field with two wins.

Andy Serkis presented the VES Award for Creative Excellence to visual effects supervisor Sheena Duggal. Joey King presented the VES Visionary Award to director-producer-screenwriter Roland Emmerich. VFX supervisor Pablo Helman presented the Lifetime Achievement Award to director/producer/screenwriter Martin Scorsese, who accepted via video from New York. Scorsese’s The Irishman also picked up two awards, including Outstanding Supporting Visual Effects in a Photoreal Feature.

Presenters also included: directors J.J. Abrams, Jon Favreau, Rian Johnson and Josh Cooley.

Winners of the 18th Annual VES Awards are as follows:

Outstanding Visual Effects in a Photoreal Feature

THE LION KING

Robert Legato

Tom Peitzman

Adam Valdez

Andrew R. Jones

 

Outstanding Supporting Visual Effects in a Photoreal Feature

THE IRISHMAN

Pablo Helman

Mitchell Ferm

Jill Brooks

Leandro Estebecorena

Jeff Brink

 

Outstanding Visual Effects in an Animated Feature

MISSING LINK

Brad Schiff

Travis Knight

Steve Emerson

Benoit Dubuc

 

Outstanding Visual Effects in a Photoreal Episode

THE MANDALORIAN; The Child

Richard Bluff

Abbigail Keller

Jason Porter

Hayden Jones

Roy K. Cancino

 

Outstanding Supporting Visual Effects in a Photoreal Episode

CHERNOBYL; 1:23:45

Max Dennison

Lindsay McFarlane

Clare Cheetham

Paul Jones

Claudius Christian Rauch

 

Outstanding Visual Effects in a Real-Time Project

Control

Janne Pulkkinen

Elmeri Raitanen

Matti Hämäläinen

James Tottman

 

Outstanding Visual Effects in a Commercial

Hennessy: The Seven Worlds

Carsten Keller

Selçuk Ergen

Kiril Mirkov

William Laban

 

Outstanding Visual Effects in a Special Venue Project

Star Wars: Rise of the Resistance

Jason Bayever

Patrick Kearney

Carol Norton

Bill George

 

Outstanding Animated Character in a Photoreal Feature

ALITA: BATTLE ANGEL; Alita

Michael Cozens

Mark Haenga

Olivier Lesaint

Dejan Momcilovic

 

Outstanding Animated Character in an Animated Feature

MISSING LINK; Susan

Rachelle Lambden

Brenda Baumgarten

Morgan Hay

Benoit Dubuc

 

Outstanding Animated Character in an Episode or Real-Time Project

STRANGER THINGS 3; Tom/Bruce Monster

Joseph Dubé-Arsenault

Antoine Barthod

Frederick Gagnon

Xavier Lafarge

 

Outstanding Animated Character in a Commercial

Cyberpunk 2077; Dex

Jonas Ekman

Jonas Skoog

Marek Madej

Grzegorz Chojnacki

 

Outstanding Created Environment in a Photoreal Feature

THE LION KING; The Pridelands

Marco Rolandi

Luca Bonatti

Jules Bodenstein

Filippo Preti

 

Outstanding Created Environment in an Animated Feature

TOY STORY 4; Antiques Mall

Hosuk Chang

Andrew Finley

Alison Leaf

Philip Shoebottom

 

Outstanding Created Environment in an Episode, Commercial, or Real-Time Project

GAME OF THRONES; The Iron Throne; Red Keep Plaza

Carlos Patrick DeLeon

Alonso Bocanegra Martinez

Marcela Silva

Benjamin Ross

 

Outstanding Virtual Cinematography in a CG Project

THE LION KING

Robert Legato

Caleb Deschanel

Ben Grossmann

AJ Sciutto

 

Outstanding Model in a Photoreal or Animated Project

THE MANDALORIAN; The Sin; The Razorcrest

Doug Chiang

Jay Machado

John Goodson

Landis Fields IV

 

Outstanding Effects Simulations in a Photoreal Feature

STAR WARS: THE RISE OF SKYWALKER

Don Wong

Thibault Gauriau

Goncalo Cababca

François-Maxence Desplanques

 

Outstanding Effects Simulations in an Animated Feature

FROZEN 2

Erin V. Ramos

Scott Townsend

Thomas Wickes

Rattanin Sirinaruemarn

 

Outstanding Effects Simulations in an Episode, Commercial, or Real-Time Project

STRANGER THINGS 3; Melting Tom/Bruce

Nathan Arbuckle

Christian Gaumond

James Dong

Aleksandr Starkov

 

Outstanding Compositing in a Feature

THE IRISHMAN

Nelson Sepulveda

Vincent Papaix

Benjamin O’Brien

Christopher Doerhoff

 

Outstanding Compositing in an Episode

GAME OF THRONES; The Long Night; Dragon Ground Battle

Mark Richardson

Darren Christie

Nathan Abbott

Owen Longstaff

 

Outstanding Compositing in a Commercial

Hennessy: The Seven Worlds

Rod Norman

Guillaume Weiss

Alexander Kulikov

Alessandro Granella

 

Outstanding Special (Practical) Effects in a Photoreal or Animated Project

THE DARK CRYSTAL: THE AGE OF RESISTANCE; She Knows All the Secrets

Sean Mathiesen

Jon Savage

Toby Froud

Phil Harvey

 

Outstanding Visual Effects in a Student Project

THE BEAUTY

Marc Angele

Aleksandra Todorovic

Pascal Schelbli

Noel Winzen

 

VFX-heavy Skyworth OLED TV spot via The-Artery

The-Artery created a spot for Skyworth’s latest version of its W81|W81 Pro Wallpaper OLED TV, which debuted last month at the “See the Wonder” event at CES 2020.

Created using The-Artery‘s newly opened Resolve-based color room and expanded design capabilities —spearheaded by colorist Stephen Picano and design director Lauren Indovina — the commercial features a couple swimming through space-like waters, children battling origami dragons while floating in a paper boat and a traveler treking through snowy tundras while glowing jellyfish float overhead. Publicis, Skyworth’s agency, wanted the ad to reflect “the wonder” of the company’s newest television model.

“The campaign, helmed by director Eli Sverdlov, was very director-led in a way that I’ve never seen before,” explains The-Artery’s EP/MD, Deborah Sullivan. “Of course, there was still ongoing dialogue with the client and agency, but the level of creative control that was entrusted is almost unheard of. Everything was open from start to finish, including the ideation phase, color grading and design — to name a few. Our team had a lot of fun jumping straight into the edit to develop and launch what we consider as a high-end conceptual throwback to the nineties.”

Sverdlov agrees: “Our flexible creative process was in a condensed schedule and required a very unique collaboration. We were practically creating the ideas and visuals while editing and sourcing footage.”

Due to the production’s long shooting schedule and tight deadlines, the visual effects were designed via Autodesk Flame in realtime, all under one roof, while filming took place in Serbia. Additional footage was carefully curated as well as color graded and cut to fit the tone and flow of the rest of the piece. The creature imagery such as the jellyfish was done via CG.

In addition to Flame and Resolve, The-Artery called on SideFX Houdini, Autodesk Maya, Maxon’s RedShift, Otoy’s Octane, Autodesk’s Arnold, Adobe After Effects and Maxon’s Cinema 4D.

Framestore launches FPS preproduction services

VFX studio Framestore has launched FPS (Framestore Pre-production Services) for the global film and content production industries. An expansion of Framestore’s existing capability, FPS is available to clients in need of standalone preproduction support or an end-to-end production solution.

The move builds out and aligns the company’s previz, virtual production, techviz and postviz services with Framestore’s art department (which operates either as part of the Framestore workflow or as a stand-alone creative service), virtual production team and R&D unit, and integrates with the company’s VFX and animation teams. The move builds on work on films such as Gravity and the knowledge gained during the company’s eight-year London joint venture with visualization company The Third Floor. FPS is working on feature film projects as part of an integrated offering and as a standalone visualization partner, with more projects slated in the coming months.

The new team is led by Alex Webster, who joins as FPS managing director after running The Third Floor London. He will report directly to Fiona Walkinshaw, Framestore’s global managing director, film.

“This work aligns Framestore’s singular VFX and animation craft with a granular understanding of the visualization industry,” says Webster. “It marries the company’s extraordinary legacy in VFX with established visualization and emergent virtual production processes, supported by bleeding-edge technology and dedicated R&D resource to inform the nimble approach which our clients need. Consolidating our preproduction services represents a significant creative step forward.”

“Preproduction is a crucial stage for filmmakers,” says chief creative officer Tim Webber. “From mapping out environments to developing creatures and characters to helping plot action sequences it provides unparalleled freedom in terms of seeing how a story unfolds or how characters interact with the worlds we create. Bringing together our technical innovation with an understanding of filmmaking, we want to offer a bespoke service for each film and each individual to help tell compelling, carefully crafted stories.”

“Our clients’ needs are as varied as the projects they bring to us, with some needing a start-to-finish service that begins with concept art and ends in post while others want a bespoke, standalone solution to specific creative challenges, be that in early stage concepting, through layout and visualization or in final animation and VFX” says Framestore CEO William Sargent. “It makes sense to bring all these services in-house — even more so when you consider how our work in adjacent fields like AR, VR and MR has helped the likes of HBO, Marvel and Warner Bros. bring their IP to new, immersive platforms. What we’ll ultimately deliver goes well beyond previz and beyond visualization.”

Main Image: (L-R) Tim Webber, Fiona Walkinshaw and Alex Webster.