Tag Archives: visual effects

Sony Imageworks provides big effects, animation for Warner’s Smallfoot

By Randi Altman

The legend of Bigfoot: a giant, hairy two-legged creature roaming the forests and giving humans just enough of a glimpse to freak them out. Sightings have been happening for centuries with no sign of slowing down — seriously, Google it.

But what if that story was turned around, and it was Bigfoot who was freaked out by a Smallfoot (human)? Well, that is exactly the premise of the new Warner Bros. film Smallfoot, directed by Karey Kirkpatrick. It’s based on the book “Yeti Tracks” by Sergio Pablos.

Karl Herbst

Instead of a human catching a glimpse of the mysterious giant, a yeti named Migo (Channing Tatum) sees a human (James Corden) and tells his entire snow-filled village about the existence of Smallfoot. Of course, no one believes him so he goes on a trek to find this mythical creature and bring him home as proof.

Sony Pictures Imageworks was tasked with all of the animation and visual effects work on the film, while Warner Animation film did all of the front end work — such as adapting the script, creating the production design, editing, directing, producing and more. We reached out to Imageworks VFX supervisor Karl Herbst (Hotel Transylvania 2) to find out more about creating the animation and effects for Smallfoot.

The film has a Looney Tunes-type feel with squash and stretch. Did this provide more freedom or less?
In general, it provided more freedom since it allowed the animation team to really have fun with gags. It also gave them a ton of reference material to pull from and come up with new twists on older ideas. Once out of animation, depending on how far the performance was pushed, other departments — like the character effects team — would have additional work due to all of the exaggerated movements. But all of the extra work was worth it because everyone really loved seeing the characters pushed.

We also found that as the story evolved, Migo’s journey became more emotionally driven; We needed to find a style that also let the audience truly connect with what he was going through. We brought in a lot more subtlety, and a more truthful physicality to the animation when needed. As a result, we have these incredibly heartfelt performances and moments that would feel right at home in an old Road Runner short. Yet it all still feels like part of the same world with these truly believable characters at the center of it.

Was scale between such large and small characters a challenge?
It was one of the first areas we wanted to tackle since the look of the yeti’s fur next to a human was really important to filmmakers. In the end, we found that the thickness and fidelity of the yeti hair had to be very high so you could see each hair next to the hairs of the humans.

It also meant allowing the rigs for the human and yetis to be flexible enough to scale them as needed to have moments where they are very close together and they did not feel so disproportionate to each other. Everything in our character pipeline from animation down to lighting had to be flexible in dealing with these scale changes. Even things like subsurface scattering in the skin had dials in it to deal with when Percy, or any human character, was scaled up or down in a shot.

How did you tackle the hair?
We updated a couple of key areas in our hair pipeline starting with how we would build our hair. In the past, we would make curves that look more like small groups of hairs in a clump. In this case, we made each curve its own strand of a single hair. To shade this hair in a way that allowed artists to have better control over the look, our development team created a new hair shader that used true multiple-scattering within the hair.

We then extended that hair shading model to add control over the distribution around the hair fiber to model the effect of animal hair, which tends to scatter differently than human hair. This gave artists the ability to create lots of different hair looks, which were not based on human hair, as was the case with our older models.

Was rendering so many fury characters on screen at a time an issue?
Yes. In the past this would have been hard to shade all at once, mostly due to our reliance on opacity to create the soft shadows needed for fur. With the new shading model, we were no longer using opacity at all so the number of rays needed to resolve the hair was lower than in the past. But we now needed to resolve the aliasing due to the number of fine hairs (9 million for LeBron James’ Gwangi).

We developed a few other new tools within our version of the Arnold renderer to help with aliasing and render time in general. The first was adaptive sampling, which would allow us to up the anti-aliasing samples drastically. This meant some pixels would only use a few samples while others would use very high sampling. Whereas in the past, all pixels would get the same number. This focused our render times to where we needed it, helping to reduce overall rendering. Our development team also added the ability for us to pick a render up from its previous point. This meant that at a lower quality level we could do all of our lighting work, get creative approval from the filmmakers and pick up the renders to bring them to full quality not losing the time already spent.

What tools were used for the hair simulations specifically, and what tools did you call on in general?
We used Maya and the Nucleus solvers for all of the hair simulations, but developed tools over them to deal with so much hair per character and so many characters on screen at once. The simulation for each character was driven by their design and motion requirements.

The Looney Tunes-inspired design and motion created a challenge around how to keep hair simulations from breaking with all of the quick and stretched motion while being able to have light wind for the emotional subtle moments. We solved all of those requirements by using a high number of control hairs and constraints. Meechee (Zendaya) used 6,000 simulation curves with over 200 constraints, while Migo needed 3,200 curves with around 30 constraints.

Stonekeeper (Common) was the most complex of the characters, with long braided hair on his head, a beard, shaggy arms and a cloak made of stones. He required a cloth simulation pass, a rigid body simulation was performed for the stones and the hair was simulated on top of the stones. Our in-house tool called Kami builds all of the hair at render time and also allows us to add procedurals to the hair at that point. We relied on those procedurals to create many varied hair looks for all of the generics needed to fill the village full of yetis.

How many different types of snow did you have?
We created three different snow systems for environmental effects. The first was a particle simulation of flakes for near-ground detail. The second was volumetric effects to create lots of atmosphere in the backgrounds that had texture and movement. We used this on each of the large sets and then stored those so lighters could pick which parts they wanted in each shot. To also help with artistically driving the look of each shot, our third system was a library of 2D elements that the effects team rendered and could be added during compositing to add details late in shot production.

For ground snow, we had different systems based on the needs in each shot. For shallow footsteps, we used displacement of the ground surface with additional little pieces of geometry to add crumble detail around the prints. This could be used in foreground or background.

For heavy interactions, like tunneling or sliding in the snow, we developed a new tool we called Katyusha. This new system combined rigid body destruction with fluid simulations to achieve all of the different states snow can take in any given interaction. We then rendered these simulations as volumetrics to give the complex lighting look the filmmakers were looking for. The snow, being in essence a cloud, allowed light transport through all of the different layers of geometry and volume that could be present at any given point in a scene. This made it easier for the lighters to give the snow its light look in any given lighting situation.

Was there a particular scene or effect that was extra challenging? If so, what was it and how did you overcome it?
The biggest challenge to the film as a whole was the environments. The story was very fluid, so design and build of the environments came very late in the process. Coupling that with a creative team that liked to find their shots — versus design and build them — meant we needed to be very flexible on how to create sets and do them quickly.

To achieve this, we begin by breaking the environments into a subset of source shapes that could be combined in any fashion to build Yeti Mountain, Yeti Village and the surrounding environments. Surfacing artists then created materials that could be applied to any set piece, allowing for quick creative decisions about what was rock, snow and ice, and creating many different looks. All of these materials were created using PatternCreate networks as part of our OSL shaders. With them we could heavily leverage the portable procedural texturing between assets making location construction quicker, more flexible and easier to dial.

To get the right snow look for all levels of detail needed, we used a combination of textured snow, modeled snow and a simulation of geometric snowfall, which all needed to shade the same. For the simulated snowfall we created a padding system that could be run at any time on an environment giving it a fresh coating of snow. We did this so that filmmakers could modify sets freely in layout and not have to worry about broken snow lines. Doing all of that with modeled snow would have been too time-consuming and costly. This padding system worked not only in organic environments, like Yeti Village, but also in the Human City at the end of the film. The snow you see in the Human City is a combination of this padding system in the foreground and textures in the background.

Tom Cruise in MISSION: IMPOSSIBLE - FALLOUT. Director Chris McQuarrie.

Mission: Impossible — Fallout writer/director Christopher McQuarrie

By Iain Blair

It’s hard to believe, but it’s been 22 years since Tom Cruise first launched the Mission: Impossible franchise. Since then, it’s become a global cultural phenomenon that’s grossed more than $2.8 billion, making it one of the most successful series in movie history.

With Mission: Impossible — Fallout, Cruise reprises his role of Impossible Missions Force (IMF) team leader Ethan Hunt for the sixth time. And writer/director/producer Christopher McQuarrie, who directed the series’ previous film Mission: Impossible — Rogue Nation, also returns. That makes him the first filmmaker ever to return to direct a second film in a franchise where one of its signature elements is that there’s been a different director for every movie.

Mission: Impossible - Fallout Director Christopher McQuarrie

Christopher McQuarrie

In the latest twisty adventure, Hunt and his IMF team (Alec Baldwin, Simon Pegg, Ving Rhames), along with some familiar allies (Rebecca Ferguson, Michelle Monaghan), find themselves in a race against time to stop a nuclear bomb disaster after a mission gone wrong. The film, which also stars Henry Cavill, Angela Bassett, Sean Harris and Vanessa Kirby, features a stellar team behind the camera as well, including director of photography Rob Hardy, production designer Peter Wenham, editor Eddie Hamilton, visual effects supervisor Jody Johnson and composer Lorne Balfe.

In 1995, McQuarrie got his start writing the script for The Usual Suspects, which won him the Best Original Screenplay Oscar. In 2000, he made his directorial debut with The Way of the Gun. Then in 2008 he reteamed with Usual Suspects director Bryan Singer, co-writing the WWII film Valkyrie, starring Tom Cruise. He followed that up with his 2010 script for The Tourist, then two years later, he worked with Cruise again on Jack Reacher, which he wrote and directed.

I recently talked with the director about making the film, dealing with all the visual effects and the workflow.

How did you feel when Tom asked for you to come back and do another MI film?
I thought, “Oh no!” In fact, when he asked me to do Rogue Nation, I was very hesitant because I’d been on the set of Ghost Protocol, and I saw just how complicated and challenging these films are. I was terrified. So after I’d finished Rogue, I said to myself, “I feel really sorry for the poor son-of-a-bitch who does the next one.” After five movies, I didn’t think there was anything left to do, but the joke turned out to be on me!

Tom Cruise, Mission: Impossible - FalloutWhat’s the secret of its continuing appeal?
First off, Tom himself. He’s always pushing himself and entertaining the audience with stuff they’ve never seen before. Then it’s all about character and story. The emphasis is always on that and the humanity of these characters. On every film, and with the last two we’ve done together, he’s learned how much deeper you can go with that and refined the process. You’re always learning from the audience as well. What they want.

How do you top yourself and make this different from the last one?
To make it different, I replaced my core crew — new DP, new composer and so on — and went for a different visual language. My intention on both films was not to even try to top the previous one. So when we started this I told Tom, “I just want to place somewhere in the Top 6 of Mission: Impossible films. I’m not trying to make the greatest action film ever.”

You say that, but it’s stuffed full of nail-biting car chases and really ambitious action sequences.
(Laughs) Well, at the same time you’re always trying to do something different from the other films in the franchise, so in Rogue I had this idea for a female counterpart for Tom — Ilsa (Rebecca Ferguson) was a more dynamic love interest. I looked at the other five films and realized that the biggest action scene of any of those films had not come in the third act. So it was a chance to create the biggest and most climactic third act — a huge team sequence that involved everyone. That was the big goal. But we didn’t set out to make this giant movie, and it wasn’t till we began editing that we realized just how much action there is.

Women seem to have far larger roles this time out.
That was very intentional from the start. In my earliest talks with Tom, we discussed the need to resolve the Julia (Michelle Monaghan) character and find closure to that story. So we had her and Rebecca, and then Angela Bassett came on board to replace Alec Baldwin’s character at the CIA after he moves to IMF, and it grew from there. I had an idea for the White Widow (Vanessa Kirby) character, and we just stayed open to all possibilities and the idea that these strong women, who own all the scenes they’re in, throw Ethan off balance all the time.

How early did you integrate post into the shoot?
Right at the start, since we had so many visual effects. We also had a major post challenge as Tom broke his ankle doing a rooftop chase stunt in London. So we had to shut down totally for six weeks and re-arrange the whole schedule to accommodate his recovery, and even when he got back on the movie his ankle wasn’t really healed enough.

We then had to shoot a lot of stuff piecemeal, and I knew, in order to make the release date, we had to start cutting right away when we had to stop for six weeks. But that also gave me a chance to re-evaluate it all, since you don’t really know the film you’ve shot until you get in the edit room, and that let me do course corrections I couldn’t have done otherwise. So, I essentially ended up doing re-shoots while still shooting the film. I was able to rewrite the second act, and it also meant that we had a finished cut done just six days after we wrapped. And we were able to test that movie four times and keep fine-tuning it.

Where did you do the post?Mission: Impossible: Fallout Tom Cruise
All in London, around Soho, and we did the sound at De Lane Lea.

Like Rogue, this was edited by Eddie Hamilton. Was he on the set?
Yes, and he’s invaluable because he’s got a very good eye, is a great storyteller and has a great sense of the continuity. He can also course-correct very quickly and let me know when we need to grab another shot. On Rogue Nation, he also did a lot of 2nd unit stuff, and he has great skills with the crew. We didn’t really have a 2nd unit on this one, which I think is better because it can get really chaotic with one. Basically, I love the edit, and I love being in the editing room and working hand in hand with my editor, shot for shot, and communicating all the time during production. It was a great collaboration.

There’s obviously a huge number of visual effects shots in the film. How many are there?
I’d say well over 3,000, and our VFX supervisor Jody Johnson at Double Negative did an amazing job. DNeg, Lola, One of Us, Bluebolt and Cheap Shot all worked on them. There was a lot of rig removal and cleanup along with the big set pieces.

Mission: Impossible Fallout

What was the most difficult VFX sequence/shot to do and why?
The big “High Altitude Low Opening,” or HALO sequence, where Tom jumps out of a Boeing Globemaster at 25,000 feet was far and away the most difficult one. We shot part of it at an RAF base in England, but then with Tom’s broken ankle and the changed schedule, we ended up shooting some of it in Abu Dhabi. Then we had to add in the Paris backdrop and the lightning for the storm, and to maintain the reality we had to keep the horizon in the shot. As the actors were falling at 160 MPH toward the Paris skyline, all of those shots had to be tracked by hand. No computer could do it, and that alone took hundreds of people working on it for three months to complete. It was exhausting.

Can you talk about the importance of music and sound to you as a filmmaker?
It’s so vital, and for me it’s always a three-pronged approach — music, sound and silence, and then the combination of all three elements. It’s very important to maintain the franchise aesthetic, but I wanted to have a fresh approach, so I brought in composer Lorne Balfe, and he did a great job.

The DI must have been vital. How did that process help?
We did it at Molinare in London with colorist Asa Shoul, who is just so good. I’m fairly hands on, especially as the DP was off on another project by the time we did the DI, although he worked on it with Asa as well. We had a big job dealing with all the stuff we shot in New Zealand, bringing it up to the other footage. I actually try to get the film as close as possible to what I want on the day, and then use the DI as a way of enhancing and shaping that, but I don’t actually like to manipulate things too much, although we gave all the Paris stuff this sort of hazy, sweaty look and feel which I love.

What’s next?
A very long nap.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Mark Thorely joins Mill Film Australia as MD

Mill Film in Australia, a Technicolor VFX studio, has named Mark Thorley as managing director.Hi appointment comes in the wake of the February launch of Mill Film in Adelaide, Australia.

Thorley brings with him more than 15 years of executive experience, working at such studios as Lucas Film, Singapore, where he oversaw studio operations and production strategies. Prior to that, Thorley spent nine years at Animal Logic, at both their Los Angeles and Sydney locations, as head of production. He also held senior positions at Screen Queensland and Omnicom.

Throughout his career, Thorley has received credits on numerous blockbuster feature films, including Kong: Skull Island, Rogue One, Jurassic World and Avengers: Age of Ultron. Thorley will drive all aspects of VFX production, client relations and business development for Australia, reporting into the global head of Mill Film, Lauren McCallum.

Milk provides VFX for Adrift, adds new head of production Kat Mann

As it celebrates its fifth anniversary, Oscar-, Emmy- and BAFTA-winning VFX studio Milk has taken an additional floor at its London location on Clipstone Street. This visual effects house has worked on projects such as Annihilation, Altered Carbon and Fantastic Beasts and Where to Find Them.

Milk’s expansion increases its artist capacity to 250, and includes two 4K FilmLight Baselight screening rooms and a dedicated client area. The studio has upgraded its pipeline, with all its rendering requirements (along with additional storage and workstation capacity) now entirely in the cloud, allowing full scalability for its roster of film and TV projects.

Annihilation

Milk has just completed production as the main vendor on STXfilms’ new feature film Adrift, the Baltasar Kormákur-directed true story of survival at sea, starring Shailene Woodley and Sam Claflin. The Milk team created all the major water and storm sequences for the feature, which were rendered entirely in the cloud.

Milk has just begun work on new projects, including Four Kids And It — Dan Films/Kindle Entertainment’s upcoming feature film — based on Jacqueline Wilson’s modern-day variation on the 1902 E Nesbit classic novel Five Children And It for which the Milk team will create the protagonist CG sand fairy character. Milk is also in production as sole VFX vendor on Neil Gaiman’s and Terry Pratchett’s six-part TV adaptation of Good Omens for Amazon/BBC.

In other news, Milk has brought on VFX producer Kat Mann as head of production. She will oversee all aspects of the studio’s production at its premises in London and at their Cardiff location. Mann has held senior production roles at ILM and Weta Digital with credits, including Jurassic World: Fallen Kingdom, Thor: The Dark World and Avatar. Milk’s former head of production Clare Norman has been promoted to business development director.

Milk was founded by a small team of VFX supervisors and producers in June 2013,

Framestore London adds joint heads of CG

Framestore has named Grant Walker and Ahmed Gharraph as joint heads of CG at its London studio. The two will lead the company’s advertising, television and immersive work alongside head of animation Ross Burgess.

Gharraph has returned to Framestore after a two-year stint at ILM, where he was lead FX artist on Star Wars: The Last Jedi, receiving a VES nomination in Outstanding Effects Simulations in a Photoreal Feature. His credits on the advertising-side as CG supervisor include Mog’s Christmas Calamity, which was Sainsbury’s 2015 festive campaign, and Shell V-Power Shapeshifter, directed by Carl Erik Rinsch.

Walker joined Framestore in 2009, and in his time at the company he has worked across film, advertising and television, building a portfolio as a CG artist with campaigns, including Freesat’s VES-nominated Sheldon. He was also instrumental in Framestore’s digital recreation of Audrey Hepburn in Galaxy’s 2013 campaign Chauffeur for AMV BBDO. Most recently, he was BAFTA-nominated for his creature work in the Black Mirror episode, “Playtest.”

Lindsay Seguin upped to EP at NYC’s FuseFX

Visual effects studio FuseFX has promoted Lindsay Seguin to executive producer in the studio’s New York City office. Seguin is now responsible for overseeing all client relationships at the FuseFX New York office, acting as a strategic collaborator for current and future productions spanning television, commercial and film categories. The company also has an office in LA.

Seguin, who first joined FuseFX in 2014, was previously managing producer. During her time with the company, she has worked with a number of high-profile client productions, including The Blacklist, Luke Cage, The Punisher, Iron Fist, Mr. Robot, The Get Down and the feature film American Made.

“Lindsay has played a key role in the growth and success of our New York office, and we’re excited for her to continue to forge partnerships with some of our biggest clients in her new role,” says Joseph Bell, chief operating officer and executive VP of production at FuseFX.

“We have a really close-knit team that enjoys working together on exciting projects,” Seguin added about her experience working at FuseFX. “Our crew is very savvy and hardworking, and they manage to maintain a great work/life balance, even as the studio delivers VFX for some of the most popular shows on television. Our goal is to have a healthy work environment and produce awesome visual effects.”

Seguin is a member of the Visual Effects Society and the Post New York Alliance. Prior to making the transition to television and feature work, her experience was primarily in national broadcast and commercial projects, which included campaigns for Wendy’s, Garnier, and Optimum. She is a graduate of Penn State University with a degree in telecommunications. Born in Toronto, Seguin is a dual citizen of Canada and the United States.

Review: HP’s lower-cost DreamColor Z24x display

By Dariush Derakhshani

So, we all know how important a color-accurate monitor is in making professional-level graphics, right? Right?!? Even at the most basic level, when you’re stalking online for the perfect watch band for your holiday present of a smart watch, you want the orange band you see in the online ad to be what you get when it arrives a few days later. Even if your wife thinks orange doesn’t suit you, and makes you look like “you’re trying too hard.”

Especially as a content developer, you want to know what you’re looking at is an accurate representation of the image. Ever walk into a Best Buy and see multiple screens showing the same content but with wild ranging differences in color? You can’t have that discrepancy working as a pro, especially in collaboration; you need color accuracy. In my own experience, that position has been filled by HP’s 10-bit DreamColor displays for many years now, but not everyone is awash in bitcoins, and justifying a price tag of over $1,200 is sometimes hard to justify, even for a studio professional.

Enter HP’s DreamColor Z24x display at half the price, coming in around $550 online. Yes, DreamColor for half the cost. That’s pretty significant. For the record, I haven’t used a 24-inch monitor since the dark ages; when Lost was the hot TV show. I’ve been fortunate enough to be running at 27-inch and higher, so there was a little shock when I started using the Z24x HP sent me for review. But this is something I quickly got used to.

With my regular 32-inch 4K display still my primary — so I can fit loads of windows all over the place — I used this DreamColor screen as my secondary display, primarily to check output for my Adobe After Effects comps, Adobe Premiere Pro edits and to hold my render view window as I develop shaders and lighting in Autodesk Maya. I felt comfortable knowing the images I shared with my colleagues across town would be seen as I intended them, evening the playing field when working collaboratively (as long as everyone is on the same LUT and color space). Speaking of color spaces, the Z24x hits 100% of sRGB, 99% of AdobeRGB and 96% of DCI P3, which is just slightly under HP’s Z27x DreamColor. It is, however, slightly faster with a 6ms response rate.

The Z24x has a 24-inch IPS panel from LG that exhibits color in 10-bit, like its bigger 27-inch Z27x sibling. This gives you over a billion colors, which I have personally verified by counting them all —that was one, long weekend, I can tell you. Unlike the highest-end DreamColor screens though, the Z24x dithers up from 8-bit to 10-bit (called an 8-bit+FRC). This means it’s better than an 8-bit color display, for sure, but not quite up to real 10-bit, making it color accurate but not color critical. HP’s implementation of dithering is quite good, when subjectively compared to my full 10-bit main display. Frankly, a lot of screens that claim 10-bit may actually be 8-bit+FRC anyway!

While the Z27x gives you 2560×1440 as you expect of most 27inch displays, if not full on 4K, the Z24x is at a comfortable 1920×1200, just enough for a full 1080p image and a little room for a slider or info bar. Being the res snob that I am, I had wondered if that was just too low, but at 24-inches I don’t think you would want a higher resolution, even if you’re sitting only 14-inches away from it. And this is a sentiment echoed by the folks at HP who consulted with so many of their professional clients to build this display. That gives a pixel density of about 94PPI, a bit lower than the 109PPI of the Z27x. This density is about the same as a 1080p HD display at 27-inch, so it’s still crisp and clean.

Viewing angles are good at about 178 degrees, and the screen is matte, with an anti-glare coating, making it easier to stare at without blinking for 10 hours at a clip, as digital artists usually do. Compared to my primary display, this HP’s coating was more matte and still gave me a richer black in comparison, which I liked to see.

Connection options are fairly standard with two DisplayPorts, one HDMI, and one DVI dual link for anyone still living in the past. You also get four USB ports and an analog 3.5mm audio jack if you want to drive some speakers, since you can’t from your phone anymore (Apple, I’m looking at you).

Summing Up
So while 24-inches is a bit small for my tastes for a display, I am seriously impressed at the street price of the Z24x, allowing a lot more pros and semi-pros to get the DreamColor accuracy HP offers at half the price. While I wouldn’t recommend color grading a show on the Z24x, this DreamColor does a nice job of bringing a higher level of color confidence at an attractive price. As a secondary display, the z24x is a nice addition to an artist workflow with budget in mind — or who has a mean, orange-watch-band-hating spouse.


Dariush Derakhshani is a VFX supervisor and educator in Southern California. You can follow his random tweets at @koosh3d.

Young pros with autism contribute to Oscar-nominated VFX films

Exceptional Minds Studio, the LA-based visual effects and animation studio made up of young people on the autism spectrum, earned screen credit on three of the five films nominated for Oscars in the visual effects category — Star Wars: The Last Jedi, Guardians of the Galaxy Vol. 2 and War for the Planet of the Apes.

L-R: Lloyd Hackl, Kenneth Au, Mason Taylor and Patrick Brady.

For Star Wars: The Last Jedi, the artists at Exceptional Minds Studio were contracted to do visual effects cleanup work that involved roto and paint for several shots. “We were awarded 20 shots for this film that included very involved rotoscoping and paint work,” explains Exceptional Minds Studio executive producer Susan Zwerman.

The studio was also hired to create the end titles for Star Wars: The Last Jedi, which involved compositing the text into a star-field background.

For Guardians of the Galaxy Vol. 2, Exceptional Minds provided the typesetting for the end credit crawl. For War for the Planet of the Apes, the studio provided visual effects cleanup on 10 shots — this involved tracker marker removal using roto and paint.

Exceptional Minds used Foundry’s Nuke for much of their work, in addition to Silhouette and Mocha for After Effects.

Star Wars: The Last Jedi. Courtesy of ILM

Since opening its doors almost four years ago, this small studio has worked on visual effects for more than 50 major motion pictures and/or television series, including The Good Doctor, Game of Thrones and Doctor Strange.

“The VFX teams we worked with on each of these movies were beyond professional, and we are so thankful that they gave our artists the opportunity to work with them,” says Zwerman, adding that “many of our artists never even dreamed they would be working in this industry.”

An estimated 90 percent of the autism population is under employed or unemployed, and few training programs exist to prepare young adults with autism for meaningful careers, which is what makes this program so important.

“I couldn’t imagine doing this when I was young,” agreed Patrick Brady, an Exceptional Minds VFX artist.

VFX supervisor Lesley Robson-Foster on Amazon’s Mrs. Maisel

By Randi Altman

If you are one of the many who tend to binge-watch streaming shows, you’ve likely already enjoyed Amazon’s The Marvelous Mrs. Maisel. This new comedy focuses on a young wife and mother living in New York City in 1958, when men worked and women tended to, well, not work.

After her husband leaves her, Mrs. Maisel chooses stand-up comedy over therapy — or you could say stand-up comedy chooses her. The show takes place in a few New York neighborhoods, including the toney Upper West Side, the Garment District and the Village. The storyline brings real-life characters into this fictional world — Midge Maisel studies by listening to Red Foxx comedy albums, and she also befriends comic Lenny Bruce, who appears in a number of episodes.

Lesley Robson-Foster on set.

The show, created by Amy Sherman-Palladino and Dan Palladino, is colorful and bright and features a significant amount of visual effects — approximately 80 per episode.

We reached out to the show’s VFX supervisor, Lesley Robson-Foster, to find out more.

How early did you get involved in Mrs. Maisel?
The producer Dhana Gilbert brought my producer Parker Chehak and I in early to discuss feasibility issues, as this is a period piece and to see if Amy and Dan liked us! We’ve been on since the pilot.

What did the creators/showrunners say they needed?
They needed 1958 New York City, weather changes and some very fancy single-shot blending. Also, some fantasy and magic realism.

As you mentioned, this is a period piece, so I’m assuming a lot of your work is based on that.
The big period shots in Season 1 are the Garment District reconstruction. We shot on 19th Street between 5th and 6th — the brilliant production designer Bill Groom did 1/3 of the street practically and VFX took care of the rest, such as crowd duplication and CG cars and crowds. Then we shot on Park Avenue and had to remove the Met Life building down near Grand Central, and knock out anything post-1958.

We also did a major gag with the driving footage. We shot driving plates around the Upper West Side and had a flotilla of period-correct cars with us, but could not get rid of all the parked cars. My genius design partner on the show Douglas Purver created a wall of parked period CG cars and put them over the modern ones. Phosphene then did the compositing.

What other types of effects did you provide?
Amy and Dan — the creators and showrunners — haven’t done many VFX shows, but they are very, very experienced. They write and ask for amazing things that allow me to have great fun. For example, I was asked to make a shot where our heroine is standing inside a subway car, and then the camera comes hurtling backwards through the end of the carriage and then sees the train going away down the tunnel. All we had was a third of a carriage with two and a half walls on set. Douglas Purver made a matte painting of the tunnel, created a CG train and put it all together.

Can you talk about the importance of being on set?
For me being on set is everything. I talk directors out of VFX shots and fixes all day long. If you can get it practically you should get it practically. It’s the best advice you’ll ever give as a VFX supervisor. A trust is built that you will give your best advice, and if you really need to shoot plates and interrupt the flow of the day, then they know it’s important for the finished shot.

Having a good relationship with every department is crucial.

Can you give an example of how being on set might have saved a shot or made a shot stronger?
This is a character-driven show. The directors really like Steadicam and long, long shots following the action. Even though a lot of the effects we want to do really demand motion control, I know I just can’t have it. It would kill the performances and take up too much time and room.

I run around with string and tennis balls to line things up. I watch the monitors carefully and use QTake to make sure things line up within acceptable parameters.

In my experience you have to have the production’s best interests at heart. Dhana Gilbert knows that a VFX supervisor on the crew and as part of the team smooths out the season. They really don’t want a supervisor who is intermittent and doesn’t have the whole picture. I’ve done several shows with Dhana; she knows my idea of how to service a show with an in-house team.

You shot b-roll for this? What camera did you use, and why?
We used a Blackmagic Ursa Mini Pro. We rented one on The OA for Netflix last year and found it to be really easy to use. We liked that’s its self-contained and we can use the Canon glass from our DSLR kits. It’s got a built-in monitor and it can shoot RAW 4.6K. It cut in just fine with the Alexa Mini for establishing shots and plates. It fits into a single backpack so we could get a shot at a moment’s notice. The user interface on the camera is so intuitive that anyone on the VFX team could pick it up and learn how to get the shot in 30 minutes.

What VFX houses did you employ, and how do you like to work with them?
We keep as much as we can in New York City, of course. Phosphene is our main vendor, and we like Shade and Alkemy X. I like RVX in Iceland, El Ranchito in Spain and Rodeo in Montreal. I also have a host of secret weapon individuals dotted around the world. For Parker and I, it’s always horses for courses. Whom we send the work to depends on the shot.

For each show we build a small in-house team — we do the temps and figure out the design, and shoot plates and elements before shots leave us to go to the vendor.

You’ve worked on many critically acclaimed television series. Television is famous for quick turnarounds. How do you and your team prepare for those tight deadlines?
Television schedules can be relentless. Prep, shoot and post all at the same time. I like it very much as it keeps the wheels of the machine oiled. We work on features in between the series and enjoy that slower process too. It’s all the same skill set and workflow — just different paces.

If you have to offer a production a tip or two about how to make the process go more smoothly, what would it be?
I would say be involved with EVERYTHING. Keep your nose close to the ground. Really familiarize yourself with the scripts — head trouble off at the pass by discussing upcoming events with the relevant person. Be fluid and flexible and engaged!

Jogger moves CD Andy Brown from London to LA

Creative director Andy Brown has moved from Jogger’s London office to its Los Angeles studio. Brown led the development of boutique VFX house Jogger London, including credits for the ADOT PSA Homeless Lights via Ogilvy & Mather, as well as projects for Adidas, Cadbury, Valentino, Glenmorangie, Northwestern Mutual, La-Z-Boy and more. He’s also been involved in post and VFX for short films such as Foot in Mouth, Containment and Daisy as well as movie title sequences (via The Morrison Studio), including Jupiter Ascending, Collide, The Ones Below and Ronaldo.

Brown got his start in the industry at MPC, where he worked for six years, eventually assuming the role of digital online editor. He then went on to work in senior VFX roles at some of London’s post houses, before assuming head of VFX at One Post. Following One Post’s merger with Rushes, Brown founded his own company Four Walls, establishing the company’s reputation for creative visual effects and finishing.

Brown oversaw Four Walls’ merger with LA’s Jogger Studios in 2016. Andy has since helped form interconnections with Jogger’s teams in London, New York, Los Angeles, San Francisco and Austin, with high-end VFX, motion graphics and color grading carried out on projects globally.

VFX house Jogger is a sister company of editing house Cut + Run.