Category Archives: VFX

Cinesite London promotes Caroline Garrett to head of VFX

Cinesite in London has upped Caroline Garrett to head of its London VFX studio. She will oversee day-to-day operations at the independent VFX facility.  Garrett will work with her colleagues at Cinesite Montreal, Cinesite Vancouver and partner facilities Image Engine and Trixter to increase Cinesite’s global capacity for feature films, broadcast and streaming shows.

Garrett started her career in 1998 as an artist at the Magic Camera Company in Shepperton Film Studios. In 2002 she co-founded the previsualization company Fuzzygoat, working closely with directors at the start of the creative process. In 2009 she joined Cinesite, taking on the position of CG manager and overseeing the management of the 3D department as well as serving as both producer and executive producer. Prior to her most recent promotion, she was head of production, overseeing all aspects of production for the London.

With 20 years of industry experience and her own background as an animator and CG artist, Garrett understands the rigors that artists face while solving technical and creative challenges. Since joining Cinesite in 2009, she worked as senior production executive on high-profile features, including Harry Potter and the Deathly Hallows, Skyfall, World War Z, The Revenant, Fantastic Beasts and Where to Find Them and, most recently, Avengers Infinity War.

Garrett is the second woman to be appointed to a head of sudio role within the Cinesite group following Tara Kemes’ appointment as GM of Cinesite’s Vancouver animation studio earlier this year.

Sony Imageworks provides big effects, animation for Warner’s Smallfoot

By Randi Altman

The legend of Bigfoot: a giant, hairy two-legged creature roaming the forests and giving humans just enough of a glimpse to freak them out. Sightings have been happening for centuries with no sign of slowing down — seriously, Google it.

But what if that story was turned around, and it was Bigfoot who was freaked out by a Smallfoot (human)? Well, that is exactly the premise of the new Warner Bros. film Smallfoot, directed by Karey Kirkpatrick. It’s based on the book “Yeti Tracks” by Sergio Pablos.

Karl Herbst

Instead of a human catching a glimpse of the mysterious giant, a yeti named Migo (Channing Tatum) sees a human (James Corden) and tells his entire snow-filled village about the existence of Smallfoot. Of course, no one believes him so he goes on a trek to find this mythical creature and bring him home as proof.

Sony Pictures Imageworks was tasked with all of the animation and visual effects work on the film, while Warner Animation film did all of the front end work — such as adapting the script, creating the production design, editing, directing, producing and more. We reached out to Imageworks VFX supervisor Karl Herbst (Hotel Transylvania 2) to find out more about creating the animation and effects for Smallfoot.

The film has a Looney Tunes-type feel with squash and stretch. Did this provide more freedom or less?
In general, it provided more freedom since it allowed the animation team to really have fun with gags. It also gave them a ton of reference material to pull from and come up with new twists on older ideas. Once out of animation, depending on how far the performance was pushed, other departments — like the character effects team — would have additional work due to all of the exaggerated movements. But all of the extra work was worth it because everyone really loved seeing the characters pushed.

We also found that as the story evolved, Migo’s journey became more emotionally driven; We needed to find a style that also let the audience truly connect with what he was going through. We brought in a lot more subtlety, and a more truthful physicality to the animation when needed. As a result, we have these incredibly heartfelt performances and moments that would feel right at home in an old Road Runner short. Yet it all still feels like part of the same world with these truly believable characters at the center of it.

Was scale between such large and small characters a challenge?
It was one of the first areas we wanted to tackle since the look of the yeti’s fur next to a human was really important to filmmakers. In the end, we found that the thickness and fidelity of the yeti hair had to be very high so you could see each hair next to the hairs of the humans.

It also meant allowing the rigs for the human and yetis to be flexible enough to scale them as needed to have moments where they are very close together and they did not feel so disproportionate to each other. Everything in our character pipeline from animation down to lighting had to be flexible in dealing with these scale changes. Even things like subsurface scattering in the skin had dials in it to deal with when Percy, or any human character, was scaled up or down in a shot.

How did you tackle the hair?
We updated a couple of key areas in our hair pipeline starting with how we would build our hair. In the past, we would make curves that look more like small groups of hairs in a clump. In this case, we made each curve its own strand of a single hair. To shade this hair in a way that allowed artists to have better control over the look, our development team created a new hair shader that used true multiple-scattering within the hair.

We then extended that hair shading model to add control over the distribution around the hair fiber to model the effect of animal hair, which tends to scatter differently than human hair. This gave artists the ability to create lots of different hair looks, which were not based on human hair, as was the case with our older models.

Was rendering so many fury characters on screen at a time an issue?
Yes. In the past this would have been hard to shade all at once, mostly due to our reliance on opacity to create the soft shadows needed for fur. With the new shading model, we were no longer using opacity at all so the number of rays needed to resolve the hair was lower than in the past. But we now needed to resolve the aliasing due to the number of fine hairs (9 million for LeBron James’ Gwangi).

We developed a few other new tools within our version of the Arnold renderer to help with aliasing and render time in general. The first was adaptive sampling, which would allow us to up the anti-aliasing samples drastically. This meant some pixels would only use a few samples while others would use very high sampling. Whereas in the past, all pixels would get the same number. This focused our render times to where we needed it, helping to reduce overall rendering. Our development team also added the ability for us to pick a render up from its previous point. This meant that at a lower quality level we could do all of our lighting work, get creative approval from the filmmakers and pick up the renders to bring them to full quality not losing the time already spent.

What tools were used for the hair simulations specifically, and what tools did you call on in general?
We used Maya and the Nucleus solvers for all of the hair simulations, but developed tools over them to deal with so much hair per character and so many characters on screen at once. The simulation for each character was driven by their design and motion requirements.

The Looney Tunes-inspired design and motion created a challenge around how to keep hair simulations from breaking with all of the quick and stretched motion while being able to have light wind for the emotional subtle moments. We solved all of those requirements by using a high number of control hairs and constraints. Meechee (Zendaya) used 6,000 simulation curves with over 200 constraints, while Migo needed 3,200 curves with around 30 constraints.

Stonekeeper (Common) was the most complex of the characters, with long braided hair on his head, a beard, shaggy arms and a cloak made of stones. He required a cloth simulation pass, a rigid body simulation was performed for the stones and the hair was simulated on top of the stones. Our in-house tool called Kami builds all of the hair at render time and also allows us to add procedurals to the hair at that point. We relied on those procedurals to create many varied hair looks for all of the generics needed to fill the village full of yetis.

How many different types of snow did you have?
We created three different snow systems for environmental effects. The first was a particle simulation of flakes for near-ground detail. The second was volumetric effects to create lots of atmosphere in the backgrounds that had texture and movement. We used this on each of the large sets and then stored those so lighters could pick which parts they wanted in each shot. To also help with artistically driving the look of each shot, our third system was a library of 2D elements that the effects team rendered and could be added during compositing to add details late in shot production.

For ground snow, we had different systems based on the needs in each shot. For shallow footsteps, we used displacement of the ground surface with additional little pieces of geometry to add crumble detail around the prints. This could be used in foreground or background.

For heavy interactions, like tunneling or sliding in the snow, we developed a new tool we called Katyusha. This new system combined rigid body destruction with fluid simulations to achieve all of the different states snow can take in any given interaction. We then rendered these simulations as volumetrics to give the complex lighting look the filmmakers were looking for. The snow, being in essence a cloud, allowed light transport through all of the different layers of geometry and volume that could be present at any given point in a scene. This made it easier for the lighters to give the snow its light look in any given lighting situation.

Was there a particular scene or effect that was extra challenging? If so, what was it and how did you overcome it?
The biggest challenge to the film as a whole was the environments. The story was very fluid, so design and build of the environments came very late in the process. Coupling that with a creative team that liked to find their shots — versus design and build them — meant we needed to be very flexible on how to create sets and do them quickly.

To achieve this, we begin by breaking the environments into a subset of source shapes that could be combined in any fashion to build Yeti Mountain, Yeti Village and the surrounding environments. Surfacing artists then created materials that could be applied to any set piece, allowing for quick creative decisions about what was rock, snow and ice, and creating many different looks. All of these materials were created using PatternCreate networks as part of our OSL shaders. With them we could heavily leverage the portable procedural texturing between assets making location construction quicker, more flexible and easier to dial.

To get the right snow look for all levels of detail needed, we used a combination of textured snow, modeled snow and a simulation of geometric snowfall, which all needed to shade the same. For the simulated snowfall we created a padding system that could be run at any time on an environment giving it a fresh coating of snow. We did this so that filmmakers could modify sets freely in layout and not have to worry about broken snow lines. Doing all of that with modeled snow would have been too time-consuming and costly. This padding system worked not only in organic environments, like Yeti Village, but also in the Human City at the end of the film. The snow you see in the Human City is a combination of this padding system in the foreground and textures in the background.

DG 7.9, 8.27, 9.26

MPC Film provides VFX for new Predator movie

From outer space to suburban streets, the hunt comes to Earth in director Shane Black’s reinvention of the Predator series. Now, the universe’s most lethal hunters are stronger, smarter and deadlier than ever before, having genetically upgraded themselves with DNA from other species.

When a young boy accidentally triggers their return to Earth, only a crew of ex-soldiers and an evolutionary biology professor can prevent the end of the human race.

 

MPC Film’s team was led by VFX supervisors Richard Little and Arundi Asregadoo, creating 500 shots for this new Predator movie. The majority of the work required hero animation for the Upgrade Predator and additional work included the Predator dogs, FX effects and a CG swamp environment.

MPC Film’s character lab department modeled, sculpted and textured the Upgrade and Predator dogs to a high level of detail, allowing the director to have flexibility to view closeups, if needed. The technical animation department applied movement to the muscle system and added the flowing motion to the dreadlocks on all of the film’s hero alien characters, an integral part of the Predator’s aesthetic in the franchise.

MPC’s artists also created photorealistic Predator One and digital double mercenary characters for the film. Sixty-four other assets were created, ranging from stones and sticks for the swamp floor to grenades, a grenade launcher and bombs.

MPC’s artists also worked on the scene where we first meet the Upgrade’s dogs who are sent out to hunt the Predator. The sequence was shot on a bluescreen stage on location. The digital environments team built a 360-degree baseball field matching the location shoot from reference photography. Creating simple geometry and re-projecting the textures helped create a realistic environment.

Once the Upgrade tracks down the “fugitive” Predator the fight begins. To create the scene, MPC used a mixture of the live-action Predator intercut with its full CG Predator. The battle culminates with the Upgrade ripping the head and spine away from the body of the fugitive. This shot was a big challenge for the FX and tech animation team, who also added green Predator blood into the mix, amplifying the gore factor.

In the hunt scene, the misfit heroes trap the Upgrade and set the ultimate hunter alight. This sequence was technically challenging for the animation, lighting and FX team, which worked very closely to create a convincing Upgrade that appeared to be on fire.

For the final battle, MPC Film’s digital environments artists created a full CG swamp where the Upgrade’s ship crash-lands. The team was tasked with matching the set and creating a 360-degree CG set extension with water effects.

A big challenge was how to ensure the Upgrade Predator interacted realistically with the actors and set. The animation, tech animation and FX teams worked hard to make the Upgrade Predator fit seamlessly into this environment.


London design, animation studio Golden Wolf sets up shop in NYC

Animation studio Golden Wolf, headquartered in London, has launched its first stateside location in New York City. The expansion comes on the heels of an alliance with animation/VFX/live-action studio Psyop, a minority investor in the company. Golden Wolf now occupies studio space in SoHo adjacent to Psyop and its sister company Blacklist, which formerly represented Golden Wolf stateside and was instrumental to the relationship.

Among the year’s highlights from Golden Wolf are an integrated campaign for Nike FA18 Phantom (client direct), a spot for the adidas x Parley Run for the Oceans initiative (TBWA Amsterdam) in collaboration with Psyop, and Marshmello’s Fly music video for Disney. Golden Wolf also received an Emmy nomination for its main title sequence for Disney’s Ducktales reboot.

Heading up Golden Wolf’s New York office are two transplants from the London studio, executive producer Dotti Sinnott and art director Sammy Moore. Both joined Golden Wolf in 2015, Sinnott from motion design studio Bigstar, where she was a senior producer, and Moore after a run as a freelance illustrator/designer in London’s agency scene.

Sinnott comments: “Building on the strength of our London team, the Golden Wolf brand will continue to grow and evolve with the fresh perspective of our New York creatives. Our presence on either side of the Atlantic not only brings us closer to existing clients, but also positions us perfectly to build new relationships with New York-based agencies and brands. On top of this, we’re able to use the time difference to our advantage to work on faster turnarounds and across a range of budgets.”

Founded in 2013 by Ingi Erlingsson, the studio’s executive creative director, Golden Wolf is known for youth-oriented work — especially content for social media, entertainment and sports — that blurs the lines of irreverent humor, dynamic action and psychedelia. Erlingsson was once a prolific graffiti artist and, later, illustrator/designer and creative director at U.K.-based design agency ilovedust. Today he inspires Golden Wolf’s creative culture and disruptive style fed in part by a wave of next-gen animation talent coming out of schools such as Gobelins in France and The Animation Workshop in Denmark.

“I’m excited about our affiliation with Psyop, which enjoys an incredible legacy producing industry-leading animated advertising content,” Erlingsson says. “Golden Wolf is the new kid on the block, with bags of enthusiasm and an aim to disrupt the industry with new ideas. The combination of the two studios means that we are able to tackle any challenge, regardless of format or technical approach, with the support of some of the world’s best artists and directors. The relationship allows brands and agencies to have complete confidence in our ability to solve even the biggest challenges.”

Golden Wolf’s initial work out of its New York studio includes spots for Supercell (client direct) and Bulleit Bourbon (Barton F. Graf). Golden Wolf is represented in the US market by Hunky Dory for the East Coast, Baer Brown for the Midwest and In House Reps for the West Coast. Stink represents the studio for Europe.

Main Photo: (L-R) Dotti Sinnott, Ingi Erlingsson and Sammy Moore.


Montreal’s Real by Fake acquires LA’s Local Hero

Montréal-based post company Real by Fake has acquired Santa Monica’s Local Hero, whose credits include Mr. Robot, Captain Fantastic and Pitch Perfect. The acquisition creates an international post production and VFX entity that gives Real by Fake (C.R.A.Z.Y., Café de Flore, Dallas Buyers Club, Wild, Demolition) a significant presence in the Los Angeles area, while also bringing Local Hero’s established brand to Montréal. The two companies recently collaborated on all post and VFX for HBO’s popular limited series Big Little Lies and Sharp Objects. Check out our coverage of Sharp Objects here.

The companies will offer a full suite of services in both Montréal and Los Angeles, including: workflow consulting and look development; film and television tax credit-claiming services in Québec, California and Georgia; dailies; VFX; editorial suites; digital intermediates; full sound packages; mastering and deliverables (including 4K, VR and Dolby Vision HDR).

The companies will retain the Real by Fake brand for VFX and the Local Hero brand for all other post services. Marc Côté will continue to serve as the president of Real by Fake and run all Canadian operations. Steve Bannerman will continue as CEO of Local Hero and run all US operations. Leandro Marini will continue to manage all Local Hero creative services and customer support.

“I have collaborated closely with Marc Côté and Real by Fake on many projects, and they have become the trusted producing and post production partner for all my projects,” says director EP Jean-Marc Vallée. “I witnessed the teamwork between Real by Fake and Local Hero first-hand while working on Big Little Lies and Sharp Objects.”

“This acquisition is perfect for Local Hero for several reasons,” according to Steve Bannerman, CEO of Local Hero. “First, if gives us significant scale. Many of our best clients, like Lynette Howell and Matt Ross, are now taking on some of the biggest projects in Hollywood, and we need scale to work on those projects — particularly in VFX. We now have that. We can also offer our clients access to the lucrative tax incentives in Montreal and Georgia, so they can allocate more of their budget above the line. This is a crucial component in getting any project made in today’s climate. And, lastly, we get a fantastic partner in Marc Côté, and the team at Real by Fake. Marc brings a tremendous amount of cutting-edge VFX producing and on-set technical skill to the company. As more of our business trends toward VFX, this expertise is crucial in winning the big projects, while efficiently managing their VFX budgets. In short, we now have the complete package — world-class scale, skill and tax incentives.”


CoreMelt’s new PaintX provides planar tracking for FCPX

CoreMelt’s PaintX planar tracking-based paint utility software for Apple Final Cut Pro X editors and artists is now available. CoreMelt PaintX is powered by Boris FX’s Mocha technology, which tracks camera motion, objects and people for seamless visual effects and screen composites. PaintX enables editors and artists to quickly and accurately apply “fixes” to footage and sequences by using standard color, blur, sharpen, clone, smear and warp brushes and applying the planar tracker.

PaintX integrates a number of features designed to significantly simplify and accelerate the editing workflow within FCPX.

Here are some highlights:
– Each paint stroke is editable after being made and is fully non-destructive with unlimited undo;
– The tracked clone brush solves many common problems quickly
– The integrated Mocha tracker is accessible in a single button press
– Each stroke can have a different track applied to multiple tracks all in one plugin
– Users can copy and paste track data from one stroke to another in order to apply different effects with the same track data
– Users can save and restore brush preset shapes and sizes

CoreMelt has worked closely with the FCPX community to develop a series of tutorials to help new users familiarize themselves with the PaintX workflow within FCPX. The entire library of tutorials is available here.

PaintX is available for $99. CoreMelt is offering significant discounts on its Everything Bundle (which also includes PaintX) and its Chromatic + PaintX bundles. For more information on these bundles click here.


postPerspective Impact Award winners from SIGGRAPH 2018

postPerspective has announced the winners of our Impact Awards from SIGGRAPH 2018 in Vancouver. Seeking to recognize debut products with real-world applications, the postPerspective Impact Awards are voted on by an anonymous judging body made up of respected industry artists and professionals. It’s working pros who are going to be using new tools — so we let them make the call.

The awards honor innovative products and technologies for the visual effects, post production and production industries that will influence the way people work. They celebrate companies that push the boundaries of technology to produce tools that accelerate artistry and actually make users’ working lives easier.

While SIGGRAPH’s focus is on VFX, animation, VR/AR, AI and the like, the types of gear they have on display vary. Some are suited for graphics and animation, while others have uses that slide into post production, which makes these SIGGRAPH Impact Awards doubly interesting.

The winners are as follows:

postPerspective Impact Award — SIGGRAPH 2018 MVP Winner:

They generated a lot of buzz at the show, as well as a lot of votes from our team of judges, so our MVP Impact Award goes to Nvidia for its Quadro RTX raytracing GPU.

postPerspective Impact Awards — SIGGRAPH 2018 Winners:

  • Maxon for its Cinema 4D R20 3D design and animation software.
  • StarVR for its StarVR One headset with integrated eye tracking.

postPerspective Impact Awards — SIGGRAPH 2018 Horizon Winners:

This year we have started a new Imapct Award category. Our Horizon Award celebrates the next wave of impactful products being previewed at a particular show. At SIGGRAPH, the winners were:

  • Allegorithmic for its Substance Alchemist tool powered by AI.
  • OTOY and Epic Games for their OctaneRender 2019 integration with UnrealEngine 4.

And while these products and companies didn’t win enough votes for an award, our voters believe they do deserve a mention and your attention: Wrnch, Google Lightfields, Microsoft Mixed Reality Capture and Microsoft Cognitive Services integration with PixStor.

 


Artifex provides VFX limb removal for Facebook Watch’s Sacred Lies

Vancouver-based VFX house Artifex Studios created CG amputation effects for the lead character in Blumhouse Productions’ new series for Facebook Watch, Sacred Lies. In the show, the lead character, Minnow Bly (Elena Kampouris), emerges after 12 years in the Kevinian cult missing both of her hands. Artifex was called on to remove the actress’ limbs.

VFX supervisor Rob Geddes led Artifex’s team who created the hand/stump transposition, which encompassed 165 shots across the series. This involved detailed paint work to remove the real hands, while Artifex 3D artists simultaneously performed tracking and match move in SynthEyes to align the CG stump assets to the actress’ forearm.

This was followed up with some custom texture and lighting work in Autodesk Maya and Chaos V-Ray to dial in the specific degree of scarring or level of healing on the stumps, depending on each scene’s context in the story. While the main focus of Artifex’s work was on hand removal, the team also created a pair of severed hands for the first episode after rubber prosthetics didn’t pass the eye test. VFX work was run through Side Effects Houdini and composited in Foundry’s Nuke.

“The biggest hurdle for the team during this assignment was working with the actress’ movements and complex performance demands, especially the high level of interaction with her environment, clothing or hair,” says Adam Stern, founder of Artifex. “In one visceral sequence, Rob and his team created the actual severed hands. These were originally shot practically with prosthetics, however the consensus was that the practical hands weren’t working. We fully replaced these with CG hands, which allowed us to dial in the level of decomposition, dirt, blood and torn skin around the cuts. We couldn’t be happier with the results.”

Geddes adds, “One interesting thing we discovered when wrangling the stumps, is that the logical and accurate placement of the wrist bone of the stumps didn’t necessarily feel correct when the hands weren’t there. There was quite a bit of experimentation to keep the ‘hand-less’ arms from looking unnaturally long, or thin.”

Artifex also added a scene involving absolute devastation in a burnt forest in Episode 101, involving matte painting and set extension of extensive fire damage that couldn’t safely be achieved on set. Artifex fell back on their experience in environmental VFX creation, using matte painting and projections tied together with ample rotoscope work.

Approximately 20 Artifex artists took part in Sacred Lies across 3D, compositing, matte painting, I/O and production staff.

Watch Artifex founder Adam Stern talk about the show from the floor of SIGGRAPH 2018:


Sony Pictures Post adds three theater-style studios

Sony Pictures Post Production Services has added three theater-style studios inside the Stage 6 facility on the Sony Pictures Studios lot in Culver City. All studios feature mid-size theater environments and include digital projectors and projection screens.

Theater 1 is setup for sound design and mixing with two Avid S6 consoles and immersive Dolby Atmos capabilities, while Theater 3 is geared toward sound design with a single S6. Theater 2 is designed for remote visual effects and color grading review, allowing filmmakers to monitor ongoing post work at other sites without leaving the lot. Additionally, centralized reception and client services facilities have been established to better serve studio sound clients.

Mix Stage 6 and Mix Stage 7 within the sound facility have been upgraded, each featuring two S6 mixing consoles, six Pro Tools digital audio workstations, Christie digital cinema projectors, 24 X 13 projection screens and a variety of support gear. The stages will be used to mix features and high-end television projects. The new resources add capacity and versatility to the studio’s sound operations.

Sony Pictures Post Production Services now has 11 traditional mix stages, the largest being the Cary Grant Theater, which seats 344. It also has mix stages dedicated to IMAX and home entertainment formats. The department features four sound design suites, 60 sound editorial rooms, three ADR recording studios and three Foley stages. Its Barbra Streisand Scoring Stage is among the largest in the world and can accommodate a full orchestra and choir.

Patrick Ferguson joins MPC LA as VFX supervisor

MPC’s Los Angeles studio has added Patrick Ferguson to its staff as visual effects supervisor. He brings with him experience working in both commercials and feature films.

Ferguson started out in New York and moved to Los Angeles in 2002, and he has since worked at a range of visual effect houses along the West Coast, including The Mission, where he was VFX supervisor, and Method, where he was head of 2D. “No matter where I am in the world or what I’m working on, one thing has remained consistent since I started working in the industry: I still love what I do. I think that’s the most important thing.”

Ferguson has collaborated with directors such as Stacy Wall, Mark Romanek, Melina Matsoukas, Brian Billow and Carl Rinsch, and has worked on campaigns for big global brands, including Nike, Apple, Audi, HP and ESPN.

He has also worked on high-profile films, including Pirates of the Caribbean and Alice in Wonderland, and he was a member of the Academy Award-winning team for The Curious Case of Benjamin Button.

“In this new role at MPC, I hope to bring my varied experience of working on large scale feature films as well as on commercials that have a much quicker turnaround time,” he says. “It’s all about knowing what the correct tools are for the particular job at hand, as every project is unique.”

For Ferguson, there is no substitute for being on set: “Being on set is vital, as that’s when key relationships are forged between the director, the crew, the agency and the entire team. Those shared experiences go a long way in creating a trust that is carried all the way through to end of the project and beyond.”