Tag Archives: VFX supervisor

A Conversation: Jungle Book’s Oscar-Winner Rob Legato

By Randi Altman

Rob Legato’s resume includes some titles that might be considered among the best visual effects films of all time: Titanic, Avatar, Hugo, Harry Potter and the Sorcerer’s Stone, Apollo 13 and, most recently, The Jungle Book. He has three Oscars to his credit (Titanic, Hugo, The Jungle Book) along with one other nomination (Apollo 13). And while Martin Scorsese’s The Wolf of Wall Street and The Aviator don’t scream effects, he worked on those as well.

While Legato might be one of the most prodigious visual effects supervisors of all time, he never intended for this to be his path. “The magic of movies, in general, was my fascination more than anything else,” he says, and that led to him studying cinematography and directing at Santa Barbara’s Brooks Institute. They provided intensive courses on the intricacies of working with cameras and film.

Rob Legato worked closely with Technicolor and MPC to realize Jon Favreau’s vision for The Jungle Book, which is nominated for a VFX Oscar this year.

It was this technical knowledge that came in handy at his first job, working as a producer at a commercials house. “I knew that bizarre, esoteric end of the business, and that became known among my colleagues.” So when a spot came in that had a visual effect in it, Legato stepped up. “No one knew how to do it, and this was before on-set visual effects supervisors worked on commercials. I grabbed the camera and I figured out a way of doing it.”

After working on commercials, Legato transitioned to longer-form work, specifically television. He started on the second season of The Twilight Zone series, where he got the opportunity to shoot some footage. He was hoping to direct an episode, but the show got cancelled before he had a chance.

Legato then took his experience to Star Trek at a time when they were switching from opticals to a digital post workflow. “There were very few people then who had any kind of visual effects and live-action experience in television. I became second-unit director and ultimately directed a few shows. It was while working on Next Generation and Deep Space Nine that I learned how to mass produce visual effects on as big a scale as television allows, and that led me to Digital Domain.”

It was at Digital Domain where Legato transitioned to films, starting with Interview With the Vampire. He served as visual effects supervisor on this one. “Director Neil Jordan asked me to do the second unit. I got along really well with DP Philippe Roussselot and was able to direct live-action scenes and personally direct and photograph anything that was not live-action related — including the Tom Cruise puppet that looked like he’s bleeding to death.” This led to Apollo 13 on which he was VFX supervisor.

On set for Hugo (L-R): Martin Scorsese, DP Bob Richardson and Rob Legato.

“I thought as a director did, and I thought as a cameraman, so I was able to answer my own questions. This made it easy to communicate with directors and cameramen, and that was my interest. I attacked everything from the perspective of, ‘If I were directing this scene, what would I do?’ It then became easy for me to work with directors who weren’t very fluent in the visual effects side. And because I shot second unit too, especially on Marty Scorsese’s movies, I could determine what the best way of getting that image was. I actually became quite a decent cameraman with all this practice emulating Bob Richardson’s extraordinary work, and I studied the masters (Marty and Bob) and learned how to emulate their work to blend into their sequences seamlessly. I was also able to maximize the smaller dollar amount I was given by designing both second unit direction and cinematography together to maximize my day.”

Ok, let’s dig in a bit deeper with Legato, a card-carrying member of the ASC, and find out how he works with directors, his workflow and his love for trying and helping to create new technology in order to help tell the story.

Over the years you started to embrace virtual production. How has that technology evolved over the years?
When I was working on Harry Potter, I had to previs a sequence for time purposes, and we used a computer. I would tell the CG animators where to put the camera and lights, but there was something missing — a lot of times you get inspired by what’s literally in front of you, which is ever-changing in realtime. We were able to click the mouse and move it where we needed, but it was still missing this other sense of life.

For example, when I did Aviator, I had to shoot the plane crash; something I’d never done before, and I was nervous. It was a Scorsese film, so it was a given that it was to be beautifully designed and photographed. I didn’t have a lot of money, and I didn’t want to blow my opportunity. On Harry Potter and Titanic we had a lot of resources, so we could fix a mistake pretty easily. Here, I had one crack at it, and it had to be a home run.

So I prevised it, but added a realtime live-action pan and tilt wheels so we could operate and react in realtime — so instead of using a mouse, I was basically using what we use on a stage. It was a great way of working. I was doing the entire scene from one vantage point. I then re-staged it, put a different lens on it and shot the same exact scene from another angle. Then I could edit it as you would a real sequence, just as if I had all the same angles I would have if I had photographed it conventionally and produced a full set of multi-angle live-action dailies.

You edit as well?
I love editing. I would operate the shot and then cut it in the Avid, instantly. All of a sudden I was able to build a sequence that had a certain photographic and editorial personality to it — it felt like there was someone quite specific shooting it.

Is that what you did for Avatar?
Yes. Cameron loves to shoot, operate and edit. He has no fear of technology. I told him what I did on Aviator and that I couldn’t afford to add the more expensive, but extremely flexible, motion capture to it. So on Avatar instead of only the camera having live pan and tilt wheels, it could also be hand-held — you could do Steadicam shots, you could do running shots, you could do hand-held things, anything you wanted, including adding a motion capture live performance by an actor. You could easily stage them, or a representation of that character, in any place or scale in the scene, because in Avatar the characters were nine feet tall. You could preview the entire movie in a very free form and analog way. Jim loved the fact he could impart his personality — the way he moves the camera, the way he frames, the way he cuts — and that the CG-created film would bear the unmistakable stamp of his distinctive live-action movies.

You used the “Avatar-way” on Jungle Book, yes?
Yes. It wasn’t until Jungle Book that I could afford the Avatar-way — a full-on stage with a lot of people to man it. I was able to take what I gave to Jim on Avatar and do it myself with the bells and whistles and some improvements that gave it a life-like sensibility of what could have been an animated film. Instead it became a live film because we used a live-action analog methodology of acquiring images and choosing which one was the right, exact moment per the cut.

The idea behind virtual cinematography is that you shoot it like you would a regular movie. All the editors, cameramen or directors who’ve never done this before are now sort of operating the way they would have if it were real. This very flavor and personality starts to rub off on the patina of the film and begins to feel like a real movie; not animated or computer generated one.

Our philosophy on Jungle Book was we would not make the computer camera do anything that a real camera could not do, so we limited the way we could move it and how fast we could move it, so it wouldn’t defy any kind of gravity. That went part and parcel with the animation and movement of the animals and the actor performing stunts that only a human can accomplish.

So you are in a sense limiting what you can do with the technology?
There was an operator behind the camera and behind the wheels, massaging and creating the various compositional choices that generally are not made in a computer. They’re not just setting keyframes, and because somebody’s behind the camera, this sense of live-action-derived movement is consistent from shot to shot to shot. It’s one person doing it, whereas normally on a CG film, there are as many as 50 people who are placing cameras on different characters within the same scene.

You have to come up with these analog methodologies that are all tied together without even really knowing it. Your choices at the end of the day end up being strictly artistic choices. We’d sort of tap into that for Jungle Book and it’s what Jim tapped into when he did Avatar. The only difference between Avatar and our film is that we set our film in an instantly recognizable place so everybody can judge whether it’s photorealistic or not.

When you start a film, do you create your own system or use something off the shelf?
With every film there is a technology advance. I typically take whatever is off-the-shelf and glue it together with something not necessarily designed to work in unison. Each year you perfect it. The only way to really keep on top of technology is by being on the forefront of it, as opposed to waiting for it to come out. Usually, we’re doing things that haven’t been done before, and invariably it causes something new and innovative.

We’re totally revamping what we did on Jungle Book to achieve the same end on my next film for Disney, but we hope to make it that much better, faster and more intuitive. We are also taking advantage of VR tools to make our job easier, more creative and faster. The faster you can create options, the more iterations you get. More iterations get you a better product sooner and help you elevate the art form by taking it to the next level.

Technology is always driven by the story. We ask ourselves what we want to achieve. What kind of shot do we want to create that creates a mood and a tone? Then once we decide what that is, we figure out what technology we need to invent, or coerce into being, to actually produce it. It’s always driven that way. For example, on Titanic, the only way I could tell that story and make these magic transitions from the Titanic to the wreck and from the wreck back to the Titanic, was by controlling the water, which was impossible. We needed to make computer-generated water that looked realistic, so we did.

THE JUNGLE BOOK (Pictured) BAGHEERA and MOWGLI. ©2016 Disney Enterprises, Inc. All Rights Reserved.CG water was a big problem back then.
But now that’s very commonplace. The water work in Jungle Book is extraordinary compared to the crudeness of what we did on Titanic, but we started on that path, and then over the years other people took over and developed it further.

Getting back to Marty Scorsese, and how you work with him. How does having his complete trust make you better at what you do?
Marty is not as interested in the technical side as Jim is. Jim loves all this stuff, and he likes to tinker and invent. Marty’s not like that. Marty likes to tinker with emotions and explore a performance editorially. His relationship with me is, “I’m not going to micro-manage you. I’m going to tell you what feeling I want to get.” It’s very much like how he would talk to an actor about what a particular scene is about. You then start using your own creativity to come up with the idea he wants, and you call on your own experience and interpretation to realize it. You are totally engaged, and the more engaged you are, the more creative you become in terms of what the director wants to tell his story. Tell me what you want, or even don’t want, and then I’ll fill in the blanks for you.

Marty is an incredible cinema master — it’s not just the performance, it’s not just the camera, it’s not just the edit, it’s all those things working in concert to create something new. His encouragement for somebody like me is to do the same and then only show him something that’s working. He can then put his own creative stamp on it as well once he sees the possibilities properly presented. If it’s good, he’s going to use it. If it’s not good, he’ll tell you why, but he won’t tell you how to if fix it. He’ll tell you why it doesn’t feel right for the scene or what would make it more eloquent. It’s a very soft, artistic push in his direction of the film. I love working with him for this very reason.

You too surround yourself with people you can trust. Can you talk about this for just a second?
I learned early on to surround myself with geniuses. You can’t be afraid of hiring people that are smarter than you are because they bring more to the party. I want to be the lowest common denominator, not the highest. I’ll start with my idea, but if someone else can do it better, I want it to be better. I can show them what I did and tell them to make it better, and they’ll go off and come up with something that maybe I wouldn’t have thought of, or the collusion between you and them creates a new gem.

When I was doing Titanic someone asked me how I did what I did. My answer was that I hired geniuses and told them what I wanted to accomplish creatively. I hire the best I can find, the smartest, and I listen. Sometimes I use it, sometimes I don’t. Sometimes the mistake of somebody literally misunderstanding what you meant delivers something that you never thought of. It’s like, “Wow, you completely misunderstood what I said, but I like that better, so we’re going to do that.”

Part and parcel of doing this is that you’re a little fearless. It’s like, “Well, that sounds good. There’s no proof to it, but we’re going to go for it,” as opposed to saying, “Well, no one has done it before so we better not try it. That’s what I learned from Cameron and Marty and Bob Zemeckis. They’re fearless.

Can you mention what you’re working on now, or no?
I’m working on Lion King.

ILM’s Richard Bluff talks VFX for Marvel’s Doctor Strange

By Daniel Restuccio

Comic book fans have been waiting for over 30 years for Marvel’s Doctor Strange to come to the big screen, and dare I say it was worth the wait. This is in large part because of the technology now available to create the film’s stunning visual effects.

Fans have the option to see the film in traditional 2D, Dolby Cinema (worthy of an interstate or plane fare pilgrimage, in my opinion) and IMAX 3D. Doctor Strange, Marvel Studios’ 15th film offering, is also receiving good critical reviews and VFX Oscar buzz — it’s currently on the list of 20 films still in the running in the Visual Effects category for the 89th Academy Awards.

Marvel Doctor StrangeThe unapologetically dazzling VFX shots, in many cases directly inspired by the original comic visuals by Steve Dittko, were created by multiple visual effects houses, including Industrial Light & Magic, Luma Pictures, Lola VFX, Method Studios, Rise FX, Crafty Apes, Framestore, Perception and previs house The Third Floor. Check out our interview with the film’s VFX supervisor Stephane Ceretti.

Director Scott Derrickson said in in a recent Reddit chat that Doctor Strange is “a fantastical superhero movie.

“Watching the final cut of the film was deeply satisfying,” commented Derrickson. “A filmmaker cannot depend upon critical reviews or box office for satisfaction — even if they are good. The only true reward for any artist is to pick a worthy target and hit it. When you know you’ve hit your target that is everything. On this one, I hit my target.”

Since we got an overview of how the visual effects workflow went from Ceretti, we decided to talk to one of the studios that provided VFX for the film, specifically ILM and their VFX supervisor Richard Bluff.

Richard Bluff

According to Bluff, early in pre-production Marvel presented concept art, reference images and previsualization on “what were the boundaries of what the visuals could be.” After that, he says, they had the freedom to search within those bounds.

During VFX presentations with Marvel, they frequently showed three versions of the work. “They went with the craziest version to the point where the next time we would show three more versions and we continued to up the ante on the crazy,” recalls Bluff.

As master coordinator of this effort for ILM, Bluff encouraged his artists, “to own the visuals and try to work out how the company could raise the quality of the work or the designs on the show to another level. How could we introduce something new that remains within the fabric of the movie?”

As a result, says Bluff, they had some amazing ideas flow from individuals on the film. Jason Parks came up with the idea of traveling through the center of a subway train as it fractured. Matt Cowey invented the notion of continually rotating the camera to heighten the sense of vertigo. Andrew Graham designed the kaleidoscope-fighting arena “largely because his personal hobby is building and designing real kaleidoscopes.”

Unique to Doctor Strange is that the big VFX sequences are all very “self-contained.” For example, ILM did the New York and Hong Kong sequence, Luma did the Dark Dimension and Method did the multi-universe. ILM also designed and developed the original concept for the Eldridge Magic and provided all the shared “digital doubles” — CGI rigged and animatable versions of the actors — that tied sequences together. The digital doubles were customized to the needs of each VFX house.

Previs
In some movies previs material is generated and thrown away. Not so with Doctor Strange. What ILM did this time was develop a previs workflow where they could actually hang assets and continue to develop, so it became part of the shot from the earliest iteration.

There was extensive previs done for Marvel by The Third Floor as a creative and technical guide across the movie, and further iterations internal to ILM done by ILM’s lead visualization artist, Landis Fields.

Warning! Spoiler! Once Doctor Strange moves the New York fight scene into the mirror universe, the city starts coming apart in an M.C. Escher-meets-Chris Nolan-Inception kind of way. To make that sequence, ILM created a massive tool kit of New York set pieces and geometry, including subway cars, buildings, vehicles and fire escapes.

In the previs, Fields started breaking apart, duplicating and animating those objects, like the fire escapes, to tell the story of what a kaleidoscoping city would look like. The artists then fleshed out a sequence of shots, a.k.a. “mini beats.” They absorbed the previs into the pipeline by later switching out the gross geometry elements in Fields’ previs with the actual New York hero assets.

Strange Cam
Landis and the ILM team also designed and built what ILM dubbed the “strange cam,” a custom 3D printed 360 GoPro rig that had to withstand the rigors of being slung off the edge of skyscrapers. What ILM wanted to do was to be able to capture 360 degrees of rolling footage from that vantage point to be used as a moving background “plates” that could be reflected within the New York City glass buildings.

VFX, Sound Design and the Hong Kong
One of the big challenges with the Hong Kong sequence was that time was reversing and moving forward at the same time. “What we had to do was ensure the viewer understands that time is reversing throughout that entire sequence.” During the tight hand-to-hand action moments that are moving forward in time, there’s not really much screen space to show you time reversing in the background. So they designed the reversing destruction sequence to work in concert with the sound design. “We realized we had to move away from a continuous shower of debris toward rhythmic beats of debris being sucked out of frame.”

before-streetafter-street

Bluff says the VFX the shot count on the film — 1,450 VFX — was actually a lot less than Captain America: Civil War. From a VFX point of view, The Avengers movies lean on the assets generated in Iron Man and Captain America. The Thor movies help provide the context for what an Avengers movie would look and feel like. In Doctor Strange “almost everything in the movie had to be designed (from scratch) because they haven’t already existed in a previous Marvel film. It’s a brand-new character to the Marvel world.”

Bluff started development on the movie in October of 2014 and really started doing hands on work in February of 2016, frequently traveling between Vancouver, San Francisco and London. A typical day, working out of the ILM London office, would see him get in early and immediately deal with review requests from San Francisco. Then he would jump into “dailies” in London and work with them until the afternoon. After “nightlies” with London there was a “dailies” session with San Francisco and Vancouver, work with them until evening, hit the hotel, grab some dinner, come back around 11:30pm or midnight and do nightlies with San Francisco. “It just kept the team together, and we never missed a beat.”

2D vs. IMAX 3D vs. Dolby Cinema
Bluff saw the entire movie for the first time in IMAX 3D, and is looking forward to seeing it in 2D. Considering sequences in the movie are surreal in nature and Escher-like, there’s an argument that suggests that IMAX 3D is a better way to see it because it enhances the already bizarre version of that world. However, he believes the 2D and 3D versions are really “two different experiences.”

Dolby Cinema is the merging of Dolby Atmos — 128-channel surround sound — with the high dynamic range of Dolby Vision, plus really comfortable seats. It is, arguably, the best way to see a movie. Bluff says as far as VFX goes, high dynamic range information has been there for years. “I’m just thankful that exhibition technology is finally catching up with what’s always been there for us on the visual effects side.”

During that Reddit interview, Derrickson commented, “The EDR (Extended Dynamic Range) print is unbelievable — if you’re lucky enough to live where an EDR print is playing. As for 3D and/or IMAX, see it that way if you like that format. If you don’t, see it 2D.”

Doctor Strange is probably currently playing in a theater near you, but go see it in Dolby Cinema if you can.


In addition to being a West Coast correspondent for postPerspective, Daniel Restuccio is the multimedia department chair at California Lutheran University and former Walt Disney Imagineer.

ILM welcomes Oscar-winning VFX supervisor Eric Barba

ILM (Industrial Light & Magic) has brought Academy Award-winning visual effects supervisor Eric Barba to its Vancouver-based studio as creative director. In addition to supervising effects work, Barba will also provide creative oversight across all of the studio’s projects. He will work closely with ILM Vancouver’s executive in charge, Randal Shore, and collaborate with ILM’s global talent base.

For the past two years, Barba was chief creative officer of Digital Domain. A visual effects supervisor since 1999, he supervised the visual effects on David Fincher’s Zodiac, The Girl With the Dragon Tattoo, Gone Girl and The Curious Case of Benjamin Button, for which he was honored with an Oscar and a BAFTA Film Award for Outstanding Visual Effects.

Barba often collaborates with Joseph Kosinski, having supervised work on his films Tron: Legacy and Oblivion. Most recently Barba has been consulting on a number of feature projects.

Outside of his feature work, Barba has supervised effects work on dozens of commercials for brands such as Nike, Heineken and Microsoft Xbox/Epic Games. He has directed ad campaigns for American Express, Cingular, Honda, Jaguar and Nike. He has received eight AICP Awards, and three gold and two bronze Clio Awards for his spot work.

Barba began his career as a digital artist on sci-fi programs from Steven Spielberg’s Amblin Imaging. He is a graduate of Art Center College of Design and is a member of The Academy of Motion Picture Arts & Sciences.

ILM Vancouver is currently in production on Warcraft for Duncan Jones, Luc Besson’s Valerian and David Green’s Teenage Mutant Turtles 2.

Blog: My three goals for NAB 2015

This VFX supervisor shares his NAB game plan.

By Adrian Winter

It has been about four or five years since I was at an NAB Show, so I am very much looking forward to this year’s trip. In the past, my plan has been to fly out for just a day or two, walk the floor and then take a redeye home. This year I will be there for almost the entire week, and have plans to take in as many demos and seminars as I manage to squeeze in.

I have three areas of focus for the convention:

The Big players
Adobe, Autodesk, The Foundry and FilmLight always have a big presence at NAB, and I’ll be checking in with them to see what they have in the pipeline. I also plan to swing by a few of my other favorite exhibitor booths to see if I come across any gems. Continue reading

Behind the Title: Method VFX supervisor Alvin Cruz

NAME: Eduardo “Alvin” Cruz

COMPANY: Method Studios (@method_studios)

CAN YOU DESCRIBE YOUR COMPANY?
Method Studios is an artist-driven global studio that offers high-end visual effects for the film, commercial, television, gaming and design industries.

WHAT’S YOUR JOB TITLE?
Visual Effects Supervisor

WHAT DOES THAT ENTAIL?
A visual effects supervisor is the person who determines creative and technical approaches for Continue reading

PostChat: VFX/post supervisor Eric Alba

By Randi Altman

This week’s PostChat, the weekly Twitter conversation about post production, featured veteran VFX/post supervisor Eric Alba. Jesse Averna (@dr0id), one of the hosts of PostChat, interviewed  Alba about “the relationship between editors and the visual effects departments and how the demands of Alba’s job have changed over the years from more practical to more CGI and maybe now back to more practical.”

Alba’s diverse background makes him ideally suited for this type of discussion. Alba (@alba) started in post and found his way to VFX, and his story is one of inspiration. He got his start as a videotape operator in a VTR machine room. From there learned all aspects of post Continue reading

Zoic VFX helps time travel for ‘Hot Tub Time Machine 2’

Zoic Studios in Vancouver created 225 visual effects shots for the sequel to the popular, and silly, Hot Tub Time Machine. Zoic’s VFX supervisor Rocco Passionino travelled to the New Orleans to be on hand during the shoot.

The work provided included visual effects to help ramp up the funny, including futuristic matte paintings, high-tech graphical interfaces for a number of devices and vortexes of light and water. Passionino and the Zoic Vancouver office worked closely with director Steve Pink to deliver just the right visuals needed. The film stars Craig Robinson, Clark Duke, Rob Corddry and Adam Scott.

Since the guys use a hot tub to travel through time, the water vortex that serves as their mode of transportation to the future was hugely important to the story. To intensify this time portal for the sequel, the Zoic team crafted a water vortex that becomes a column of water and light. Passionino worked with cinematographer Declan Quinn to create an animated LED light system to simulate the interactive light from the water vortex that would later be augmented in post.

The CG team used a wide array of tools and software to delve into the dynamics and water solutions involved with creating the water vortex. Once the main column of the of the vortex was created using Realflow, additional elements of foam, splashes, droplets, blobs, steam, mist and debris were created in Maya with Phoenix, Fury and Vray, then added in compositing to create the shaft of the vortex.

Additional render passes for subsurface, refraction, depth, luminosity and light rays were created along with an RGB lighting utility pass to give further flexibility for the compositors to light the CG in 2D. To connect the chaos on the ground to the sky, a swirling mass of volumetric clouds were created with Phoenix and rendered in Vray.

Another central effect for the comedy was the visualization of a character’s existence being threatened due to the time travel. To show Lou (Rob Corddry) flickering — in threat of not surviving long enough to reach 2024 — clean plates were shot on set for each of the scenes where the effect would be needed and moments where the actors crossed. In post production, compositors painted sections back in that were needed and added multiple layers of static, distortion and chromatic aberration to the images to create the final effect.

Since Hot Tub Time Machine 2 includes traveling to the future, the Zoic team also crafted a number of holographic elements, such as a holographic television and holographic phone communications. To create a television that would cater to the ever-dwindling attention spans of viewers, Zoic worked with director Pink to create a holographic television that would allow for an endless number of channels to display simultaneously.

Motion graphics designer Jeremy Price crafted the multi-planed interface and composited the content into the windows to create an interactive and holographic device that was activated by touch. This design was also used for telecommunications throughout the film.

Best practices for working with practical and digital effects

By Andre Bustanoby

As a follow up to a short interview postPerspective did with MastersFX about working with and integrating practical and digital effects, I wanted to take this opportunity to share some insights and guidelines for producers, directors and artists who wish to blend both art forms within their film and TV projects.

Start With a Plan and Make Decisions Early
Because of today’s digital tools, it is now possible to create any image one can imagine… if one has the time, resources and talent, of course. However, despite advances in technology, big-screen images that were developed in the era before computer filmmaking came along can often be the most visually satisfying, even today.

Continue reading

MPC creates a variety of fairies, environments for Disney’s ‘Maleficent’

MPC, led by VFX supervisors Adam Valdez (based in London) and Seth Maury (based in Vancouver), completed 875 shots for Disney’s Maleficent, starring Angelina Jolie in a take on the story of Sleeping Beauty. Working closely with director Rob Stromberg and production VFX supervisor Carey Villegas, the team created a host of animated creatures and fairy world environments.

At the start of the movie, the young Maleficent flies into MPC’s full CG fairy world environment with colorful trees, lakes and waterfalls, interacting with MPC’s hero fairies as she travels. MPC’s environment team built a library of photographic elements taken from a second-unit shoot for their human and fairy environments. These included trees, rocks and bushes. This CG environment was built in Maya, using IDV Speedtree as a basis for tree geometry. The team created 15 different types of creatures, all with their own unique characteristics and features. These ranged from the larger, humanistic mushroom fairies and Wallerbogs, to the more animalistic Cheeps, smaller delicate dew fairies and water pixies.

O_QA_2327_Cine_still_4K_v2.1074smaller

MPC were tasked with creating Maleficent’s castle, which was created as a 3D model. A great deal of attention was paid to the texturing of the walls. The texture team layered different brick, mortar and masonry effects to the model to enhance detail and give a realistic look. The lighting team took inspiration from moonlit walks around Vancouver, taking photographs of different buildings hit by moonlight and its effect. They used a combination of subtle light sources on the model, including strong internal lights and firelights to make the castle feel as grand as possible, and the compositing team added flags and soldiers to the battlements.

One of the most iconic scenes of the movie includes an establishing shot of MPC’s CG castle with a CG crowd arriving for Aurora’s christening. MPC’s FX team created the green magic Maleficent uses to curse the baby.

E_AC_1960_comp_v1.1001small

In the moorland battle scene, MPC used its propriety crowd software tool, ALICE to multiply 100 soldier actors into a 1,500-member army. The team also turned the on-set standing stones into giant obelisks and added CG grass, distant trees and mountains. The team developed the Dark Rider chief, four additional Dark Riders, boar, troll and serpent fairies.

MPC also created the raven Diaval, which Maleficent transforms into a man. MPC’s animation team researched the traits of the bird’s behavior, from their flight to feather and wing movements, spending time with a bird-handler to gather photo and video reference to create a CG model. To transform the bird into a human the VFX team used fluid simulation ink effects and deep compositing.

MPC built a CG thorn wall, which surrounds the fairy world, and used its destruction tool Kali for the disruption of the soil. Soldiers attempt to destroy the wall and the team added CG fire balls, trebuchets, soldiers, fire and flame FX and destruction elements.

O_EB_4087_Cine_still_4K_v2.1078_afterdragon

MPC created a full CG dragon and Great Hall interior for the climactic battle sequence. MPC worked with designs created by production concept artists and continued to develop the fire breathing creature using Pixologic Z-Brush. The team then had to transform the creature into the raven. Digital burning embers, dragon fire and smoke FX fill the room and MPC also transformed the character of Diaval from raven to dragon. Full CG soldiers were created to battle the charging dragon.

MPC’s artists also created a Disney logo using the CG Sleeping Beauty castle replacing the traditional Cinderella castle. This also involved MPC FX work to match the Disney logo fireworks and sparkle VFX. A full CG environment was built to illustrate the opening voiceover and introduce the audience to the world Maleficent inhabits.

Quick chat: ‘Sleepy Hollow’ VFX supervisor Jason Zimmerman

BURBANK — The first season of Fox’s Sleepy Hollow, which recently had its season finale, averaged anywhere from 30 to 300 visual effects shots.

The show’s VFX supervisor, Jason Zimmerman, had three go-to houses — Synaptic VFX, Pixomondo (www.pixomondo.com) and Fuse (www.fusefx.com) — to create what the Continue reading