Tag Archives: visual effects

The A-list — Kong: Skull Island director Jordan Vogt-Roberts

By Iain Blair

Plucky explorers! Exotic locations! A giant ape! It can only mean one thing: King Kong is back… again. This time, the new Warner Bros. and Legendary Pictures’ Kong: Skull Island re-imagines the origin of the mythic Kong in an original adventure from director Jordan Vogt-Roberts (The Kings of Summer).

Jordan Vogt-Roberts

With an all-star cast that includes Tom Hiddleston, Samuel L. Jackson, Oscar-winner Brie Larson, John Goodman and John C. Reilly, it follows a diverse team of explorers as they venture deep into an uncharted island in the Pacific — as beautiful as it is treacherous — unaware that they’re crossing into the domain of the mythic Kong.

The legendary Kong was brought to life on a whole new scale by Industrial Light & Magic, with two-time Oscar-winner Stephen Rosenbaum (Avatar, Forrest Gump) serving as visual effects supervisor.

To fully immerse audiences in the mysterious Skull Island, Vogt-Roberts, his cast and filmmaking team shot across three continents over six months, capturing its primordial landscapes on Oahu, Hawaii — where shooting commenced on October 2015 — on Australia’s Gold Coast and, finally, in Vietnam, where production took place across multiple locations, some of which have never before been seen on film. Kong: Skull Island was released worldwide in 2D, 3D and IMAX beginning March 10.

I spoke with Vogt-Roberts about making the film and his love of post.

What’s the eternal appeal of doing a King Kong movie?
He’s King Kong! But the appeal is also this burden, as you’re playing with film history and this cinematic icon of pop culture. Obviously, the 1933 film is this impeccable genre story, and I’m a huge fan of creature features and people like Ray Harryhausen. I liked the idea of taking my love for all that and then giving it my own point of view, my sense of style and my voice.

With just one feature film credit, you certainly jumped in the deep end with this — pun intended — monster production, full of complex moving parts and cutting-edge VFX. How scary was it?
Every movie is scary because I throw myself totally into it. I vanish from the world. If you asked my friends, they would tell you I completely disappear. Whether it’s big or small, any film’s daunting in that sense. When I began doing shorts and my own stuff, I did shooting, the lighting, the editing and so on, and I thrived off all that new knowledge, so even all the complex VFX stuff wasn’t that scary to me. The truly daunting part is that a film like this is two and a half years of your life! It’s a big sacrifice, but I love a big challenge like this was.

What were the biggest challenges, and how did you prepare?
How do you make it special —and relevant in 2017? I’m a bit of a masochist when it comes to a challenge, and when I made the jump to The Kings of Summer it really helped train me. But there are certain things that are the same as they always are, such as there’s never enough time or money or daylight. Then there are new things on a movie of this size, such as the sheer endurance you need and things you simply can’t prepare yourself for, like the politics involved, all the logistics and so on. The biggest thing for me was, how do I protect my voice and point of view and make sure my soul is present in the movie when there are so many competing demands? I’m proud of it, because I feel I was able to do that.

How early on did you start integrating post and all the VFX?
Very early on — even before we had the script ready. We had concept artists and began doing previs and discussing all the VFX.

Did you do a lot of previs?
I’m not a huge fan of it. Third Floor did it and it’s a great tool for communicating what’s happening and how you’re going to execute it, but there’s also that danger of feeling like you’re already making the movie before you start shooting it. Think of all the great films like Blade Runner and the early Star Wars films, all shot before they even had previs, whereas now it’s very easy to become too reliant on it; you can see a movie sequence where it just feels like you’re watching previs come to life. It’s lost that sense of life and spontaneity. We only did three previs sequences — some only partially — and I really stressed with the crew that it was only a guide.

Where did you do the post?
It was all done at Pivotal in Burbank, and we began cutting as we shot. The sound mix was done at Skywalker and we did our score in London.

Do you like the post process?
I love post. I love all aspects of production, but post is where you write the film again and where it ceases being what was on the page and what you wanted it to be. Instead you have to embrace what it wants to be and what it needs to be. I love repurposing things and changing things around and having those 3am breakthroughs! If we moved this and use that shot instead, then we can cut all that.

You had three editors — Richard Pearson, Bob Murawski and Josh Schaeffer. How did that work?
Rick and Bob ran point, and Rick was the lead. Josh was the editor who had done The Kings of Summer with me, and my shorts. He really understands my montages and comedy. It was so great that Rick and Bob were willing to bring him on, and they’re all very different editors with different skills — and all masters of their craft. They weren’t on set, except for Hawaii. Once we were really globe-trotting, they were in LA cutting.

VFX play a big role. Can you talk about working on them with VFX supervisor Jeff White and ILM, who did the majority of the effects work?
He ran the team there, and they’re all amazing. It was a dream come true for me. They’re so good at taking kernels of ideas and turning them into reality. I was able to do revisions as I got new ideas. Creating Kong was the big one, and it was very tricky because the way he moves isn’t totally realistic. It’s very stylized, and Jeff really tapped into my animé and videogame sensibility for all that. We also used Hybride and Rodeo for some shots.

What was the hardest VFX sequence to do?
The helicopter sequence was really very difficult, juggling the geography of that, with this 100-foot creature and people spread all over the island, and also the final battle sequence. The VFX team and I constantly asked ourselves, “Have we seen this before? Is it derivative? Is it redundant?” The goal was to always keep it fresh and exciting.

Where did you do the DI?
At Fotokem with colorist Dave Cole who worked on The Lord of the Rings and so many others. I love color, and we did a lot of very unusual stuff for a movie like this, with a lot of saturation.

Did the film turn out the way you hoped?
A movie never quite turns out the way you hope or think it will, but I love the end result and I feel it represents my voice. I’m very proud of what we did.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Recreating history for Netflix’s The Crown

By Randi Altman

If you, like me, binge-watched Netflix’s The Crown, you are now considerably better educated on the English monarchy, have a very different view of Queen Elizabeth, and were impressed with the show’s access to Buckingham Palace.

Well, it turns out they didn’t actually have access to the Palace. This is where London-based visual effects house One of Us came in. While the number of shots provided for the 10-part series varied, the average was 43 per episode.

In addition to Buckingham Palace, One of Us worked on photoreal digital set extensions, crowd replications and environments, including Downing Street and London Airport. The series follows a young Elizabeth who inherits the crown after her father, King George VI, dies. We see her transition from a vulnerable young married lady to a more mature woman who takes her role as head monarch very seriously.

We reached out to One of Us VFX supervisor Ben Turner to find out more.

How early did you join the production?
One of Us was heavily involved during an eight-month pre-production process, until shooting commenced in July 2015.

Ben Turner

Did they have clear vision of what they needed VFX vs. practical?
As we were involved from the pre-production stage, we were able to engage in discussions about how best to approach shooting the scenes with the VFX work in mind. It was important to us and the production that actors interacted with real set pieces and the VFX work would be “thrown away” in the background, not drawing attention to itself.

Were you on set?
I visited all relevant locations, assisted on set by Jon Pugh who gathered all VFX data required. I would attend all recces at these locations, and then supervise on the shoot days.

Did you do previs? If so, what software did you use?
We didn’t do much previs in the traditional sense. We did some tech-vis to help us figure out how best to film some things, such as the arrivals at the gates of Buckingham Palace and the Coronation sequence. We also did some concept images to help inform the shoot and design of some scenes. This work was all done in Autodesk Maya, The Foundry’s Nuke and Adobe Photoshop.

Were there any challenges in working in 4K? Did your workflow change at all, and how much of your work currently is in 4K?
Working in 4K didn’t really change our workflow too much. At One of Us, we are used to working on film projects that come in all different shapes and sizes (we recently completed work on Terrance Mallick’s Voyage of Time in IMAX 5K), but for The Crown we invested in the infrastructure that enabled us to take it in our stride — larger and faster disks to hold the huge amounts of data, as well as a new 4K monitor to review all the work.

     

What were some of your favorite, or most challenging, VFX for the show?
The most challenging work was the kind of shots that many people are already very familiar with. So the Queen’s Coronation, for example, was watched by 20 million people in 1953, and with Buckingham Palace and Downing Street being two of the most famous and recognizable addresses in the world, there wasn’t really anywhere for us to hide!

Some of my favorite shots are the ones where we were recreating real events for which there are amazing archive references, such as the tilt down on the scaffolding at Westminster Abbey on the eve of the Coronation, or the unveiling of the statue of King George VI.

     

Can you talk about the tools you used, and did you create any propriety tools during the workflow?
We used Enwaii and Maya for photogrammetry, Photoshop for digital matte painting and Nuke for compositing. For crowd replication we created our own in-house 2.5D tool in Nuke, which was a card generator that gave the artist a choice of crowd elements, letting them choose the costume, angle, resolution and actions required.

What are you working on now?
We are currently hard at work on Season 2 of The Crown, which is going to be even bigger and more ambitious, so watch this space! Recent work also includes King Arthur: Legend Of The Sword (Warner Bros.) and Assassin’s Creed (New Regency).

Swedish post/VFX company Chimney opens in LA

Swedish post company Chimney has opened a Los Angeles facility, its first in the US, but one of their 12 offices in eight countries. Founded in Stockholm in 1995, Chimney produces over 6,000 pieces for more than 60 countries each year, averaging 1,000 projects and 10,000 VFX shots. The company, which is privately held by 50 of its artists, is able to offer 24-hour service thanks to its many locations around the world.

When asked why Chimney decided to open an office in LA, founder Henric Larsson said, “It was not the palm trees and beaches that made us open up in LA. We’re film nerds and we want to work with the best talent in the world, and where do we find the top directors, DPs, ADs, CDs and producers if not in the US?”

The Chimney LA crew.

The Chimney LA team was busy from the start, working with Team One to produce two Lexus campaigns, including one that debuted during the Super Bowl. For the Lexus Man & Machine Super Bowl Spot, they took advantage of the talent at sister facilities in Poland and Sweden.

Chimney also reports that it has signed with Shortlist Mgmt, joining other companies like RSA, Caviar, Tool and No6 Editorial. Charlie McBrearty, founding partner of Shortlist Mgmt, says that Chimney has “been on our radar for quite some time, and we are very excited to be part of their US expansion. Shortlist is no stranger to managing director Jesper Palsson, and we are thrilled to be reunited with him after our past collaboration through Stopp USA.”

Tools used for VFX include Autodesk’s Flame and Maya, The Foundry’s Nukea and Adobe After Effects. Audio is via Avid Pro Tools. Color is done in Digital Vision’s Nucoda. For editing they call on Avid Media Composer, Apple Final Cut and Adobe Premiere

Quick Chat: Brent Bonacorso on his Narrow World

Filmmaker Brent Bonacorso has written, directed and created visual effects for The Narrow World, which examines the sudden appearance of a giant alien creature in Los Angeles and the conflicting theories on why it’s there, what its motivations are, and why it seems to ignore all attempts at human interaction. It’s told through the eyes of three people with differing ideas of its true significance. Bonacorso shot on a Red camera with Panavision Primo lenses, along with a bit of Blackmagic Pocket Cinema Camera for random B-roll.

Let’s find out more…

Where did the idea for The Narrow World come from?
I was intrigued by the idea of subverting the traditional alien invasion story and using that as a way to explore how we interpret the world around us, and how our subconscious mind invisibly directs our behavior. The creature in this film becomes a blank canvas onto which the human characters project their innate desires and beliefs — its mysterious nature revealing more about the characters than the actual creature itself.

As with most ideas, it came to me in a flash, a single image that defined the concept. I was riding my bike along the beach in Venice, and suddenly in my head saw a giant Kaiju as big as a skyscraper sitting on the sand, gazing out at the sea. Not directly threatening, not exactly friendly either, with a mutual understanding with all the tiny humans around it — we don’t really understand each other at all, and probably never will. Suddenly, I knew why he was here, and what it all meant. I quickly sketched the image and the story followed.

What was the process like bringing the film to life as an independent project?
After I wrote the script, I shot principal photography with producer Thom Fennessey in two stages – first with the actor who plays Raymond Davis (Karim Saleh) and then with the actress playing Emily Field (Julia Cavanaugh).

I called in a lot of favors from my friends and connections here in LA and abroad — the highlight was getting some amazing Primo lenses and equipment from Panavision to use because they love Magdalena Górka’s (the cinematographer) work. Altogether it was about four days of principal photography, a good bit of it guerrilla style, and then shooting lots of B-roll all over the city.

Kacper Sawicki, head of Papaya Films which represents me for commercial work in Europe, got on board during post production to help bring The Narrow World to completion. Friends of mine in Paris and Luxembourg designed and textured the creature, and I did the lighting and animation in Maxon Cinema 4D and compositing in Adobe After Effects.

Our editor was the genius Jack Pyland (who cut on Adobe Premiere), based in Dallas. Sound design and color grading (via Digital Vision’s Nucoda) were completed by Polish companies Głośno and Lunapark, respectively. Our composer was Cedie Janson from Australia. So even though this was an indie project, it became an amazing global collaborative effort.

Of course, with any no-budget project like this, patience is key — lack of funds is offset by lots of time, which is free, if sometimes frustrating. Stick with it — directing is a generally a war of attrition, and it’s won by the tenacious.

As a director, how did you pull off so much of the VFX work yourself, and what lessons do you have for other directors?
I realized early on in my career as a director that the more you understand about post, and the more you can do yourself, the more you can control the scope of the project from start to finish. If you truly understand the technology and what is possible with what kind of budget and what kind of manpower, it removes a lot of barriers.

I taught myself After Effects and Cinema 4D in graphic design school, and later I figured out how to make those tools work for me in visual effects and to stretch the boundaries of the short films I was making. It has proved invaluable in my career — in the early stages I did most of the visual effects in my work myself. Later on, when I began having VFX companies do the work, my knowledge and understanding of the process enabled me to communicate very efficiently with the artists on my projects.

What other projects do you have on the horizon?
In addition to my usual commercial work, I’m very excited about my first feature project coming up this year through Awesomeness Films and DreamWorks — You Get Me, starring Bella Thorne and Halston Sage.

VFX house Jamm adds Flame artist Mark Holden

Santa Monica-based visual effects boutique Jamm has added veteran Flame artist Mark Holden to its roster. Holden comes to Jamm with over 20 years of experience in post production, including stints in London and Los Angeles.

It didn’t take long for Holden to dive right in at Jamm; he worked on Space 150’s Buffalo Wild Wings Super Bowl campaign directed by the Snorri Bros. and starring Brett Favre. The Super Bowl teaser kicked off the pre-game.

Holden is known not only for his visual effects talent, but also for turning projects around under tight deadlines and offering his clients as many possible solutions within the post process. This has earned him work with leading agencies such as Fallon, Mother, Saatchi & Saatchi, Leo Burnett, 180, TBWA/Chiat/Day, Goodby Silverstein & Partners, Deutsch, David & Goliath, and Team One. He has worked with brands including Lexus, Activision, Adidas, Chevy, Geico, Grammys, Kia, Lyft, Pepsi, Southwest Airlines, StubHub, McDonald’s, Kellogg’s, Stella Artois, Silk, Heineken and Olay.

 

Ingenuity Studios helps VFX-heavy spot get NASCAR-ready

Hollywood-based VFX house Ingenuity Studios recently worked on a 60-second Super Bowl spot for agency Pereira & O’Dell promoting Fox Sports’ coverage of the Daytona 500, which takes place on February 26. The ad, directed by Joseph Kahn, features people from all over the country gearing up to watch the Daytona 500, including footage from NASCAR races, drivers and, for some reason, actor James Van Der Beek.

The Ingenuity team had only two weeks to turn around this VFX-heavy spot, called Daytona Day. Some CG elements include a giant robot, race cars and crowds. While they were working on the effects, Fox was shooting footage in Charlotte, North Carolina and Los Angeles.

“When we were initially approached about this project we knew the turnaround would be a challenge,” explains creative director/VFX supervisor Grant Miller. “Editorial wasn’t fully locked until Thursday before the big game! With such a tight deadline preparing as much as we could in advance was key.”

Portions of the shoot took place at the Daytona Speedway, and since it was an off day the stadium and infield were empty. “In preparation, our CG team built the entire Daytona stadium while we were still shooting, complete with cheering CG crowds, RVs filling the interior, pit crews, etc.,” says Miller. “This meant that once shots were locked we simply needed to track the camera, adjust the lighting and render all the stadium passes for each shot.”

Additional shooting took place at the Charlotte Motor Speedway, Downtown Los Angeles and Pasadena, California.

In addition to prepping CG for set extensions, Ingenuity also got a head start on the giant robot that shows up halfway through the commercial.  “Once the storyboards were approved and we were clear on the level of detail required, we took our ‘concept bot’ out of ZBrush, retopologized and unwrapped it, then proceeded to do surfacing and materials in Substance Painter. While we had some additional detailing to do, we were able to get the textures 80 percent completed by applying a variety of procedural materials to the mesh, saving a ton of manual painting.”

Other effects work included over 40 CG NASCAR vehicles to fill the track, additional cars for the traffic jam and lots of greenscreen and roto work to get the scenes shot in Charlotte into Daytona. There was also a fair bit of invisible work that included cleaning up sets, removing rain, painting out logos, etc.

Other tools used include Autodesk’s Maya, The Foundry’s Nuke and BorisFX’s Mocha.

Last Chance to Enter to Win an Amazon Echo… Take our Storage Survey Now!

If you’re working in post production, animation, VFX and/or VR/AR/360, please take our short survey and tell us what works (and what doesn’t work) for your day-to-day needs.

What do you need from a storage solution? Your opinion is important to us, so please complete the survey by Wednesday, March 8th.

We want to hear your thoughts… so click here to get started now!

 

 

Review: Nvidia’s new Pascal-based Quadro cards

By Mike McCarthy

Nvidia has announced a number of new professional graphic cards, filling out their entire Quadro line-up with models based on their newest Pascal architecture. At the absolute top end, there is the new Quadro GP100, which is a PCIe card implementation of their supercomputer chip. It has similar 32-bit (graphics) processing power to the existing Quadro P6000, but adds 16-bit (AI) and 64-bit (simulation). It is intended to combine compute and visualization capabilities into a single solution. It has 16GB of new HBM2 (High Bandwidth Memory) and two cards can be paired together with NVLink at 80GB/sec to share a total of 32GB between them.

This powerhouse is followed by the existing P6000 and P5000 announced last July. The next addition to the line-up is the single-slot VR-ready Quadro P4000. With 1,792 CUDA cores running at 1200MHz, it should outperform a previous-generation M5000 for less than half the price. It is similar to its predecessor the M4000 in having 8GB RAM, four DisplayPort connectors, and running on a single six-pin power connector. The new P2000 follows next with 1024 cores at 1076MHz and 5GB of RAM, giving it similar performance to the K5000, which is nothing to scoff at. The P1000, P600 and P400 are all low-profile cards with Mini-DisplayPort connectors.

All of these cards run on PCIe Gen3 x16, and use DisplayPort 1.4, which adds support for HDR and DSC. They all support 4Kp60 output, with the higher end cards allowing 5K and 4Kp120 displays. In regards to high-resolution displays, Nvidia continues to push forward with that, allowing up to 32 synchronized displays to be connected to a single system, provided you have enough slots for eight Quadro P4000 cards and two Quadro Sync II boards.

Nvidia also announced a number of Pascal-based mobile Quadro GPUs last month, with the mobile P4000 having roughly comparable specifications to the desktop version. But you can read the paper specs for the new cards elsewhere on the Internet. More importantly, I have had the opportunity to test out some of these new cards over the last few weeks, to get a feel for how they operate in the real world.

DisplayPorts

Testing
I was able to run tests and benchmarks with the P6000, P4000 and P2000 against my current M6000 for comparison. All of these test were done on a top-end Dell 7910 workstation, with a variety of display outputs, primarily using Adobe Premiere Pro, since I am a video editor after all.

I ran a full battery of benchmark tests on each of the cards using Premiere Pro 2017. I measured both playback performance and encoding speed, monitoring CPU and GPU use, as well as power usage throughout the tests. I had HD, 4K, and 6K source assets to pull from, and tested monitoring with an HD projector, a 4K LCD and a 6K array of TVs. I had assets that were RAW R3D files, compressed MOVs and DPX sequences. I wanted to see how each of the cards would perform at various levels of production quality and measure the differences between them to help editors and visual artists determine which option would best meet the needs of their individual workflow.

I started with the intuitive expectation that the P2000 would be sufficient for most HD work, but that a P4000 would be required to effectively handle 4K. I also assumed that a top-end card would be required to playback 6K files and split the image between my three Barco Escape formatted displays. And I was totally wrong.

Besides when using the higher-end options within Premiere’s Lumetri-based color corrector, all of the cards were fully capable of every editing task I threw at them. To be fair, the P6000 usually renders out files about 30 percent faster than the P2000, but that is a minimal difference compared to the costs. Even the P2000 was able to playback my uncompressed 6K assets onto my array of Barco Escape displays without issue. It was only when I started making heavy color changes in Lumetri that I began to observe any performance differences at all.

Lumetri

Color correction is an inherently parallel, graphics-related computing task, so this is where GPU processing really shines. Premiere’s Lumetri color tools are based on SpeedGrade’s original CUDA processing engine, and it can really harness the power of the higher-end cards. The P2000 can make basic corrections to 6K footage, but it is possible to max out the P6000 with HD footage if I adjust enough different parameters. Fortunately, most people aren’t looking for more stylized footage than the 300 had, so in this case, my original assumptions seem to be accurate. The P2000 can handle reasonable corrections to HD footage, the P4000 is probably a good choice for VR and 4K footage, while the P6000 is the right tool for the job if you plan to do a lot of heavy color tweaking or are working on massive frame sizes.

The other way I expected to be able to measure a difference between the cards would be in playback while rendering in Adobe Media Encoder. By default, Media Encoder pauses exports during timeline playback, but this behavior can be disabled by reopening Premiere after queuing your encode. Even with careful planning to avoid reading from the same disks as the encoder was accessing from, I was unable to get significantly better playback performance from the P6000 compared to the P2000. This says more about the software than it says about the cards.

P6000

The largest difference I was able to consistently measure across the board was power usage, with each card averaging about 30 watts more as I stepped up from the P2000 to the P4000 to the P6000. But they all are far more efficient than the previous M6000, which frequently sucked up an extra 100 watts in the same tests. While “watts” may not be a benchmark most editors worry too much about, among other things it does equate to money for electricity. Lower wattage also means less cooling is needed, which results in quieter systems that can be kept closer to the editor without being distracting from the creative process or interfering with audio editing. It also allows these new cards to be installed in smaller systems with smaller power supplies, using up fewer power connectors. My HP Z420 workstation only has one 6-pin PCIe power plug, so the P4000 is the ideal GPU solution for that system.

Summing Up
It appears that we have once again reached a point where hardware processing capabilities have surpassed the software capacity to use them, at least within Premiere Pro. This leads to the cards performing relatively similar to one another in most of my tests, but true 3D applications might reveal much greater differences in their performance. Further optimization of CUDA implementation in Premiere Pro might also lead to better use of these higher-end GPUs in the future.


Mike McCarthy is an online editor and workflow consultant with 10 years of experience on feature films and commercials. He has been on the forefront of pioneering new solutions for tapeless workflows, DSLR filmmaking and now multiscreen and surround video experiences. If you want to see more specific details about performance numbers and benchmark tests for these Nvidia cards, check out techwithmikefirst.com.

Fantastic Beasts VFX workflow employs previs and postvis

By Daniel Restuccio

Warner Bros’ Fantastic Beasts and Where to Find Them is considered by some a Harry Potter prequel and by others an entirely new J.K. Rowling franchise. Filled with nearly 1,500 VFX shots, this live-action, CG and character-driven movie put a huge emphasis on pre-pro and established the increased role of postvis in the film’s visual effects post pipeline.

For the film’s overall visual effects supervisors, Tim Burke and Christian Manz, it was a family reunion of sorts, reteaming with many of the companies and individuals that worked on the Harry Potter movies, including director David Yates and producers David Heyman, Steve Kloves, J.K. Rowling and Lionel Wigram.

According to Manz, one of the most significant aspects of this film was how visual effects were integrated into the story from the very beginning. The direction from Yates was very clear: “Make things fantastic, but not fantasy.” For every creature design presented, Yates would ask, “What would be the story behind that creature? What would that character do if the audience saw it from one moment to the next?” Says Manz, “It all had to work to support the story, but not be the story.”

Manz feels that this movie speaks to a new way of storytelling with VFX. “Visual effects is now a part of that filmmaking and storytelling team rather than being the guys who stick in everything afterwards.”

Starting in January 2015, while Burke was busy as VFX supervisor on The Legend of Tarzan, Manz worked with Framestore animation director Pablo Grillo, a Framestore art and animation team and a group of freelance concept and previs artists doing creature development and scene design. Over eight months of design sessions they created 18 main animal types based on hundreds of variations, and worked with the Framestore art department to conceive the turn-of-the-century New York City sets and set extensions.

“Of course, there were creatures we tested that didn’t make the film,” says Framestore animator Andras Ormos, “but it was about the process of whittling it down, and that was the real joy of working on this project. The creative input stretched beyond the post production stage, deciding what worked and what wouldn’t in the overall film.”

“J.K. Rowling’s wonderful script was filled with characters and creatures,” explains Grillo. “Having seen how animation is such a big part of the process in a film like this, we decided that it was important to be involved from the concept stage onwards.” The character development and scene work sessions were so impressive they actually influenced subsequent drafts of the script.

Burke came on full-time in June 2015, and they split the movie in half. Manz took the lead developing the world inside Newt’s magical case, and Burke did the “Obscurus” and the third act. Principal photography took place from August 2015 to January 2016, and they took turns on set supervising their own and each other’s VFX sequences.

With Framestore and Double Negative taking the lead, the shots were spread out among nine main VFX and three previs/postvis companies including: Cinesite, Image Engine, Method Studios, Milk Visual Effects, Moving Picture Company, Nvizible, Proof, Rodeo FX, Secret Lab, The Third Floor and others. Burke says they divided the work by “the strengths of the companies and without overlapping them too much.”

Framestore
Framestore took on the majority of the complex character animation: the Niffler, Gnarlack, the Erumpent and Picket Bowtruckle, as well as many goblins and elves. Grillo first tackled Niffler, described by Rowling as “a long-snouted, burrowing creature native to Britain with a penchant for anything shiny.” The creature design was a mash-up of a spiny anteater, platypus and mole and went through hundreds of iterations and many animated prototypes. Framestore used the skin and muscle rigging toolkit Flesh and Flex developed for Tarzan on Niffler’s magic “loot stuffing” pouch.

Framestore

Framestore

The reason the audience is so delighted when this character first appears, explains Ormos, is that “this scene is driven by the relationship between Newt and the Niffler. There was a history we had to get across — the fact that the Niffler was notorious for escaping and pick-pocketing, and that Newt was going through the motions in trying to catch him. They understood each other and there were little looks, a language in their movement.”

Gnarlack, an American, cigar-chewing, snarky goblin, voiced and facial mocaped by actor Ron Perlman, “is one of the best digital humanoids yet,” reports Grillo. Perlman donned a Vicon Cara 3D facial motion capture headset, surrounded by four high-resolution, high-speed witness cameras. According to Framestore VFX supervisor Andy Kind, Perlman also sat in front of 98 cameras for a facial action coding shape (FACS) session so the team could sculpt the face directly in 3D.

“We created CG characters for the giants, elves and band ensemble,” says Kind. “Then we gave them crazy instruments, including a sousaphone/trumpet concoction.”

A 17-foot carbon fiber puppet, built by Handspring Puppet, substituted for the amorous rhinoceros Erumpent during the Central Park chase scene. It was switched out with the CG version later and dynamic simulations of shattering ice, explosive snow and water effects were added to the concluding shots. There’s this liquid, light-filled sack on the Erumpent’s forehead that Manz says, “made her slightly more unusual than a normal creature.”

“There was an awful lot of digital environment as well as the beast itself,” continues Manz. “David Yates fell in love with the postvis for this scene. It was great to be able to play with shots and offer up suggestions for the edit. It was a very organic way of filmmaking.”

Newt’s pocket-hiding creature sidekick, Picket Bowtruckle, took two months and 200 versions to get right. “We were told that Picket moved too slowly at first and that he appeared too old. We played with the speed but kept his movements graceful,” explains Manz. “He didn’t really have any facial animation, but he does blow a raspberry at one point. In the end, we added more shots to get Pickett’s story to go through, as everyone just loved him.”

MPC
The Moving Picture Company (MPC) completed more than 220 shots and created the Demiguise, Occamy and Billiwig, as well as 3D set extensions of period Manhattan.

For Demiguise’s long, flowing hair and invisibility effect, MPC used their Furtility groom technology. According to MPC VFX supervisor Ferran Domenech, using Furtility “allows for the hair to move naturally and interact with the creature’s arms, legs and the environment around it.” Demiguise was animated using enhanced mocap with keyframed facial expressions.

MPC

MPC built the large feathered dragon-snake Occamy in sections to fill the real and CG extended attic. They used Furtility once more, this time to add feathers, and they augmented the code so that in the climatic fight scene they could scale the giant version of the creature down to mouse-size. MPC’s effects team then used its in-house Kali destruction technology to wreck the attic.

Finally, MPC worked on the Billiwig, a magical bug that can change its flight mode from dragonfly to propeller plane. “This little creature has lots of character and was great fun to bring to life,” reports Domenech.

Previs and Postvis
A major technical advance for Fantastic Beasts can be found in the workflow. It’s been 15 years since the first Harry Potter movie and five years since Deathly Hallows. Over that time Burke had designed a very efficient, streamlined, mostly film-based VFX workflow.

“In the past, we were always stuck at the point where when we shot the film, it was put into editorial, they cut it and then gave it back to us — quite often with big holes where creatures would exist or environments needed to be placed,” describes Burke. “Then we would have to involve the facilities to use their real power to push things through and start blocking out all of the characters. This took quite a bit of time and would always slow the process down, and time is really the key difference with everything we do these days.”

In the past, says Burke, he might wait two months to see an early block of an animated character, “which always then restricts what you can do at the back end or restricts the director’s ability to make creative changes.”

Thankfully this wasn’t the case with Fantastic Beasts. “In the middle of the shoot, Christian and I started supervising the postvis of the scenes we’d already shot,” he explains. They assembled a 50-artist in-house postvis team comprised of members of The Third Floor, Proof and Framestore. While some of the movie was prevised, all of the movie was postvised.

“The path from previs to postvis varied from sequence to sequence,” explains Peter McDonald, previs/postvis supervisor for The Third Floor, London. “At one end of the scale, we had sequences that never really survived through shooting, while at the other end we had sequences that were followed shot-for-shot during the shoot and subsequent editorial process.”

Third Floor

Third Floor postvis

“As an example,” he continues, “the Demiguise and Occamy scene in the department store attic was heavily prevised. The final previs was a pretty polished and spectacular piece in its own right with some relatively sophisticated animation and a highly refined edit. This previs edit was taken onto the stage, with printouts of the shots being referenced as the shoot day progressed. What later came back our way for postvis was very similar to what had been designed in the previs, which was very satisfying from our point of view. It’s nice to know that previs can help drive a production at this level of fidelity!”

One of the benefits of this process was having a movie during editorial that had no “holes” where VFX shots were to later appear. The “postvis” was so good that it was used during audience screenings before the VFX shots were actually built and rendered.

“There were a couple of factors that elevated the postvis,” says McDonald. “Probably the primary one was integration between Framestore’s team with our team at The Third Floor London. Having them present and being supervised by Pablo Grillo guaranteed that the work we were putting together was being judged from almost a “finals” point of view, as Pablo and his artists would also be the ones finishing the job in post. It meant that our postvis wasn’t a throw away — it was the first step in the post production pipeline. This philosophy was present beyond the personnel involved. We also had creature rigs that could be translated with their animation down the line.”

Subway rampage previs

Third Floor’s previs of subway rampage.

One example of a scene that followed through from previs to postvis were parts of the Obscurus rampage in the subway. “Pablo and I worked very closely with artists at both Framestore and The Third Floor on this ‘sequence within a sequence,” says McDonald. “We started with bifrost fluid simulations created in Maya by our own senior asset builder Chris Dawson. We then had our animators distort these simulations into the virtual subway set. Through iteration, we developed the choreography of the piece and further refined the overall rhythm and shape of the piece with our previs editor. This previs then became the template for what was shot on the actual set with Eddie Redmayne and Colin Farrell in the roles of Newt and Graves. When the plate edit returned to us for postvis, we were pretty much able to drop the same distorted simulations onto the plates. The camera angles and cutting pattern in the previs edit had been followed very closely by the live-action unit. We then animated a layer of environment destruction and comped it into the shots to help tie it all together.”

During postvis, says Manz, “A character could go into a shot within a day or two. You would track the shot plate, put the character in, light it and stick it back into editorial. That sort of turn around, that in-house work that we did was the big, big difference with how the film worked. It allowed us to feed all that stuff to David Yates.”

Yates showed his director’s cut to the studio with every single shot of the film blocked out. There were no empty spaces. “We even got the environments in so he had a representation of every street,” says Manz. They completed a three-hour assembly of the film in about five months.

Creatively, it was very liberating, which enabled them to do additional shoots just to enhance the movie. Burke says they were designing and changing shots right up to the end. The final reveal of Jonny Depp as dark wizard Gellert Grindelwald came together all at the last minute.

Fantastic Beasts is like a Harry Potter movie because it exists in the J.K. Rowling story universe and is part of the Harry Potter lore. “Where it’s similar to Potter in terms of the filmmaking,” says Manz, “ is in making it feel very real and not fantasy. What I always enjoyed about the Potter films was they really felt like they were set in a real version of the UK; you could almost believe that magical sort of school existed.”

Third Floor previs

How Fantastic Beasts is different, says Burke, is that it is set in turn-of-the-century New York City, a real city, and not the magical school environment of Hogwarts. “We were dealing with adults,” continues Burke, “we’re not talking about a child growing and running everything through a child’s life. We’re talking about a series of adults. In that sense, it felt when we were making it like were making a film for adults, which obviously has great appeal to children as well. But I do feel it’s more of a film for grown-ups in terms of its storyline, and the issues it’s challenging and discussing.”

Someone told Manz that somebody “felt like Fantastic Beasts was made for the audience that read the books and watched those films and has now grown up.”

IMDB lists four new Fantastic Beasts movies in development. Burke and Manz are already in pre-production on the second, penciled in for a November 16, 2018 release date. “I think it’s fair to say,” says Burke, “that we’ll obviously be trying to expand on the universe that we’ve started to create. Newt will be there with his animals and various other characters, which are going to bring a lot more interest as the story evolves.”

Manz predicts, “It’s just trying to push the believability of the interactions and that world even further. The first film was all about creating that new world, and now it’s out there. It will be a new city (Paris) so we’ll have that challenge again, but we’ll build on what we’ve learned. You don’t often get an opportunity to work with the same team of people, so that’s going to be the great for the second one.”

 

 

Bill Hewes

Behind the Title: Click 3X executive producer Bill Hewes

NAME: Bill Hewes

COMPANY: Click 3X  (@Click3X) in New York City.

CAN YOU DESCRIBE YOUR COMPANY?
We are a digital creation studio that also provides post and animation services.

WHAT’S YOUR JOB TITLE?
I am an executive producer with a roster of animation and live-action directors.

WHAT DOES THAT ENTAIL?
Overseeing everything from the initial creative pitch, working closely with directors, budgeting, approach to a given project, overseeing line producers for shooting, animation and post, client relations and problem solving.

PGIM Prudential

One recent project was this animated spot for a Prudential Global Investment Management campaign.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
Probably that there is no limit to the job description — it involves business skills, a creative sensibility, communication and logistics. It is not about the big decisions, but more about the hundreds of small ones made moment to moment in a given day that add up.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Winning projects.

WHAT’S YOUR LEAST FAVORITE?
Losing projects

WHAT IS YOUR FAVORITE TIME OF THE DAY?
Depends on the day and where I am.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
A park ranger at Gettysburg.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I didn’t choose it. I had been on another career path in the maritime transportation industry and did not want to get on another ship, so I took an entry-level job at a video production company. From day one, there was not a day I did not want to go to work. I was fortunate to have had great mentors that made it possible to learn and advance.

Click it or Ticket

‘Click it or Ticket’ for the National Highway Traffic Safety Administration.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
Two animated spots for Prudential Global Investment Management, commercials and a social media campaign for Ford Trucks, and two humorous online animated spots for the NHTSA’s “Click It or Ticket” campaign.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
A few years back, I took some time off and worked with a director for several months creating films for Amnesty International. Oh, and putting a Dodge Viper on a lava field on a mountain in Hawaii.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
The wheel, anesthesia and my iPhone.

DO YOU LISTEN TO MUSIC WHILE YOU WORK?
I share an office, so we take turns picking the music selections. Lately, we’ve been listening to a lot of Kamasi Washington, Telemann, J Mascis and My Bloody Valentine.

I also would highly recommend, “I Plan to Stay a Believer” by William Parker and the album, “The Inside Songs” by Curtis Mayfield.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Jeet Kune Do, boxing, Muy Thai, Kali/Escrima, knife sparring, and some grappling. But I do this outside of the office.