Tag Archives: VFX

Fantastic Beasts VFX workflow employs previs and postvis

By Daniel Restuccio

Warner Bros’ Fantastic Beasts and Where to Find Them is considered by some a Harry Potter prequel and by others an entirely new J.K. Rowling franchise. Filled with nearly 1,500 VFX shots, this live-action, CG and character-driven movie put a huge emphasis on pre-pro and established the increased role of postvis in the film’s visual effects post pipeline.

For the film’s overall visual effects supervisors, Tim Burke and Christian Manz, it was a family reunion of sorts, reteaming with many of the companies and individuals that worked on the Harry Potter movies, including director David Yates and producers David Heyman, Steve Kloves, J.K. Rowling and Lionel Wigram.

According to Manz, one of the most significant aspects of this film was how visual effects were integrated into the story from the very beginning. The direction from Yates was very clear: “Make things fantastic, but not fantasy.” For every creature design presented, Yates would ask, “What would be the story behind that creature? What would that character do if the audience saw it from one moment to the next?” Says Manz, “It all had to work to support the story, but not be the story.”

Manz feels that this movie speaks to a new way of storytelling with VFX. “Visual effects is now a part of that filmmaking and storytelling team rather than being the guys who stick in everything afterwards.”

Starting in January 2015, while Burke was busy as VFX supervisor on The Legend of Tarzan, Manz worked with Framestore animation director Pablo Grillo, a Framestore art and animation team and a group of freelance concept and previs artists doing creature development and scene design. Over eight months of design sessions they created 18 main animal types based on hundreds of variations, and worked with the Framestore art department to conceive the turn-of-the-century New York City sets and set extensions.

“Of course, there were creatures we tested that didn’t make the film,” says Framestore animator Andras Ormos, “but it was about the process of whittling it down, and that was the real joy of working on this project. The creative input stretched beyond the post production stage, deciding what worked and what wouldn’t in the overall film.”

“J.K. Rowling’s wonderful script was filled with characters and creatures,” explains Grillo. “Having seen how animation is such a big part of the process in a film like this, we decided that it was important to be involved from the concept stage onwards.” The character development and scene work sessions were so impressive they actually influenced subsequent drafts of the script.

Burke came on full-time in June 2015, and they split the movie in half. Manz took the lead developing the world inside Newt’s magical case, and Burke did the “Obscurus” and the third act. Principal photography took place from August 2015 to January 2016, and they took turns on set supervising their own and each other’s VFX sequences.

With Framestore and Double Negative taking the lead, the shots were spread out among nine main VFX and three previs/postvis companies including: Cinesite, Image Engine, Method Studios, Milk Visual Effects, Moving Picture Company, Nvizible, Proof, Rodeo FX, Secret Lab, The Third Floor and others. Burke says they divided the work by “the strengths of the companies and without overlapping them too much.”

Framestore
Framestore took on the majority of the complex character animation: the Niffler, Gnarlack, the Erumpent and Picket Bowtruckle, as well as many goblins and elves. Grillo first tackled Niffler, described by Rowling as “a long-snouted, burrowing creature native to Britain with a penchant for anything shiny.” The creature design was a mash-up of a spiny anteater, platypus and mole and went through hundreds of iterations and many animated prototypes. Framestore used the skin and muscle rigging toolkit Flesh and Flex developed for Tarzan on Niffler’s magic “loot stuffing” pouch.

Framestore

Framestore

The reason the audience is so delighted when this character first appears, explains Ormos, is that “this scene is driven by the relationship between Newt and the Niffler. There was a history we had to get across — the fact that the Niffler was notorious for escaping and pick-pocketing, and that Newt was going through the motions in trying to catch him. They understood each other and there were little looks, a language in their movement.”

Gnarlack, an American, cigar-chewing, snarky goblin, voiced and facial mocaped by actor Ron Perlman, “is one of the best digital humanoids yet,” reports Grillo. Perlman donned a Vicon Cara 3D facial motion capture headset, surrounded by four high-resolution, high-speed witness cameras. According to Framestore VFX supervisor Andy Kind, Perlman also sat in front of 98 cameras for a facial action coding shape (FACS) session so the team could sculpt the face directly in 3D.

“We created CG characters for the giants, elves and band ensemble,” says Kind. “Then we gave them crazy instruments, including a sousaphone/trumpet concoction.”

A 17-foot carbon fiber puppet, built by Handspring Puppet, substituted for the amorous rhinoceros Erumpent during the Central Park chase scene. It was switched out with the CG version later and dynamic simulations of shattering ice, explosive snow and water effects were added to the concluding shots. There’s this liquid, light-filled sack on the Erumpent’s forehead that Manz says, “made her slightly more unusual than a normal creature.”

“There was an awful lot of digital environment as well as the beast itself,” continues Manz. “David Yates fell in love with the postvis for this scene. It was great to be able to play with shots and offer up suggestions for the edit. It was a very organic way of filmmaking.”

Newt’s pocket-hiding creature sidekick, Picket Bowtruckle, took two months and 200 versions to get right. “We were told that Picket moved too slowly at first and that he appeared too old. We played with the speed but kept his movements graceful,” explains Manz. “He didn’t really have any facial animation, but he does blow a raspberry at one point. In the end, we added more shots to get Pickett’s story to go through, as everyone just loved him.”

MPC
The Moving Picture Company (MPC) completed more than 220 shots and created the Demiguise, Occamy and Billiwig, as well as 3D set extensions of period Manhattan.

For Demiguise’s long, flowing hair and invisibility effect, MPC used their Furtility groom technology. According to MPC VFX supervisor Ferran Domenech, using Furtility “allows for the hair to move naturally and interact with the creature’s arms, legs and the environment around it.” Demiguise was animated using enhanced mocap with keyframed facial expressions.

MPC

MPC built the large feathered dragon-snake Occamy in sections to fill the real and CG extended attic. They used Furtility once more, this time to add feathers, and they augmented the code so that in the climatic fight scene they could scale the giant version of the creature down to mouse-size. MPC’s effects team then used its in-house Kali destruction technology to wreck the attic.

Finally, MPC worked on the Billiwig, a magical bug that can change its flight mode from dragonfly to propeller plane. “This little creature has lots of character and was great fun to bring to life,” reports Domenech.

Previs and Postvis
A major technical advance for Fantastic Beasts can be found in the workflow. It’s been 15 years since the first Harry Potter movie and five years since Deathly Hallows. Over that time Burke had designed a very efficient, streamlined, mostly film-based VFX workflow.

“In the past, we were always stuck at the point where when we shot the film, it was put into editorial, they cut it and then gave it back to us — quite often with big holes where creatures would exist or environments needed to be placed,” describes Burke. “Then we would have to involve the facilities to use their real power to push things through and start blocking out all of the characters. This took quite a bit of time and would always slow the process down, and time is really the key difference with everything we do these days.”

In the past, says Burke, he might wait two months to see an early block of an animated character, “which always then restricts what you can do at the back end or restricts the director’s ability to make creative changes.”

Thankfully this wasn’t the case with Fantastic Beasts. “In the middle of the shoot, Christian and I started supervising the postvis of the scenes we’d already shot,” he explains. They assembled a 50-artist in-house postvis team comprised of members of The Third Floor, Proof and Framestore. While some of the movie was prevised, all of the movie was postvised.

“The path from previs to postvis varied from sequence to sequence,” explains Peter McDonald, previs/postvis supervisor for The Third Floor, London. “At one end of the scale, we had sequences that never really survived through shooting, while at the other end we had sequences that were followed shot-for-shot during the shoot and subsequent editorial process.”

Third Floor

Third Floor postvis

“As an example,” he continues, “the Demiguise and Occamy scene in the department store attic was heavily prevised. The final previs was a pretty polished and spectacular piece in its own right with some relatively sophisticated animation and a highly refined edit. This previs edit was taken onto the stage, with printouts of the shots being referenced as the shoot day progressed. What later came back our way for postvis was very similar to what had been designed in the previs, which was very satisfying from our point of view. It’s nice to know that previs can help drive a production at this level of fidelity!”

One of the benefits of this process was having a movie during editorial that had no “holes” where VFX shots were to later appear. The “postvis” was so good that it was used during audience screenings before the VFX shots were actually built and rendered.

“There were a couple of factors that elevated the postvis,” says McDonald. “Probably the primary one was integration between Framestore’s team with our team at The Third Floor London. Having them present and being supervised by Pablo Grillo guaranteed that the work we were putting together was being judged from almost a “finals” point of view, as Pablo and his artists would also be the ones finishing the job in post. It meant that our postvis wasn’t a throw away — it was the first step in the post production pipeline. This philosophy was present beyond the personnel involved. We also had creature rigs that could be translated with their animation down the line.”

Subway rampage previs

Third Floor’s previs of subway rampage.

One example of a scene that followed through from previs to postvis were parts of the Obscurus rampage in the subway. “Pablo and I worked very closely with artists at both Framestore and The Third Floor on this ‘sequence within a sequence,” says McDonald. “We started with bifrost fluid simulations created in Maya by our own senior asset builder Chris Dawson. We then had our animators distort these simulations into the virtual subway set. Through iteration, we developed the choreography of the piece and further refined the overall rhythm and shape of the piece with our previs editor. This previs then became the template for what was shot on the actual set with Eddie Redmayne and Colin Farrell in the roles of Newt and Graves. When the plate edit returned to us for postvis, we were pretty much able to drop the same distorted simulations onto the plates. The camera angles and cutting pattern in the previs edit had been followed very closely by the live-action unit. We then animated a layer of environment destruction and comped it into the shots to help tie it all together.”

During postvis, says Manz, “A character could go into a shot within a day or two. You would track the shot plate, put the character in, light it and stick it back into editorial. That sort of turn around, that in-house work that we did was the big, big difference with how the film worked. It allowed us to feed all that stuff to David Yates.”

Yates showed his director’s cut to the studio with every single shot of the film blocked out. There were no empty spaces. “We even got the environments in so he had a representation of every street,” says Manz. They completed a three-hour assembly of the film in about five months.

Creatively, it was very liberating, which enabled them to do additional shoots just to enhance the movie. Burke says they were designing and changing shots right up to the end. The final reveal of Jonny Depp as dark wizard Gellert Grindelwald came together all at the last minute.

Fantastic Beasts is like a Harry Potter movie because it exists in the J.K. Rowling story universe and is part of the Harry Potter lore. “Where it’s similar to Potter in terms of the filmmaking,” says Manz, “ is in making it feel very real and not fantasy. What I always enjoyed about the Potter films was they really felt like they were set in a real version of the UK; you could almost believe that magical sort of school existed.”

Third Floor previs

How Fantastic Beasts is different, says Burke, is that it is set in turn-of-the-century New York City, a real city, and not the magical school environment of Hogwarts. “We were dealing with adults,” continues Burke, “we’re not talking about a child growing and running everything through a child’s life. We’re talking about a series of adults. In that sense, it felt when we were making it like were making a film for adults, which obviously has great appeal to children as well. But I do feel it’s more of a film for grown-ups in terms of its storyline, and the issues it’s challenging and discussing.”

Someone told Manz that somebody “felt like Fantastic Beasts was made for the audience that read the books and watched those films and has now grown up.”

IMDB lists four new Fantastic Beasts movies in development. Burke and Manz are already in pre-production on the second, penciled in for a November 16, 2018 release date. “I think it’s fair to say,” says Burke, “that we’ll obviously be trying to expand on the universe that we’ve started to create. Newt will be there with his animals and various other characters, which are going to bring a lot more interest as the story evolves.”

Manz predicts, “It’s just trying to push the believability of the interactions and that world even further. The first film was all about creating that new world, and now it’s out there. It will be a new city (Paris) so we’ll have that challenge again, but we’ll build on what we’ve learned. You don’t often get an opportunity to work with the same team of people, so that’s going to be the great for the second one.”

 

 

Jim Hagarty Photography

Blue Sky Studios’ Mikki Rose named SIGGRAPH 2019 conference chair

Mikki Rose has been named conference chair of SIGGRAPH 2019. Fur technical director at Greenwich, Connecticut-based Blue Sky Studios, Rose chaired the Production Sessions during SIGGRAPH 2016 this past July in Anaheim and has been a longtime volunteer and active member of SIGGRAPH for the last 15 years.

Rose has worked on such film as The Peanuts Movie and Hotel Transylvania. She refers to herself a “CG hairstylist” due to her specialization in fur at Blue Sky Studios — everything from hair to cloth to feathers and even vegetation. She studied general CG production at college and holds BS degrees in Computer Science and Digital Animation from Middle Tennessee State University as well as an MFA in Digital Production Arts from Clemson University. Prior to Blue Sky, she lived in California and held positions with Rhythm & Hues Studios and Sony Pictures Imageworks.

“I have grown to rely on each SIGGRAPH as an opportunity for renewal of inspiration in both my professional and personal creative work. In taking on the role of chair, my goal is to provide an environment for those exact activities to others,” said Rose. “Our industries are changing and developing at an astounding rate. It is my task to incorporate new techniques while continuing to enrich our long-standing traditions.”

SIGGRAPH 2019 will take place in Los Angeles from July 29 to August 2, 2019.


Main Image: SIGGRAPH 2016 — Jim Hagarty Photography

Creating and tracking roaches for Hulu’s 11.22.63

By Randi Altman

Looking for something fun and compelling to watch while your broadcast shows are on winter break? You might want to try Hulu’s original eight-part miniseries 11.22.63, which the streaming channel released last February.

It comes with a pretty impressive pedigree — it’s based on a Stephen King novel, it’s executive produced by J.J. Abrams, it stars Oscar-nominee James Franco (127 Hours) and it’s about JFK’s assassination and includes time travel. C’mon!

The plot involves Franco’s character traveling back to 1960 in an effort to stop JFK’s assassination, but just as he makes headway, he feels the past pushing back in some dangerous, and sometimes gross, ways.

Bruce Branit

In the series pilot, Franco’s character, Jack Epping, is being chased by Kennedy’s security after he tries to sneak into a campaign rally. He ducks in a storage room to hide, but he’s already ticked off the past, which slowly serves him up a room filled with cockroaches that swarm him. The sequence is a slow build, with roaches crawling out, covering the floor and then crawling up him.

I’m not sure if Franco has a no-roach clause in his contract (I would), but in order to have control over these pests, it was best to create them digitally. This is where Bruce Branit, owner of BranitFX in Kansas City, Missouri came in. Yes, you read that right, Kansas City, and his resume is impressive. He is a frequent collaborator with Jay Worth, Bad Robot’s VFX supervisor.

So for this particular scene, BranitFX had one or two reference shots, which they used to create a roach brush via Photoshop. Once the exact look was determined regarding the amount of attacking roaches, they animated it in 3D and and composited. They then used 2D and 3D tracking tools to track Franco while the cockroaches swarmed all over him.

Let’s find out more from Bruce Branit.

How early did you get involved in that episode? How much input did you have in how it would play out?
For this show, there wasn’t a lot of lead time. I came on after shooting was done and there was a rough edit. I don’t think the edit changed a lot after we started.

What did the client want from the scene, and how did you go about accomplishing that?
VFX supervisor Jay Worth and I have worked together on a lot of shows. We’d done some roaches for an episode of Almost Human, and also I think for Fringe, so we had some similar assets and background with talking “roach.” The general description was tons of roaches crawling on James Franco.

Did you do previs?
Not really. I rendered about 10 angles of the roach we had previously worked with and made Adobe Photoshop brushes out of each frame. I used that to paint up a still of each shot to establish a baseline for size, population and general direction of the roaches in each of the 25 or so shots in the sequence.

Did you have to play with the movements a lot, or did it all just come together?
We developed a couple base roach walks and behaviors and then populated each scene with instances of that. This changed depending on whether we needed them crossing the floor, hanging on a light fixture or climbing on Franco’s suit. The roach we had used in the past was similar to what the producers on 11.22.63 had in mind. We made a few minor modifications with texture and modeling. Some of this affected the rig we’d built so a lot of the animations had to be rebuilt.

Can you talk about your process/workflow?
This sequence was shot in anamorphic and featured a constantly flashing light on the set going from dark emergency red lighting to brighter florescent lights. So I generated unsqueezed lens distortion, removed and light mitigated interim plates to pull all of our 2D and 3D tracking off of. The tracking was broken into 2D, 3D and 3D tracking by hand involving roaches on Franco’s body as he turns and swats at them in a panic. The production had taped large “Xs” on his jacket to help with this roto-tracking, but those two had to be painted out for many shots prior to the roaches reaching Franco.

The shots were tracked in Fusion Studio for 2D and SynthEyes for 3D. A few shots were also tracked in PFTrack.

The 3D roach assets were animated and rendered in NewTek LightWave. Passes for the red light and white light conditions were rendered as well as ambient show and specular passes. Although we were now using tracking plates with the 2:1 anamorphic stretch removed, a special camera was created in LightWave that was actually double the anamorphic squeeze to duplicate the vertical booked and DOF from an anamorphic lens. The final composite was completed in Blackmagic Fusion Studio using the original anamorphic plates.

What was the biggest challenge you faced working on this scene?
Understanding the anamorphic workflow was a new challenge. Luckily, I had just completed a short project of my own called Bully Mech that was shot with Lomo anamorphic lenses. So I had just recently developed some familiarity and techniques to deal with the unusual lens attributes of those lenses. Let’s just say they have a lot of character. I talked with a lot of cinematographer friends to try to understand how the lenses behaved and why they stretched the out-of-focus element vertically while the image was actually stretched the other way.

What are you working on now?
I‘ve wrapped up a small amount of work on Westworld and a handful of shots on Legends of Tomorrow. I’ve been directing some television commercials the last few months and just signed a development deal on the Bully Mech project I mentioned earlier.

We are making a sizzle reel of the short that expands the scope of the larger world and working with concept designers and a writer to flush out a feature film pitch. We should be going out with the project early next year.

Infinite Fiction

Republic Editorial launches design/VFX studio

Republic Editorial in Dallas has launched a design- and VFX-focused sister studio, Infinite Fiction, and leading the charge as executive producer is visual effects industry veteran Joey Cade. In her new role, she will focus on developing Infinite Fiction’s sales and marketing strategy, growing its client roster and expanding the creative team and its capabilities. More on her background in a bit.

Infinite Fiction, which is being managed by Republic partners Carrie Callaway, Chris Gipson and Keith James, focuses on high-end, narrative-driven motion design and visual effects work for all platforms, including virtual reality. Although it shares management with Republic Editorial, Infinite Fiction is a stand-alone creative shop and will service agencies, outside post houses and entertainment studios.

Infinite Fiction is housed separately, but located next door to Republic Editorial’s uptown Dallas headquarters. It adds nearly 2,000 square feet of creative space to Republic’s recently renovated 8,000 square feet and is already home to a team of motion designers, visual effects artists, CG generalists and producers.

Cade began her career in live-action production working with Hungry Man, Miramax and NBC. She gained expertise in visual effects and animation at Reel FX, which grew from a 30-person boutique to an over 300-person studio with several divisions during her tenure. As its first entertainment division executive producer, Cade won business with Sony TV, Universal, A&E Networks and ABC Family as well as produced Reel FX’s first theatrical VFX project for Disney. She broadened her skill set by launching and managing a web-based business and gained branding, marketing and advertising experience within small independent agencies, including Tractorbeam.

Infinite Fiction already has projects in its pipeline, including design-driven content pieces for TM Advertising, Dieste and Tracy Locke.

Credit: Film Frame ©2016 Marvel. All Rights Reserved.

Digging Deeper: Doctor Strange VFX supervisor Stephane Ceretti

By Daniel Restuccio

Marvel’s Doctor Strange — about an arrogant neurosurgeon who loses the use of his hands in an accident and sets off on a self-obsessed journey to find a cure — has been doing incredibly well in terms of box office. You’ve got the winning combination of Benedict Cumberbatch, Marvel, a compelling story and a ton of visual effects created by some of the biggest houses in the business, including ILM (London, San Francisco, Vancouver), Method (LA, Vancouver), Luma (LA, Melbourne) Framestore London, Lola, Animal Logic, Crafty Apes, Exceptional Minds and Technicolor VFX.

Stephane Ceretti

Leading the VFX charge was visual effects supervisor Stephane Ceretti, whose credit list reads like a Top 10 list for films based on Marvel comics, including Guardians of the Galaxy, Thor: The Dark World, Captain America: The First Avenger and X-Men: First Class. His resume is long and impressive.

We recently reached out to Ceretti to find out more about Doctor Strange‘s VFX process…

When did you start on the project? When were all the shots turned in?
I started in September 2014 as Scott Derrickson, the director, was working on the script. Production got pushed a few months while we waited for Benedict Cumberbatch to be available, but we worked extensively on previz and visual development during all this time. Production moved to London in June 2015 and shooting began in November 2015 and went until March 2016. Shots and digital asset builds got turned over as we were shooting and in post, as the post production period was very short on the film. We only had 5.5 months to do the visual effects. We finished the film sometime in October, just a few weeks before the release.

What criteria did you use to distribute the shots among the different VFX companies?  For example, was it based on specialty areas?
It’s like a casting; you try to pick the best company and people for each style of effects. For example, ILM had done a lot of NYC work before, especially with Marvel on Avengers. Plus they are a VFX behemoth, so for us it made sense to have them on board the project for these two major sequences, especially with Richard Bluff as their supervisor. He worked with my VFX producer Susan Pickett on the New York battle sequence in Avengers and she knew he would totally be great for what we wanted to achieve.

What creative or technical breakthroughs were there on this project? For example, ILM talked about the 360 Dr. Strange Camera. What were some of the other things that had never been done before?
I think we pushed the envelope on a lot of visual things that had been touched before, but not to that level. We also made huge use of digital doubles extremely close to camera, both in the astral world and the magic mystery tour. It was a big ask for the vendors.

ILM said they did the VFX at IMAX 2K, were any of the VFX shots done at 4K? If yes, why?
No we couldn’t do a 4K version for the IMAX on this project. IMAX takes care, upresing the shots to IMAX resolution with their DMR process. The quality of the Alexa 65, which we used to shoot the movie, makes it a much smoother process. Images were much sharper and detailed to begin with.

It may be meaningless to talk about how many effects shots there were in the movie when it seems like every shot is a VFX shot.  Is there a more meaningful way to describe the scale of the VFX work? 
It is true that just looking at the numbers isn’t a good indication … we had 1,450 VFX shots in the film, and that’s about 900 less than Guardians of the Galaxy, but the shot complexity and design was way more involved because every shot was a bit of a puzzle, plus the R&D effort.

Some shots with the Mandelbrot 3D fractals required a lot of computing power, having a full bending CG NY required tons of assets and the destruction simulation in Hong Kong had to be extremely precise as we were really within the entire street being rebuilt in reversed time. All of these were extremely time and process consuming and needed to be choreographed and designed precisely.

Can you talk about the design references Marvel gave you for the VFX work done in this movie?
Well most of the references that Marvel gave us came from the comics, especially the ones from Steve Ditko, who created all the most iconic psychedelic moments in Doctor Strange in the ‘60s and ‘70s. We also looked at a Doctor Strange comic called “The Oath,” which inspired some of the astral projection work.

How did you draw the line stylistically and creatively between impressively mind-blowing and over-the-top psychedelic?
It was always our concern to push the limits but not break them. We want to take the audience to these new places but not lose them on the way. It was a joint effort between the VFX artists and the director, editors and producers to always keep in mind what the goal of the story was and to make sure that the VFX wouldn’t take over when it was not necessary. It’s important that the VFX don’t overtake the story and the characters at any time. Sometimes we allow ourselves to shine and show off but it’s always in the service of pushing the story further.

What review and submission technology did you use to coordinate all the VFX houses? Was there a central server?
We used CineSync to review all the submissions live with the vendors. Marvel has a very strong IT department and servers that allow the various vendors to send their submission securely and quickly. We used a system called Signiant that allows all submissions to be automatically sorted and put in a database for review. It’s very efficient and necessary when you get a huge amount of submissions daily as we did toward the end of the project. Our team of amazing coordinators made sure everything was reviewed and presented to the studio so we could give immediate feedback to our vendors, who worked 24/7 around the globe to finish the movie.

What project management software did you use?
Our database is customized and we use Filemaker. Our review sessions are a mixture of CineSync (QuickTime and interactive reviews) and Tweak RV for 2K viewing and finalizing.

In talking to ILM about the film, they mentioned previs, production and postvis. Can you talk a bit about that whole workflow?
We do extensive previz/techviz and stuntviz before production, but as soon as the shots are in the can editors cut them in the movie. They are then turned over to our postviz team so we can quickly check that everything works and the editors can cut in a version of the shot that represents the idea of what it will be in the end. It’s a fantastic tool that allows us to shape the film before we turn it over to the vendors, so we nail basic ideas and concepts before they get executed. Obviously, there is lots that the vendors will add on top of the postviz, but this process is necessary for a lot of reasons (editing, R&D, preview screening) and is very efficient and useful.

Collectively how many hundreds of people worked on the VFX on this movie?
I would say about 1,000 people in the VFX overall. That does not count the 3D conversion people.

What was the personal challenge for you? How did you survive and thrive while working on this one project?
I worked two years on it! It was really difficult, but also very exciting. Sometimes mentally draining and challenging, but always interesting. What makes you survive is the excitement of making something special and getting to see it put together by such a talented group of people across the board. When you work on this kind of film everybody does their best, so the outcome is worth it. I think we definitely tried to do our best, and the audience seems to respond to what we did. It’s incredibly rewarding and in the end, it’s the reason why we make these movies — so that people can enjoy the ride.


In addition to being a West Coast correspondent for postPerspective, Daniel Restuccio is the multimedia department chair at California Lutheran University and former Walt Disney Imagineer.

New Wacom Cintiq Pro line offers portability, updated pen, more

Wacom has introduced a new line of Wacom Cintiq Pro creative pen displays: the Cintiq Pro 13 and Cintiq Pro 16. The Wacom Cintiq Pro features a thin and portable form factor, making them suitable for working on the road or remotely.

Cintiq Pro’s new Pro Pen 2, according to Wacom, offers four times greater accuracy and pressure sensitivity than the previous Pro Pen. The improved Pro Pen 2 creates an intuitive experience with virtually lag-free tracking on a glass surface that produces the right amount of friction, and is coated to reduce reflection.

Additionally, the new optical bonding process reduces parallax, providing a pen-on-screen performance that feels natural and has the feedback of a traditional pen or brush. Both Cintiq Pro models also feature multi-touch for easy and fast navigation, as well as the ability to pinch, zoom and rotate illustrations, photos or models within supporting 2D or 3D creative software apps.

Both high-resolution Cintiq Pro models come with an optimized edge-to-edge etched glass workspace. The Cintiq Pro also builds on its predecessor, the Cintiq 13HD touch, offering the ExpressKey Remote as an optional accessory so users can customize their most commonly used shortcuts and modifiers when working with their most-used software applications. In addition, ergonomic features, such as ErgoFlex, fully integrated pop out legs and an optional three-position desk stand (available in February), let users focus on their work instead of constantly adjusting for comfort.

The Wacom Cintiq Pro 13 and 16 are compatible with both Macs and PCs and feature full HD (1920×1080) and UHD (3840×2160) resolution, respectively. Both Cintiq Pro configurations deliver vivid colors, the 13-inch model providing 87 percent Adobe RGB and the 16-inch, 94 percent.

Priced at $999.95 USD, the Cintiq Pro 13 is expected to be available online and at select retail locations at the beginning of December. The Cintiq Pro 16, $1499.95 USD, is expected in February.

GenPop’s Bill Yukich directs, edits gritty open for Amazon’s Goliath 

Director/editor Bill Yukich helmed the film noir-ish opening title sequence for Amazon’s new legal drama, Goliath. Produced by LA-based content creation studio GenPop, the black and white intro starts with Goliath lead actor Billy Bob Thornton jumping into the ocean. While underwater, and smoking a cigarette and holding a briefcase, he casually strolls through rooms filled with smoke and fire. At the end of the open, he rises from the water as the Santa Monica Pier appears next to him and as the picture turns from B&W to color. The Silent Comedy’s “Bartholomew” track plays throughout.

The ominous backdrop, of a man underwater but not drgoliathowning, is a perfect visual description of Thornton’s role as disgraced lawyer Billy McBride. Yukich’s visuals, he says, are meant to strike a balance between dreamlike and menacing.

The approved concept called for a dry shoot, so Yukich came up with solutions to make it seem as though the sequence was actually filmed underwater. Shot on a Red Magnesium Weapon camera, Yukich used a variety of in-camera techniques to achieve the illusion of water, smoke and fire existing within the same world, including the ingenious use of smoke to mimic the movement of crashing waves.

After wrapping the live-action shoot with Thornton, Yukich edited and color corrected the sequence. The VFX work was mostly supplementary and used to enhance the practical effects which were captured on set, such as adding extra fireballs into the frame to make the pyrotechnics feel fuller. Editing was via Adobe Premiere and VFX and color was done in Autodesk Flame. In the end, 80 percent was live action and only 20 percent visual effects.

Once post production was done, Yukich projected the sequence onto a screen which was submerged underwater and reshot the projected footage. Though technically challenging, Yukich says, this Inception-style method of re-shooting the footage gave the film the organic quality that he was looking for.

Yukich recently worked as lead editor for Beyoncé’s visual album Lemonade. Stepping behind the lens was a natural progression for Yukich, who began directing concerts for bands like Godsmack and The Hollywood Undead, as well as music videos for HIM, Vision of Disorder and The Foo Fighters.

Marvel’s Victoria Alonso to receive VES Visionary Award

The VES (Visual Effects Society) has named Victoria Alonso, producer and Marvel Studios EVP of production, as the next recipient of its Visionary Award in recognition of her contributions to visual arts and filmed entertainment. The award will be presented to Alonso at the 15th Annual VES Awards on February 7 at the Beverly Hilton.

The VES Visionary Award, voted on by the VES board of directors, “recognizes an individual who has uniquely and consistently employed the art and science of visual effects to foster imagination and ignite future discoveries by way of artistry, invention and groundbreaking work.” VES will honor Alonso for her dedication to the industry and advancement of storytelling through visual effects.

Alonso is currently executive producing James Gunn’s Guardians of the Galaxy Vol. 2 and Taika Waititi’s Thor: Ragnarok. In her executive role, she oversees post and visual effects for Marvel’s slate. She executive produced Scott Derrickson’s Doctor Strange, Joe and Anthony Russo’s Captain America: Civil War, Peyton Reed’s Ant-Man, Joss Whedon’s Avengers: Age of Ultron, James Gunn’s Guardians of the Galaxy, Joe and Anthony Russo’s Captain America: The Winter Soldier, Alan Taylor’s Thor: The Dark World and Shane Black’s Iron Man 3, as well as Marvel’s The Avengers for Joss Whedon. She co-produced Iron Man and Iron Man 2 with director Jon Favreau, Kenneth Branagh’s Thor and Joe Johnston’s Captain America: The First Avenger.

Alonso’s career began as a commercial VFX producer. From there, she VFX-produced numerous feature films, working with such directors as Ridley Scott (Kingdom of Heaven), Tim Burton (Big Fish) and Andrew Adamson (Shrek), to name a few.

Over the years, Alonso’s dedication to the industry has been admired and her achievements recognized. Alonso was the keynote speaker at the 2014 Visual Effects Society Summit, where she exemplified her role as an advocate for women in the visual effects industry. In 2015, she was an honoree of the New York Women in Film & Television’s Muse Award for Outstanding Vision and Achievement.  This past January she was presented with the Advanced Imaging Society’s Harold Lloyd Award and was recently named to Variety’s 2016 Power of Women L.A. Impact Report, which spotlights creatives and executives who’ve ‘rocked’ the industry in the past year.

Alfonso is in good company. Previous winners of the VES Visionary Award have been Christopher Nolan, Ang Lee, Alfonso Cuarón, J.J. Abrams and Syd Mead.

Behind the Title: Reel FX editor Chris Collins

NAME: Chris Collins
 
COMPANY: Reel FX (@wearereelfx) in Dallas
 
CAN YOU DESCRIBE YOUR COMPANY?
Reel FX is made up of directors, editors, animators, VFX artists, audio engineers and more. We work on everything feature length projects to commercials to VR/360 experiences.

WHAT’S YOUR JOB TITLE?
Editor

WHAT DOES THAT ENTAIL?
What it means to be an editor depends on what kind of editor you ask. If you ask me, the editor is the final director — the person responsible for compiling and composing the hard work of production into a finalized coherent piece of media. Sometimes it’s simple and sometimes there is a lot of creative problem-solving. Sometimes you only cut footage, sometimes you dive into Photoshop, After Effects and other programs to execute a vision. Sometimes there is only one way to make a video work, and sometimes there are infinite ways a piece can be cut. It all depends on the concept and production.

Now with VR, a whole new aspect of editing has opened up by being able to put on a headset and be transported into your footage. I couldn’t be more excited to see the new places that VR can take editing.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I think people look at editors and think the job is easy because they sit in a cozy office on the computer… and sometimes, they’re not wrong. But there is a lot of hidden stress, problem-solving and creativity that is invisible within a finished piece of media. They may watch a final cut and never notice all the things an editor did or fixed — and that’s what makes a good editor.

WHAT DO YOU EDIT ON?
I cut on Avid Media Composer and Adobe Premiere but I also use Adobe’s After Effects and Photoshop in my work.

DO YOU HAVE A FAVORITE PLUG-IN?
Right now it would have to be Mettle’s Skybox VR Player because it allows me to edit and view my cut of 360 footage within the Oculus headset — so ridiculously cool!

WHAT’S YOUR FAVORITE PART OF THE JOB?
Screening a cut to someone for the first time and watching their reaction.

WHAT’S YOUR LEAST FAVORITE?
The fact that the majority of people will never see or know all the unused footage and options that didn’t make the cut.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
I’d have to say those first few hours in the morning with my coffee and late at night after hours because that is when I am the most creative.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Photography.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I’ve been shooting and editing videos since I was a little kid. It carried on into high school and then into college, since that’s really what my hobby and passion was. It was one of the only things I was good at besides video games so it seemed like a no-brainer.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
TGI Fridays Countdown, Ram Division of Labor, Jeep Renegade campaign, Tostitos Recipe Videos and Texas Health Resources.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
A video I cut called Jeep Legendary Lives. It started out as a personal project of mine that I was cutting after hours, and eventually became part of a presentation video that opened the Detroit Auto Show. It’s also one of the first things that got my foot in the door as an editor.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
Phone. Computer. Camera.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
Facebook. Instagram. Twitter.

DO YOU LISTEN TO MUSIC WHILE YOU WORK? 
Only if the footage does not have audio and I am in the organization and melting phase. It’s usually some sort of chill electronic — typically instrumental.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Playing video games or taking photos really helps to distract my brain, but typically all I have to do is remind myself that I am getting paid doing something I love and something that I’ve been passionate about since I was a kid. With that thought, it’s hard not to be anything but grateful.

Ronen Tanchum brought on to run The Artery’s new AR/VR division

New York City’s The Artery has named Ronen Tanchum head of its newly launched virtual reality/augmented reality division. He will serve as creative director/technical director.

Tanchum has a rich VFX background, having produced complex effects set-ups and overseen digital tools development for feature films including Deadpool, Transformers, The Amazing Spiderman, Happy Feet 2, Teenage Mutant Ninja Turtles and The Wolverine. He is also the creator of the original VR film When We Land: Young Yosef. His work on The Future of Music — a 360-degree virtual experience from director Greg Barth and Phenomena Labs, which immerses the viewer in a surrealist musical space — won the DA&D Silver Award in the “Best Branded Content” category in 2016.

“VR today stands at just the tip of the iceberg,” says Tanchum. “Before VR came along, we were just observers and controlled our worlds through a mouse and a keyboard. Through the VR medium, humans become active participants in the virtual world — we get to step into our own imaginations with a direct link to our brains for the first time, experiencing the first impressions of a virtual world. As creators, VR offers us a very powerful tool by which to present a unique new experience.”

Tanchum says the first thing he asks a potential new VR client is, ‘Why VR? What is the role of VR in your story? “Coming from our long experiences in the CG world by working on highly demanding creative visual projects, we at The Artery have evolved our collective knowledge and developed a strong pipeline into this new VR platform,” he explains, adding that The Artery’s new division is currently gearing up for a big VR project for a major brand. “We are using it to its fullest to tell stories. We inform our clients that VR shouldn’t be created just because it’s ‘cool.’ The new VR platform should be used to play an integral part of the storyline itself — a well crafted VR experience should embellish and complement the story.”