Tag Archives: VFX

Quick Chat: Brent Bonacorso on his Narrow World

Filmmaker Brent Bonacorso has written, directed and created visual effects for The Narrow World, which examines the sudden appearance of a giant alien creature in Los Angeles and the conflicting theories on why it’s there, what its motivations are, and why it seems to ignore all attempts at human interaction. It’s told through the eyes of three people with differing ideas of its true significance. Bonacorso shot on a Red camera with Panavision Primo lenses, along with a bit of Blackmagic Pocket Cinema Camera for random B-roll.

Let’s find out more…

Where did the idea for The Narrow World come from?
I was intrigued by the idea of subverting the traditional alien invasion story and using that as a way to explore how we interpret the world around us, and how our subconscious mind invisibly directs our behavior. The creature in this film becomes a blank canvas onto which the human characters project their innate desires and beliefs — its mysterious nature revealing more about the characters than the actual creature itself.

As with most ideas, it came to me in a flash, a single image that defined the concept. I was riding my bike along the beach in Venice, and suddenly in my head saw a giant Kaiju as big as a skyscraper sitting on the sand, gazing out at the sea. Not directly threatening, not exactly friendly either, with a mutual understanding with all the tiny humans around it — we don’t really understand each other at all, and probably never will. Suddenly, I knew why he was here, and what it all meant. I quickly sketched the image and the story followed.

What was the process like bringing the film to life as an independent project?
After I wrote the script, I shot principal photography with producer Thom Fennessey in two stages – first with the actor who plays Raymond Davis (Karim Saleh) and then with the actress playing Emily Field (Julia Cavanaugh).

I called in a lot of favors from my friends and connections here in LA and abroad — the highlight was getting some amazing Primo lenses and equipment from Panavision to use because they love Magdalena Górka’s (the cinematographer) work. Altogether it was about four days of principal photography, a good bit of it guerrilla style, and then shooting lots of B-roll all over the city.

Kacper Sawicki, head of Papaya Films which represents me for commercial work in Europe, got on board during post production to help bring The Narrow World to completion. Friends of mine in Paris and Luxembourg designed and textured the creature, and I did the lighting and animation in Maxon Cinema 4D and compositing in Adobe After Effects.

Our editor was the genius Jack Pyland (who cut on Adobe Premiere), based in Dallas. Sound design and color grading (via Digital Vision’s Nucoda) were completed by Polish companies Głośno and Lunapark, respectively. Our composer was Cedie Janson from Australia. So even though this was an indie project, it became an amazing global collaborative effort.

Of course, with any no-budget project like this, patience is key — lack of funds is offset by lots of time, which is free, if sometimes frustrating. Stick with it — directing is a generally a war of attrition, and it’s won by the tenacious.

As a director, how did you pull off so much of the VFX work yourself, and what lessons do you have for other directors?
I realized early on in my career as a director that the more you understand about post, and the more you can do yourself, the more you can control the scope of the project from start to finish. If you truly understand the technology and what is possible with what kind of budget and what kind of manpower, it removes a lot of barriers.

I taught myself After Effects and Cinema 4D in graphic design school, and later I figured out how to make those tools work for me in visual effects and to stretch the boundaries of the short films I was making. It has proved invaluable in my career — in the early stages I did most of the visual effects in my work myself. Later on, when I began having VFX companies do the work, my knowledge and understanding of the process enabled me to communicate very efficiently with the artists on my projects.

What other projects do you have on the horizon?
In addition to my usual commercial work, I’m very excited about my first feature project coming up this year through Awesomeness Films and DreamWorks — You Get Me, starring Bella Thorne and Halston Sage.

Deluxe VFX

Craig Zerouni joins Deluxe VFX as head of technology

Deluxe has named Craig Zerouni as head of technology for Deluxe Visual Effects. In this role, he will focus on continuing to unify software development and systems architecture across Deluxe’s Method studios in Los Angeles, Vancouver, New York and India, and its Iloura studios in Sydney and Melbourne, as well as LA’s Deluxe VR.

Based in LA and reporting to president/GM of Deluxe VFX and VR Ed Ulbrich, Zerouni will lead VFX and VR R&D and software development teams and systems worldwide, working closely with technology teams across Deluxe’s Creative division.

Zerouni has been working in media technology and production for nearly three decades, joining Deluxe most recently from DreamWorks, where he was director of technology at its Bangalore, India-bsed facility overseeing all technology. Prior to that he spent nine years at Digital Domain, where he was first head of R&D responsible for software strategy and teams in five locations across three countries, then senior director of technology overseeing software, systems, production technology, technical directors and media systems. He has also directed engineering, products and teams at software/tech companies Silicon Grail, Side Effects Software and Critical Path. In addition, he was co-founder of London-based computer animation company CFX.

Zerouni’s work has contributed to features including Tron: Legacy, Iron Man 3, Maleficent, X-Men: Days of Future Past, Ender’s Game and more than 400 commercials and TV IDs and titles. He is a member of BAFTA, ACM/SIGGRAPH, IEEE and the VES. He has served on the AMPAS Digital Imaging Technology Subcommittee and is the author of the technical reference book “Houdini on the Spot.”

Says Ulbrich on the new hire: “Our VFX work serves both the features world, which is increasingly global, and the advertising community, which is increasingly local. Behind the curtain at Method, Iloura, and Deluxe, in general, we have been working to integrate our studios to give clients the ability to tap into integrated global capacity, technology and talent anywhere in the world, while offering a high-quality local experience. Craig’s experience leading global technology organizations and distributed development teams, and building and integrating pipelines is right in line with our focus.”

Last Chance to Enter to Win an Amazon Echo… Take our Storage Survey Now!

If you’re working in post production, animation, VFX and/or VR/AR/360, please take our short survey and tell us what works (and what doesn’t work) for your day-to-day needs.

What do you need from a storage solution? Your opinion is important to us, so please complete the survey by Wednesday, March 8th.

We want to hear your thoughts… so click here to get started now!

 

 

Quick Chat: Freefolk US executive producer Celia Williams

By Randi Altman

A few months back, UK-based post house Finish purchased VFX studio Realise and renamed the company Freefolk. They also expanded into the US with a New York City-based studio. Industry vet Celia Williams, who was most recently head of production at agency Arnold NY, is heading up Freefolk US. To find out more about the recently rebranded entity, we reached out to Williams.

Can you describe Freefolk? What kind of services do you offer?
Freefolk is a team of creative artists, technicians and problem solvers who use post production as their tool box. We offer services including high-end FilmLight Baselight color grading, remote grading, 2D and 3D visual effects, final conform, shoot supervision, animation, data management and direction of special projects. We work across the mediums of advertising, film, TV and digital content.

L-R: Celia Williams, Paul Harrison and Jason Watts.

What spurred on Freefolk’s expansion to the US?
Having carved out a reputation in London over the last 13 years as a commercials post house, the expansion to the US seemed like a natural progression for the founders, allowing them to export a boutique service and high-quality work rather than becoming another large machine in London.

Will you be offering the same services in both locations?
The services we offer in London will all be represented in New York. Color grading plays such an important role in the process these days, so we are spearheading with a Baselight suite driven by Paul Harrison and 2D VFX department being set up by Jason Watts.

Will you share staff between New York and the UK?
Yes, there will be a sharing of resources and, obviously, experience across the offices. A great thing about opening in New York is being able to offer our staff the experience of working in a foreign city. It also gives clients who are increasingly working across multiple markets a seamless global service.

Why the rebrand from Finish to Freefolk?
The rebrand from Finish to Freefolk came about as part of the expansion into the US and the acquisition of Realise. It was also a timely opportunity to express one of the core values of the company, and the way it values its staff and clients — Freefolk is about the people involved in the process.

What does the acquisition of Realise mean to the company?
Realise has brought a wealth of experience and talent to the table. They combine creative skill and technical understanding in equal measure. They are known in both commercials and now film and TV for offering very specialized capabilities with Side Effects Houdini and customized software.

We have just completed VFX work on 400 shots over 10 episodes of NBC’s Emerald City TV series (due to be released early 2017) and have just embarked on our next long-form project. It’s really exciting to be expanding into other mediums such as TV, film, installation work, projection mapping and other experimental and experiential arenas.

You have an ad agency background. From your own experience how important is that to clients?
It’s extremely important and comforting, actually. Understanding what the producers and creatives are challenged with on a daily basis gives me the ability to offer workable solutions to their problems in a very collaborative way. They don’t have to wonder if I “get” where they’re coming from. Frankly, I do.

I think that it’s emotionally helpful as well. To know someone can be an understanding shoulder to lean on and is taking their concerns seriously is beyond important. Everyone is working at breakneck speed in our industry, which can lead to a lack of humanity in our interactions. One of the main reasons I was attracted to working with Freefolk is that they are deeply dedicated to keeping that humanity and personal touch in the way they do business.

The way that post companies service agencies has changed due to the way that products are now being marketed — online ads, social media, VR. Can you talk about that?
To be well informed and prepped as early on in the process as you can be is key. And to truly partner with the producers and creatives, as much as they need or want, is critical. What might work in one medium may be less impactful in another, so from the get-go, how do we plan to ensure all deliverables are strong, and to offer insights into new technology that might impact the outcome? It’s all about sharing and collaboration.

I may be one of the few people who’ve never really panicked about the different ways we deliver final work — our industry has always been about change, which is what keeps it interesting. At the end of the day, it’s always been about delivering content, in one form or another. So you need to know your final deliverables list and plan accordingly.

Fantastic Beasts VFX workflow employs previs and postvis

By Daniel Restuccio

Warner Bros’ Fantastic Beasts and Where to Find Them is considered by some a Harry Potter prequel and by others an entirely new J.K. Rowling franchise. Filled with nearly 1,500 VFX shots, this live-action, CG and character-driven movie put a huge emphasis on pre-pro and established the increased role of postvis in the film’s visual effects post pipeline.

For the film’s overall visual effects supervisors, Tim Burke and Christian Manz, it was a family reunion of sorts, reteaming with many of the companies and individuals that worked on the Harry Potter movies, including director David Yates and producers David Heyman, Steve Kloves, J.K. Rowling and Lionel Wigram.

According to Manz, one of the most significant aspects of this film was how visual effects were integrated into the story from the very beginning. The direction from Yates was very clear: “Make things fantastic, but not fantasy.” For every creature design presented, Yates would ask, “What would be the story behind that creature? What would that character do if the audience saw it from one moment to the next?” Says Manz, “It all had to work to support the story, but not be the story.”

Manz feels that this movie speaks to a new way of storytelling with VFX. “Visual effects is now a part of that filmmaking and storytelling team rather than being the guys who stick in everything afterwards.”

Starting in January 2015, while Burke was busy as VFX supervisor on The Legend of Tarzan, Manz worked with Framestore animation director Pablo Grillo, a Framestore art and animation team and a group of freelance concept and previs artists doing creature development and scene design. Over eight months of design sessions they created 18 main animal types based on hundreds of variations, and worked with the Framestore art department to conceive the turn-of-the-century New York City sets and set extensions.

“Of course, there were creatures we tested that didn’t make the film,” says Framestore animator Andras Ormos, “but it was about the process of whittling it down, and that was the real joy of working on this project. The creative input stretched beyond the post production stage, deciding what worked and what wouldn’t in the overall film.”

“J.K. Rowling’s wonderful script was filled with characters and creatures,” explains Grillo. “Having seen how animation is such a big part of the process in a film like this, we decided that it was important to be involved from the concept stage onwards.” The character development and scene work sessions were so impressive they actually influenced subsequent drafts of the script.

Burke came on full-time in June 2015, and they split the movie in half. Manz took the lead developing the world inside Newt’s magical case, and Burke did the “Obscurus” and the third act. Principal photography took place from August 2015 to January 2016, and they took turns on set supervising their own and each other’s VFX sequences.

With Framestore and Double Negative taking the lead, the shots were spread out among nine main VFX and three previs/postvis companies including: Cinesite, Image Engine, Method Studios, Milk Visual Effects, Moving Picture Company, Nvizible, Proof, Rodeo FX, Secret Lab, The Third Floor and others. Burke says they divided the work by “the strengths of the companies and without overlapping them too much.”

Framestore
Framestore took on the majority of the complex character animation: the Niffler, Gnarlack, the Erumpent and Picket Bowtruckle, as well as many goblins and elves. Grillo first tackled Niffler, described by Rowling as “a long-snouted, burrowing creature native to Britain with a penchant for anything shiny.” The creature design was a mash-up of a spiny anteater, platypus and mole and went through hundreds of iterations and many animated prototypes. Framestore used the skin and muscle rigging toolkit Flesh and Flex developed for Tarzan on Niffler’s magic “loot stuffing” pouch.

Framestore

Framestore

The reason the audience is so delighted when this character first appears, explains Ormos, is that “this scene is driven by the relationship between Newt and the Niffler. There was a history we had to get across — the fact that the Niffler was notorious for escaping and pick-pocketing, and that Newt was going through the motions in trying to catch him. They understood each other and there were little looks, a language in their movement.”

Gnarlack, an American, cigar-chewing, snarky goblin, voiced and facial mocaped by actor Ron Perlman, “is one of the best digital humanoids yet,” reports Grillo. Perlman donned a Vicon Cara 3D facial motion capture headset, surrounded by four high-resolution, high-speed witness cameras. According to Framestore VFX supervisor Andy Kind, Perlman also sat in front of 98 cameras for a facial action coding shape (FACS) session so the team could sculpt the face directly in 3D.

“We created CG characters for the giants, elves and band ensemble,” says Kind. “Then we gave them crazy instruments, including a sousaphone/trumpet concoction.”

A 17-foot carbon fiber puppet, built by Handspring Puppet, substituted for the amorous rhinoceros Erumpent during the Central Park chase scene. It was switched out with the CG version later and dynamic simulations of shattering ice, explosive snow and water effects were added to the concluding shots. There’s this liquid, light-filled sack on the Erumpent’s forehead that Manz says, “made her slightly more unusual than a normal creature.”

“There was an awful lot of digital environment as well as the beast itself,” continues Manz. “David Yates fell in love with the postvis for this scene. It was great to be able to play with shots and offer up suggestions for the edit. It was a very organic way of filmmaking.”

Newt’s pocket-hiding creature sidekick, Picket Bowtruckle, took two months and 200 versions to get right. “We were told that Picket moved too slowly at first and that he appeared too old. We played with the speed but kept his movements graceful,” explains Manz. “He didn’t really have any facial animation, but he does blow a raspberry at one point. In the end, we added more shots to get Pickett’s story to go through, as everyone just loved him.”

MPC
The Moving Picture Company (MPC) completed more than 220 shots and created the Demiguise, Occamy and Billiwig, as well as 3D set extensions of period Manhattan.

For Demiguise’s long, flowing hair and invisibility effect, MPC used their Furtility groom technology. According to MPC VFX supervisor Ferran Domenech, using Furtility “allows for the hair to move naturally and interact with the creature’s arms, legs and the environment around it.” Demiguise was animated using enhanced mocap with keyframed facial expressions.

MPC

MPC built the large feathered dragon-snake Occamy in sections to fill the real and CG extended attic. They used Furtility once more, this time to add feathers, and they augmented the code so that in the climatic fight scene they could scale the giant version of the creature down to mouse-size. MPC’s effects team then used its in-house Kali destruction technology to wreck the attic.

Finally, MPC worked on the Billiwig, a magical bug that can change its flight mode from dragonfly to propeller plane. “This little creature has lots of character and was great fun to bring to life,” reports Domenech.

Previs and Postvis
A major technical advance for Fantastic Beasts can be found in the workflow. It’s been 15 years since the first Harry Potter movie and five years since Deathly Hallows. Over that time Burke had designed a very efficient, streamlined, mostly film-based VFX workflow.

“In the past, we were always stuck at the point where when we shot the film, it was put into editorial, they cut it and then gave it back to us — quite often with big holes where creatures would exist or environments needed to be placed,” describes Burke. “Then we would have to involve the facilities to use their real power to push things through and start blocking out all of the characters. This took quite a bit of time and would always slow the process down, and time is really the key difference with everything we do these days.”

In the past, says Burke, he might wait two months to see an early block of an animated character, “which always then restricts what you can do at the back end or restricts the director’s ability to make creative changes.”

Thankfully this wasn’t the case with Fantastic Beasts. “In the middle of the shoot, Christian and I started supervising the postvis of the scenes we’d already shot,” he explains. They assembled a 50-artist in-house postvis team comprised of members of The Third Floor, Proof and Framestore. While some of the movie was prevised, all of the movie was postvised.

“The path from previs to postvis varied from sequence to sequence,” explains Peter McDonald, previs/postvis supervisor for The Third Floor, London. “At one end of the scale, we had sequences that never really survived through shooting, while at the other end we had sequences that were followed shot-for-shot during the shoot and subsequent editorial process.”

Third Floor

Third Floor postvis

“As an example,” he continues, “the Demiguise and Occamy scene in the department store attic was heavily prevised. The final previs was a pretty polished and spectacular piece in its own right with some relatively sophisticated animation and a highly refined edit. This previs edit was taken onto the stage, with printouts of the shots being referenced as the shoot day progressed. What later came back our way for postvis was very similar to what had been designed in the previs, which was very satisfying from our point of view. It’s nice to know that previs can help drive a production at this level of fidelity!”

One of the benefits of this process was having a movie during editorial that had no “holes” where VFX shots were to later appear. The “postvis” was so good that it was used during audience screenings before the VFX shots were actually built and rendered.

“There were a couple of factors that elevated the postvis,” says McDonald. “Probably the primary one was integration between Framestore’s team with our team at The Third Floor London. Having them present and being supervised by Pablo Grillo guaranteed that the work we were putting together was being judged from almost a “finals” point of view, as Pablo and his artists would also be the ones finishing the job in post. It meant that our postvis wasn’t a throw away — it was the first step in the post production pipeline. This philosophy was present beyond the personnel involved. We also had creature rigs that could be translated with their animation down the line.”

Subway rampage previs

Third Floor’s previs of subway rampage.

One example of a scene that followed through from previs to postvis were parts of the Obscurus rampage in the subway. “Pablo and I worked very closely with artists at both Framestore and The Third Floor on this ‘sequence within a sequence,” says McDonald. “We started with bifrost fluid simulations created in Maya by our own senior asset builder Chris Dawson. We then had our animators distort these simulations into the virtual subway set. Through iteration, we developed the choreography of the piece and further refined the overall rhythm and shape of the piece with our previs editor. This previs then became the template for what was shot on the actual set with Eddie Redmayne and Colin Farrell in the roles of Newt and Graves. When the plate edit returned to us for postvis, we were pretty much able to drop the same distorted simulations onto the plates. The camera angles and cutting pattern in the previs edit had been followed very closely by the live-action unit. We then animated a layer of environment destruction and comped it into the shots to help tie it all together.”

During postvis, says Manz, “A character could go into a shot within a day or two. You would track the shot plate, put the character in, light it and stick it back into editorial. That sort of turn around, that in-house work that we did was the big, big difference with how the film worked. It allowed us to feed all that stuff to David Yates.”

Yates showed his director’s cut to the studio with every single shot of the film blocked out. There were no empty spaces. “We even got the environments in so he had a representation of every street,” says Manz. They completed a three-hour assembly of the film in about five months.

Creatively, it was very liberating, which enabled them to do additional shoots just to enhance the movie. Burke says they were designing and changing shots right up to the end. The final reveal of Jonny Depp as dark wizard Gellert Grindelwald came together all at the last minute.

Fantastic Beasts is like a Harry Potter movie because it exists in the J.K. Rowling story universe and is part of the Harry Potter lore. “Where it’s similar to Potter in terms of the filmmaking,” says Manz, “ is in making it feel very real and not fantasy. What I always enjoyed about the Potter films was they really felt like they were set in a real version of the UK; you could almost believe that magical sort of school existed.”

Third Floor previs

How Fantastic Beasts is different, says Burke, is that it is set in turn-of-the-century New York City, a real city, and not the magical school environment of Hogwarts. “We were dealing with adults,” continues Burke, “we’re not talking about a child growing and running everything through a child’s life. We’re talking about a series of adults. In that sense, it felt when we were making it like were making a film for adults, which obviously has great appeal to children as well. But I do feel it’s more of a film for grown-ups in terms of its storyline, and the issues it’s challenging and discussing.”

Someone told Manz that somebody “felt like Fantastic Beasts was made for the audience that read the books and watched those films and has now grown up.”

IMDB lists four new Fantastic Beasts movies in development. Burke and Manz are already in pre-production on the second, penciled in for a November 16, 2018 release date. “I think it’s fair to say,” says Burke, “that we’ll obviously be trying to expand on the universe that we’ve started to create. Newt will be there with his animals and various other characters, which are going to bring a lot more interest as the story evolves.”

Manz predicts, “It’s just trying to push the believability of the interactions and that world even further. The first film was all about creating that new world, and now it’s out there. It will be a new city (Paris) so we’ll have that challenge again, but we’ll build on what we’ve learned. You don’t often get an opportunity to work with the same team of people, so that’s going to be the great for the second one.”

 

 

Jim Hagarty Photography

Blue Sky Studios’ Mikki Rose named SIGGRAPH 2019 conference chair

Mikki Rose has been named conference chair of SIGGRAPH 2019. Fur technical director at Greenwich, Connecticut-based Blue Sky Studios, Rose chaired the Production Sessions during SIGGRAPH 2016 this past July in Anaheim and has been a longtime volunteer and active member of SIGGRAPH for the last 15 years.

Rose has worked on such film as The Peanuts Movie and Hotel Transylvania. She refers to herself a “CG hairstylist” due to her specialization in fur at Blue Sky Studios — everything from hair to cloth to feathers and even vegetation. She studied general CG production at college and holds BS degrees in Computer Science and Digital Animation from Middle Tennessee State University as well as an MFA in Digital Production Arts from Clemson University. Prior to Blue Sky, she lived in California and held positions with Rhythm & Hues Studios and Sony Pictures Imageworks.

“I have grown to rely on each SIGGRAPH as an opportunity for renewal of inspiration in both my professional and personal creative work. In taking on the role of chair, my goal is to provide an environment for those exact activities to others,” said Rose. “Our industries are changing and developing at an astounding rate. It is my task to incorporate new techniques while continuing to enrich our long-standing traditions.”

SIGGRAPH 2019 will take place in Los Angeles from July 29 to August 2, 2019.


Main Image: SIGGRAPH 2016 — Jim Hagarty Photography

Creating and tracking roaches for Hulu’s 11.22.63

By Randi Altman

Looking for something fun and compelling to watch while your broadcast shows are on winter break? You might want to try Hulu’s original eight-part miniseries 11.22.63, which the streaming channel released last February.

It comes with a pretty impressive pedigree — it’s based on a Stephen King novel, it’s executive produced by J.J. Abrams, it stars Oscar-nominee James Franco (127 Hours) and it’s about JFK’s assassination and includes time travel. C’mon!

The plot involves Franco’s character traveling back to 1960 in an effort to stop JFK’s assassination, but just as he makes headway, he feels the past pushing back in some dangerous, and sometimes gross, ways.

Bruce Branit

In the series pilot, Franco’s character, Jack Epping, is being chased by Kennedy’s security after he tries to sneak into a campaign rally. He ducks in a storage room to hide, but he’s already ticked off the past, which slowly serves him up a room filled with cockroaches that swarm him. The sequence is a slow build, with roaches crawling out, covering the floor and then crawling up him.

I’m not sure if Franco has a no-roach clause in his contract (I would), but in order to have control over these pests, it was best to create them digitally. This is where Bruce Branit, owner of BranitFX in Kansas City, Missouri came in. Yes, you read that right, Kansas City, and his resume is impressive. He is a frequent collaborator with Jay Worth, Bad Robot’s VFX supervisor.

So for this particular scene, BranitFX had one or two reference shots, which they used to create a roach brush via Photoshop. Once the exact look was determined regarding the amount of attacking roaches, they animated it in 3D and and composited. They then used 2D and 3D tracking tools to track Franco while the cockroaches swarmed all over him.

Let’s find out more from Bruce Branit.

How early did you get involved in that episode? How much input did you have in how it would play out?
For this show, there wasn’t a lot of lead time. I came on after shooting was done and there was a rough edit. I don’t think the edit changed a lot after we started.

What did the client want from the scene, and how did you go about accomplishing that?
VFX supervisor Jay Worth and I have worked together on a lot of shows. We’d done some roaches for an episode of Almost Human, and also I think for Fringe, so we had some similar assets and background with talking “roach.” The general description was tons of roaches crawling on James Franco.

Did you do previs?
Not really. I rendered about 10 angles of the roach we had previously worked with and made Adobe Photoshop brushes out of each frame. I used that to paint up a still of each shot to establish a baseline for size, population and general direction of the roaches in each of the 25 or so shots in the sequence.

Did you have to play with the movements a lot, or did it all just come together?
We developed a couple base roach walks and behaviors and then populated each scene with instances of that. This changed depending on whether we needed them crossing the floor, hanging on a light fixture or climbing on Franco’s suit. The roach we had used in the past was similar to what the producers on 11.22.63 had in mind. We made a few minor modifications with texture and modeling. Some of this affected the rig we’d built so a lot of the animations had to be rebuilt.

Can you talk about your process/workflow?
This sequence was shot in anamorphic and featured a constantly flashing light on the set going from dark emergency red lighting to brighter florescent lights. So I generated unsqueezed lens distortion, removed and light mitigated interim plates to pull all of our 2D and 3D tracking off of. The tracking was broken into 2D, 3D and 3D tracking by hand involving roaches on Franco’s body as he turns and swats at them in a panic. The production had taped large “Xs” on his jacket to help with this roto-tracking, but those two had to be painted out for many shots prior to the roaches reaching Franco.

The shots were tracked in Fusion Studio for 2D and SynthEyes for 3D. A few shots were also tracked in PFTrack.

The 3D roach assets were animated and rendered in NewTek LightWave. Passes for the red light and white light conditions were rendered as well as ambient show and specular passes. Although we were now using tracking plates with the 2:1 anamorphic stretch removed, a special camera was created in LightWave that was actually double the anamorphic squeeze to duplicate the vertical booked and DOF from an anamorphic lens. The final composite was completed in Blackmagic Fusion Studio using the original anamorphic plates.

What was the biggest challenge you faced working on this scene?
Understanding the anamorphic workflow was a new challenge. Luckily, I had just completed a short project of my own called Bully Mech that was shot with Lomo anamorphic lenses. So I had just recently developed some familiarity and techniques to deal with the unusual lens attributes of those lenses. Let’s just say they have a lot of character. I talked with a lot of cinematographer friends to try to understand how the lenses behaved and why they stretched the out-of-focus element vertically while the image was actually stretched the other way.

What are you working on now?
I‘ve wrapped up a small amount of work on Westworld and a handful of shots on Legends of Tomorrow. I’ve been directing some television commercials the last few months and just signed a development deal on the Bully Mech project I mentioned earlier.

We are making a sizzle reel of the short that expands the scope of the larger world and working with concept designers and a writer to flush out a feature film pitch. We should be going out with the project early next year.

Infinite Fiction

Republic Editorial launches design/VFX studio

Republic Editorial in Dallas has launched a design- and VFX-focused sister studio, Infinite Fiction, and leading the charge as executive producer is visual effects industry veteran Joey Cade. In her new role, she will focus on developing Infinite Fiction’s sales and marketing strategy, growing its client roster and expanding the creative team and its capabilities. More on her background in a bit.

Infinite Fiction, which is being managed by Republic partners Carrie Callaway, Chris Gipson and Keith James, focuses on high-end, narrative-driven motion design and visual effects work for all platforms, including virtual reality. Although it shares management with Republic Editorial, Infinite Fiction is a stand-alone creative shop and will service agencies, outside post houses and entertainment studios.

Infinite Fiction is housed separately, but located next door to Republic Editorial’s uptown Dallas headquarters. It adds nearly 2,000 square feet of creative space to Republic’s recently renovated 8,000 square feet and is already home to a team of motion designers, visual effects artists, CG generalists and producers.

Cade began her career in live-action production working with Hungry Man, Miramax and NBC. She gained expertise in visual effects and animation at Reel FX, which grew from a 30-person boutique to an over 300-person studio with several divisions during her tenure. As its first entertainment division executive producer, Cade won business with Sony TV, Universal, A&E Networks and ABC Family as well as produced Reel FX’s first theatrical VFX project for Disney. She broadened her skill set by launching and managing a web-based business and gained branding, marketing and advertising experience within small independent agencies, including Tractorbeam.

Infinite Fiction already has projects in its pipeline, including design-driven content pieces for TM Advertising, Dieste and Tracy Locke.

Credit: Film Frame ©2016 Marvel. All Rights Reserved.

Digging Deeper: Doctor Strange VFX supervisor Stephane Ceretti

By Daniel Restuccio

Marvel’s Doctor Strange — about an arrogant neurosurgeon who loses the use of his hands in an accident and sets off on a self-obsessed journey to find a cure — has been doing incredibly well in terms of box office. You’ve got the winning combination of Benedict Cumberbatch, Marvel, a compelling story and a ton of visual effects created by some of the biggest houses in the business, including ILM (London, San Francisco, Vancouver), Method (LA, Vancouver), Luma (LA, Melbourne) Framestore London, Lola, Animal Logic, Crafty Apes, Exceptional Minds and Technicolor VFX.

Stephane Ceretti

Leading the VFX charge was visual effects supervisor Stephane Ceretti, whose credit list reads like a Top 10 list for films based on Marvel comics, including Guardians of the Galaxy, Thor: The Dark World, Captain America: The First Avenger and X-Men: First Class. His resume is long and impressive.

We recently reached out to Ceretti to find out more about Doctor Strange‘s VFX process…

When did you start on the project? When were all the shots turned in?
I started in September 2014 as Scott Derrickson, the director, was working on the script. Production got pushed a few months while we waited for Benedict Cumberbatch to be available, but we worked extensively on previz and visual development during all this time. Production moved to London in June 2015 and shooting began in November 2015 and went until March 2016. Shots and digital asset builds got turned over as we were shooting and in post, as the post production period was very short on the film. We only had 5.5 months to do the visual effects. We finished the film sometime in October, just a few weeks before the release.

What criteria did you use to distribute the shots among the different VFX companies?  For example, was it based on specialty areas?
It’s like a casting; you try to pick the best company and people for each style of effects. For example, ILM had done a lot of NYC work before, especially with Marvel on Avengers. Plus they are a VFX behemoth, so for us it made sense to have them on board the project for these two major sequences, especially with Richard Bluff as their supervisor. He worked with my VFX producer Susan Pickett on the New York battle sequence in Avengers and she knew he would totally be great for what we wanted to achieve.

What creative or technical breakthroughs were there on this project? For example, ILM talked about the 360 Dr. Strange Camera. What were some of the other things that had never been done before?
I think we pushed the envelope on a lot of visual things that had been touched before, but not to that level. We also made huge use of digital doubles extremely close to camera, both in the astral world and the magic mystery tour. It was a big ask for the vendors.

ILM said they did the VFX at IMAX 2K, were any of the VFX shots done at 4K? If yes, why?
No we couldn’t do a 4K version for the IMAX on this project. IMAX takes care, upresing the shots to IMAX resolution with their DMR process. The quality of the Alexa 65, which we used to shoot the movie, makes it a much smoother process. Images were much sharper and detailed to begin with.

It may be meaningless to talk about how many effects shots there were in the movie when it seems like every shot is a VFX shot.  Is there a more meaningful way to describe the scale of the VFX work? 
It is true that just looking at the numbers isn’t a good indication … we had 1,450 VFX shots in the film, and that’s about 900 less than Guardians of the Galaxy, but the shot complexity and design was way more involved because every shot was a bit of a puzzle, plus the R&D effort.

Some shots with the Mandelbrot 3D fractals required a lot of computing power, having a full bending CG NY required tons of assets and the destruction simulation in Hong Kong had to be extremely precise as we were really within the entire street being rebuilt in reversed time. All of these were extremely time and process consuming and needed to be choreographed and designed precisely.

Can you talk about the design references Marvel gave you for the VFX work done in this movie?
Well most of the references that Marvel gave us came from the comics, especially the ones from Steve Ditko, who created all the most iconic psychedelic moments in Doctor Strange in the ‘60s and ‘70s. We also looked at a Doctor Strange comic called “The Oath,” which inspired some of the astral projection work.

How did you draw the line stylistically and creatively between impressively mind-blowing and over-the-top psychedelic?
It was always our concern to push the limits but not break them. We want to take the audience to these new places but not lose them on the way. It was a joint effort between the VFX artists and the director, editors and producers to always keep in mind what the goal of the story was and to make sure that the VFX wouldn’t take over when it was not necessary. It’s important that the VFX don’t overtake the story and the characters at any time. Sometimes we allow ourselves to shine and show off but it’s always in the service of pushing the story further.

What review and submission technology did you use to coordinate all the VFX houses? Was there a central server?
We used CineSync to review all the submissions live with the vendors. Marvel has a very strong IT department and servers that allow the various vendors to send their submission securely and quickly. We used a system called Signiant that allows all submissions to be automatically sorted and put in a database for review. It’s very efficient and necessary when you get a huge amount of submissions daily as we did toward the end of the project. Our team of amazing coordinators made sure everything was reviewed and presented to the studio so we could give immediate feedback to our vendors, who worked 24/7 around the globe to finish the movie.

What project management software did you use?
Our database is customized and we use Filemaker. Our review sessions are a mixture of CineSync (QuickTime and interactive reviews) and Tweak RV for 2K viewing and finalizing.

In talking to ILM about the film, they mentioned previs, production and postvis. Can you talk a bit about that whole workflow?
We do extensive previz/techviz and stuntviz before production, but as soon as the shots are in the can editors cut them in the movie. They are then turned over to our postviz team so we can quickly check that everything works and the editors can cut in a version of the shot that represents the idea of what it will be in the end. It’s a fantastic tool that allows us to shape the film before we turn it over to the vendors, so we nail basic ideas and concepts before they get executed. Obviously, there is lots that the vendors will add on top of the postviz, but this process is necessary for a lot of reasons (editing, R&D, preview screening) and is very efficient and useful.

Collectively how many hundreds of people worked on the VFX on this movie?
I would say about 1,000 people in the VFX overall. That does not count the 3D conversion people.

What was the personal challenge for you? How did you survive and thrive while working on this one project?
I worked two years on it! It was really difficult, but also very exciting. Sometimes mentally draining and challenging, but always interesting. What makes you survive is the excitement of making something special and getting to see it put together by such a talented group of people across the board. When you work on this kind of film everybody does their best, so the outcome is worth it. I think we definitely tried to do our best, and the audience seems to respond to what we did. It’s incredibly rewarding and in the end, it’s the reason why we make these movies — so that people can enjoy the ride.


In addition to being a West Coast correspondent for postPerspective, Daniel Restuccio is the multimedia department chair at California Lutheran University and former Walt Disney Imagineer.

New Wacom Cintiq Pro line offers portability, updated pen, more

Wacom has introduced a new line of Wacom Cintiq Pro creative pen displays: the Cintiq Pro 13 and Cintiq Pro 16. The Wacom Cintiq Pro features a thin and portable form factor, making them suitable for working on the road or remotely.

Cintiq Pro’s new Pro Pen 2, according to Wacom, offers four times greater accuracy and pressure sensitivity than the previous Pro Pen. The improved Pro Pen 2 creates an intuitive experience with virtually lag-free tracking on a glass surface that produces the right amount of friction, and is coated to reduce reflection.

Additionally, the new optical bonding process reduces parallax, providing a pen-on-screen performance that feels natural and has the feedback of a traditional pen or brush. Both Cintiq Pro models also feature multi-touch for easy and fast navigation, as well as the ability to pinch, zoom and rotate illustrations, photos or models within supporting 2D or 3D creative software apps.

Both high-resolution Cintiq Pro models come with an optimized edge-to-edge etched glass workspace. The Cintiq Pro also builds on its predecessor, the Cintiq 13HD touch, offering the ExpressKey Remote as an optional accessory so users can customize their most commonly used shortcuts and modifiers when working with their most-used software applications. In addition, ergonomic features, such as ErgoFlex, fully integrated pop out legs and an optional three-position desk stand (available in February), let users focus on their work instead of constantly adjusting for comfort.

The Wacom Cintiq Pro 13 and 16 are compatible with both Macs and PCs and feature full HD (1920×1080) and UHD (3840×2160) resolution, respectively. Both Cintiq Pro configurations deliver vivid colors, the 13-inch model providing 87 percent Adobe RGB and the 16-inch, 94 percent.

Priced at $999.95 USD, the Cintiq Pro 13 is expected to be available online and at select retail locations at the beginning of December. The Cintiq Pro 16, $1499.95 USD, is expected in February.