Tag Archives: previs

Fantastic Beasts VFX workflow employs previs and postvis

By Daniel Restuccio

Warner Bros’ Fantastic Beasts and Where to Find Them is considered by some a Harry Potter prequel and by others an entirely new J.K. Rowling franchise. Filled with nearly 1,500 VFX shots, this live-action, CG and character-driven movie put a huge emphasis on pre-pro and established the increased role of postvis in the film’s visual effects post pipeline.

For the film’s overall visual effects supervisors, Tim Burke and Christian Manz, it was a family reunion of sorts, reteaming with many of the companies and individuals that worked on the Harry Potter movies, including director David Yates and producers David Heyman, Steve Kloves, J.K. Rowling and Lionel Wigram.

According to Manz, one of the most significant aspects of this film was how visual effects were integrated into the story from the very beginning. The direction from Yates was very clear: “Make things fantastic, but not fantasy.” For every creature design presented, Yates would ask, “What would be the story behind that creature? What would that character do if the audience saw it from one moment to the next?” Says Manz, “It all had to work to support the story, but not be the story.”

Manz feels that this movie speaks to a new way of storytelling with VFX. “Visual effects is now a part of that filmmaking and storytelling team rather than being the guys who stick in everything afterwards.”

Starting in January 2015, while Burke was busy as VFX supervisor on The Legend of Tarzan, Manz worked with Framestore animation director Pablo Grillo, a Framestore art and animation team and a group of freelance concept and previs artists doing creature development and scene design. Over eight months of design sessions they created 18 main animal types based on hundreds of variations, and worked with the Framestore art department to conceive the turn-of-the-century New York City sets and set extensions.

“Of course, there were creatures we tested that didn’t make the film,” says Framestore animator Andras Ormos, “but it was about the process of whittling it down, and that was the real joy of working on this project. The creative input stretched beyond the post production stage, deciding what worked and what wouldn’t in the overall film.”

“J.K. Rowling’s wonderful script was filled with characters and creatures,” explains Grillo. “Having seen how animation is such a big part of the process in a film like this, we decided that it was important to be involved from the concept stage onwards.” The character development and scene work sessions were so impressive they actually influenced subsequent drafts of the script.

Burke came on full-time in June 2015, and they split the movie in half. Manz took the lead developing the world inside Newt’s magical case, and Burke did the “Obscurus” and the third act. Principal photography took place from August 2015 to January 2016, and they took turns on set supervising their own and each other’s VFX sequences.

With Framestore and Double Negative taking the lead, the shots were spread out among nine main VFX and three previs/postvis companies including: Cinesite, Image Engine, Method Studios, Milk Visual Effects, Moving Picture Company, Nvizible, Proof, Rodeo FX, Secret Lab, The Third Floor and others. Burke says they divided the work by “the strengths of the companies and without overlapping them too much.”

Framestore
Framestore took on the majority of the complex character animation: the Niffler, Gnarlack, the Erumpent and Picket Bowtruckle, as well as many goblins and elves. Grillo first tackled Niffler, described by Rowling as “a long-snouted, burrowing creature native to Britain with a penchant for anything shiny.” The creature design was a mash-up of a spiny anteater, platypus and mole and went through hundreds of iterations and many animated prototypes. Framestore used the skin and muscle rigging toolkit Flesh and Flex developed for Tarzan on Niffler’s magic “loot stuffing” pouch.

Framestore

Framestore

The reason the audience is so delighted when this character first appears, explains Ormos, is that “this scene is driven by the relationship between Newt and the Niffler. There was a history we had to get across — the fact that the Niffler was notorious for escaping and pick-pocketing, and that Newt was going through the motions in trying to catch him. They understood each other and there were little looks, a language in their movement.”

Gnarlack, an American, cigar-chewing, snarky goblin, voiced and facial mocaped by actor Ron Perlman, “is one of the best digital humanoids yet,” reports Grillo. Perlman donned a Vicon Cara 3D facial motion capture headset, surrounded by four high-resolution, high-speed witness cameras. According to Framestore VFX supervisor Andy Kind, Perlman also sat in front of 98 cameras for a facial action coding shape (FACS) session so the team could sculpt the face directly in 3D.

“We created CG characters for the giants, elves and band ensemble,” says Kind. “Then we gave them crazy instruments, including a sousaphone/trumpet concoction.”

A 17-foot carbon fiber puppet, built by Handspring Puppet, substituted for the amorous rhinoceros Erumpent during the Central Park chase scene. It was switched out with the CG version later and dynamic simulations of shattering ice, explosive snow and water effects were added to the concluding shots. There’s this liquid, light-filled sack on the Erumpent’s forehead that Manz says, “made her slightly more unusual than a normal creature.”

“There was an awful lot of digital environment as well as the beast itself,” continues Manz. “David Yates fell in love with the postvis for this scene. It was great to be able to play with shots and offer up suggestions for the edit. It was a very organic way of filmmaking.”

Newt’s pocket-hiding creature sidekick, Picket Bowtruckle, took two months and 200 versions to get right. “We were told that Picket moved too slowly at first and that he appeared too old. We played with the speed but kept his movements graceful,” explains Manz. “He didn’t really have any facial animation, but he does blow a raspberry at one point. In the end, we added more shots to get Pickett’s story to go through, as everyone just loved him.”

MPC
The Moving Picture Company (MPC) completed more than 220 shots and created the Demiguise, Occamy and Billiwig, as well as 3D set extensions of period Manhattan.

For Demiguise’s long, flowing hair and invisibility effect, MPC used their Furtility groom technology. According to MPC VFX supervisor Ferran Domenech, using Furtility “allows for the hair to move naturally and interact with the creature’s arms, legs and the environment around it.” Demiguise was animated using enhanced mocap with keyframed facial expressions.

MPC

MPC built the large feathered dragon-snake Occamy in sections to fill the real and CG extended attic. They used Furtility once more, this time to add feathers, and they augmented the code so that in the climatic fight scene they could scale the giant version of the creature down to mouse-size. MPC’s effects team then used its in-house Kali destruction technology to wreck the attic.

Finally, MPC worked on the Billiwig, a magical bug that can change its flight mode from dragonfly to propeller plane. “This little creature has lots of character and was great fun to bring to life,” reports Domenech.

Previs and Postvis
A major technical advance for Fantastic Beasts can be found in the workflow. It’s been 15 years since the first Harry Potter movie and five years since Deathly Hallows. Over that time Burke had designed a very efficient, streamlined, mostly film-based VFX workflow.

“In the past, we were always stuck at the point where when we shot the film, it was put into editorial, they cut it and then gave it back to us — quite often with big holes where creatures would exist or environments needed to be placed,” describes Burke. “Then we would have to involve the facilities to use their real power to push things through and start blocking out all of the characters. This took quite a bit of time and would always slow the process down, and time is really the key difference with everything we do these days.”

In the past, says Burke, he might wait two months to see an early block of an animated character, “which always then restricts what you can do at the back end or restricts the director’s ability to make creative changes.”

Thankfully this wasn’t the case with Fantastic Beasts. “In the middle of the shoot, Christian and I started supervising the postvis of the scenes we’d already shot,” he explains. They assembled a 50-artist in-house postvis team comprised of members of The Third Floor, Proof and Framestore. While some of the movie was prevised, all of the movie was postvised.

“The path from previs to postvis varied from sequence to sequence,” explains Peter McDonald, previs/postvis supervisor for The Third Floor, London. “At one end of the scale, we had sequences that never really survived through shooting, while at the other end we had sequences that were followed shot-for-shot during the shoot and subsequent editorial process.”

Third Floor

Third Floor postvis

“As an example,” he continues, “the Demiguise and Occamy scene in the department store attic was heavily prevised. The final previs was a pretty polished and spectacular piece in its own right with some relatively sophisticated animation and a highly refined edit. This previs edit was taken onto the stage, with printouts of the shots being referenced as the shoot day progressed. What later came back our way for postvis was very similar to what had been designed in the previs, which was very satisfying from our point of view. It’s nice to know that previs can help drive a production at this level of fidelity!”

One of the benefits of this process was having a movie during editorial that had no “holes” where VFX shots were to later appear. The “postvis” was so good that it was used during audience screenings before the VFX shots were actually built and rendered.

“There were a couple of factors that elevated the postvis,” says McDonald. “Probably the primary one was integration between Framestore’s team with our team at The Third Floor London. Having them present and being supervised by Pablo Grillo guaranteed that the work we were putting together was being judged from almost a “finals” point of view, as Pablo and his artists would also be the ones finishing the job in post. It meant that our postvis wasn’t a throw away — it was the first step in the post production pipeline. This philosophy was present beyond the personnel involved. We also had creature rigs that could be translated with their animation down the line.”

Subway rampage previs

Third Floor’s previs of subway rampage.

One example of a scene that followed through from previs to postvis were parts of the Obscurus rampage in the subway. “Pablo and I worked very closely with artists at both Framestore and The Third Floor on this ‘sequence within a sequence,” says McDonald. “We started with bifrost fluid simulations created in Maya by our own senior asset builder Chris Dawson. We then had our animators distort these simulations into the virtual subway set. Through iteration, we developed the choreography of the piece and further refined the overall rhythm and shape of the piece with our previs editor. This previs then became the template for what was shot on the actual set with Eddie Redmayne and Colin Farrell in the roles of Newt and Graves. When the plate edit returned to us for postvis, we were pretty much able to drop the same distorted simulations onto the plates. The camera angles and cutting pattern in the previs edit had been followed very closely by the live-action unit. We then animated a layer of environment destruction and comped it into the shots to help tie it all together.”

During postvis, says Manz, “A character could go into a shot within a day or two. You would track the shot plate, put the character in, light it and stick it back into editorial. That sort of turn around, that in-house work that we did was the big, big difference with how the film worked. It allowed us to feed all that stuff to David Yates.”

Yates showed his director’s cut to the studio with every single shot of the film blocked out. There were no empty spaces. “We even got the environments in so he had a representation of every street,” says Manz. They completed a three-hour assembly of the film in about five months.

Creatively, it was very liberating, which enabled them to do additional shoots just to enhance the movie. Burke says they were designing and changing shots right up to the end. The final reveal of Jonny Depp as dark wizard Gellert Grindelwald came together all at the last minute.

Fantastic Beasts is like a Harry Potter movie because it exists in the J.K. Rowling story universe and is part of the Harry Potter lore. “Where it’s similar to Potter in terms of the filmmaking,” says Manz, “ is in making it feel very real and not fantasy. What I always enjoyed about the Potter films was they really felt like they were set in a real version of the UK; you could almost believe that magical sort of school existed.”

Third Floor previs

How Fantastic Beasts is different, says Burke, is that it is set in turn-of-the-century New York City, a real city, and not the magical school environment of Hogwarts. “We were dealing with adults,” continues Burke, “we’re not talking about a child growing and running everything through a child’s life. We’re talking about a series of adults. In that sense, it felt when we were making it like were making a film for adults, which obviously has great appeal to children as well. But I do feel it’s more of a film for grown-ups in terms of its storyline, and the issues it’s challenging and discussing.”

Someone told Manz that somebody “felt like Fantastic Beasts was made for the audience that read the books and watched those films and has now grown up.”

IMDB lists four new Fantastic Beasts movies in development. Burke and Manz are already in pre-production on the second, penciled in for a November 16, 2018 release date. “I think it’s fair to say,” says Burke, “that we’ll obviously be trying to expand on the universe that we’ve started to create. Newt will be there with his animals and various other characters, which are going to bring a lot more interest as the story evolves.”

Manz predicts, “It’s just trying to push the believability of the interactions and that world even further. The first film was all about creating that new world, and now it’s out there. It will be a new city (Paris) so we’ll have that challenge again, but we’ll build on what we’ve learned. You don’t often get an opportunity to work with the same team of people, so that’s going to be the great for the second one.”

 

 

VFX Supervisor Volker Engel: ‘Independence Day,’ technology and more

Uncharted Territory’s Volker Engel is one of Hollywood’s leading VFX supervisors, working on movies as diverse as White House Down, Hugo and Roland Emmerich’s Shakespeare movie Anonymous. Most recently he was in charge of the huge number of effects for Emmerich’s Independence Day: Resurgence.

Engel was kind enough to make time in his schedule to discuss his 28-year history with Emmerich, his favorite scenes from Independence Day, his experience with augmented reality on set and more.

When did you get involved with Independence Day?
I was probably the earliest person involved after Roland Emmerich himself! He kept me posted over the years while we were working on other projects because we were always going to do this movie.

I think it was 2009 when the first negotiations with 20th Century Fox started, but the important part was early 2014. Roland had to convince the studio regarding the visuals of the project. Everyone was happy with the screenplay, but they said it would be great to get some key images. I hired a company called Trixter — they are based in Germany, but also have an office in LA. They have a very strong art department. In about six weeks we finished 16 images that are what you can call “concept art,” but they are extremely detailed. Most of these concepts can be seen as finished shots in the movie. This artwork was presented to 20th Century Fox and the movie was greenlit.

Concept art via Trixter.

You have worked with Emmerich many times. You must have developed a sort of shorthand?
This is now a 28-year working relationship. Obviously, we haven’t done every movie as a team but I think this is our eighth movie together. There is a shorthand and that helps a lot. I don’t think we really know what the actual shorthand is other than things that we don’t need to talk about because we know what needs to happen.

Technology continues to advance. Does that make life easier, or because you have more options does it make it even more complex?
It’s less the fact that there’s more options, it’s that the audience is so much more sophisticated. We now have better tools available to make better pictures. We can do things now that we were not able to do before. So, for example, now we can imagine a mothership that’s 3,000 miles in diameter and actually lands on Earth. There is a reason we had a smaller mothership in the first movie and that it didn’t touch down anywhere on the planet.

The mothership touching down in DC.

So it changes the way you tell stories in a really fundamental way?
Absolutely. If you look at a movie like Ex Machina, for example, you can show a half-human/half-robot and make it incredibly visually convincing. So all of a sudden you can tell a story that you wouldn’t have been able to tell before.

If you look at the original Independence Day movie, you really only see glimpses of the aliens because we had to do it with practical effects and men in suits. For Independence Day: Resurgence we had the chance to go much further. What I like actually is that Roland decided not to make it too gratuitous, but at least we were able to fully show the aliens.

Reports vary, but they suggest about 1,700 effects shots in Independence Day: Resurgence. Is that correct?
It was 1,748. Close to two-thirds of the movie!

What was your previs process like?
We had two different teams: one joined us from Method Studios and the other was our own Uncharted Territory team, and we split the task in half. The Method artists were working in our facility, so we were all under one roof.

Method focused on the whole lunar sequence, for example, while our in-house team started with the queen/bus chase toward the end of the movie. Roland loves to work with two specific storyboard artists and has several sessions during the week with them, and we used this as a foundation for the previs.

Trixter concept art.

So Roland was involved at the previs stage looking at how it was all going to fit together?
He had an office where the previs team was working, so we could get him over and go literally from artist to artist. We usually did these sessions twice a day.

What tools were you using?
Our in-house artists are Autodesk 3D Studio Max specialists, and the good folks from Method worked with Autodesk Maya.

The live shoot used camera-tracking technology from Ncam to marry the previs graphics and the live action in realtime to give a precise impression of how the final married shot would work.

How were you using the Ncam exactly?
The advantage is that we took the assets we had already built for previs and then re-used them inside the Ncam set-up, doing this with Autodesk Motion Builder. But some of the animation had to be done right there on set.

After: Area 51

I’ll give you an example. When we’re inside the hangar at Area 51, Roland wanted to pan from an actor’s face looking at 20 jet fighters lifting off and flying into the distance, and he wanted to pan off the actors face to show the jets. The Ncam team and Marion [Spates, the on-set digital effects supervisor] had to right there, on the spot, do the animation for the fighters. In about five minutes, they had to come up with something there and then and do the animation, and what’s more, it worked. That’s why Roland also loves to work with Ncam, because it gives him the flexibility to make some decisions right there in the moment.

So you’re actually updating or even creating shots on set?
Yes, exactly. We have the toolbox there — the assets like the interior of the hangar — but then we do it right there to the picture. Sometimes for both the A-camera and the B-camera.

We did a lot of extensions and augmentations on this movie and what really helped was our experience of working with Ncam on White House Down. For Roland, as the director, it helps him compose his images instead of just looking at a gigantic bluescreen. That’s really what it is, and he’s really good at that.

The Ncam at use on set.

I explain it this way: imagine you already have your first composite right there, which goes straight to editorial. They immediately have something to work with. We just deliver two video files: the clean one with the bluescreen and another from Ncam that has the composite.

Did using Ncam add to the shooting time?
Working with AR on set always adds some shooting time, and it’s really important that the director is briefed and wants to use this tool. The Ncam prep often runs parallel to the rehearsals with the actors, but sometimes it adds two or three additional minutes. When you have someone who’s not prepared for it, two or three minutes can feel like a lifetime. It does, however, save a lot of time in post.

On White House Down, when we used Ncam for the first time, it actually took a little over a week until everything grooved and everyone was aware of it — especially the camera department. After a little while they just knew this is exactly what needed to be done. It all became instant teamwork. It is something that supports the picture and it’s not a hindrance. It’s something that the director really wants.

Do you have a favorite scene from Resurgence?
There is a sequence inside the mothership where our actors are climbing up one of these gigantic columns. We had a small set piece being built for the actors to climb, and it was really important for Roland to compose the whole image. He could ask for a landing platform to be removed and more columns to be added to create a sense of depth, then move the view around another 50 or 60 degrees.

He was creating his images right there, and that’s why the guys have to be really quick on their feet and build these things in and make it work. At the same time, the assistant director is there saying the cameras are ready, the actors are ready and we’re ready to shoot, and of course no one wants them to wait around, so they better have their stuff ready!

Destruction of Singapore

The destruction of Singapore.

Some of my other favorite sequences from the film are the destruction of Singapore while the mothership enters the atmosphere and the alien queen chasing the school bus!

What is next for you?
In 1999, when I started Unchartered Territory with my business partner Marc Weigert, we set it up as a production company and started developing our own projects. We joke that Roland interrupts us from developing our projects because he comes with projects of his own that we just cannot say no to! But we have just come back from a trip to Ireland where we scouted two studios and met with several potential production partners for a new project of our own. Stay tuned!

The Third Floor’s Eric Carney on the evolution of previs

When people hear the word previs, they likely think of visual effects, but today’s previs goes way beyond VFX. Our industry is made up of artists who think visually, so why not get a mock-up of what a scene might look like before it’s shot, whether it includes visual effects or not?

Eric Carney is a previs supervisor and co-founder of The Third Floor, which has studios in Los Angeles, Montreal and London. He defines today’s previs as a Swiss army knife that helps define the vision and solutions for all departments on a project. “Previs is not exclusively for visual effects,” he explains. “Previs teams work with producers, directors, cinematographers, stunt coordinators, special effects crews, grips, locations, editorial and many other collaborators, including visual effects, to help map out ideas for scenes and how they can be effectively executed on the day.”

Eric Carney

Let’s find out more from Carney about previs’ meaning and evolution.

How has the definition of previs changed over the years?
While previs is often categorized as a visual effects tool, it’s really a process for the entire production and is being regularly used that way. In a heads-of-department meeting, where scenes are being discussed with a large group of people, it can be hard to describe with words something that is a moving image. Being able to say, “Why don’t we mock up something in previs?” makes everyone happy because they know everyone will get something they can watch and understand, and we can move on to the next item in the meeting.

We’re also seeing previs used more frequently to develop the storytelling and to visualize a large percentage of a film — 80 to 100 percent on some we’ve collaborated on. If you can sit down and “see” a version of the movie, what works (or doesn’t) really comes to light.

Can you give an example?
Maybe a certain scene doesn’t play very well when placed after a certain other scene — maybe the order should be flipped. Maybe there are two scenes that are too similar. Maybe the pacing should be changed, or maybe the last part of the scene is unnecessary. You used to have to wait until the first cut to have this type of insight, but with previs filmmakers and studios can discover these things much earlier, before visual effects may have been ordered and oftentimes before the scenes get filmed.

Ultron

Postvis for Age of Ultron

What is the relationship between previs and production?
Previs helps to produce a blueprint for production from which everything can be planned. Once all departments have a good idea of the desired scene, they can apply their specialized knowledge for how to accomplish that — from the equipment and personnel they are going to need on the day to figuring out how many days they will be filming or where they are going to shoot. All of the nuts and bolts become easier and more efficient when the production has invested in accurate previs.

What is the relationship between previs and post production?
In post production, previs becomes something called “postvis,” which can be the editorial department’s best friend. Many big-budget movies have so many visual effects that it can be challenging to produce a truly representative cut prior to visual effects delivery if your footage is mostly greenscreen. Postvis is able to fill the live plates with temp effects, characters or environments so the creatures, backgrounds or other elements that are important to the shot appear in context. Because postvis can be done quickly, editors can request shots on the fly to help them try out and drop in different options. It’s such a useful process that we’re spending as much and sometimes more time on postvis as we do on previs.

Can you describe the creative aspect of previs?
Previs involves all aspects of filmmaking, and there are no creative boundaries. This is why directors love previs; it’s a giant sandbox, free from the realities of physical production. A previs team is typically small, so the work can be very collaborative. Operating in service of the director, frequently also including producers, visual effects supervisors or other collaborators, creative visualization helps find effective ways to visually tell the story, to show the key beats and the way a scene goes together. The previs team’s starting point is often the script or storyboards, but this can also be general descriptions of the action that needs to occur. Through previs, we often have the latitude to explore possible flows of action and brainstorm different details or gags that might be a creative fit.

While previs supports having a very fully realized creative vision, it’s also important that what is visualized translates into shots and scenes that are possible for real-world production and budgets. It’s all well and good to come up with great ideas, but eventually someone has to actually film or post produce the shot.

Can you talk about the technical aspects of previs?
Previs has an important function in helping plan complicated technical aspects of production. We call it “techvis.” This is where we incorporate input and information from all the key departments to produce detailed shooting logistics and plans. By working collaboratively to bring these details into the previs, any number of shooting and visual effects details can be determined and shots can be rehearsed virtually with a good deal of technical accuracy corresponding to the setup for the shooting day.

Many things can be figured out using techvis, including positions for the camera, how far and fast it should move, which lenses are needed and where the actors need to be. It’s also possible to define equipment needs and a host of specific details. Can the shot be done on a Fisher Dolly or will you need a jib arm or a Technocrane? Should it be Techno 30 or Techno 50? Where should it go and how much track are you going to need? Or maybe the move is too fast for a crane and you’d be better off with a Spidercam?

By interfacing with all the departments and bringing together the collective wisdom about the scene at hand, we can produce on-set specifications ahead of time so everyone can refer to the diagrams that have been created without spending time figuring it out on the day.

One area where previs and techvis artists often contribute is in scenes with motion control work. We might be charged with visualizing the types of moves that the motion control crane can achieve, or looking at the best places to position the rig. We’ve built a large library of motion control cranes in 3D that can be dropped into the virtual scene to aid this process. Not only can the move be calculated in advance, the camera path from the computer can be loaded directly to the physical rig to have the on-set equipment execute the same move.

We are at a point where virtual planning and on-set production can really work hand in hand, each process feeding the other to realize the vision more effectively.

Previs for Ant-man.

Name some common challenges that can be solved via the previs process.
One common request for previs is in planning large-scale fight scenes. Knowing what each character, creature or ship, etc. is doing at any given movement is important for the story as well as in orchestrating filming and visual effects. Scenes like the final car chase in Mad Max: Fury Road, armies clashing in Game of Thrones or the epic action in a Marvel film like Avengers: Age of Ultron or Captain America: Civil War are good examples. Visualizing heroes, villains and the inevitable destruction that will happen can be lots of fun.

As mentioned, previs also comes up to help with things that pose specific technical challenges, such as those that rely on representing physics or scale. If you’re depicting astronauts in zero gravity, a fleet of hover cars or a superhero shrinking from human to ant-size, you are likely using previs to help conceptualize, as well as realize, the scene.

What are some less common ways previs is used?
A newer trend is using previs within a virtual camera system to explore frame-ups of the shot. Previs visuals appear in a display that the director can control and reposition to see what type of coverage works best. On The Walk, postvis composites actually fed a virtual camera that was used to explore shot plates and extend practical camera moves. On some shows, previs versions of real locations or CG environments might be used to virtually “scout” and more extensively develop the shots, or a location might be sought matching the size or description suggested in a previs mockup of the scene.

Beyond showing the action, previs artists are sometimes asked to develop and test the characteristics of a character, environment, prop or type of effect. In Godzilla, we did animation tests for our director with possible fighting styles for Godzilla and the Mutos, cueing off large animals from nature. For Thor, we looked at things like how the hero’s hammer and cape would fly and behave. On Total Recall, we considered different sets of rules that might apply to cities and vehicles in a futuristic world. On special venue projects, we’ve tested things like the flow of a ride from the audience’s POV.

Previs for Game of Thrones

While previs is used a lot on CG-heavy scenes, it’s worth noting that visualization can also be vital for scenes largely based on practical filming. This is especially true for coordinating complex stunts. On Mission Impossible: Rogue Nation, for example, stunt and camera teams tightly coordinated their work with previs to identify requirements for safely and effectively pulling off in-camera stunts that ranged from Tom Cruise riding on the wing of an Airbus to being rotated in an underwater chamber. The same is true on Season 5 of Game of Thrones where the approach to realizing the ambitious arena scene in Episode 9 relied on syncing up the actions of a digital dragon with real pyrotechnics and stunt performances on the location set.

What do you see for the future of previs?
In film, we’re seeing higher proportions of movies being previsualized and previs being requested by directors and studios on productions of all sizes and scale. We’re doing more techvis, virtual production and visualization from on set. We’re looking into how modern game engines can support the process with increased interactivity and visual quality. And we are applying skills, tools and collaborations from the previs process to create content for platforms like VR.

You’ve won two Emmys as part of the Game of Thrones team. Can you talk about your work on the show?
We don’t actually think about it as a television program but more like a 10-hour movie. The trick is that we have a smaller team and less time than we would on a two-hour big film. To be able to visualize the large set-piece sequences —like Drogon in the arena or the battle at Hardhome — is an indispensable part of the production process and it’s difficult to imagine being able to achieve such sequences without this type of process. Everything involving visual effects can be planned down to the inch, with it all being done in half the time of normal films.

All of the contributors on the show — from the producers to the directors, special effects, stunts, camera, visual effects teams headed up by Joe Bauer and Steve Kullback — are so very collaborative.

Being on a show like this only inspires innovation even more. Last season, we had a flame-throwing Technodolly playing a CG dragon with real actors in a real location in Spain. This season…stay tuned!

Ncam hires industry vet Vincent Maza to head up LA office

Ncam, makers of camera tracking for augmented reality production and previs, has opened a new office in Los Angeles, and they have brought on Vincent Maza to run the operation.

Maza spent much of his career at Avid and as an HD engineer at Fletcher Chicago. More recently he has been working with the professional imaging division of Dolby and with data transfer specialist Aspera. He is also a member of the board of directors of the HPA (Hollywood Post Alliance), now part of SMPTE. He will be in Indian Wells, California next week representing Ncam at the HPA Tech Retreat.

“2016 is going to be a great year for augmented reality and we believe we will see a huge uptake in people using it to make television more engaging, more exciting and more challenging,” commented Maza. “Ncam’s camera tracking technology makes augmented reality a practical proposition, and I am very excited to be at the heart of it and supporting our US presence.”

Ncam’s tracking system is able to achieve all six degrees of movement in camera location: XYZ position in 3D space, pan, tilt and roll, so even handheld cameras can be precisely tracked with minimal latency.

Broadcasters have embraced augmented reality with Ncam, including CNN, ESPN, Fox Sports and the NFL. This same technology is used to provide directors and cinematographers with realtime visualization of effects shots. Recent movies using the technology include, Avengers Age of Ultron, Edge of Tomorrow and White House Down.

The A-List: An interview with ‘The Martian’ director Ridley Scott

By Iain Blair

A mysterious alien world in deep space, hundreds of years in the future. The gore and glory of imperial Rome, and the spectacle of its doomed gladiators. The nightmarish vision of a dystopian Los Angeles and its rogue replicants. The colossal grandeur of ancient Egypt and its massive monuments. The bloody battlefields of the Crusades. The pastoral glow of vineyards in southern France.

Those are just a few of the “other worlds” that Ridley Scott, one of the supreme stylists of contemporary cinema, has brought to life over the past five decades since making his feature debut with The Duellists in 1977. Scott’s directorial resume also includes Blade Runner, Alien and Thelma and Louise. Of all his contemporaries working today, Scott alone seems to be equally at ease creating vast landscapes set in both the distant past and distant future, in the process channeling David Lean, Cecil B. DeMille and Jim Cameron along with his own prodigious gifts as an epic storyteller and visual artist.

THE MARTIAN

Ridley Scott on location in Jordan for ‘The Martian.’

Now, the three-time Oscar-nominated director — whose credits include such varied fare as Hannibal, Robin Hood, Black Hawk Down, Exodus: Gods and Kings, A Good Year and G.I. Jane — has turned his attention to the red badlands of Mars in his new sci-fi thriller The Martian. Starring Matt Damon, Jessica Chastain, Kate Mara and Jeff Daniels, it tells the story of a botanist astronaut (Damon) left behind on the dead, hostile planet after an aborted mission and the efforts of NASA and a team of international scientists to rescue him.

I recently spoke to Scott, whose other credits include Prometheus, Matchstick Men, American Gangster and Legend, about making the 3D film, which was shot in Jordan and Hungary. We discussed his love of previs and post and — hold onto your seats!! — why post schedules are way too long for his liking.

You’ve made a lot of sci-fi films. What’s the appeal?
It’s a new canvas, it takes you into the arena of “anything goes,” but you also need to create a rulebook so the world you create is coherent, otherwise you just get rubbish. You also need a story that’s valid in that universe… and to create parameters. Anything doesn’t go!

The appeal here was it’s a sort of Robinson Crusoe survival story, set in space, five years in the future. There are no aliens and we went for a very realistic look and approach — I knew exactly what to do with it. Even as I was reading it, I was seeing Wadi Rum in Jordan, where we shot the landscapes, and I knew grading would be simple, as I could adjust terra cotta to orange landscapes.

THE MARTIAN

How early on did you decide to go 3D?
Immediately. I loved 3D when I first tried it out on Prometheus, and then we used it on Exodus, so this is the third one. Again DP Dariusz Wolski used the 3ality TS-5 Technica rigs with Red Epic Dragons and Scarlet Dragons. I love it! It’s only a problem if you allow it to become brain surgery, so you just need to know what you’re doing. It’s a bit like shooting four cameras, which I do anyway.

All the visual effects were obviously crucial. How soon did you integrate post and VFX with the production?
I start it almost immediately, and I also do a lot of boarding. I started well before we began The Martian, with a particular view or rock. I board it all myself, which makes it more accurate, and it allows you to pace a scene. They’re very instructive and they become the bible for everyone, and you can tell the VFX guys, “Here’s the lead-in, this is the cross-over, now we’re in the full VFX shot.”

What about digi-data animation?
I absolutely love it. I think it’s essential before you go into anything complex, because, first, you see what the problems are and, second, in editing you invariably haven’t got the greenscreen, so digital data enables you to cut it into the film instead of having blank space, and it stays there until you get a complete shot.

Matt Damon portrays an astronaut who draws upon his ingenuity to subsist on a hostile planet.

Did you do a lot of previs?
Yes, at MPC and Argon. I love that too as it let’s me see what’s what. It can be very sophisticated now in terms of working out the pacing and how you’ll cut. You can get very close to what the final thing will be.

Where did you post?
Partly in London and Budapest, and we did a lot of post as we shot. I cut as we go, every night, so by the end of the shoot I’m pretty close to the director’s cut. I hate waiting until the end of the shoot to start editing, so editor Pietro Scalia just got on with it. That let’s me see where I am.

We did the sound mix at Twickenham in the big new Dolby Atmos room. The mix is amazing as it gives you all this clarity and separation between dialogue and all the effects and other layers.

A lot of filmmakers complain about today’s accelerated post schedules. I assume you’re not one of them?
Are you kidding me? It’s like watching ivy grow when you’re waiting for all the VFX shots and so on. We worked 25 weeks on post for this, and I still think it’s a bit long. Today’s digital technology means you no longer travel with a million feet [of film], just digital output and data, and post is getting faster and faster, thank God. Shooting and posting in 35mm drove me crazy! To be honest, I’d be happy with an even shorter post. I love post, but if you know what you’re doing you don’t need to spend all that time. And digital has changed everything.

Matt Damon portrays an astronaut who must draw upon his ingenuity to survive on a hostile planet.How many visual effects shots are there?
Probably 1,300, and we had a lot of companies — Framestore, ILM, Milk, Prime Focus, The Senate, [The Territory for screen graphics] and my usual VFX supervisor Richard Stammers, who’s been with me since Kingdom of Heaven. Funnily enough, the hardest shot to do was the [scene] with the tape, where it floats around and curls in a rather balletic fashion. That was very tricky to get right.

Where was the DI?
At Company 3 in London. I love the DI. For me it’s the final touch, like grading still photographs, and I used my favorite colorist, Stephen Nakamura, who’s a top guy at their LA office and he would travel to London. He’s very fast, and we did the whole film in just two weeks. I’m very happy with the way it looks. (Nakamura used Blackmagic’s DaVinci Resolve on the film.)

What’s next?
I plan to start Alien: Paradise Lost in February, maybe in Toronto. It’s a sequel to Prometheus and a prequel to Alien. I’m also doing a lot of TV projects, including The Hot Zone, a drama with Fox about the Ebola virus.

You seem to be working at a flat-out pace these days, directing a huge movie every year. You turn 78 in November. Do you ever see yourself slowing down?
(Laughs) Hopefully not! I actually think I’m speeding up, and as long as I find great projects to make that really interest me, I’ll keep working.

Photos by Giles Keyte and Aidan Monaghan.


Check back in soon for our audio post coverage of The Martian.

Gaining more control over VFX shoots

By Randi Altman

During IBC this year, I saw many companies, some I was familiar with and some that were new to me. One I was eager to get to know was SolidAnim.

Alas, fate intervened. Well, more accurately, the size of the RAI Conventions Center where IBC took place intervened — it is huge and has a crazy amount of exhibit halls to navigate through. I never made it there (damn you RAI, shaking fist in air!). But happily I did get to connect with the company’s Lamia Nouri on a call recently.

Founded by three artists — mocap supervisors and animation directors Isaac Partouche, Emmanuel Linot and Jean-Francois Szlapka — this French animation company was born in Continue reading

Zoic’s Mike Romey discusses previs app ZEUS:Scout

By Randi Altman

Visual effects studio Zoic has released to the masses an iPad-based previs tool they developed while providing shots for VFX-heavy shows such as Once Upon a Time, Intelligence, Pan Am and V. The app is available now via iTunes for $9.99.

According to Zoic (@zoicstudios), ZEUS:Scout (Zoic Environmental Unification System) offers seven main modes: View allows the user to move the camera around and save camera positions for shock blocking purposes; Measurements mode allows users to bring real-world measurements into a virtual world; Characters mode can be used to insert character cards into the virtual location; Props lets users add, move, rotate and scale set props that are available for in-app purchase; Previs Animation lets users explore camera moves for previs and rehearsal purposes. The Tracking mode allows users to use the tablet as a virtual camera with the CG view matching the Continue reading

Baraboom Studios Launches in Culver City

Culver City — Baraboom Studios, a specialist in previsualization, has opened a full-service production office, here in Culver City. The launch of  the new digs, which features workspace for a small team of artists and an assortment of cutting edge technology, is a prelude to a broader expansion planned for next year. Over the next six months, the company expects to hire additional staff, add a scanning and motion capture stage and extend its menu of services.

The move provides Baraboom (www.baraboomstudios.comwith a base of operations in close proximity to several major studios and room to grow. “We’re very excited to have a home in Culver City that’s convenient to our clients and provides a comfortable working environment for our talent,” says executive producer Mike Pryor. “It’s also a signal of the strength of the company, with much more to come.”

Interior

Baraboom’s expansion plans are designed to capitalize on Hollywood’s growing use of previsualization as a time- and cost-saving tool throughout the production cycle. “We help directors, editors, art directors and others create great visuals,” Valencia explains. “We provide tools that allow them to experiment and test ideas before they arrive on the set or become involved in expensive post-production processes.”

Founded in 2009 by animation supervisor Pepe Valencia, Baraboom  Studios has provided previsualization services for a wide range of film, television and commercial projects, including Summit Entertainment’s The Impossible and Universal Pictures’ Hop. It is currently working on a  television series for a major US network, under strict NDA.

For The Impossible, Baraboom assisted in planning shots involving combinations of practical and digital visual effects. “We were able to lay it all out for everyone involved in the production to see,” recalls Pryor. “The producers were able to determine very clearly both the technical requirements and the financial aspects of each shot.”

Baraboom Studios also offers previs for animated features, Pryor adds. “With Pepe’s background in virtual cinematography and layout, we are able to provide assets directly into rough layout and then into final layout,” he explains. “That results in further cost savings while ensuring creative continuity.”

Valencia launched Baraboom Studios on the basis of his more than 20 years of experience as an animator and animation supervisor. His background includes 11 years at Sony Pictures Imageworks where he was animation supervisor on such films as The Aviator, Peter Pan and Charlie’s Angels: Full Throttle. He also produced director’s layouts for Robert Zemeckis on The Polar Express, Gil Kenan on Monster House and Brian Singer of Superman. After leaving SPI in 2007, he joined Imagi Animation Studios and served as director of photography on the film Astro Boy.