Tag Archives: Avengers

This DIT talks synchronization and action scenes

By Peter Welch

As a digital imaging technician on feature films, I work closely with the director of photography, camera crew, editorial and visual effects teams to make sure the right data is collected for post production to transform raw digital footage into a polished movie as efficiently as possible.

A big part of this role is to anticipate how things could go wrong during a shoot, so I can line up the optimal combination of technical kit to minimize the impact should the worst happen. With all the inherent unpredictability that comes with high-speed chases and pyrotechnics, feature film action sequences can offer up some particularly tough challenges. These pivotal scenes are incredibly costly to get wrong, so every technological requirement is amplified. This includes the need to generate robust and reliable timecode.

I take timecode very seriously, seeing it as an essential part of the camera rather than a nice-to-have add-on. Although it doesn’t really affect us on set, further down the line, a break in timecode can cause other areas a whole world of problems. In the case of last year’s Avengers: Age of Ultron, creating the spectacular scenes and VFX we’ve come to expect from Marvel involved developing solid workflows for some very large multi-camera set-ups. For some shots, as many as 12 cameras were rolling with a total camera package of 27 cameras, including Arri Alexa XTs, Canon C500s with Codex recorders, Red Epics and Blackmagic cameras. The huge amounts of data generated made embedding accurate, perfectly synced timecode into every piece of footage an important technical requirement.

Avengers : Age of Ultron ; Year : 2015 USA ; Director : Joss Whedon ; Chris Evans, Jeremy Renner, Scarlett Johansson, Chris Hemsworth. Image shot 2015. Exact date unknown.One of the largest action sequences for Age of Ultron was filmed in Korea with eight cameras rigged to capture footage — four Arri Alexas and four Canon C500s — and huge volumes of RAW output going to Codex recorders. With this shoot, there was a chance that cameras could be taken out while filming, putting footage at risk of being lost. As a result, while the Alexas were strategically rigged a safe distance from the main action, the less costly C500s were placed in and around the explosion, putting them at an increased risk of being caught in the line of fire.

As an added complication, once the set was built, and definitely once it was hot with explosives, we couldn’t go back in to adjust camera settings. So while I was able to manually jam-sync the Alexas, the C500s had to be set to record with timecode running at the point of rigging. There wasn’t an opportunity to go back later and re-jam midway through the day — they had to stay in sync throughout, whatever twists and turns the filming process took.

With the C500 cameras placed in strategic positions to maximize the action, the Codex recorders, Preston MDRs and power were built into recording and camera control boxes (or ‘safe boxes’) and positioned at a distance from the cameras and then connected via a bespoke set of cables. Within each C500’s “safe box,” I also placed a Timecode Systems Minitrx+ set in receive mode. This was synced over RF to a master unit back outside of the “hot” zone.

With an internal Li-Polymer battery powering it for up to 12 hours, the Mintrx+ units in the C500 “safe boxes” could be left running throughout a long shooting day with complete confidence and no requirement for manual jamming or resetting. This set-up ensured all footage captured by the C500s in the “hot” zone was stamped with the same frame-accurate timecode as the Alexas. The timecode could also be monitored via the return video signals’ embedded SDI feed.

But it’s not just the pyrotechnics that inject unpredictability into shooting this kind of scene — the sheer scale of the locations can be as much of a challenge. The ability to synchronize timecode over RF definitely helps, but even with long-range RF it’s good to have a backup. For example, for one scene in 2015’s Spectre, 007 piloted a motorboat down a sizeable stretch of the Thames in London. For this scene, I rigged one camera with a Minirtx+ on a boat in Putney, powered it up and left it onboard filming James Bond. I then got in my car and raced down the Embankment to Westminster to set up the main technical base with the camera crews, with a Timecode Systems unit set to the same timecode as that on the boat.

Even though the boat started its journey out of range of its paired unit, the crystal inside the Minitrx+ continued to feed timecode to that camera accurately. As soon as the boat drifted into range, it synced back to the master unit again with zero drift. There was no need to reset or re-jam.

Action sequences are certainly getting increasingly ambitious, with footage being captured from an increasing number and variety of camera sources. Although it’s possible to fix sync problems in post, it’s time consuming and expensive. Getting it right at the point of shooting offers considerable efficiencies to the production process, something that every production demands — even those working with superhero budgets.

Peter Welch is a London-based digital imaging technician (DIT) with Camera Facilities.

The Third Floor’s Eric Carney on the evolution of previs

When people hear the word previs, they likely think of visual effects, but today’s previs goes way beyond VFX. Our industry is made up of artists who think visually, so why not get a mock-up of what a scene might look like before it’s shot, whether it includes visual effects or not?

Eric Carney is a previs supervisor and co-founder of The Third Floor, which has studios in Los Angeles, Montreal and London. He defines today’s previs as a Swiss army knife that helps define the vision and solutions for all departments on a project. “Previs is not exclusively for visual effects,” he explains. “Previs teams work with producers, directors, cinematographers, stunt coordinators, special effects crews, grips, locations, editorial and many other collaborators, including visual effects, to help map out ideas for scenes and how they can be effectively executed on the day.”

Eric Carney

Let’s find out more from Carney about previs’ meaning and evolution.

How has the definition of previs changed over the years?
While previs is often categorized as a visual effects tool, it’s really a process for the entire production and is being regularly used that way. In a heads-of-department meeting, where scenes are being discussed with a large group of people, it can be hard to describe with words something that is a moving image. Being able to say, “Why don’t we mock up something in previs?” makes everyone happy because they know everyone will get something they can watch and understand, and we can move on to the next item in the meeting.

We’re also seeing previs used more frequently to develop the storytelling and to visualize a large percentage of a film — 80 to 100 percent on some we’ve collaborated on. If you can sit down and “see” a version of the movie, what works (or doesn’t) really comes to light.

Can you give an example?
Maybe a certain scene doesn’t play very well when placed after a certain other scene — maybe the order should be flipped. Maybe there are two scenes that are too similar. Maybe the pacing should be changed, or maybe the last part of the scene is unnecessary. You used to have to wait until the first cut to have this type of insight, but with previs filmmakers and studios can discover these things much earlier, before visual effects may have been ordered and oftentimes before the scenes get filmed.

Ultron

Postvis for Age of Ultron

What is the relationship between previs and production?
Previs helps to produce a blueprint for production from which everything can be planned. Once all departments have a good idea of the desired scene, they can apply their specialized knowledge for how to accomplish that — from the equipment and personnel they are going to need on the day to figuring out how many days they will be filming or where they are going to shoot. All of the nuts and bolts become easier and more efficient when the production has invested in accurate previs.

What is the relationship between previs and post production?
In post production, previs becomes something called “postvis,” which can be the editorial department’s best friend. Many big-budget movies have so many visual effects that it can be challenging to produce a truly representative cut prior to visual effects delivery if your footage is mostly greenscreen. Postvis is able to fill the live plates with temp effects, characters or environments so the creatures, backgrounds or other elements that are important to the shot appear in context. Because postvis can be done quickly, editors can request shots on the fly to help them try out and drop in different options. It’s such a useful process that we’re spending as much and sometimes more time on postvis as we do on previs.

Can you describe the creative aspect of previs?
Previs involves all aspects of filmmaking, and there are no creative boundaries. This is why directors love previs; it’s a giant sandbox, free from the realities of physical production. A previs team is typically small, so the work can be very collaborative. Operating in service of the director, frequently also including producers, visual effects supervisors or other collaborators, creative visualization helps find effective ways to visually tell the story, to show the key beats and the way a scene goes together. The previs team’s starting point is often the script or storyboards, but this can also be general descriptions of the action that needs to occur. Through previs, we often have the latitude to explore possible flows of action and brainstorm different details or gags that might be a creative fit.

While previs supports having a very fully realized creative vision, it’s also important that what is visualized translates into shots and scenes that are possible for real-world production and budgets. It’s all well and good to come up with great ideas, but eventually someone has to actually film or post produce the shot.

Can you talk about the technical aspects of previs?
Previs has an important function in helping plan complicated technical aspects of production. We call it “techvis.” This is where we incorporate input and information from all the key departments to produce detailed shooting logistics and plans. By working collaboratively to bring these details into the previs, any number of shooting and visual effects details can be determined and shots can be rehearsed virtually with a good deal of technical accuracy corresponding to the setup for the shooting day.

Many things can be figured out using techvis, including positions for the camera, how far and fast it should move, which lenses are needed and where the actors need to be. It’s also possible to define equipment needs and a host of specific details. Can the shot be done on a Fisher Dolly or will you need a jib arm or a Technocrane? Should it be Techno 30 or Techno 50? Where should it go and how much track are you going to need? Or maybe the move is too fast for a crane and you’d be better off with a Spidercam?

By interfacing with all the departments and bringing together the collective wisdom about the scene at hand, we can produce on-set specifications ahead of time so everyone can refer to the diagrams that have been created without spending time figuring it out on the day.

One area where previs and techvis artists often contribute is in scenes with motion control work. We might be charged with visualizing the types of moves that the motion control crane can achieve, or looking at the best places to position the rig. We’ve built a large library of motion control cranes in 3D that can be dropped into the virtual scene to aid this process. Not only can the move be calculated in advance, the camera path from the computer can be loaded directly to the physical rig to have the on-set equipment execute the same move.

We are at a point where virtual planning and on-set production can really work hand in hand, each process feeding the other to realize the vision more effectively.

Previs for Ant-man.

Name some common challenges that can be solved via the previs process.
One common request for previs is in planning large-scale fight scenes. Knowing what each character, creature or ship, etc. is doing at any given movement is important for the story as well as in orchestrating filming and visual effects. Scenes like the final car chase in Mad Max: Fury Road, armies clashing in Game of Thrones or the epic action in a Marvel film like Avengers: Age of Ultron or Captain America: Civil War are good examples. Visualizing heroes, villains and the inevitable destruction that will happen can be lots of fun.

As mentioned, previs also comes up to help with things that pose specific technical challenges, such as those that rely on representing physics or scale. If you’re depicting astronauts in zero gravity, a fleet of hover cars or a superhero shrinking from human to ant-size, you are likely using previs to help conceptualize, as well as realize, the scene.

What are some less common ways previs is used?
A newer trend is using previs within a virtual camera system to explore frame-ups of the shot. Previs visuals appear in a display that the director can control and reposition to see what type of coverage works best. On The Walk, postvis composites actually fed a virtual camera that was used to explore shot plates and extend practical camera moves. On some shows, previs versions of real locations or CG environments might be used to virtually “scout” and more extensively develop the shots, or a location might be sought matching the size or description suggested in a previs mockup of the scene.

Beyond showing the action, previs artists are sometimes asked to develop and test the characteristics of a character, environment, prop or type of effect. In Godzilla, we did animation tests for our director with possible fighting styles for Godzilla and the Mutos, cueing off large animals from nature. For Thor, we looked at things like how the hero’s hammer and cape would fly and behave. On Total Recall, we considered different sets of rules that might apply to cities and vehicles in a futuristic world. On special venue projects, we’ve tested things like the flow of a ride from the audience’s POV.

Previs for Game of Thrones

While previs is used a lot on CG-heavy scenes, it’s worth noting that visualization can also be vital for scenes largely based on practical filming. This is especially true for coordinating complex stunts. On Mission Impossible: Rogue Nation, for example, stunt and camera teams tightly coordinated their work with previs to identify requirements for safely and effectively pulling off in-camera stunts that ranged from Tom Cruise riding on the wing of an Airbus to being rotated in an underwater chamber. The same is true on Season 5 of Game of Thrones where the approach to realizing the ambitious arena scene in Episode 9 relied on syncing up the actions of a digital dragon with real pyrotechnics and stunt performances on the location set.

What do you see for the future of previs?
In film, we’re seeing higher proportions of movies being previsualized and previs being requested by directors and studios on productions of all sizes and scale. We’re doing more techvis, virtual production and visualization from on set. We’re looking into how modern game engines can support the process with increased interactivity and visual quality. And we are applying skills, tools and collaborations from the previs process to create content for platforms like VR.

You’ve won two Emmys as part of the Game of Thrones team. Can you talk about your work on the show?
We don’t actually think about it as a television program but more like a 10-hour movie. The trick is that we have a smaller team and less time than we would on a two-hour big film. To be able to visualize the large set-piece sequences —like Drogon in the arena or the battle at Hardhome — is an indispensable part of the production process and it’s difficult to imagine being able to achieve such sequences without this type of process. Everything involving visual effects can be planned down to the inch, with it all being done in half the time of normal films.

All of the contributors on the show — from the producers to the directors, special effects, stunts, camera, visual effects teams headed up by Joe Bauer and Steve Kullback — are so very collaborative.

Being on a show like this only inspires innovation even more. Last season, we had a flame-throwing Technodolly playing a CG dragon with real actors in a real location in Spain. This season…stay tuned!