Author Archives: Randi Altman

Avid at IBC with new post workflows based on MediaCentral 

At IBC2017, Avid introduced new MediaCentral solutions for post production. Dubbed MediaCentral for Post, the solutions integrate Media Composer video editing software with a new collaborative asset management module, new video I/O hardware and Avid Nexis software-defined storage.

MediaCentral for Post is a scalable solution for small and mid-sized creative teams to enhance collaboration and deliver their best work faster, as well as to work more efficiently with 4K and other demanding formats. Avid collected feedback from working editors while developing a collaborative workflow that goes beyond bin locking and project sharing to include integrated storage, editing, I/O acceleration and media management.

Besides Media Composer, MediaCentral solutions for post integrate Avid’s products in a single, open platform that includes:

• MediaCentral Editorial Management is new media asset management tool that enables everyone in a creative organization to collaborate in secure, reliable and simply configured media workflows from a web browser. MediaCentral Editorial Management gives a view into an entire organization’s media assets. Without needing a an NLE, assistants and producers can ingest files, create bins, add locators and metadata, create subclips and perform other asset management tasks — all from a simple browser interface. Users can collaborate using the new MediaCentral Panel for Media Composer, which provides direct access to MediaCentral content right in the Media Composer interface.

• MediaCentral Cloud UX, is an easy-to-use and task-oriented graphical user interface that runs on any OS or mobile device, and is available to everyone connected to the platform. Creative team members can easily collaborate with each other from wherever they are.

• The Artist DNxIVvideo interface offers a wide range of analog and digital I/O to plug into diverse media productions. It works with a broad spectrum of Avid and third-party video editing, audio, visual effects and graphics software.

• The MediaCentral Panel for Media Composer within the Media Composer user interface enables users to see media outside of their active project as well as drag and drop assets from MediaCentral directly into any Media Composer project, bin or sequence.

• Avid Nexis Pro now scales to 160 terabytes – twice its previous capacity – to give small post facilities the ease-of-use, security and performance advantages that larger Avid Nexis customers have access to. Avid Nexis E2 now supports SSD drives to deliver the extreme performance required when working with multiple streams of ultra-high-resolution media in real time. Additionally, Avid Nexis Enterprise now leverages 100 terabyte media packs to scale up to 4.8 petabytes.

Adobe intros updates to Creative Cloud, including Team Projects

Later this year, Adobe will be offering new capabilities within its Adobe Creative Cloud video tools and services. This includes updates for VR/360, animation, motion graphics, editing, collaboration and Adobe Stock. Many of these features are powered by Adobe Sensei, the company’s artificial intelligence and machine learning framework. Adobe will preview these advancements at IBC.

The new capabilities coming later this year to Adobe Creative Cloud for video include:
• Access to motion graphics templates in Adobe Stock and through Creative Cloud Libraries, as well as usability improvements to the Essential Graphics panel in Premiere Pro, including responsive design options for preserving spatial and temporal.
• Character Animator 1.0 with changes to core and custom animation functions, such as pose-to-pose blending, new physics behaviors and visual puppet controls. Adobe Sensei will help improve lip-sync capability by accurately matching mouth shape with spoken sounds.
• Virtual reality video creation with a dedicated viewing environment in Premiere Pro. Editors can experience the deeply engaging qualities of content, review their timeline and use keyboard driven editing for trimming and markers while wearing the same VR head-mounts as their audience. In addition, audio can be determined by orientation or position and exported as ambisonics audio for VR-enabled platforms such as YouTube and Facebook. VR effects and transitions are now native and accelerated via the Mercury playback engine.
• Improved collaborative workflows with Team Projects on the Local Area Network with managed access features that allow users to lock bins and provide read-only access to others. Formerly in beta, the release of Team Projects will offer smoother workflows hosted in Creative Cloud and the ability to more easily manage versions with auto-save history.
• Flexible session organization to multi-take workflows and continuous playback while editing in Adobe Audition. Powered by Adobe Sensei, auto-ducking is added to the Essential Sound panel that automatically adjusts levels by type: dialogue, background sound or music.

Integration with Adobe Stock
Adobe Stock is now offering over 90 million assets including photos, illustrations and vectors. Customers now have access to over 4 million HD and 4K Adobe Stock video footage directly within their Creative Cloud video workflows and can now search and scrub assets in Premiere Pro.

Coming to this new release are hundreds of professionally-created motion graphics templates for Adobe Stock, available later this year. Additionally, motion graphic artists will be able to sell Motion Graphic templates for Premiere Pro through Adobe Stock. Earlier this year, Adobe added editorial and premium collections from Reuters, USA Today Sports, Stocksy and 500px.

LumaForge offering support for shared projects in Adobe Premiere

LumaForge, which designs and sells high-performance servers and shared storage appliances for video workflows, will be at IBC this year showing full support for new collaboration features in Adobe Premiere Pro CC. When combined with LumaForge’s Jellyfish or ShareStation post production servers, the new Adobe features — including multiple open projects and project locking —allow production groups and video editors to work more effectively with shared projects and assets. This is something that feature film and TV editors have been asking for from Adobe.

Project locking allows multiple users to work with the same content. In a narrative workflow, an editing team can divide their film into shared projects per reel or scene. An assistant editor can get to work synchronizing and logging one scene, while the editor begins assembling another. Once the assistant editor is finished with their scene, the editor can refresh their copy of the scene’s Shared Project and immediately see the changes.

An added benefit of using Shared Projects on productions with large amounts of footage is the significantly reduced load time of master projects. When a master project is broken into multiple shared project bins, footage from those shared projects is only loaded once that shared project is opened.

“Adobe Premiere Pro facilitates a broad range of editorial collaboration scenarios,” says Sue Skidmore, partner relations for Adobe Professional Video. “The LumaForge Jellyfish shared storage solution complements and supports them well.”

All LumaForge Jellyfish and LumaForge ShareStation servers will support the Premiere Pro CC collaboration features for both Mac OS and Windows users, connecting over 10Gb Ethernet.

Check out their video on the collaboration here.

Sony Imageworks’ VFX work on Spider-Man: Homecoming

By Daniel Restuccio

With Sony’s Spider-Man: Homecoming getting ready to release digitally on September 26 and on 4K Ultra HD/Blu-ray, Blu-ray 3D, Blu-ray and DVD on October 17, we thought this was a great opportunity to talk about some of the film’s VFX.

Sony Imageworks has worked on every single Spider-Man movie in some capacity since the 2002 Sam Raimi version. On Spider-Man: Homecoming, Imageworks worked on mostly the “third act,” which encompasses the warehouse, hijacked plane and beach destruction scenes. This meant delivering over 500 VFX shots, created by over 30 artists (at one point this peaked at 200) and compositors, and rendering out 2K finished scenes.

All of the Imageworks artists used Dell R7910 workstations with Intel Xeon CPU E5-2620 24 cores, 64GB memory and Nvidia Quadro P5000 graphics cards. They used Cinesync for client reviews and internally they used their in-house Itview software. Rendering technology was SPI Arnold (not the commercial version) and their custom shading system. Software used was Autodesk 2015, Foundry’s Nuke X 10.0 and Side Effects Houdini 15.5. They avoided plug-ins so that their auto-vend, breaking of comps into layers for the 3D conversion process, would be as smooth as possible. Everything was rendered internally on their on-premises renderfarm. They also used the Sony “Kinect” scanning technique that allowed their artists to do performance capture on themselves and rapidly prototype ideas and generate reference.

We sat down with Sony Imageworks VFX supervisor Theo Bailek, who talks about the studio’s contribution to this latest Spidey film.

You worked on The Amazing Spider-Man in 2012 and The Amazing Spider-Man 2 in 2014. From a visual effects standpoint, what was different?
You know, not a lot. Most of the changes have been iterative improvements. We used many of the same technologies that we developed on the first few movies. How we do our city environments is a specific example of how we build off of our previous assets and techniques, leveraging off the library of buildings and props. As the machines get faster and the software more refined, it allows our artists increased iterations. This alone gave our team a big advantage over the workflows from five years earlier. As the software and pipeline here at Sony has gotten more accessible, it has allowed us to more quickly integrate new artists.

It’s a lot of very small, incremental improvements along the way. The biggest technological changes between now and the early Spider-Mans is our rendering technology. We use a more physically-accurate-based rendering incarnation of our global illumination Arnold renderer. As the shaders and rendering algorithms become more naturalistic, we’re able to conform our assets and workflows. In the end, this translates to a more realistic image out of the box.

The biggest thing on this movie was the inclusion of Spider-Man in a Marvel Universe: a different take on this film and how they wanted it to go. That would be probably the biggest difference.

Did you work directly with director Jon Watts, or did you work with production VFX supervisor Janek Sirrs in terms of the direction on the VFX?
During the shooting of the film I had the advantage of working directly with both Janek and Jon. The entire creative team pushed for open collaboration, and Janek was very supportive toward this goal. He would encourage and facilitate interaction with both the director and Tom Holland (who played Spider-Man) whenever possible. Everything moved so quick on set, often times if you waited to suggest an idea you’d lose the chance, as they would have to set up for the next scene.

The sooner Janek could get his vendor supervisors comfortable interacting, the bigger our contributions. While on set I often had the opportunity to bring our asset work and designs directly to Jon for feedback. There were times on set when we’d iterate on a design three or four times over the span of the day. Getting this type of realtime feedback was amazing. Once post work began, most of our reviews were directly with Janek.

When you had that first meeting about the tone of the movie, what was Jon’s vision? What did he want to accomplish in this movie?
Early on, it was communicated from him through Janek. It was described as, “This is sort of like a John Hughes, Ferris Bueller’s take on Spider-Man. Being a teenager he’s not meant to be fully in control of his powers or the responsibility that comes with them. This translates to not always being super-confident or proficient in his maneuvers. That was the basis of it.

Their goal was a more playful, relatable character. We accomplished this by being more conservative in our performances, of what Spider-Man was capable of doing. Yes, he has heightened abilities, but we never wanted every landing and jump to be perfect. Even superheroes have off days, especially teenage ones.

This being part of the Marvel Universe, was there a pool of common assets that all the VFX houses used?
Yes. With the Marvel movies, they’re incredibly collaborative and always use multiple vendors. We’re constantly sharing the assets. That said, there are a lot of things you just can’t share because of the different systems under the hood. Textures and models are easily exchanged, but how the textures are combined in the material and shaders… that makes them not reusable given the different renderers at companies. Character rigs are not reusable across vendors as facilities have very specific binding and animation tools.

It is typical to expect only models, textures, base joint locations and finished turntable renders for reference when sending or receiving character assets. As an example, we were able to leverage somewhat on the Avengers Tower model we received from ILM. We did supply our Toomes costume model and Spider-Man character and costume models to other vendors as well.

The scan data of Tom Holland, was it a 3D body scan of him or was there any motion capture?
Multiple sessions were done through the production process. A large volume of stunts and test footage were shot with Tom before filming that proved to be invaluable to our team. He’s incredibly athletic and can do a lot of his own stunts, so the mocap takes we came away with were often directly usable. Given that Tom could do backflips and somersaults in the air we were able to use this footage as a reference for how to instruct our animators later on down the road.
Toward the later-end of filming we did a second capture session, focusing on the shots we wanted to acquire using specific mocap performances. Then again several months later, we followed up with a third mocap session to get any new performances required as the edit solidified.

As we were trying to create a signature performance that felt like Tom Holland, we exclusively stuck to his performances whenever possible. On rare occasions when the stunt was too dangerous, a stuntman was used. Other times we resorted to using our own in-house method of performance capture using a modified Xbox Kinect system to record our own animators as they acted out performances.

In the end performance capture accounted for roughly 30% of the character animation of Spider-Man and Vulture in our shots, with the remaining 70% being completed using traditional key-framed methods.

How did you approach the fresh take on this iconic film franchise?
It was clear from our first meeting with the filmmakers that Spider-Man in this film was intended to be a more relatable and light-hearted take on the genre. Yes, we wanted to take the characters and their stories seriously, but not at the expense of having fun with Peter Parker along the way.

For us that meant that despite Spider-Man’s enhanced abilities, how we displayed those abilities on screen needed to always feel grounded in realism. If we faltered on this goal, we ran the risk of eroding the sense of peril and therefore any empathy toward the characters.

When you’re animating a superhero it’s not easy to keep the action relatable. When your characters possess abilities that you never see in the real world, it’s a very thin line between something that looks amazing and something that is amazingly silly and unrealistic. Over-extend the performances and you blow the illusion. Given that Peter Parker is a teenager and he’s coming to grips with the responsibilities and limits of his abilities, we really tried to key into the performances from Tom Holland for guidance.

The first tool at our disposal and the most direct representation of Tom as Spider-Man was, of course, motion capture of his performances. On three separate occasions we recorded Tom running through stunts and other generic motions. For the more dangerous stunts, wires and a stuntman were employed as we pushed the limit of what could be recorded. Even though the cables allowed us to record huge leaps, you couldn’t easily disguise the augmented feel to the actor’s weight and motion. Even so, every session provided us with amazing reference.

Though a bulk of the shots were keyframed, it was always informed by reference. We looked at everything that was remotely relevant for inspiration. For example, we have a scene in the warehouse where the Vulture’s wings are racing toward you as Spider-Man leaps into the air stepping on the top of the wings before flipping to avoid the attack. We found this amazing reference of people who leap over cars racing in excess of 70mph. It’s absurdly dangerous and hard to justify why someone would attempt a stunt like that, and yet it was the perfect for inspiration for our shot.

In trying to keep the performances grounded and stay true to the goals of the filmmakers, we also found it was always better to err on the side of simplicity when possible. Typically, when animating a character, you look for opportunities to create strong silhouettes so the actions read clearly, but we tended to ignore these rules in favor of keeping everything dirty and with an unscripted feel. We let his legs cross over and knees knock together. Our animation supervisor, Richard Smith, pushed our team to follow the guidelines of “economy of motion.” If Spider-Man needed to get from point A to B he’d take the shortest route — there’s not time to strike an iconic pose in-between!


Let’s talk a little bit about the third act. You had previsualizations from The Third Floor?
Right. All three of the main sequences we worked on in the third act had extensive previs completed before filming began. Janek worked extremely closely with The Third Floor and the director throughout the entire process of the film. In addition, Imageworks was tapped to help come up with ideas and takes. From early on it was a very collaborative effort on the part of the whole production.
The previs for the warehouse sequence was immensely helpful in the planning of the shoot. Given we were filming on location and the VFX shots would largely rely on carefully choreographed plate photography and practical effects, everything had to be planned ahead of time. In the end, the previs for that sequence resembled the final shots in most cases.

The digital performances of our CG Spider-Man varied at times, but the pacing and spirit remained true to the previs. As our plane battle sequence was almost entirely CG, the previs stage was more of an ongoing process for this section. Given that we weren’t locked into plates for the action, the filmmakers were free to iterate and refine ideas well past the time of filming. In addition to The Third Floor’s previs, Imageworks’ internal animation team also contributed heavily to the ideas that eventually formed the sequence.

For the beach battle, we had a mix of plate and all-CG shots. Here the previs was invaluable once again in informing the shoot and subsequent reshoots later on. As there were several all-CG beats to the fight, we again had sections where we continued to refine and experiment till late into post. As with the plane battle, Imageworks’ internal team contributed extensively to pre and postvis of this sequence.

The one scene, you mentioned — the fight in the warehouse — in the production notes, it talks about that scene being inspired by an actual scene from the comic The Amazing Spider-Man #33.
Yes, in our warehouse sequence there are a series of shots that are directly inspired by the comic book’s cells. Different circumstances in the the comic and our sequence lead to Spider-Man being trapped under debris. However, Tom’s performance and the camera angles that were shot play homage to the comic as he escapes. As a side note, many of those shots were added later in the production and filmed as reshoots.

What sort of CG enhancements did you bring to that scene?
For the warehouse sequence, we added digital Spider-Man, Vulture wings, CG destruction, enhanced any practical effects, and extended or repaired the plate as needed.The columns that the Vulture wings slice through as it circles Spider-Man were practically destroyed with small denoted charges. These explosives were rigged within cement that encased the actual warehouses steel girder columns. They had fans on set that were used to help mimic interaction from the turbulence that would be present from a flying wingsuit powered by turbines. These practical effects were immensely helpful for our effects artists as they provided the best-possible in-camera reference. We kept much of what was filmed, adding our fully reactive FX on top to help tie it into the motion of our CG wings.

There’s quite a bit of destruction when the Vulture wings blast through walls as well. For those shots we relied entirely on CG rigid body dynamic simulations for the CG effects, as filming it would have been prohibitive and unreliable. Though most of the shots in this sequence had photographed plates, there were still a few that required the background to be generated in CG. One shot in particular, with Spider-Man sliding back and rising up, stands out in particular. As the shot was conceived later in the production, there was no footage for us to use as our main plate. We did however have many tiles shot of the environment, which we were able to use to quickly reconstruct the entire set in CG.

I was particularly proud of our team for their work on the warehouse sequence. The quality of our CG performances and the look of the rendering is difficult to discern from the live action. Even the rare all-CG shots blended seamlessly between scenes.

When you were looking at that ending plane scene, what sort of challenges were there?
Since over 90 shots within the plane sequence were entirely CG we faced many challenges, for sure. With such a large number of shots without the typical constraints that practical plates impose, we knew a turnkey pipeline was needed. There just wouldn’t be time to have custom workflows for each shot type. This was something Janek, our client-side VFX supervisor, stressed from the onset, “show early, show often and be prepared to change constantly!”

To accomplish this, a balance of 3D and 2D techniques were developed to make the shot production as flexible as possible. Using our compositing software Nuke’s 3D abilities we were able to offload significant portions of the shot production into the compositor’s hands. For example: the city ground plane you see through the clouds, the projections of the imagery on the plane’s cloaking LED’s and the damaged flickering LED’s were all techniques done in the composite.

A unique challenge to the sequence that stands out is definitely the cloaking. Making an invisible jet was only half of the equation. The LEDs that made up the basis for the effect also needed to be able to illuminate our characters. This was true for wide and extreme close-up shots. We’re talking about millions of tiny light sources, which is a particularly expensive rendering problem to tackle. Mix in the fact that the design of these flashing light sources is highly subjective and thus prone to needing many revisions to get the look right.

Painting control texture maps for the location of these LEDs wouldn’t be feasible for the detail needed on our extreme close-up shots. Modeling them in would have been prohibitive as well, resulting in excessive geometric complexity. Instead, using Houdini, our effects software, we built algorithms to automate the distribution of point clouds of data to intelligently represent each LED position. This technique could be reprocessed as necessary without incurring the large amounts of time a texture or model solution would have required. As the plane base model often needed adjustments to accommodate design or performance changes, this was a real factor. The point cloud data was then used by our rendering software to instance geometric approximations of inset LED compartments on the surface.

Interestingly, this was a technique we adopted from rendering technology we use to create room interiors for our CG city buildings. When rendering large CG buildings we can’t afford to model the hundreds and sometimes thousands of rooms you see through the office windows. Instead of modeling the complex geometry you see through the windows, we procedurally generate small inset boxes for each window that have randomized pictures of different rooms. This is the same underlying technology we used to create the millions of highly detailed LEDs on our plane.

First our lighters supplied base renders to our compositors to work with inside of Nuke. The compositors quickly animated flashing damage to the LEDs by projecting animated imagery on the plane using Nuke’s 3D capabilities. Once we got buyoff on the animation of the imagery we’d pass this work back to the lighters as 2D layers that could be used as texture maps for our LED lights in the renderer. These images would instruct each LED when it was on and what color it needed to be. This back and forth technique allowed us to more rapidly iterate on the look of the LEDs in 2D before committing and submitting final 3D renders that would have all of the expensive interactive lighting.

Is that a proprietary system?
Yes, this is a shading system that was actually developed for our earlier Spider-Man films back when we used RenderMan. It has since been ported to work in our proprietary version of Arnold, our current renderer.

Broadway Video’s Sue Pelino and team win Emmy

Sue Pelino and the sound mixing team at New York City’s Broadway Video have won the Emmy for Outstanding Sound Mixing for a Variety Series Or Special for their work on the 2017 Rock & Roll Hall of Fame Induction Ceremony that aired on HBO in April. Pelino served as re-recording mixer on the project.

Says Pelino, who is VP of audio post production at Broadway Video, “Our goal in preparing the televised package was to capture the true essence of the night. We wanted viewers to experience the energy and feel as if they were sitting in the tenth row of the Barclays Center. It’s a remarkable feeling to know that we have achieved that goal.”

Pelino is already the proud owner of two Emmy awards and has nine nominations under her belt. Her career as an audio post production engineer rests on her early years playing guitar in rock bands and recording original songs in her home studio.

Additional members of the winning sound team for the 2017 Rock & Roll Hall of Fame Induction Ceremony — produced by HBO entertainment in association with Playtone, Line by Line Productions, Alex Coletti Productions and the Rock & Roll Hall of Fame Foundation — include Al Centrella, John Harris, Dave Natale, Jay Vicari, Erik Von Ranson and Simon Welch.

Behind the Title: Park Road Post’s Anthony Pratt

NAME: Anthony Pratt

COMPANY: Park Road Post Production

CAN YOU DESCRIBE YOUR COMPANY?
Park Road is a bespoke post production facility, and is part of the Weta Group of Companies based on the Miramar Peninsular in Wellington, New Zealand.

We are internationally recognized for our award-winning sound and picture finishing for TV and film. We walk alongside all kinds of storytellers, supporting them from shoot through to final delivery.

WHAT’S YOUR JOB TITLE?
Workflow Architect — Picture

WHAT DOES THAT ENTAIL?
I get to think about how we can work with a production to wrap process and people around a project, all with a view of achieving the best result at the end of that process. It’s about taking a step back and challenging our current view while thinking about what’s next.

We spend a lot of time working with the camera department and dailies team, and integrating their work with editorial and VFX. I work alongside our brilliant director of engineering for the picture department, and our equally skilled systems technology team — they make me look good!

From a business development perspective, I try to integrate the platforms and technologies we advance into new opportunities for Park Road as a whole.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I quite like outreach around the company and the group, so presenting and sharing is fun — and it’s certainly not always directly related to the work in the picture department. Our relationships with film festivals, symposia, the local industry guilds and WIFT always excite me.

WHAT’S YOUR FAVORITE PART OF THE JOB?
My favorite time of all is when we get to see our clients work in a cinema with an audience for the first time — then the story is really real.

It’s great when our team is actively engaged as a creative partner, especially during the production phase. I enjoy learning from our technical team alongside our creative folk, and there’s always something to learn.

We have fantastic coffee and baristas; I get to help QC that throughout my day!

WHAT’S YOUR LEAST FAVORITE?
It’s always hard when a really fantastic story we’ve helped plan for isn’t greenlit. That’s the industry, of course, but there are some stories we really want to see told! Like everyone, there are a few Monday mornings that really need to start later.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
I get a huge kick on the days we get to sign off the final DCP for a theatrical release. It’s always inspiring seeing all that hard work come together in our cinema.

I am also particularly fond of summer days where we can get away from the facility for a half hour and eat lunch on a beach somewhere with the crew — in Miramar a beach is only 10 minutes away.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I’d be building my own business and making my own work — if it wasn’t strictly film related it would still be narrative — and I’m always playing with technology, so no doubt I’d be asking questions about what that meant from a lived perspective, regardless of the medium. I’d quite probably be distilling a bit more gin as well!

WHY DID YOU CHOOSE THIS PROFESSION? HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I think it kind of chose me in the end… I’ve always loved the movies and experimented with work in various media types from print and theatre through animation and interactivity — there was always a technology overtone — before landing where I needed to be: in cinema.

I came to high-end film post somewhat obliquely, having built an early tapeless TV pipeline; I was able to bring that comfort with digital acquisition to an industry transitioning from 35mm in the mid 2000s.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
I’m profoundly privileged to work for a company owned by Peter Jackson, and I have worked on every project of his since The Lovely Bones. We are working on Christian Rivers’ Mortal Engines at present. We recently supported the wonderful Jess Hall shooting on the Alexa 65 for Ghost in the Shell. He’s a really smart DOP.

I really enjoy our offshore clients. As well as the work we do with our friends in the USA. we’ve done some really great work recently with clients in China and the Middle East. Cultural fusion is exhilarating.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
We worked with director Geoff Murphy to restore and revisit his seminal New Zealand feature from 1983 UTU Redux, and that was the opening night feature for the 2013 NZ International Film Festival. It was incredibly good fun, was honorable and is a true taonga in our national narrative.

A Park Road Mistika grading suite.

The Hobbit films were a big chunk of the last decade for us, and our team was recognized with multiple awards. The partnerships we built with SGO, Quantum, Red and Factorial are strong to this day. I was very fortunate to collect some of those awards on our team’s behalf, and was delighted to have that honor.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
I rely on clean water and modern medicine to help keep myself, and our wider community, safe from harm. And I am really conscious that to keep that progress moving forward we’re going to have to shepherd our natural world one hell of a lot better.

Powerful computing and fast Internet transformed not only our work, but time and distance for me. I’ve learned more about film, music and art because of the platforms running without friction on the Internet than I would have dared dreamed in the ‘90s.

I hold in my hand a mobile access point that can not only access a mind-bogglingly large world of knowledge and media, but can also dynamically represent that information for my benefit and allow me to acknowledge the value of trust in that connection — there’s hope for us in the very large being accessible by way of the very small.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
I kind of abandoned Facebook a few years ago, but Film Twitter has some amazing writers and cinematographers represented. I tend to be a bit of a lurker most other places — sometimes the most instructive exercise is to observe! Our private company Slack channels supplement the rest of my social media time.

To be honest, most of our world is respectfully private, but I do follow @ParkRoadPost on Instagram and Twitter.

DO YOU LISTEN TO MUSIC WHILE YOU WORK? CARE TO SHARE YOUR FAVORITE MUSIC TO WORK TO?
Our team has a very broad range of musical tastes, and we tend to try and share that with each other… and there is always Radiohead. I have a not-so-secret love of romantic classical music and lush film scores. My boss and I agree very much on what rock (and a little alt-country) should sound like, so there’s a fair bit of that!

When my headphones are on there is sometimes old-school liquid or downbeat electronica, but mostly I am listening to the best deep house that Berlin and Hamburg have to offer while I work.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
A quick walk around the peninsular is a pretty good way of chilling out, especially if it’s dusk or dawn — then I can watch some penguins on the rocks while ships come in and out of the harbor!

My family (including the furry ones!) are incredible, and they help provide perspective in all things.

Wellington is the craft beer capital of New Zealand, so there’s always an opportunity for some food and an interesting drop from Garage Project (or Liberty brewing out of Auckland) with mates in town. I try and hang out with a bunch of my non-industry friends every month or so — those nights are definitely my favorite for music, and are good for my soul!

ATTO intros quad-port version of 32Gb Fibre line of HBAs

ATTO Technology has added a new quad-port host bus adapter (HBA) to its Fibre Channel portfolio. The ATTO Celerity 32Gb Gen 6 FC-324E HBA will enable companies to use their existing storage area network infrastructure and address the growing need for high-performing, scalable and secure storage. Celerity is intended to support exponential data growth in applications such as 4K/8K editing and high-performance computing and data warehousing, along with the proliferation of virtualized servers and flash arrays.

According to ATTO, the 32Gb HBAs support data throughput of 3,200 MB/s per channel, maximizing the number of virtual machines per physical server. With 16 PCIe bus connections and four 32 Gb/s Fibre Channel ports, the FC-324E eliminates the bottlenecks created by I/O data-intensive applications.

With data centers moving to all-flash arrays, there’s a need to drive greater performance to more solid-state drives. Having four Fibre Channel ports in a single PCIe slot ensures high-density connectivity at the highest available performance for up to 1.2 GB/s throughput, making it well suited for environments that rely on next-generation, flash-based storage.

Celerity 32Gb HBAs also make it possible to increase the distance between servers and storage. Because they support more data in flight, users can extend their connection up to 10 kilometers without degrading throughput in demanding long-distance applications, such as a stretch cluster.

The ATTO Gen 6 line includes Celerity 32Gb and 16Gb HBAs in low-profile single-, dual- and now quad-port full-height versions. All versions are backward-compatible and take advantage of advancements in reliability and forward error correction to improve network performance and resiliency.

Celerity 32Gb Gen 6 quad-port FC-324E HBAs are available now.

VFX Roundtable: Trends and inspiration

By Randi Altman

The world of visual effects is ever-changing, and the speed at which artists are being asked to create new worlds, or to make things invisible is moving full-speed ahead. How do visual effects artists (and studios) prepare for these challenges, and what inspired them to get into this business? We reached out to a small group of visual effects pros working in television, commercials and feature films to find out how they work and what gets their creative juices flowing.

Let’s find out what they had to say…

KEVIN BAILLIE, CEO, ATOMIC FICTION
What do you wish clients would know before jumping into a VFX-heavy project?
The core thing for every filmmaking team to recognize is that VFX isn’t a “post process.” Careful advance planning and a tight relationship between the director, production designer, stunt team and cinematographer will yield a far superior result much more cost effectively.

In the best-looking and best-managed productions I’ve ever been a part of, the VFX team is the first department to be brought onto the show and the last one off. It truly acts as a partner in the filmmaking process. After all, once the VFX post phase starts, it’s effectively a continuation of production — with there being a digital corollary to every single department on set, from painters to construction to costume!

What trends in VFX have impressed you the most over the last year or two, and how are they affecting your work?
The move to cloud computing is one of the most exciting trends in VFX. The cloud is letting smaller teams to much bigger work, allowing bigger teams to do things that have never been seen before and will ultimately result in compute resources no longer being a constraint on the creative process.

Cloud computing allowed Atomic Fiction to play alongside the most prestigious companies in the world, even when we were just 20 people. That capability has allowed us to grow to over 200 people, and now we’re able to take the lead vendor position on A-list shows. It’s remarkable what dynamic and large-scale infrastructure in the cloud has enabled Atomic to accomplish.

How many years have you been working in VFX, and what project inspired you to get into this line of work?
I grew up in Seattle and started dabbling in 3D as a hobby when I was 14 years old, having been immensely inspired by Jurassic Park. Soon thereafter, I started working at Microsoft in the afternoons, developing visual content to demonstrate their upcoming technologies. I was fortunate enough to land a job with Lucasfilm right after graduating high school, which was 20 years ago at this point! I’ve been lucky enough to work with many of the directors that inspired me as a child, such as George Lucas and Robert Zemeckis, and modern pioneers like JJ Abrams.

Looking back on my career so far, I truly feel like I’ve been living the dream. I can’t wait for what’s next in this exciting, ever-changing business.

ROB LEGATO, OSCAR-WINNING VFX SUPERVISOR, SECOND UNIT DIRECTOR, SECOND UNIT DIRECTOR OF PHOTOGRAPHY
What do you wish clients would know before jumping into a VFX-heavy project?
It takes a good bit of time to come up with a plan that will ensure a sustainable attack when makinging the film. They need to ask someone in authority, “What does it take to do it,” and then make a reasonable plan. Everyone wants to do a great job all the time, and if they could maneuver the schedule — even with the same timeframe — it could be a much less frustrating job.

It happens time and time again, someone comes up with a budget and a schedule that doesn’t really fit with the task and forces you to live with it. That makes for a very difficult assignment that gets done because of the hard work of the people who are in the trenches.

What trends in VFX have impressed you the most over the last year or two, and how are they affecting your work?
For me, it’s how realistic you can make something. The rendering capabilities — like what we did on Jungle Book with the animals — are so sophisticated that it fools your eye into believing it’s real. Once you do that you’ve opened the magic door that allows you to do anything with a tremendous amount of fidelity. You can make good movies without it being a special-venue movie or a VFX movie. The computer power and rendering abilities — along with the incredible artistic talent pool that we have created over the years — is very impressive, especially for me, coming from a more traditional camera background. I tended to shy away from computer-generated things because they never had the authenticity you would have wanted.

Then there is the happy accident of shooting something, where an angle you wouldn’t have considered appears as you look through the camera; now you can do that in the computer, which I find infinitely fascinating. This is where all the virtual cinematography things I’ve done in the past come in to help create that happy accident.

How many years have you been working in VFX, and what project inspired you to get into this line of work?
I’ve been working in VFX since about 1984. Visual effects wasn’t my dream. I wanted to make movies: direct, shoot and be a cameraman and editor. I fell into it and then used it as an avenue to allow me to create sequences in films and commercials.

The reason you go to movies is to see something you have never seen before, and for me that was Close Encounters. The first time I saw the mothership in Close Encounters, it wasn’t just an effect, it became an art form. It was beautifully realized and it made the story. Blade Runner was another where it’s no longer a visual effect, it’s filmmaking as an art form.

There was also my deep appreciation for Doug Trumbull, whose quality of work was so high it transcended being a visual effect or a photographic effect.

LISA MAHER, VP OF PRODUCTION, SHADE VFX 
What do you wish clients would know before jumping into a VFX-heavy project?
That it’s less expensive in the end to have a VFX representative involved on the project from the get-go, just like all the other filmmaking craft-persons that are represented. It’s getting better all the time though, and we are definitely being brought on board earlier these days.

At Shade we specialize in invisible or supporting VFX. So-called invisible effects are often much harder to pull off. It’s all about integrating digital elements that support the story but don’t pull the audience out of a scene. Being able to assist in the planning stages of a difficult VFX sequence often results in the filmmakers achieving what they envisioned more readily. It also helps tremendously to keep the costs in line with what was originally budgeted. It also goes without saying that it makes for happier VFX artists as they receive photography captured with their best interests in mind.

What trends in VFX have impressed you the most over the last year or two, and how are they affecting your work?
I would say the most exciting development affecting visual effects is the explosion of opportunities offered by the OTT content providers such as Netflix, Amazon, HBO and Hulu. Shade primarily served the feature film market up to three years ago, but with the expanding needs of television, our offices in Los Angeles and New York are now evenly split between film and TV work.

We often find that the film work is still being done at the good old reliable 2K resolution while our TV shows are always 4K plus. The quality and diversity of projects being produced for TV now make visual effects a much more buoyant enterprise for a mid-sized company and also a real source of employment for VFX professionals who were previously so dependent on big studio generated features.

How many years have you been working in VFX, and what project inspired you to get into this line of work?
I’ve been working in visual effects close to 20 years now. I grew up in Ireland; as a child the world of film, and especially images of sunny California, were always a huge draw for me. They helped me survive the many grey and rainy days of the Irish climate.  I can’t point to one project that inspired me to get into film making — there have been so many — just a general love for storytelling, I guess. Films like Westworld (the 1973 version), Silent Running, Cinema Paradiso, Close Encounters of the Third Kind, Blade Runner and, of course, the original Star Wars were truly inspirational.

DAVID SHELDON-HICKS, CO-FOUNDER/EXECUTIVE CREATIVE DIRECTOR, TERRITORY STUDIO
What do you wish clients would know before jumping into a VFX-heavy project?
The craft and care and love that goes into VFX is often forgotten in the “business” of it all. As a design led studio that straddles art and VFX departments in our screen graphic and VFX work, we prefer to work with the director from the preproduction phase. This ensures that all aspects of our work are integrated into story and world building.

The talent and gut instinct, eye for composition and lighting, appreciation of form, choreography of movement and, most notably, the appreciation of the classics is so pertinent to the art of VFX and is undersold for conversations of shot counts, pipelines, bidding and numbers of artists. Bringing the filmmakers into the creative process has to be the way forward for an art form still finding its own voice.

What trends in VFX have impressed you the most over the last year or two, and how are they affecting your work?
The level of concept art and postviz coming through from VFX studios is quite staggering. It gets back to my point from above of bringing the VFX dialogue with filmmakers and VFX artists concentrated on world building and narrative expansion. It’s so exciting to see concept art and postviz getting to a new level of sophistication and influence in the filmmaking process.

How many years have you been working in VFX, and what project inspired you to get into this line of work?
I have been working professionally in VFX for over 15 years. My love of VFX and creativity in general came from the moment I picked up a pencil and imagined new possibilities. But once I cut my film teeth designing screens graphics on Casino Royale and followed by Dark Knight, I left my freelance days behind and co-founded Territory Studio. Our first film as a studio was Prometheus, and working with Ridley Scott was a formative experience that has influenced our own design-led approach to motion graphics and VFX, which has established us in the industry and seen the studio grow and expand.

MARK BREAKSPEAR, VFX SUPERVISOR, SONY PICTURES IMAGEWORKS
What do you wish clients would know before jumping into a VFX-heavy project?

Firstly, I think the clients I have worked with have always been extremely cognizant of the key areas affecting VFX heavy projects and consequently have built frameworks that help plan and execute these mammoth shows successfully.

Ironically, it’s the smaller shows that sometimes have the surprising “gotchas” in them. The big shows come with built-in checks and balances in the form of experienced people who are looking out for the best interests of the project and how to navigate the many pitfalls that can make the VFX costs increase.

Smaller shows sometimes don’t allow enough discussion and planning time for the VFX components in pre-production, which could result in the photography not being captured as well as it could have been. Everything goes wrong from there.

So, when I approach any show, I always look for the shots that are going to be underestimated and try to give them the attention they need to succeed. You can get taken out of a movie by a bad driving comp as much as you can a monster space goat biting a planet in half.

What trends in VFX have impressed you the most over the last year or two, and how are they affecting your work?
I think there are several red herrings out there right now… the big one being VR. To me, VR is like someone has invented teleportation, but it only works on feet.

So, right now, it’s essentially useless and won’t make creating VFX any easier or make the end result any more spectacular. I would like to see VR used to aid artists working on shots. If you could comp in VR I could see that being a good way to help create more complex and visually thrilling shots. The user interface world is really the key area VR can benefit.

Suicide Squad

I do think however, that AR is very interesting. The real world, with added layers of information is a hugely powerful prospect. Imagine looking at a building in any city of the world, and the apartments for sale in it are highlighted in realtime, with facts like cost, square footage etc. all right there in front of you.

How does AR benefit VFX? An artist could use AR to get valuable info about shots just by looking at them. How often do we look at a shot and ask “what lens was this? AR could have all that meta-data ready to display at any point on any shot.

How many years have you been working in VFX, and what project inspired you to get into this line of work?
I’ve been in VFX for 25 years. When I started, VFX was not really a common term. I came to this industry through the commercial world… as a compositor on TV shows and music videos. Lots of (as we would call it now) visual effects, but done in a world bereft of pipelines and huge cloud-based renderfarms.

I was never inspired by a specific project to get into the visual effects world. I was a creative kid who also liked the sciences. I liked to work out why things ticked, and also draw them, and sometimes try to draw them with improvements or updates as I could imagine. It’s a common set of passions that I find in my colleagues.

I watched Star Wars and came out wondering why there were black lines around some of the space ships. Maybe there’s your answer… I was inspired by the broken parts of movies, rather than being swept up in the worlds they portrayed. After all that effort, time and energy… why did it still look wrong? How can I fix it for next time?

CHRIS HEALER, CEO/CTO/VFX SUPERVISOR, THE MOLECULE
What do you wish clients would know before jumping into a VFX-heavy project?

Plan, plan plan… previs, storyboarding and initial design are crucial to VFX-heavy projects. The mindset should ideally be that most (or all) decisions have been made before the shoot starts, as opposed to a “we’ll figure it out in post” approach.

What trends in VFX have impressed you the most over the last year or two, and how are they affecting your work?
Photogrammetry, image modeling and data capture are so much more available than ever before. Instead of an expensive Lidar rig that only produces geometry without color, there are many many new ways to capture the color and geometry of the physical world, even using a simple smart phone or DSLR.

How many years have you been working in VFX, and what project inspired you to get into this line of work?
I’ve been doing VFX now for over 16 years. I would have to say that The Matrix (part 1) was really inspiring when I saw it the first time, and it made clear that VFX as an art form was coming and available to artists of all kinds all over the world. Previous to that, VFX was very difficult to approach for the average student with limited resources.

PAUL MARANGOS, SENIOR VFX FLAME ARTIST, HOOLIGAN
What do you wish clients would know before jumping into a VFX-heavy project?

The more involved I can be in the early stages, the more I can educate clients on all of the various effects they could use, as well as technical hurdles to watch out for. In general, I wish more clients involved the VFX guys earlier in the process — even at the concepting and storyboarding stages — because we can consult on a range of critical matters related to budgets, timelines, workflow and, of course, bringing the creative to life with the best possible quality.

Fortunately, more and more agencies realize the value of this. For instance, with a recent campaign Hooligan finished for Harvoni, we were able to plan shots for a big scene featuring hundreds of lanterns in the sky, which required lanterns of various sizes for every angle that Elma Garcia’s production team shot. Having everything well storyboarded and under Elma’s direction, who left no detail unnoticed, we managed to create a spectacular display of lantern composites for the commercial.

We were also involved early on for a campaign for MyHeritage DNA (above) via creative agency Berlin Cameron, featuring spoken word artist Prince Ea, and directed by Jonathan Augustavo of Skunk. Devised as if projecting on a wall, we mapped the motion graphics in the 3D environments.

What trends in VFX have impressed you the most over the last year or two, and how are they affecting your work?
Of course VR and 360 live TV shows are exciting, but augmented reality is what I find particularly interesting — mixing the real world with graphics and video all around you. The interactivity of both of these emerging platforms presents an endless area of growth, as our industry is on the cusp of a sea change that hasn’t quite yet begun to directly affect my day-to-day.

Meanwhile, at Hooligan, we’re always educating ourselves on the latest software, tools and technological trends in order to prepare for the future of media and entertainment — which is wise if you want to be relevant 10 years from now. For instance, I recently attended the TED conference, where Chris Milk spoke on the birth of virtual reality as an artform. I’m also seeing advances in Google cardboard, which is making the platform affordable, too. Seeing companies open up VR Departments is an exciting step for us all and it shows the vision for the future of advertising.

How many years have you been working in VFX, and what project inspired you to get into this line of work?
I have worked in VFX for 25 years. After initially studying fine art and graphic design, the craft aspect of visual effects really appealed to me. Seeing special effects genius Ray Harryhausen’s four-minute skeleton fight was a big inspiration. He rear-projected footage of the actual actors and then combined the shots to make a realistic skeleton-Argonaut battle. It took him over four and a half months to shoot the stop-motion animation.

Main Image: Deadpool/Atomic Fiction.

The hybridization of VFX and motion design

Plus the rise of the small studio

By Miguel Lee

There has long been a dichotomy between motion graphics and VFX because they have traditionally serviced very different creative needs. However, with the democratization of tools and the migration of talent between these two pillars of the CG industry, a new “hybrid” field of content creators is emerging. And for motion designers like myself, this trend reflects the many exciting things taking place in our industry today, especially as content platforms increase at an incredible rate with smartphones and new LED technologies, not to mention a renaissance in the fields of VR, AR, and projection mapping, to name a few.

Miguel Lee

I’ve always likened the comparison of motion graphics and VFX to the Science Club and the Art Club we remember at school. VFX has its roots in an objective goal: to seamlessly integrate CG into the narrative or spectacle in a convincing and highly technical way. Motion graphics, on the other hand, can be highly subjective. One studio, for instance, might produce a broadcast package laden with 3D animations, whereas another studio will opt for a more minimal, graphical approach to communicating the same brand. A case can typically be made for either direction.

So where does the new “hybrid” studios fit into this analogy? Let’s call them the “Polymath Club,” given their abilities to tap into the proverbial hemispheres of the brain — the “left” representing their affinity for the tools, and the “right” driving the aesthetics and creative. With this “Polymath” mentality, CG artists are now able to generate work that was once only achievable by a large team of artists and technicians. Concurrently, it is influencing the hybridization of the CG industry at large, as VFX companies build motion design teams in-house, while motion graphics studios increasingly incorporate VFX tools into their own workflow.

As a result, we’ve seen a proliferation in the “lean-and-mean” production studio over the last few years. Their rise is the direct result of the democratization of our industry, where content creation tools have significantly evolved in terms of technology, accessibility and reliability. One such example is the dramatic increase in render power with the rise of third-party GPU renderers, such as Otoy’s Octane and Redshift, which have essentially made 3D photorealism more attainable. Cloud rendering solutions have also popped up for conventional and third-party renderers, which mitigates the need to build out expensive renderfarms — a luxury that is still privy to companies of a certain size.

Otoy’s Octane being used on one of Midnight Sherpa’s jobs.

Motion artists, too, have become far more adventurous in employing VFX-specific software like Houdini, which has simultaneously become far more accessible and egalitarian without any compromise to its capability. Maxon’s Cinema 4D, the heavily favored 3D application in motion graphics, has had a long tradition of implementing efficient software-specific workflows to bridge its ecosystem to other programs. Coding and script-based animation has also found a nice home in the Motion repertoire to create inventive and efficient ways to generate content. Even the barrier of entry for creating VR and AR content has eased quite a bit with the latest releases of both the Unity and Unreal engines.

Aside from lower overhead costs, the horizontal work structure of the “lean-and-mean” model has also cultivated truly collaborative environments where artists of trans-disciplinary backgrounds can work together in more streamlined ways than can be done in an oversized team. In many cases, these smaller studios are forced to develop workflows that more effectively reflect their team’s makeup — these systems often enjoy more success as they reflect the styles and, even, personalities of the core teams, which institute them.

The nature of being small also pushes you to innovate and develop greater efficiencies, rather than just throwing more bodies at the problem. These solutions and workflows are often baked into the core team and rolled out on future projects. Smaller studios also have a reputation for cultivating talent. Junior artists and interns are often put on a wider range of projects and into more roles out of necessity to fulfill the various needs of production, whereas they are typically relegated to a single role at larger studios and oftentimes are not afforded the opportunity to branch out. This conversely creates an incentive to hire artists with the intent of developing them over a long term.

There are downsides, of course, to being small — chief among them is how quickly they reach physical capacity at which point jobs would have to be turned down. The proliferation of small studios equals more voices in the landscape of content, which in turn directly contributes to the greater evolution of design as a whole.

Now that the playing field has been technologically equalized, the key between failure and success for many of these companies lies in whether or not they can craft a voice that is unique amongst their peers in an increasingly saturated landscape.

Main Image: Audi – Photorealism is more achievable in a streamlined production pipeline.


Miguel Lee is partner/creative director at LA’s Midnight Sherpa, a boutique creative studio for brands and entertainment.

The importance of on-set VFX supervision

By Karen Maierhofer

Some contend that having a visual effects supervisor present on set during production is a luxury; others deem it a necessity. However, few, if any, see it as unnecessary.

Today, more and more VFX supes can be found alongside directors and DPs during filming, advising and problem-solving, with the goal of saving valuable time and expense during production and, later, in post.

John Kilshaw

“A VFX supervisor is on set and in pre-production to help the director and production team achieve their creative goals. By having the supervisor on set, they gain the flexibility to cope with the unexpected and allow for creative changes in scope or creative direction,” says Zoic Studios creative director John Kilshaw, a sought-after VFX supervisor known for his collaborative creative approach.

Kilshaw, who has worked at a number of top VFX studios including ILM, Method and Double Negative, has an impressive resume of features, among them The Avengers, Pirates of the Caribbean: On Stranger Tides, Mission: Impossible – Ghost Protocol and various Harry Potter films. More recently, he was visual effects supervisor for the TV series Marvel’s The Defenders and Iron Fist.

Weta Digital’s Erik Winquist (Apes trilogy, Avatar, The Hobbit: An Unexpected Journey) believes the biggest contribution a VFX supervisor can make while on set comes during prep. “Involving the VFX supervisor as early as possible can only mean less surprises during principal photography. This is when the important conversations are taking place between the various heads of departments. ‘Does this particular effect need to be executed with computer graphics, or is there a way to get this in-camera? Do we need to build a set for this, or would it be better for the post process to be greenscreen? Can we have practical smoke and air mortars firing debris in this shot, or is that going to mess with the visual effects that have to be added behind it later?’”

War for the Planet of the Apes via Weta Digital

According to Winquist, who is VFX supervisor on Rampage (2018), currently in post production, having a VFX supe around can help clear up misconceptions in the mind of the director or other department heads: “No, putting that guy in a green suit doesn’t make him magically disappear from the shot. Yes, replacing that sky is probably relatively straightforward. No, modifying the teeth of that actor to look more like a vampire’s while he’s talking is actually pretty involved.”

Both Kilshaw and Winquist note that it is not uncommon to have a VFX supervisor on set whenever there are shots that include visual effects. In fact, Winquist has not heard of a major production that didn’t have a visual effects supervisor present for principal photography. “From the filmmaker’s point of view, I can’t imagine why you would not want to have your VFX supervisor there to advise,” he says. “Film is a collaborative medium. Building a solid team is how you put your vision up on the screen in the most cost-effective way possible.”

At Industrial Light & Magic, which has a long list of major VFX film credits, it is a requirement. “We always have a visual effects supervisor on set, and we insist on it. It is critical to our success on a project,” says Lindy De Quattro, VFX supervisor at ILM. “Frankly, it terrifies me to think about what could happen without one present.”

Lindy De Quattro

For some films, such as Evan Almighty, Pacific Rim, Mission: Impossible — Ghost Protocol and the upcoming Downsizing, De Quattro spent an extended period on set, while for many others she was only present for a week or two while big VFX scenes were shot. “No matter how much time you have put into planning, things rarely go entirely as planned. And someone has to be present to make last-minute adjustments and changes, and deal with new ideas that might arise on that day — it’s just part of the creative process,” she says.

For instance, while working on Pacific Rim, Director Guillermo del Toro would stay up until the wee hours of the night making new boards for what would be shot the following day, and the next morning everyone would crowd around his hand-drawn sketches and notebooks and he would say, “OK, this is what we are shooting. So we have to be prepared and do everything in our power to help ensure that the director’s vision becomes reality on screen.”

“I cannot imagine how they would have gone about setting up the shots if they didn’t have a VFX supervisor on set. Someone has to be there to be sure we are gathering the data needed to recreate the environment and the camera move in post, to be sure these things, and the greenscreens, are set up correctly so the post is successful,” De Quattro says. If you don’t know to put in greenscreen, you may be in a position where you cannot extract the foreground elements the way you need to, she warns. “So, suddenly, two days of an extraction and composite turns into three weeks of roto and hair replacement, and a bunch of other time-consuming and expensive work because it wasn’t set up properly in initial photography.”

Sometimes, a VFX supervisor ends up running the second unit, where the bulk of the VFX work is done, if the director is at a different location with the first unit. This was the case recently when De Quattro was in Norway for the Downsizing shoot. She ended up overseeing the plate unit and did location scouting with the DP each morning to find shots or elements that could be used in post. “It’s not that unusual for a VFX supervisor to operate as a second unit director and get a credit for that work,” she adds.

Kilshaw often finds himself discussing the best way of achieving the show’s creative goals with the director and producer while on set. Also, he makes sure that the producer is always informed of changes that will impact the budget. “It becomes very easy for people to say, ‘we can fix this in post.’ It is at this time when costs can start to spiral, and having a VFX supervisor on set to discuss options helps stop this from happening,” he adds. “At Zoic, we ensure that the VFX supervisor is also able to suggest alternative approaches that may help directors achieve what they need.”

Erik Winquist

According to Winquist, the tasks a VFX supe does on set depends on the size of the budget and crew. In a low-budget production, a person might be doing a myriad of different tasks themselves: creating previs and techvis, working with the cinematographer and key grip concerning greenscreen or bluescreen placement, placing tracking markers, collecting camera information for each setup or take, shooting reference photos of the set, helping with camera or lighting placement, gathering lighting measurements with gray and chrome reference spheres — basically any information that will help the person best execute the visual effects requirements of the shot. “And all the while being available to answer questions the director might have,” he says.

If the production has a large budget, the role is more about spreading out and managing those tasks among an on-set visual effects team: data wranglers, surveyors, photographers, coordinators, PAs, perhaps a motion capture crew, “so that each aspect of it is done as thoroughly as possible,” says Winquist. “Your primary responsibility is being there for the director and staying in close communication with the ADs so that you or your team are able to get all the required data from the shoot. You only have one chance to do so.”

The benefits of on-set VFX supervision are not just for those working on big-budget features, however. As Winquist points out, the larger the budget, the more demanding the VFX work and the higher the shot count, therefore the more important it is to involve the VFX supervisor in the shoot. “But it could also be argued that a production with a shoestring budget also can’t afford to get it wrong or be wasteful during the shoot, and the best way to ensure that footage is captured in a way that will make for a cost-effective post process is to have the VFX supervisor there to help.”

Kilshaw concurs. “Regardless of whether it is a period drama or superhero show, whether you need to create a superpower or a digital version of 1900 New York, the advantages of visual effects and visual effects supervision on set are equally important.”

While De Quattro’s resume is overflowing with big-budget VFX films, she has also assisted on smaller projects where a VFX supervisor’s presence was also critical. She recalls a commercial shoot, one that prompted her to question the need for her presence. However, production hit a snag when a young actor was unable to physically accomplish a task during multiple takes, and she was able to step in and offer a suggestion, knowing it would require just a minor VFX fix. “It’s always something like that. Even if the shoot is simple and you think there is no need, inevitably someone will need you and the input of someone who understands the process and what can be done,” she says.

De Quattro’s husband is also a VFX supervisor who is presently working on a non-VFX-driven Netflix series. While he is not on set every day, he is called when there is an effects shoot scheduled.

Mission Impossible: Ghost Protocol

So, with so many benefits to be had, why would someone opt not to have a VFX supervisor on set? De Quattro assumes it is the cost. “What’s that saying, ‘penny wise and pound foolish?’ A producer thinks he or she is saving money by eliminating the line item of an on-set supervisor but doesn’t realize the invisible costs, including how much more expensive the work can be, and often is, on the back end,” she notes.

“On set, people always tell me their plans, and I find myself advising them not to bother building this or that — we are not going to need it, and the money saved could be better utilized elsewhere,” De Quattro says.

On Mission: Impossible, for example, the crew was filming a complicated underwater escape scene with Tom Cruise and finally got the perfect take, only his emergency rig became exposed. However, rather than have the actor go back into the frigid water for another take, De Quattro assured the team that the rig could be removed in post within the original scope of the VFX work. While most people are aware that can be done now, having someone with the authority and knowledge to know that for sure was a relief, she says.

Despite their extensive knowledge of VFX, these supervisors all say they support the best tool for the job on set and, mostly, that is to capture the shot in-camera first. “In most instances, the best way to make something look real is to shoot it real, even if it’s ultimately just a small part of the final frame,” Winquist says. However, when factors conspire against that, whether it be weather, animals, extras, or something similar, “having a VFX supervisor there during the shoot will allow a director to make decisions with confidence.”

Main Image: Weta’s Erik Winquist on set for Planet of the Apes.