Category Archives: VFX

Brickyard VFX now offering editing via Andre Betz and Bug

Brickyard VFX has added editor Andre Betz to its team. Brickyard and Betz have been longtime collaborators through Betz’s shop Bug Editorial, and Bug will now be the official banner for Brickyard’s editorial roster and services.

“We have had a fabulous relationship with Andre over the years, and as more and more of our clients were asking for in-house editorial services, it made sense to officially join forces,” explains Andrew Bell, managing director at Brickyard VFX. “Adding editorial under the same roof will streamline post for our clients and be a huge benefit, and we’re excited to now offer this option in both our Boston and Santa Monica offices.”

Betz’s work has appeared in Super Bowl spots and in the Museum of Modern Art’s permanent collection. He has cut projects for brands such as Mercedes, Nationwide, VW, Chobani, Honda and many more. He is based in the Boston office and his editing tool of choice is Avid Media Composer.

“I’m thrilled to join their team and work to build out their editorial offerings on both coasts, so that clients can get results more efficiently and cost-effectively, all through one vendor,” says Betz.

Autodesk Flame family updates offer pipeline enhancements

Autodesk has updated its Flame 2018 family of 3D visual effects and finishing software, which includes Flame, Flare, Flame Assist and Lustre. Flame 2018.3 offers more efficient ways of working in post, with feature enhancements that offer greater pipeline flexibility, speed and support for emerging formats and technology.

Flame 2018.3 highlights include:

• Action Selective: Apply FX color to an image surface or the whole action scene via the camera

• Motion Warp Tracking: Organically distort objects that are changing shape, angle and form with new 32-bit motion vector-based tracking technology

• 360-degree VR viewing mode: View LatLong images in a 360-degree VR viewing mode in the Flame player or any viewport during compositing and manipulate the field of view

• HDR waveform monitoring: Set viewport to show luminance waveform; red, green, blue (RGB) parade; color vectorscope or 3D cube; and monitor a range of HDR and wide color gamut (WCG) color spaces including Rec2100 PQ, Rec2020 and DCI P3

• Shotgun Software Loader: Load assets for a shot and build custom batches via Flame’s Python API, and browse a Shotgun project for a filtered view of individual shots

• User-requested improvements for Action, Batch, Timeline and Media Hub

“The new standalone Python console in Flame 2018.3 is a great,” says Treehouse Edit finishing artist John Fegan, a Flame family beta tester. “We’re also excited about the enhanced FBX export with physically based renderer (PBR) for Maya and motion analysis updates. Using motion vector maps, we can now achieve things we couldn’t with a planar tracker or 3D track.”

Flame Family 2018.3 is available today at no additional cost to customers with a current Flame Family 2018 subscription.

Dell 6.15

Behind the Title: Postal director of operations Jason Mayo

NAME: Jason Mayo

COMPANY: Postal

CAN YOU DESCRIBE YOUR COMPANY?
Postal is a VFX and animation studio made up of artists and producers that like to make cool shit. We experiment and push the envelope, but we’re also adults, so we get it done on time and on budget. Oh and we’re not assholes. That would be a cool t-shirt. “Postal: We’re not assholes.”

Postal is a creative studio that believes everything starts with great design. That’s our DNA. We believe that it’s always about the talent and not the tools. Whether it’s motion graphics, animation, visual effects, or even editorial, our desire to create transcends all mediums.

Postal’s live-action parent company, Humble is a NY- and LA-based home for makers —directors, writers, creatives, artists and designers — to create culture-defining content.

Coke Freestyle

WHAT’S YOUR JOB TITLE?
Director of Operations

WHAT DOES THAT ENTAIL?
I spend a lot of my time on biz dev, recruiting interesting talent and developing strategic partnerships that lead to new pipelines of business.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
Probably picking up garbage. Creatives are pretty messy. They leave their stuff all over the place. The truth of the matter is, it’s a small company so no matter what your title is, you’re always on the front lines. That’s what makes my days interesting.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Definitely competing for projects we’re passionate about. I love the thrill of the chase. Also I love trying to keep our artists and producers inspired. Not every project needs to win awards but it’s important to me that my team finds the work interesting and challenging to tackle.

WHAT’S YOUR LEAST FAVORITE?
Probably the picking up the garbage part. I’ve ruined a lot of shirts. I also hate seeing content on TV or on the web that could have been produced by us. Especially if it turned out killer.

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
I have two daughters and a puppy so by 8am I’m basically a broken man. But as soon as I hit the office with my iced coffee in hand, I’m on fire. I love the start of the workday. Endless possibilities abound.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Probably a cool middle school English teacher. The kids would call me Jay and talk to me about their problems. Honestly though, when I’m done working I’ll probably just disappear into the woods or something and chase possums with a BB gun.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
It was an accident. I wanted to be an actor. My mom’s best friend’s, ex-husband owned a small post house and he hired me as a receptionist. I was probably the greatest receptionist of all time. I thought being in “entertainment” would get me to Hollywood through the back door. I still have about 500 headshots that I never got to use.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
We’ve had such a crazy year. We’ve done projects for Pepsi, Coke, Panera, Morgan Stanley, TED, Canon, Billboard and Nike.

TED Zipline

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
I really love the TED stuff we do. They are a dream client. They come to us with a challenge and they allow us to go away, come up with some really imaginative stuff and then present them with a solution. As long as it’s on brief, it can be any style or any execution we think is right. We love that type of open collaboration with our clients.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
If we’re talking about apps, as well as hardware, then that’s easy. Sonos because it’s all about the music, Netflix because… zombies, and ride sharing apps because cabs are dirty and they make me nauseous.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
In general, I’m pretty active on social media and we actually just launched Facebook and Instagram pages for Postal. In a parallel universe I’m a dad blogger so I’ve always been big on community via social media. Facebook, Instagram and Twitter are the standards for me, but I’ve been Snapchatting with my daughter for years. I do have a Pinterest page somewhere, but it’s devoted solely to Ryan Gosling.

DO YOU LISTEN TO MUSIC WHILE YOU WORK?
I’m a heavy metal guy so pretty much anything heavy. I do also love me some Jackson Browne and some Dawes. Oh, and the Pretty in Pink soundtrack, of course.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I try not to let it get to me. It’s way tougher raising two daughters and two dogs. The rest is a cakewalk. I do binge eat from time to time and love to watch horror movies on the train. Always a good way for me to decompress.


Behind the Title: Milk VFX supervisor Jean-Claude Deguara

NAME: Jean-Claude Deguara

COMPANY: Milk Visual Effects (@milkvfx)

CAN YOU DESCRIBE YOUR COMPANY?
Milk is an independent visual effects company. We create complex sequences for high-end television and feature films, and we have studios in London and Cardiff, Wales. We launched four years ago and we pride ourselves on our friendly working culture and ability to nurture talent.

WHAT’S YOUR JOB TITLE?
VFX Supervisor

WHAT DOES THAT ENTAIL?
Overseeing the VFX for feature films, television and digital content — from the initial concept development right through to delivery. This includes on-set supervision and supervising teams of artists.

HOW DID YOU TRANSITION TO VFX?
I started out as a runner at London post house Soho 601, and got my first VFX role at The Hive. Extinct was my very first animation job — a Channel 4 dinosaur program. Then I moved to Mill Film to work on Harry Potter.

HOW LONG HAVE YOU BEEN WORKING IN VFX?
Over 20 years.

HOW HAS THE VFX INDUSTRY CHANGED IN THE TIME YOU’VE BEEN WORKING?
In London, the industry has grown from what was a small cottage industry in the late 1990s, pre Harry Potter. More creative freedom has come with the massive technology advances.

When I started out TV was all done on Digi Beta, but now, with the quality of cameras, television VFX has caught up with film.

Dinosaurs in the Wild

Being able to render huge amounts of data in the cloud as we did recently on our special venue project Dinosaurs in the Wild means that smaller companies can compete better.

DID A PARTICULAR FILM INSPIRE YOU ALONG THIS PATH IN ENTERTAINMENT?
Ray Harryhausen’s films inspired me as child. We’d watch at Christmas in awe!

I was also massively inspired by Spitting Image. I applied for a job only to find they were about to close down.

DID YOU GO TO FILM SCHOOL?
No, I went to Weston Supermare College of Art (Bristol University) and studied for an art and design diploma. Then I went straight into the film/TV industry as a runner.

WHAT’S YOUR FAVORITE PART OF THE JOB?
The creative planning and building of shots and collaborating with all the other departments to try to problem solve in order to tell the best possible story visually, within the budget.

WHAT’S YOUR LEAST FAVORITE?
Answering emails, and the traveling.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
If I didn’t do this, I’d like to be directing.

Sherlock

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
Sherlock (BBC/Hartswood), Dinosaurs in the Wild, Jonathan Strange & Mr Norrell (BBC) and Beowulf (ITV). I am currently VFX supervisor on Good Omens (BBC/Amazon).

WHAT IS THE PROJECT/S THAT YOU ARE MOST PROUD OF?
It’s really hard to choose, but the problem solving on Sherlock has been very satisfying. We’ve created invisible effects across three series.

WHAT TOOLS DO YOU USE DAY TO DAY?
I previz in Autodesk Maya.

WHERE DO YOU FIND INSPIRATION NOW?
Scripts. I get creative “triggers” when I’m reading scripts or discussing a new scene or idea, which for me, pushes it to the next level. I also get a lot of inspiration working with my fellow artists at Milk. They’re a talented bunch.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I’d go to the gym, but the pub tends to get in the way!


VFX company Kevin launches in LA

VFX vets Tim Davies, Sue Troyan and Darcy Parsons have partnered to open the Los Angeles-based VFX house, Kevin. The company is currently up and running in a temp studio in Venice, while construction is underway on Kevin’s permanent Culver City location, scheduled to open early next year.

When asked about the name, as none of the partners are actually named Kevin, Davies said, “Well, Kevin is always there for you! He’s your best mate and will always have your back. He’s the kind of guy you want to have a beer with whenever he’s in town. Kevin knows his stuff and works his ass off to make sure you get what you need and then some!” Troyan added, “Kevin is a state of mind.”

Davies is on board as executive creative director, overseeing the collective creative output of the company. Having led teams of artists for over 25 years, he was formerly at Asylum Visual Effects and The Mill as creative director and head of 2D. Among his works are multiple Cannes Gold Lion-winning commercials, including HBO’s “Voyeur” campaign for Jake Scott, Nike Golf’s Ripple for Steve Rogers, Old Spice’s Momsong for Steve Ayson, Old Spice’s Dadsong for Andreas Nilsson, and Old Spice’s Whale and Rocket Car for Steve Rogers.

Troyan will serve as senior executive producer of Kevin, having previously worked on campaigns at The Mill and Method. Parsons, owner and partner of Kevin, has enjoyed a career covering multiple disciplines, including producer, VFX producer and executive producer.

Launch projects for Kevin include recent spots for Wieden + Kennedy Portland, The Martin Agency and Spark44.

Main Image: L-R: Darcy Parsons, Sue Troyan, Tim Davies


Neill Blomkamp’s Oats Studios uses Unity 2017 on ADAM shorts

Academy Award-nominated director Neill Blomkamp (District 9) is directing the next installments in the ADAM franchise — ADAM: The Mirror and ADAM: The Prophet — using the latest version of the Unity, which launches today.

Created and produced by Blomkamp’s Oats Studios, these short films show the power of working within an integrated realtime environment — allowing the team to build, texture, animate, light and render all in Unity to deliver high-quality graphics at a fraction of the cost and time of a normal film production cycle.

ADAM: The Mirror will premiere during the live stream of the Unite Austin 2017 keynote, which begins at 4pm Pacific tonight, and will be available on the Oats YouTube channel shortly after. ADAM: The Prophet will follow before the end of 2017.

“Ever since I started making films I’ve dreamed of a virtual sandbox that would let me build, shoot and edit photorealistic worlds all in one place. Today that dream came true thanks to the power of Unity 2017,” said Neill Blomkamp, founder Oats Studios. “The fact that we could achieve near photorealistic visuals at half of average time of our production cycles is astounding. The future is here, and I can’t wait to see what our fans think.”

Neill Blomkamp

The original ADAM was released in 2016 as a short film to demonstrate technical innovations on Unity. It won a Webby Award and was screened at several film festivals, including the Annecy Film Festival and the Nashville Film Festival. ADAM: The Mirror picks up after the end of the events of ADAM where the cyborg hero discovers a clue about what and who he is. ADAM: The Prophet gives viewers their first glimpse of one of the villains in the ADAM universe.

Using the power of realtime rendering, Oats used Unity 2017 to help them create photorealistic graphics and lifelike digital humans. This was achieved through a combination of Unity’s advanced high-end graphics power, new materials using the Custom Render Texture feature, advanced photogrammetry techniques, Alembic-streamed animations for facial and cloth movement and Unity’s Timeline feature.

Innovations will be highlighted in the coming months via a series of behind-the-scenes videos and articles on the Unity website. Innovations in these short films include:
• Lifelike digital humans in realtime: Oats created the best-looking human ever in Unity using custom sub-surface scattering shaders for skin, eyes and hair.
• Alembic-based realtime facial performance capture: Oats has created a new facial performance capture technique that streams 30 scanned heads per second for lifelike animation, all without the use of morph targets or rigs.
• Virtual worlds via photogrammetry: Staying true to their live-action background, Oats shot more than 35,000 photos of environments and props and after the initial photogrammetry solve, imported these into Unity using the delighting tool. This allowed them to quickly create rich complex materials without the need to spend time to model high-resolution models.
• Rapid streamlined iteration in realtime: Working with realtime rendering lets artists and designers “shoot” the story as if on a set, with a live responsiveness that allows room to experiment and make creative decisions anywhere in the process.
• Unity’s timeline backbone for collaboration: Unity’s Timeline feature, a visual sequencing tool that allows artists to orchestrate scenes without additional programming, combined with Multi-Scene Authoring allowed a team of 20 artists to collaborate on the same shot simultaneously.


Cabin Editing Company opens in Santa Monica focusing on editing, VFX

Cabin Editing Company has opened in Santa Monica, started by three industry veterans: managing partner Carr Schilling and award-winning editors Chan Hatcher, Graham Turner and Isaac Chen.

“We are a company of film editors with a passion for storytelling who are committed to mentoring talent and establishing lasting relationships with directors and agencies,” says Schilling, who formerly worked alongside Hatcher, Turner and Chen at NO6.

L-R: Isaac Chen, Carr Schilling, Graham Turner and Chan Hatcher.

Cabin, which also features creative director/Flame artist Verdi Sevenhuysen and editor Lucas Spaulding, will offer creative editorial, visual effects, finishing, graphics and color. The boutique’s work spans mediums across broadcast, branded content, web, film and more.

Why was now the right time to open a studio? “Everything aligned to make it possible, and at Cabin we have a collective of top creative talent where each of us bring our individual style to our projects to create great work with our clients,” says Schilling.

The boutique studio has already been busy working with agencies such as 215 McCann, BBDO, CP+B, Deutsch, GSD&M, Mekanism and Saatchi & Saatchi.

In terms of tools, Cabin uses Avid Media Composer and Autodesk Flame Premium all centralized to the Facilis TerraBlock shared storage system via Fibre.


Zoic Studios adds feature film vet Lou Pecora as VFX supervisor

Academy Award-nominated Lou Pecora has joined Zoic Studios’ Culver City studio as VFX supervisor. Pecora has over two decades of experience in visual effects, working across commercial, feature film and series projects. He comes to Zoic from Digital Domain, where he spent 18 years working on large-scale feature film projects as a visual effects supervisor and compositing supervisor.

Pecora has worked on films including X Men: Apocalypse, Spider-Man: Homecoming, X-Men: Days of Future Past (his Oscar nom), Maleficent, Pirates of the Caribbean: At World’s End, I, Robot, Transformers: Revenge of the Fallen, Transformers: Dark of the Moon, Star Trek, G.I. Joe: Retaliation, Stealth, The Mummy: Tomb of the Emperor, How the Grinch Stole Christmas, Flags of Our Fathers and Letters From Iwo Jima, among others.

“There has been a major shift in the television landscape, with a much greater volume of work and substantially higher production values than ever before,” says Pecora. “Zoic has their hands in a diverse range of high-end television projects, and I’m excited to bring my experience in the feature film space to this dynamic sector of the entertainment industry.”

The addition of Pecora comes on the heels of several high-profile projects at Zoic, including work on Darren Aronofsky’s thriller Mother!, Game of Thrones for HBO and Marvel’s The Defenders for Netflix.

 


Sony Imageworks’ VFX work on Spider-Man: Homecoming

By Daniel Restuccio

With Sony’s Spider-Man: Homecoming getting ready to release digitally on September 26 and on 4K Ultra HD/Blu-ray, Blu-ray 3D, Blu-ray and DVD on October 17, we thought this was a great opportunity to talk about some of the film’s VFX.

Sony Imageworks has worked on every single Spider-Man movie in some capacity since the 2002 Sam Raimi version. On Spider-Man: Homecoming, Imageworks worked on mostly the “third act,” which encompasses the warehouse, hijacked plane and beach destruction scenes. This meant delivering over 500 VFX shots, created by over 30 artists (at one point this peaked at 200) and compositors, and rendering out 2K finished scenes.

All of the Imageworks artists used Dell R7910 workstations with Intel Xeon CPU E5-2620 24 cores, 64GB memory and Nvidia Quadro P5000 graphics cards. They used Cinesync for client reviews and internally they used their in-house Itview software. Rendering technology was SPI Arnold (not the commercial version) and their custom shading system. Software used was Autodesk 2015, Foundry’s Nuke X 10.0 and Side Effects Houdini 15.5. They avoided plug-ins so that their auto-vend, breaking of comps into layers for the 3D conversion process, would be as smooth as possible. Everything was rendered internally on their on-premises renderfarm. They also used the Sony “Kinect” scanning technique that allowed their artists to do performance capture on themselves and rapidly prototype ideas and generate reference.

We sat down with Sony Imageworks VFX supervisor Theo Bailek, who talks about the studio’s contribution to this latest Spidey film.

You worked on The Amazing Spider-Man in 2012 and The Amazing Spider-Man 2 in 2014. From a visual effects standpoint, what was different?
You know, not a lot. Most of the changes have been iterative improvements. We used many of the same technologies that we developed on the first few movies. How we do our city environments is a specific example of how we build off of our previous assets and techniques, leveraging off the library of buildings and props. As the machines get faster and the software more refined, it allows our artists increased iterations. This alone gave our team a big advantage over the workflows from five years earlier. As the software and pipeline here at Sony has gotten more accessible, it has allowed us to more quickly integrate new artists.

It’s a lot of very small, incremental improvements along the way. The biggest technological changes between now and the early Spider-Mans is our rendering technology. We use a more physically-accurate-based rendering incarnation of our global illumination Arnold renderer. As the shaders and rendering algorithms become more naturalistic, we’re able to conform our assets and workflows. In the end, this translates to a more realistic image out of the box.

The biggest thing on this movie was the inclusion of Spider-Man in a Marvel Universe: a different take on this film and how they wanted it to go. That would be probably the biggest difference.

Did you work directly with director Jon Watts, or did you work with production VFX supervisor Janek Sirrs in terms of the direction on the VFX?
During the shooting of the film I had the advantage of working directly with both Janek and Jon. The entire creative team pushed for open collaboration, and Janek was very supportive toward this goal. He would encourage and facilitate interaction with both the director and Tom Holland (who played Spider-Man) whenever possible. Everything moved so quick on set, often times if you waited to suggest an idea you’d lose the chance, as they would have to set up for the next scene.

The sooner Janek could get his vendor supervisors comfortable interacting, the bigger our contributions. While on set I often had the opportunity to bring our asset work and designs directly to Jon for feedback. There were times on set when we’d iterate on a design three or four times over the span of the day. Getting this type of realtime feedback was amazing. Once post work began, most of our reviews were directly with Janek.

When you had that first meeting about the tone of the movie, what was Jon’s vision? What did he want to accomplish in this movie?
Early on, it was communicated from him through Janek. It was described as, “This is sort of like a John Hughes, Ferris Bueller’s take on Spider-Man. Being a teenager he’s not meant to be fully in control of his powers or the responsibility that comes with them. This translates to not always being super-confident or proficient in his maneuvers. That was the basis of it.

Their goal was a more playful, relatable character. We accomplished this by being more conservative in our performances, of what Spider-Man was capable of doing. Yes, he has heightened abilities, but we never wanted every landing and jump to be perfect. Even superheroes have off days, especially teenage ones.

This being part of the Marvel Universe, was there a pool of common assets that all the VFX houses used?
Yes. With the Marvel movies, they’re incredibly collaborative and always use multiple vendors. We’re constantly sharing the assets. That said, there are a lot of things you just can’t share because of the different systems under the hood. Textures and models are easily exchanged, but how the textures are combined in the material and shaders… that makes them not reusable given the different renderers at companies. Character rigs are not reusable across vendors as facilities have very specific binding and animation tools.

It is typical to expect only models, textures, base joint locations and finished turntable renders for reference when sending or receiving character assets. As an example, we were able to leverage somewhat on the Avengers Tower model we received from ILM. We did supply our Toomes costume model and Spider-Man character and costume models to other vendors as well.

The scan data of Tom Holland, was it a 3D body scan of him or was there any motion capture?
Multiple sessions were done through the production process. A large volume of stunts and test footage were shot with Tom before filming that proved to be invaluable to our team. He’s incredibly athletic and can do a lot of his own stunts, so the mocap takes we came away with were often directly usable. Given that Tom could do backflips and somersaults in the air we were able to use this footage as a reference for how to instruct our animators later on down the road.
Toward the later-end of filming we did a second capture session, focusing on the shots we wanted to acquire using specific mocap performances. Then again several months later, we followed up with a third mocap session to get any new performances required as the edit solidified.

As we were trying to create a signature performance that felt like Tom Holland, we exclusively stuck to his performances whenever possible. On rare occasions when the stunt was too dangerous, a stuntman was used. Other times we resorted to using our own in-house method of performance capture using a modified Xbox Kinect system to record our own animators as they acted out performances.

In the end performance capture accounted for roughly 30% of the character animation of Spider-Man and Vulture in our shots, with the remaining 70% being completed using traditional key-framed methods.

How did you approach the fresh take on this iconic film franchise?
It was clear from our first meeting with the filmmakers that Spider-Man in this film was intended to be a more relatable and light-hearted take on the genre. Yes, we wanted to take the characters and their stories seriously, but not at the expense of having fun with Peter Parker along the way.

For us that meant that despite Spider-Man’s enhanced abilities, how we displayed those abilities on screen needed to always feel grounded in realism. If we faltered on this goal, we ran the risk of eroding the sense of peril and therefore any empathy toward the characters.

When you’re animating a superhero it’s not easy to keep the action relatable. When your characters possess abilities that you never see in the real world, it’s a very thin line between something that looks amazing and something that is amazingly silly and unrealistic. Over-extend the performances and you blow the illusion. Given that Peter Parker is a teenager and he’s coming to grips with the responsibilities and limits of his abilities, we really tried to key into the performances from Tom Holland for guidance.

The first tool at our disposal and the most direct representation of Tom as Spider-Man was, of course, motion capture of his performances. On three separate occasions we recorded Tom running through stunts and other generic motions. For the more dangerous stunts, wires and a stuntman were employed as we pushed the limit of what could be recorded. Even though the cables allowed us to record huge leaps, you couldn’t easily disguise the augmented feel to the actor’s weight and motion. Even so, every session provided us with amazing reference.

Though a bulk of the shots were keyframed, it was always informed by reference. We looked at everything that was remotely relevant for inspiration. For example, we have a scene in the warehouse where the Vulture’s wings are racing toward you as Spider-Man leaps into the air stepping on the top of the wings before flipping to avoid the attack. We found this amazing reference of people who leap over cars racing in excess of 70mph. It’s absurdly dangerous and hard to justify why someone would attempt a stunt like that, and yet it was the perfect for inspiration for our shot.

In trying to keep the performances grounded and stay true to the goals of the filmmakers, we also found it was always better to err on the side of simplicity when possible. Typically, when animating a character, you look for opportunities to create strong silhouettes so the actions read clearly, but we tended to ignore these rules in favor of keeping everything dirty and with an unscripted feel. We let his legs cross over and knees knock together. Our animation supervisor, Richard Smith, pushed our team to follow the guidelines of “economy of motion.” If Spider-Man needed to get from point A to B he’d take the shortest route — there’s not time to strike an iconic pose in-between!


Let’s talk a little bit about the third act. You had previsualizations from The Third Floor?
Right. All three of the main sequences we worked on in the third act had extensive previs completed before filming began. Janek worked extremely closely with The Third Floor and the director throughout the entire process of the film. In addition, Imageworks was tapped to help come up with ideas and takes. From early on it was a very collaborative effort on the part of the whole production.
The previs for the warehouse sequence was immensely helpful in the planning of the shoot. Given we were filming on location and the VFX shots would largely rely on carefully choreographed plate photography and practical effects, everything had to be planned ahead of time. In the end, the previs for that sequence resembled the final shots in most cases.

The digital performances of our CG Spider-Man varied at times, but the pacing and spirit remained true to the previs. As our plane battle sequence was almost entirely CG, the previs stage was more of an ongoing process for this section. Given that we weren’t locked into plates for the action, the filmmakers were free to iterate and refine ideas well past the time of filming. In addition to The Third Floor’s previs, Imageworks’ internal animation team also contributed heavily to the ideas that eventually formed the sequence.

For the beach battle, we had a mix of plate and all-CG shots. Here the previs was invaluable once again in informing the shoot and subsequent reshoots later on. As there were several all-CG beats to the fight, we again had sections where we continued to refine and experiment till late into post. As with the plane battle, Imageworks’ internal team contributed extensively to pre and postvis of this sequence.

The one scene, you mentioned — the fight in the warehouse — in the production notes, it talks about that scene being inspired by an actual scene from the comic The Amazing Spider-Man #33.
Yes, in our warehouse sequence there are a series of shots that are directly inspired by the comic book’s cells. Different circumstances in the the comic and our sequence lead to Spider-Man being trapped under debris. However, Tom’s performance and the camera angles that were shot play homage to the comic as he escapes. As a side note, many of those shots were added later in the production and filmed as reshoots.

What sort of CG enhancements did you bring to that scene?
For the warehouse sequence, we added digital Spider-Man, Vulture wings, CG destruction, enhanced any practical effects, and extended or repaired the plate as needed.The columns that the Vulture wings slice through as it circles Spider-Man were practically destroyed with small denoted charges. These explosives were rigged within cement that encased the actual warehouses steel girder columns. They had fans on set that were used to help mimic interaction from the turbulence that would be present from a flying wingsuit powered by turbines. These practical effects were immensely helpful for our effects artists as they provided the best-possible in-camera reference. We kept much of what was filmed, adding our fully reactive FX on top to help tie it into the motion of our CG wings.

There’s quite a bit of destruction when the Vulture wings blast through walls as well. For those shots we relied entirely on CG rigid body dynamic simulations for the CG effects, as filming it would have been prohibitive and unreliable. Though most of the shots in this sequence had photographed plates, there were still a few that required the background to be generated in CG. One shot in particular, with Spider-Man sliding back and rising up, stands out in particular. As the shot was conceived later in the production, there was no footage for us to use as our main plate. We did however have many tiles shot of the environment, which we were able to use to quickly reconstruct the entire set in CG.

I was particularly proud of our team for their work on the warehouse sequence. The quality of our CG performances and the look of the rendering is difficult to discern from the live action. Even the rare all-CG shots blended seamlessly between scenes.

When you were looking at that ending plane scene, what sort of challenges were there?
Since over 90 shots within the plane sequence were entirely CG we faced many challenges, for sure. With such a large number of shots without the typical constraints that practical plates impose, we knew a turnkey pipeline was needed. There just wouldn’t be time to have custom workflows for each shot type. This was something Janek, our client-side VFX supervisor, stressed from the onset, “show early, show often and be prepared to change constantly!”

To accomplish this, a balance of 3D and 2D techniques were developed to make the shot production as flexible as possible. Using our compositing software Nuke’s 3D abilities we were able to offload significant portions of the shot production into the compositor’s hands. For example: the city ground plane you see through the clouds, the projections of the imagery on the plane’s cloaking LED’s and the damaged flickering LED’s were all techniques done in the composite.

A unique challenge to the sequence that stands out is definitely the cloaking. Making an invisible jet was only half of the equation. The LEDs that made up the basis for the effect also needed to be able to illuminate our characters. This was true for wide and extreme close-up shots. We’re talking about millions of tiny light sources, which is a particularly expensive rendering problem to tackle. Mix in the fact that the design of these flashing light sources is highly subjective and thus prone to needing many revisions to get the look right.

Painting control texture maps for the location of these LEDs wouldn’t be feasible for the detail needed on our extreme close-up shots. Modeling them in would have been prohibitive as well, resulting in excessive geometric complexity. Instead, using Houdini, our effects software, we built algorithms to automate the distribution of point clouds of data to intelligently represent each LED position. This technique could be reprocessed as necessary without incurring the large amounts of time a texture or model solution would have required. As the plane base model often needed adjustments to accommodate design or performance changes, this was a real factor. The point cloud data was then used by our rendering software to instance geometric approximations of inset LED compartments on the surface.

Interestingly, this was a technique we adopted from rendering technology we use to create room interiors for our CG city buildings. When rendering large CG buildings we can’t afford to model the hundreds and sometimes thousands of rooms you see through the office windows. Instead of modeling the complex geometry you see through the windows, we procedurally generate small inset boxes for each window that have randomized pictures of different rooms. This is the same underlying technology we used to create the millions of highly detailed LEDs on our plane.

First our lighters supplied base renders to our compositors to work with inside of Nuke. The compositors quickly animated flashing damage to the LEDs by projecting animated imagery on the plane using Nuke’s 3D capabilities. Once we got buyoff on the animation of the imagery we’d pass this work back to the lighters as 2D layers that could be used as texture maps for our LED lights in the renderer. These images would instruct each LED when it was on and what color it needed to be. This back and forth technique allowed us to more rapidly iterate on the look of the LEDs in 2D before committing and submitting final 3D renders that would have all of the expensive interactive lighting.

Is that a proprietary system?
Yes, this is a shading system that was actually developed for our earlier Spider-Man films back when we used RenderMan. It has since been ported to work in our proprietary version of Arnold, our current renderer.

VFX Roundtable: Trends and inspiration

By Randi Altman

The world of visual effects is ever-changing, and the speed at which artists are being asked to create new worlds, or to make things invisible is moving full-speed ahead. How do visual effects artists (and studios) prepare for these challenges, and what inspired them to get into this business? We reached out to a small group of visual effects pros working in television, commercials and feature films to find out how they work and what gets their creative juices flowing.

Let’s find out what they had to say…

KEVIN BAILLIE, CEO, ATOMIC FICTION
What do you wish clients would know before jumping into a VFX-heavy project?
The core thing for every filmmaking team to recognize is that VFX isn’t a “post process.” Careful advance planning and a tight relationship between the director, production designer, stunt team and cinematographer will yield a far superior result much more cost effectively.

In the best-looking and best-managed productions I’ve ever been a part of, the VFX team is the first department to be brought onto the show and the last one off. It truly acts as a partner in the filmmaking process. After all, once the VFX post phase starts, it’s effectively a continuation of production — with there being a digital corollary to every single department on set, from painters to construction to costume!

What trends in VFX have impressed you the most over the last year or two, and how are they affecting your work?
The move to cloud computing is one of the most exciting trends in VFX. The cloud is letting smaller teams to much bigger work, allowing bigger teams to do things that have never been seen before and will ultimately result in compute resources no longer being a constraint on the creative process.

Cloud computing allowed Atomic Fiction to play alongside the most prestigious companies in the world, even when we were just 20 people. That capability has allowed us to grow to over 200 people, and now we’re able to take the lead vendor position on A-list shows. It’s remarkable what dynamic and large-scale infrastructure in the cloud has enabled Atomic to accomplish.

How many years have you been working in VFX, and what project inspired you to get into this line of work?
I grew up in Seattle and started dabbling in 3D as a hobby when I was 14 years old, having been immensely inspired by Jurassic Park. Soon thereafter, I started working at Microsoft in the afternoons, developing visual content to demonstrate their upcoming technologies. I was fortunate enough to land a job with Lucasfilm right after graduating high school, which was 20 years ago at this point! I’ve been lucky enough to work with many of the directors that inspired me as a child, such as George Lucas and Robert Zemeckis, and modern pioneers like JJ Abrams.

Looking back on my career so far, I truly feel like I’ve been living the dream. I can’t wait for what’s next in this exciting, ever-changing business.

ROB LEGATO, OSCAR-WINNING VFX SUPERVISOR, SECOND UNIT DIRECTOR, SECOND UNIT DIRECTOR OF PHOTOGRAPHY
What do you wish clients would know before jumping into a VFX-heavy project?
It takes a good bit of time to come up with a plan that will ensure a sustainable attack when makinging the film. They need to ask someone in authority, “What does it take to do it,” and then make a reasonable plan. Everyone wants to do a great job all the time, and if they could maneuver the schedule — even with the same timeframe — it could be a much less frustrating job.

It happens time and time again, someone comes up with a budget and a schedule that doesn’t really fit with the task and forces you to live with it. That makes for a very difficult assignment that gets done because of the hard work of the people who are in the trenches.

What trends in VFX have impressed you the most over the last year or two, and how are they affecting your work?
For me, it’s how realistic you can make something. The rendering capabilities — like what we did on Jungle Book with the animals — are so sophisticated that it fools your eye into believing it’s real. Once you do that you’ve opened the magic door that allows you to do anything with a tremendous amount of fidelity. You can make good movies without it being a special-venue movie or a VFX movie. The computer power and rendering abilities — along with the incredible artistic talent pool that we have created over the years — is very impressive, especially for me, coming from a more traditional camera background. I tended to shy away from computer-generated things because they never had the authenticity you would have wanted.

Then there is the happy accident of shooting something, where an angle you wouldn’t have considered appears as you look through the camera; now you can do that in the computer, which I find infinitely fascinating. This is where all the virtual cinematography things I’ve done in the past come in to help create that happy accident.

How many years have you been working in VFX, and what project inspired you to get into this line of work?
I’ve been working in VFX since about 1984. Visual effects wasn’t my dream. I wanted to make movies: direct, shoot and be a cameraman and editor. I fell into it and then used it as an avenue to allow me to create sequences in films and commercials.

The reason you go to movies is to see something you have never seen before, and for me that was Close Encounters. The first time I saw the mothership in Close Encounters, it wasn’t just an effect, it became an art form. It was beautifully realized and it made the story. Blade Runner was another where it’s no longer a visual effect, it’s filmmaking as an art form.

There was also my deep appreciation for Doug Trumbull, whose quality of work was so high it transcended being a visual effect or a photographic effect.

LISA MAHER, VP OF PRODUCTION, SHADE VFX 
What do you wish clients would know before jumping into a VFX-heavy project?
That it’s less expensive in the end to have a VFX representative involved on the project from the get-go, just like all the other filmmaking craft-persons that are represented. It’s getting better all the time though, and we are definitely being brought on board earlier these days.

At Shade we specialize in invisible or supporting VFX. So-called invisible effects are often much harder to pull off. It’s all about integrating digital elements that support the story but don’t pull the audience out of a scene. Being able to assist in the planning stages of a difficult VFX sequence often results in the filmmakers achieving what they envisioned more readily. It also helps tremendously to keep the costs in line with what was originally budgeted. It also goes without saying that it makes for happier VFX artists as they receive photography captured with their best interests in mind.

What trends in VFX have impressed you the most over the last year or two, and how are they affecting your work?
I would say the most exciting development affecting visual effects is the explosion of opportunities offered by the OTT content providers such as Netflix, Amazon, HBO and Hulu. Shade primarily served the feature film market up to three years ago, but with the expanding needs of television, our offices in Los Angeles and New York are now evenly split between film and TV work.

We often find that the film work is still being done at the good old reliable 2K resolution while our TV shows are always 4K plus. The quality and diversity of projects being produced for TV now make visual effects a much more buoyant enterprise for a mid-sized company and also a real source of employment for VFX professionals who were previously so dependent on big studio generated features.

How many years have you been working in VFX, and what project inspired you to get into this line of work?
I’ve been working in visual effects close to 20 years now. I grew up in Ireland; as a child the world of film, and especially images of sunny California, were always a huge draw for me. They helped me survive the many grey and rainy days of the Irish climate.  I can’t point to one project that inspired me to get into film making — there have been so many — just a general love for storytelling, I guess. Films like Westworld (the 1973 version), Silent Running, Cinema Paradiso, Close Encounters of the Third Kind, Blade Runner and, of course, the original Star Wars were truly inspirational.

DAVID SHELDON-HICKS, CO-FOUNDER/EXECUTIVE CREATIVE DIRECTOR, TERRITORY STUDIO
What do you wish clients would know before jumping into a VFX-heavy project?
The craft and care and love that goes into VFX is often forgotten in the “business” of it all. As a design led studio that straddles art and VFX departments in our screen graphic and VFX work, we prefer to work with the director from the preproduction phase. This ensures that all aspects of our work are integrated into story and world building.

The talent and gut instinct, eye for composition and lighting, appreciation of form, choreography of movement and, most notably, the appreciation of the classics is so pertinent to the art of VFX and is undersold for conversations of shot counts, pipelines, bidding and numbers of artists. Bringing the filmmakers into the creative process has to be the way forward for an art form still finding its own voice.

What trends in VFX have impressed you the most over the last year or two, and how are they affecting your work?
The level of concept art and postviz coming through from VFX studios is quite staggering. It gets back to my point from above of bringing the VFX dialogue with filmmakers and VFX artists concentrated on world building and narrative expansion. It’s so exciting to see concept art and postviz getting to a new level of sophistication and influence in the filmmaking process.

How many years have you been working in VFX, and what project inspired you to get into this line of work?
I have been working professionally in VFX for over 15 years. My love of VFX and creativity in general came from the moment I picked up a pencil and imagined new possibilities. But once I cut my film teeth designing screens graphics on Casino Royale and followed by Dark Knight, I left my freelance days behind and co-founded Territory Studio. Our first film as a studio was Prometheus, and working with Ridley Scott was a formative experience that has influenced our own design-led approach to motion graphics and VFX, which has established us in the industry and seen the studio grow and expand.

MARK BREAKSPEAR, VFX SUPERVISOR, SONY PICTURES IMAGEWORKS
What do you wish clients would know before jumping into a VFX-heavy project?

Firstly, I think the clients I have worked with have always been extremely cognizant of the key areas affecting VFX heavy projects and consequently have built frameworks that help plan and execute these mammoth shows successfully.

Ironically, it’s the smaller shows that sometimes have the surprising “gotchas” in them. The big shows come with built-in checks and balances in the form of experienced people who are looking out for the best interests of the project and how to navigate the many pitfalls that can make the VFX costs increase.

Smaller shows sometimes don’t allow enough discussion and planning time for the VFX components in pre-production, which could result in the photography not being captured as well as it could have been. Everything goes wrong from there.

So, when I approach any show, I always look for the shots that are going to be underestimated and try to give them the attention they need to succeed. You can get taken out of a movie by a bad driving comp as much as you can a monster space goat biting a planet in half.

What trends in VFX have impressed you the most over the last year or two, and how are they affecting your work?
I think there are several red herrings out there right now… the big one being VR. To me, VR is like someone has invented teleportation, but it only works on feet.

So, right now, it’s essentially useless and won’t make creating VFX any easier or make the end result any more spectacular. I would like to see VR used to aid artists working on shots. If you could comp in VR I could see that being a good way to help create more complex and visually thrilling shots. The user interface world is really the key area VR can benefit.

Suicide Squad

I do think however, that AR is very interesting. The real world, with added layers of information is a hugely powerful prospect. Imagine looking at a building in any city of the world, and the apartments for sale in it are highlighted in realtime, with facts like cost, square footage etc. all right there in front of you.

How does AR benefit VFX? An artist could use AR to get valuable info about shots just by looking at them. How often do we look at a shot and ask “what lens was this? AR could have all that meta-data ready to display at any point on any shot.

How many years have you been working in VFX, and what project inspired you to get into this line of work?
I’ve been in VFX for 25 years. When I started, VFX was not really a common term. I came to this industry through the commercial world… as a compositor on TV shows and music videos. Lots of (as we would call it now) visual effects, but done in a world bereft of pipelines and huge cloud-based renderfarms.

I was never inspired by a specific project to get into the visual effects world. I was a creative kid who also liked the sciences. I liked to work out why things ticked, and also draw them, and sometimes try to draw them with improvements or updates as I could imagine. It’s a common set of passions that I find in my colleagues.

I watched Star Wars and came out wondering why there were black lines around some of the space ships. Maybe there’s your answer… I was inspired by the broken parts of movies, rather than being swept up in the worlds they portrayed. After all that effort, time and energy… why did it still look wrong? How can I fix it for next time?

CHRIS HEALER, CEO/CTO/VFX SUPERVISOR, THE MOLECULE
What do you wish clients would know before jumping into a VFX-heavy project?

Plan, plan plan… previs, storyboarding and initial design are crucial to VFX-heavy projects. The mindset should ideally be that most (or all) decisions have been made before the shoot starts, as opposed to a “we’ll figure it out in post” approach.

What trends in VFX have impressed you the most over the last year or two, and how are they affecting your work?
Photogrammetry, image modeling and data capture are so much more available than ever before. Instead of an expensive Lidar rig that only produces geometry without color, there are many many new ways to capture the color and geometry of the physical world, even using a simple smart phone or DSLR.

How many years have you been working in VFX, and what project inspired you to get into this line of work?
I’ve been doing VFX now for over 16 years. I would have to say that The Matrix (part 1) was really inspiring when I saw it the first time, and it made clear that VFX as an art form was coming and available to artists of all kinds all over the world. Previous to that, VFX was very difficult to approach for the average student with limited resources.

PAUL MARANGOS, SENIOR VFX FLAME ARTIST, HOOLIGAN
What do you wish clients would know before jumping into a VFX-heavy project?

The more involved I can be in the early stages, the more I can educate clients on all of the various effects they could use, as well as technical hurdles to watch out for. In general, I wish more clients involved the VFX guys earlier in the process — even at the concepting and storyboarding stages — because we can consult on a range of critical matters related to budgets, timelines, workflow and, of course, bringing the creative to life with the best possible quality.

Fortunately, more and more agencies realize the value of this. For instance, with a recent campaign Hooligan finished for Harvoni, we were able to plan shots for a big scene featuring hundreds of lanterns in the sky, which required lanterns of various sizes for every angle that Elma Garcia’s production team shot. Having everything well storyboarded and under Elma’s direction, who left no detail unnoticed, we managed to create a spectacular display of lantern composites for the commercial.

We were also involved early on for a campaign for MyHeritage DNA (above) via creative agency Berlin Cameron, featuring spoken word artist Prince Ea, and directed by Jonathan Augustavo of Skunk. Devised as if projecting on a wall, we mapped the motion graphics in the 3D environments.

What trends in VFX have impressed you the most over the last year or two, and how are they affecting your work?
Of course VR and 360 live TV shows are exciting, but augmented reality is what I find particularly interesting — mixing the real world with graphics and video all around you. The interactivity of both of these emerging platforms presents an endless area of growth, as our industry is on the cusp of a sea change that hasn’t quite yet begun to directly affect my day-to-day.

Meanwhile, at Hooligan, we’re always educating ourselves on the latest software, tools and technological trends in order to prepare for the future of media and entertainment — which is wise if you want to be relevant 10 years from now. For instance, I recently attended the TED conference, where Chris Milk spoke on the birth of virtual reality as an artform. I’m also seeing advances in Google cardboard, which is making the platform affordable, too. Seeing companies open up VR Departments is an exciting step for us all and it shows the vision for the future of advertising.

How many years have you been working in VFX, and what project inspired you to get into this line of work?
I have worked in VFX for 25 years. After initially studying fine art and graphic design, the craft aspect of visual effects really appealed to me. Seeing special effects genius Ray Harryhausen’s four-minute skeleton fight was a big inspiration. He rear-projected footage of the actual actors and then combined the shots to make a realistic skeleton-Argonaut battle. It took him over four and a half months to shoot the stop-motion animation.

Main Image: Deadpool/Atomic Fiction.