Tag Archives: Marvel Studios

The sounds of Spider-Man: Homecoming

By Jennifer Walden

Columbia Pictures and Marvel Studios’ Spider-Man: Homecoming, directed by Jon Watts, casts Tom Holland as Spider-Man, a role he first played in 2016 for Marvel Studios’ Captain America: Civil War (directed by Joe and Anthony Russo).

Homecoming reprises a few key character roles, like Tony Stark/Iron Man (Robert Downey Jr.) and Aunt May Parker (Marisa Tomei), and it picks up a thread of Civil War’s storyline. In Civil War, Peter Parker/Spider-Man helped Tony Stark’s Avengers in their fight against Captain America’s Avengers. Homecoming picks up after that battle, as Parker settles back into his high school life while still fighting crime on the side to hone his superhero skills. He seeks to prove himself to Stark but ends up becoming entangled with the supervillain Vulture (Michael Keaton).

Steven Ticknor

Spider-Man: Homecoming supervising sound editors/sound designers Steven Ticknor and Eric A. Norris — working at Culver City’s Sony Pictures Post Production Services — both brought Spidey experience to the film. Ticknor was a sound designer on director Sam Raimi’s Spider-Man (2002) and Norris was supervising sound editor/sound designer on director Marc Webb’s The Amazing Spider-Man 2 (2014). With experiences from two different versions of Spider-Man, together Ticknor and Norris provided a well-rounded knowledge of the superhero’s sound history for Homecoming. They knew what’s worked in the past, and what to do to make this Spider-Man sound fresh. “This film took a ground-up approach but we also took into consideration the magnitude of the movie,” says Ticknor. “We had to keep in mind that Spider-Man is one of Marvel’s key characters and he has a huge fan base.”

Web Slinging
Being a sequel, Ticknor and Norris honored the sound of Spider-Man’s web slinging ability that was established in Captain America: Civil War, but they also enhanced it to create a subtle difference between Spider-Man’s two suits in Homecoming. There’s the teched-out Tony Stark-built suit that uses the Civil War web-slinging sound, and then there’s Spider-Man’s homemade suit. “I recorded a couple of 5,000-foot magnetic tape cores unraveling very fast, and to that I added whooshes and other elements that gave a sense of speed. Underneath, I had some of the web sounds from the Tony Stark suit. That way the sound for the homemade suit had the same feel as the Stark suit but with an old-school flair,” explains Ticknor.

One new feature of Spider-Man’s Stark suit is that it has expressive eye movements. His eyes can narrow or grow wide with surprise, and those movements are articulated with sound. Norris says, “We initially went with a thin servo-type sound, but the filmmakers were looking for something less electrical. We had the idea to use the lens of a DSLR camera to manually zoom it in and out, so there’s no motor sound. We recorded it up close-up in the quiet environment of an unused ADR stage. That’s the primary sound for his eye movement.”

Droney
Another new feature is the addition of Droney, a small reconnaissance drone that pops off of Spider-Man’s suit and flies around. The sound of Droney was one of director Watt’s initial focus-points. He wanted it sound fun and have a bit of personality. He wanted Droney “to be able to vocalize in a way, sort of like Wall-E,” explains Norris.

Ticknor had the idea of creating Droney’s sound using a turbo toy — a small toy that has a mouthpiece and a spinning fan. Blowing into the mouthpiece makes the fan spin, which generates a whirring sound. The faster the fan spins, the higher the pitch of the generated sound. By modulating the pitch, they created a voice-like quality for Droney. Norris and sound effects editor Andy Sisul performed and recorded an array of turbo toy sounds to use during editorial. Ticknor also added in the sound of a reel-to-reel machine rewinding, which he sped up and manipulated “so that it sounded like Droney was fluttering as it was flying,” Ticknor says.

The Vulture
Supervillain the Vulture offers a unique opportunity for sound design. His alien-tech enhanced suit incorporates two large fans that give him the ability to fly. Norris, who was involved in the initial sound design of Vulture’s suit, created whooshes using Whoosh by Melted Sounds — a whoosh generator that runs in Native Instruments Reaktor. “You put individual samples in there and it creates a whoosh by doing a Doppler shift and granular synthesis as a way of elongating short sounds. I fed different metal ratcheting sounds into it because Vulture’s suit almost has these metallic feathers. We wanted to articulate the sound of all of these different metallic pieces moving together. I also fed sword shings into it and came up with these whooshes that helped define the movement as the Vulture was flying around,” he says. Sound designer/re-recording mixer Tony Lamberti was also instrumental in creating Vulture’s sound.

Alien technology is prevalent in the film. For instance, it’s a key ingredient to Vulture’s suit. The film’s sound needed to reflect the alien influence but also had to feel realistic to a degree. “We started with synthesized sounds, but we then had to find something that grounded it in reality,” reports Ticknor. “That’s always the balance of creating sound design. You can make it sound really cool, but it doesn’t always connect to the screen. Adding organic elements — like wind gusts and debris — make it suddenly feel real. We used a lot of synthesized sounds to create Vulture, but we also used a lot of real sounds.”

The Washington Monument
One of the big scenes that Ticknor handled was the Washington Monument elevator sequence. Spider-Man stands on the top of the Washington Monument and prepares to jump over a helicopter that looms ever closer. He clears the helicopter’s blades and shoots a web onto the helicopter’s skid, using that to sling himself through a window just in time to shoot another web that grabs onto the compromised elevator car that contains his friends. “When Spider-Man jumps over the helicopter, I couldn’t wait to make that work perfectly,” says Ticknor. “When he is flying over the helicopter blades it sounds different. It sounds more threatening. Sound creates an emotion but people don’t realize how sound is creating the emotion because it is happening so quickly sometimes.”

To achieve a more threatening blade sound, Ticknor added in scissor slicing sounds, which he treated using a variety of tools like zPlane Elastique Pitch 2 and plug-ins from FabFilter plug-ins and Soundtoys, all within the Avid Pro Tools 12 environment. “This made the slicing sound like it was about to cut his head off. I took the helicopter blades and slowed them down and added low-end sweeteners to give a sense of heaviness. I put all of that through the plug-ins and basically experimented. The hardest part of sound design is experimenting and finding things that work. There’s also music playing in that scene as well. You have to make the music play with the sound design.”

When designing sounds, Ticknor likes to generate a ton of potential material. “I make a library of sound effects — it’s like a mad science experiment. You do something and then wonder, ‘How did I just do that? What did I just do?’ When you are in a rhythm, you do it all because you know there is no going back. If you just do what you need, it’s never enough. You always need more than you think. The picture is going to change and the VFX are going to change and timings are going to change. Everything is going to change, and you need to be prepared for that.”

Syncing to Picture
To help keep the complex soundtrack in sync with the evolving picture, Norris used Conformalizer by Cargo Cult. Using the EDL of picture changes, Conformalizer makes the necessary adjustments in Pro Tools to resync the sound to the new picture.

Norris explains some key benefits of Conformalizer. “First, when you’re working in Pro Tools you can only see one picture at a time, so you have to go back and forth between the two different pictures to compare. With Conformalizer, you can see the two different pictures simultaneously. It also does a mathematical computation on the two pictures in a separate window, a difference window, which shows the differences in white. It highlights all the subtle visual effects changes that you may not have noticed.

Eric Norris

For example, in the beginning of the film, Peter leaves school and heads out to do some crime fighting. In an alleyway, he changes from his school clothes into his Spider-Man suit. As he’s changing, he knocks into a trash can and a couple of rats fall out and scurry away. Those rats were CG and they didn’t appear until the end of the process. So the rats in the difference window were bright white while everything else was a dark color.”

Another benefit is that the Conformalizer change list can be used on multiple Pro Tools sessions. Most feature films have the sound effects, including Foley and backgrounds, in one session. For Spider-Man: Homecoming, it was split into multiple sessions, with Foley and backgrounds in one session and the sound effects in another.

“Once you get that change list you can run it on all the Pro Tools sessions,” explains Norris. “It saves time and it helps with accuracy. There are so many sounds and details that match the visuals and we need to make sure that we are conforming accurately. When things get hectic, especially near the end of the schedule, and we’re finalizing the track and still getting new visual effects, it becomes a very detail-oriented process and any tools that can help with that are greatly appreciated.”

Creating the soundtrack for Spider-Man: Homecoming required collaboration on a massive scale. “When you’re doing a film like this, it just has to run well. Unless you’re really organized, you’ll never be able to keep up. That’s the beautiful thing, when you’re organized you can be creative. Everything was so well organized that we got an opportunity to be super creative and for that, we were really lucky. As a crew, we were so lucky to work on this film,” concludes Ticknor.


Jennifer Walden in a New Jersey-based audio engineer and writer. Follow her on Twitter @audiojeney.com

ILM’s Richard Bluff talks VFX for Marvel’s Doctor Strange

By Daniel Restuccio

Comic book fans have been waiting for over 30 years for Marvel’s Doctor Strange to come to the big screen, and dare I say it was worth the wait. This is in large part because of the technology now available to create the film’s stunning visual effects.

Fans have the option to see the film in traditional 2D, Dolby Cinema (worthy of an interstate or plane fare pilgrimage, in my opinion) and IMAX 3D. Doctor Strange, Marvel Studios’ 15th film offering, is also receiving good critical reviews and VFX Oscar buzz — it’s currently on the list of 20 films still in the running in the Visual Effects category for the 89th Academy Awards.

Marvel Doctor StrangeThe unapologetically dazzling VFX shots, in many cases directly inspired by the original comic visuals by Steve Dittko, were created by multiple visual effects houses, including Industrial Light & Magic, Luma Pictures, Lola VFX, Method Studios, Rise FX, Crafty Apes, Framestore, Perception and previs house The Third Floor. Check out our interview with the film’s VFX supervisor Stephane Ceretti.

Director Scott Derrickson said in in a recent Reddit chat that Doctor Strange is “a fantastical superhero movie.

“Watching the final cut of the film was deeply satisfying,” commented Derrickson. “A filmmaker cannot depend upon critical reviews or box office for satisfaction — even if they are good. The only true reward for any artist is to pick a worthy target and hit it. When you know you’ve hit your target that is everything. On this one, I hit my target.”

Since we got an overview of how the visual effects workflow went from Ceretti, we decided to talk to one of the studios that provided VFX for the film, specifically ILM and their VFX supervisor Richard Bluff.

Richard Bluff

According to Bluff, early in pre-production Marvel presented concept art, reference images and previsualization on “what were the boundaries of what the visuals could be.” After that, he says, they had the freedom to search within those bounds.

During VFX presentations with Marvel, they frequently showed three versions of the work. “They went with the craziest version to the point where the next time we would show three more versions and we continued to up the ante on the crazy,” recalls Bluff.

As master coordinator of this effort for ILM, Bluff encouraged his artists, “to own the visuals and try to work out how the company could raise the quality of the work or the designs on the show to another level. How could we introduce something new that remains within the fabric of the movie?”

As a result, says Bluff, they had some amazing ideas flow from individuals on the film. Jason Parks came up with the idea of traveling through the center of a subway train as it fractured. Matt Cowey invented the notion of continually rotating the camera to heighten the sense of vertigo. Andrew Graham designed the kaleidoscope-fighting arena “largely because his personal hobby is building and designing real kaleidoscopes.”

Unique to Doctor Strange is that the big VFX sequences are all very “self-contained.” For example, ILM did the New York and Hong Kong sequence, Luma did the Dark Dimension and Method did the multi-universe. ILM also designed and developed the original concept for the Eldridge Magic and provided all the shared “digital doubles” — CGI rigged and animatable versions of the actors — that tied sequences together. The digital doubles were customized to the needs of each VFX house.

Previs
In some movies previs material is generated and thrown away. Not so with Doctor Strange. What ILM did this time was develop a previs workflow where they could actually hang assets and continue to develop, so it became part of the shot from the earliest iteration.

There was extensive previs done for Marvel by The Third Floor as a creative and technical guide across the movie, and further iterations internal to ILM done by ILM’s lead visualization artist, Landis Fields.

Warning! Spoiler! Once Doctor Strange moves the New York fight scene into the mirror universe, the city starts coming apart in an M.C. Escher-meets-Chris Nolan-Inception kind of way. To make that sequence, ILM created a massive tool kit of New York set pieces and geometry, including subway cars, buildings, vehicles and fire escapes.

In the previs, Fields started breaking apart, duplicating and animating those objects, like the fire escapes, to tell the story of what a kaleidoscoping city would look like. The artists then fleshed out a sequence of shots, a.k.a. “mini beats.” They absorbed the previs into the pipeline by later switching out the gross geometry elements in Fields’ previs with the actual New York hero assets.

Strange Cam
Landis and the ILM team also designed and built what ILM dubbed the “strange cam,” a custom 3D printed 360 GoPro rig that had to withstand the rigors of being slung off the edge of skyscrapers. What ILM wanted to do was to be able to capture 360 degrees of rolling footage from that vantage point to be used as a moving background “plates” that could be reflected within the New York City glass buildings.

VFX, Sound Design and the Hong Kong
One of the big challenges with the Hong Kong sequence was that time was reversing and moving forward at the same time. “What we had to do was ensure the viewer understands that time is reversing throughout that entire sequence.” During the tight hand-to-hand action moments that are moving forward in time, there’s not really much screen space to show you time reversing in the background. So they designed the reversing destruction sequence to work in concert with the sound design. “We realized we had to move away from a continuous shower of debris toward rhythmic beats of debris being sucked out of frame.”

before-streetafter-street

Bluff says the VFX the shot count on the film — 1,450 VFX — was actually a lot less than Captain America: Civil War. From a VFX point of view, The Avengers movies lean on the assets generated in Iron Man and Captain America. The Thor movies help provide the context for what an Avengers movie would look and feel like. In Doctor Strange “almost everything in the movie had to be designed (from scratch) because they haven’t already existed in a previous Marvel film. It’s a brand-new character to the Marvel world.”

Bluff started development on the movie in October of 2014 and really started doing hands on work in February of 2016, frequently traveling between Vancouver, San Francisco and London. A typical day, working out of the ILM London office, would see him get in early and immediately deal with review requests from San Francisco. Then he would jump into “dailies” in London and work with them until the afternoon. After “nightlies” with London there was a “dailies” session with San Francisco and Vancouver, work with them until evening, hit the hotel, grab some dinner, come back around 11:30pm or midnight and do nightlies with San Francisco. “It just kept the team together, and we never missed a beat.”

2D vs. IMAX 3D vs. Dolby Cinema
Bluff saw the entire movie for the first time in IMAX 3D, and is looking forward to seeing it in 2D. Considering sequences in the movie are surreal in nature and Escher-like, there’s an argument that suggests that IMAX 3D is a better way to see it because it enhances the already bizarre version of that world. However, he believes the 2D and 3D versions are really “two different experiences.”

Dolby Cinema is the merging of Dolby Atmos — 128-channel surround sound — with the high dynamic range of Dolby Vision, plus really comfortable seats. It is, arguably, the best way to see a movie. Bluff says as far as VFX goes, high dynamic range information has been there for years. “I’m just thankful that exhibition technology is finally catching up with what’s always been there for us on the visual effects side.”

During that Reddit interview, Derrickson commented, “The EDR (Extended Dynamic Range) print is unbelievable — if you’re lucky enough to live where an EDR print is playing. As for 3D and/or IMAX, see it that way if you like that format. If you don’t, see it 2D.”

Doctor Strange is probably currently playing in a theater near you, but go see it in Dolby Cinema if you can.


In addition to being a West Coast correspondent for postPerspective, Daniel Restuccio is the multimedia department chair at California Lutheran University and former Walt Disney Imagineer.

Marvel’s Victoria Alonso to receive VES Visionary Award

The VES (Visual Effects Society) has named Victoria Alonso, producer and Marvel Studios EVP of production, as the next recipient of its Visionary Award in recognition of her contributions to visual arts and filmed entertainment. The award will be presented to Alonso at the 15th Annual VES Awards on February 7 at the Beverly Hilton.

The VES Visionary Award, voted on by the VES board of directors, “recognizes an individual who has uniquely and consistently employed the art and science of visual effects to foster imagination and ignite future discoveries by way of artistry, invention and groundbreaking work.” VES will honor Alonso for her dedication to the industry and advancement of storytelling through visual effects.

Alonso is currently executive producing James Gunn’s Guardians of the Galaxy Vol. 2 and Taika Waititi’s Thor: Ragnarok. In her executive role, she oversees post and visual effects for Marvel’s slate. She executive produced Scott Derrickson’s Doctor Strange, Joe and Anthony Russo’s Captain America: Civil War, Peyton Reed’s Ant-Man, Joss Whedon’s Avengers: Age of Ultron, James Gunn’s Guardians of the Galaxy, Joe and Anthony Russo’s Captain America: The Winter Soldier, Alan Taylor’s Thor: The Dark World and Shane Black’s Iron Man 3, as well as Marvel’s The Avengers for Joss Whedon. She co-produced Iron Man and Iron Man 2 with director Jon Favreau, Kenneth Branagh’s Thor and Joe Johnston’s Captain America: The First Avenger.

Alonso’s career began as a commercial VFX producer. From there, she VFX-produced numerous feature films, working with such directors as Ridley Scott (Kingdom of Heaven), Tim Burton (Big Fish) and Andrew Adamson (Shrek), to name a few.

Over the years, Alonso’s dedication to the industry has been admired and her achievements recognized. Alonso was the keynote speaker at the 2014 Visual Effects Society Summit, where she exemplified her role as an advocate for women in the visual effects industry. In 2015, she was an honoree of the New York Women in Film & Television’s Muse Award for Outstanding Vision and Achievement.  This past January she was presented with the Advanced Imaging Society’s Harold Lloyd Award and was recently named to Variety’s 2016 Power of Women L.A. Impact Report, which spotlights creatives and executives who’ve ‘rocked’ the industry in the past year.

Alfonso is in good company. Previous winners of the VES Visionary Award have been Christopher Nolan, Ang Lee, Alfonso Cuarón, J.J. Abrams and Syd Mead.