Tag Archives: VFX

Creating and tracking roaches for Hulu’s 11.22.63

By Randi Altman

Looking for something fun and compelling to watch while your broadcast shows are on winter break? You might want to try Hulu’s original eight-part miniseries 11.22.63, which the streaming channel released last February.

It comes with a pretty impressive pedigree — it’s based on a Stephen King novel, it’s executive produced by J.J. Abrams, it stars Oscar-nominee James Franco (127 Hours) and it’s about JFK’s assassination and includes time travel. C’mon!

The plot involves Franco’s character traveling back to 1960 in an effort to stop JFK’s assassination, but just as he makes headway, he feels the past pushing back in some dangerous, and sometimes gross, ways.

Bruce Branit

In the series pilot, Franco’s character, Jack Epping, is being chased by Kennedy’s security after he tries to sneak into a campaign rally. He ducks in a storage room to hide, but he’s already ticked off the past, which slowly serves him up a room filled with cockroaches that swarm him. The sequence is a slow build, with roaches crawling out, covering the floor and then crawling up him.

I’m not sure if Franco has a no-roach clause in his contract (I would), but in order to have control over these pests, it was best to create them digitally. This is where Bruce Branit, owner of BranitFX in Kansas City, Missouri came in. Yes, you read that right, Kansas City, and his resume is impressive. He is a frequent collaborator with Jay Worth, Bad Robot’s VFX supervisor.

So for this particular scene, BranitFX had one or two reference shots, which they used to create a roach brush via Photoshop. Once the exact look was determined regarding the amount of attacking roaches, they animated it in 3D and and composited. They then used 2D and 3D tracking tools to track Franco while the cockroaches swarmed all over him.

Let’s find out more from Bruce Branit.

How early did you get involved in that episode? How much input did you have in how it would play out?
For this show, there wasn’t a lot of lead time. I came on after shooting was done and there was a rough edit. I don’t think the edit changed a lot after we started.

What did the client want from the scene, and how did you go about accomplishing that?
VFX supervisor Jay Worth and I have worked together on a lot of shows. We’d done some roaches for an episode of Almost Human, and also I think for Fringe, so we had some similar assets and background with talking “roach.” The general description was tons of roaches crawling on James Franco.

Did you do previs?
Not really. I rendered about 10 angles of the roach we had previously worked with and made Adobe Photoshop brushes out of each frame. I used that to paint up a still of each shot to establish a baseline for size, population and general direction of the roaches in each of the 25 or so shots in the sequence.

Did you have to play with the movements a lot, or did it all just come together?
We developed a couple base roach walks and behaviors and then populated each scene with instances of that. This changed depending on whether we needed them crossing the floor, hanging on a light fixture or climbing on Franco’s suit. The roach we had used in the past was similar to what the producers on 11.22.63 had in mind. We made a few minor modifications with texture and modeling. Some of this affected the rig we’d built so a lot of the animations had to be rebuilt.

Can you talk about your process/workflow?
This sequence was shot in anamorphic and featured a constantly flashing light on the set going from dark emergency red lighting to brighter florescent lights. So I generated unsqueezed lens distortion, removed and light mitigated interim plates to pull all of our 2D and 3D tracking off of. The tracking was broken into 2D, 3D and 3D tracking by hand involving roaches on Franco’s body as he turns and swats at them in a panic. The production had taped large “Xs” on his jacket to help with this roto-tracking, but those two had to be painted out for many shots prior to the roaches reaching Franco.

The shots were tracked in Fusion Studio for 2D and SynthEyes for 3D. A few shots were also tracked in PFTrack.

The 3D roach assets were animated and rendered in NewTek LightWave. Passes for the red light and white light conditions were rendered as well as ambient show and specular passes. Although we were now using tracking plates with the 2:1 anamorphic stretch removed, a special camera was created in LightWave that was actually double the anamorphic squeeze to duplicate the vertical booked and DOF from an anamorphic lens. The final composite was completed in Blackmagic Fusion Studio using the original anamorphic plates.

What was the biggest challenge you faced working on this scene?
Understanding the anamorphic workflow was a new challenge. Luckily, I had just completed a short project of my own called Bully Mech that was shot with Lomo anamorphic lenses. So I had just recently developed some familiarity and techniques to deal with the unusual lens attributes of those lenses. Let’s just say they have a lot of character. I talked with a lot of cinematographer friends to try to understand how the lenses behaved and why they stretched the out-of-focus element vertically while the image was actually stretched the other way.

What are you working on now?
I‘ve wrapped up a small amount of work on Westworld and a handful of shots on Legends of Tomorrow. I’ve been directing some television commercials the last few months and just signed a development deal on the Bully Mech project I mentioned earlier.

We are making a sizzle reel of the short that expands the scope of the larger world and working with concept designers and a writer to flush out a feature film pitch. We should be going out with the project early next year.

Infinite Fiction

Republic Editorial launches design/VFX studio

Republic Editorial in Dallas has launched a design- and VFX-focused sister studio, Infinite Fiction, and leading the charge as executive producer is visual effects industry veteran Joey Cade. In her new role, she will focus on developing Infinite Fiction’s sales and marketing strategy, growing its client roster and expanding the creative team and its capabilities. More on her background in a bit.

Infinite Fiction, which is being managed by Republic partners Carrie Callaway, Chris Gipson and Keith James, focuses on high-end, narrative-driven motion design and visual effects work for all platforms, including virtual reality. Although it shares management with Republic Editorial, Infinite Fiction is a stand-alone creative shop and will service agencies, outside post houses and entertainment studios.

Infinite Fiction is housed separately, but located next door to Republic Editorial’s uptown Dallas headquarters. It adds nearly 2,000 square feet of creative space to Republic’s recently renovated 8,000 square feet and is already home to a team of motion designers, visual effects artists, CG generalists and producers.

Cade began her career in live-action production working with Hungry Man, Miramax and NBC. She gained expertise in visual effects and animation at Reel FX, which grew from a 30-person boutique to an over 300-person studio with several divisions during her tenure. As its first entertainment division executive producer, Cade won business with Sony TV, Universal, A&E Networks and ABC Family as well as produced Reel FX’s first theatrical VFX project for Disney. She broadened her skill set by launching and managing a web-based business and gained branding, marketing and advertising experience within small independent agencies, including Tractorbeam.

Infinite Fiction already has projects in its pipeline, including design-driven content pieces for TM Advertising, Dieste and Tracy Locke.

Credit: Film Frame ©2016 Marvel. All Rights Reserved.

Digging Deeper: Doctor Strange VFX supervisor Stephane Ceretti

By Daniel Restuccio

Marvel’s Doctor Strange — about an arrogant neurosurgeon who loses the use of his hands in an accident and sets off on a self-obsessed journey to find a cure — has been doing incredibly well in terms of box office. You’ve got the winning combination of Benedict Cumberbatch, Marvel, a compelling story and a ton of visual effects created by some of the biggest houses in the business, including ILM (London, San Francisco, Vancouver), Method (LA, Vancouver), Luma (LA, Melbourne) Framestore London, Lola, Animal Logic, Crafty Apes, Exceptional Minds and Technicolor VFX.

Stephane Ceretti

Leading the VFX charge was visual effects supervisor Stephane Ceretti, whose credit list reads like a Top 10 list for films based on Marvel comics, including Guardians of the Galaxy, Thor: The Dark World, Captain America: The First Avenger and X-Men: First Class. His resume is long and impressive.

We recently reached out to Ceretti to find out more about Doctor Strange‘s VFX process…

When did you start on the project? When were all the shots turned in?
I started in September 2014 as Scott Derrickson, the director, was working on the script. Production got pushed a few months while we waited for Benedict Cumberbatch to be available, but we worked extensively on previz and visual development during all this time. Production moved to London in June 2015 and shooting began in November 2015 and went until March 2016. Shots and digital asset builds got turned over as we were shooting and in post, as the post production period was very short on the film. We only had 5.5 months to do the visual effects. We finished the film sometime in October, just a few weeks before the release.

What criteria did you use to distribute the shots among the different VFX companies?  For example, was it based on specialty areas?
It’s like a casting; you try to pick the best company and people for each style of effects. For example, ILM had done a lot of NYC work before, especially with Marvel on Avengers. Plus they are a VFX behemoth, so for us it made sense to have them on board the project for these two major sequences, especially with Richard Bluff as their supervisor. He worked with my VFX producer Susan Pickett on the New York battle sequence in Avengers and she knew he would totally be great for what we wanted to achieve.

What creative or technical breakthroughs were there on this project? For example, ILM talked about the 360 Dr. Strange Camera. What were some of the other things that had never been done before?
I think we pushed the envelope on a lot of visual things that had been touched before, but not to that level. We also made huge use of digital doubles extremely close to camera, both in the astral world and the magic mystery tour. It was a big ask for the vendors.

ILM said they did the VFX at IMAX 2K, were any of the VFX shots done at 4K? If yes, why?
No we couldn’t do a 4K version for the IMAX on this project. IMAX takes care, upresing the shots to IMAX resolution with their DMR process. The quality of the Alexa 65, which we used to shoot the movie, makes it a much smoother process. Images were much sharper and detailed to begin with.

It may be meaningless to talk about how many effects shots there were in the movie when it seems like every shot is a VFX shot.  Is there a more meaningful way to describe the scale of the VFX work? 
It is true that just looking at the numbers isn’t a good indication … we had 1,450 VFX shots in the film, and that’s about 900 less than Guardians of the Galaxy, but the shot complexity and design was way more involved because every shot was a bit of a puzzle, plus the R&D effort.

Some shots with the Mandelbrot 3D fractals required a lot of computing power, having a full bending CG NY required tons of assets and the destruction simulation in Hong Kong had to be extremely precise as we were really within the entire street being rebuilt in reversed time. All of these were extremely time and process consuming and needed to be choreographed and designed precisely.

Can you talk about the design references Marvel gave you for the VFX work done in this movie?
Well most of the references that Marvel gave us came from the comics, especially the ones from Steve Ditko, who created all the most iconic psychedelic moments in Doctor Strange in the ‘60s and ‘70s. We also looked at a Doctor Strange comic called “The Oath,” which inspired some of the astral projection work.

How did you draw the line stylistically and creatively between impressively mind-blowing and over-the-top psychedelic?
It was always our concern to push the limits but not break them. We want to take the audience to these new places but not lose them on the way. It was a joint effort between the VFX artists and the director, editors and producers to always keep in mind what the goal of the story was and to make sure that the VFX wouldn’t take over when it was not necessary. It’s important that the VFX don’t overtake the story and the characters at any time. Sometimes we allow ourselves to shine and show off but it’s always in the service of pushing the story further.

What review and submission technology did you use to coordinate all the VFX houses? Was there a central server?
We used CineSync to review all the submissions live with the vendors. Marvel has a very strong IT department and servers that allow the various vendors to send their submission securely and quickly. We used a system called Signiant that allows all submissions to be automatically sorted and put in a database for review. It’s very efficient and necessary when you get a huge amount of submissions daily as we did toward the end of the project. Our team of amazing coordinators made sure everything was reviewed and presented to the studio so we could give immediate feedback to our vendors, who worked 24/7 around the globe to finish the movie.

What project management software did you use?
Our database is customized and we use Filemaker. Our review sessions are a mixture of CineSync (QuickTime and interactive reviews) and Tweak RV for 2K viewing and finalizing.

In talking to ILM about the film, they mentioned previs, production and postvis. Can you talk a bit about that whole workflow?
We do extensive previz/techviz and stuntviz before production, but as soon as the shots are in the can editors cut them in the movie. They are then turned over to our postviz team so we can quickly check that everything works and the editors can cut in a version of the shot that represents the idea of what it will be in the end. It’s a fantastic tool that allows us to shape the film before we turn it over to the vendors, so we nail basic ideas and concepts before they get executed. Obviously, there is lots that the vendors will add on top of the postviz, but this process is necessary for a lot of reasons (editing, R&D, preview screening) and is very efficient and useful.

Collectively how many hundreds of people worked on the VFX on this movie?
I would say about 1,000 people in the VFX overall. That does not count the 3D conversion people.

What was the personal challenge for you? How did you survive and thrive while working on this one project?
I worked two years on it! It was really difficult, but also very exciting. Sometimes mentally draining and challenging, but always interesting. What makes you survive is the excitement of making something special and getting to see it put together by such a talented group of people across the board. When you work on this kind of film everybody does their best, so the outcome is worth it. I think we definitely tried to do our best, and the audience seems to respond to what we did. It’s incredibly rewarding and in the end, it’s the reason why we make these movies — so that people can enjoy the ride.


In addition to being a West Coast correspondent for postPerspective, Daniel Restuccio is the multimedia department chair at California Lutheran University and former Walt Disney Imagineer.

New Wacom Cintiq Pro line offers portability, updated pen, more

Wacom has introduced a new line of Wacom Cintiq Pro creative pen displays: the Cintiq Pro 13 and Cintiq Pro 16. The Wacom Cintiq Pro features a thin and portable form factor, making them suitable for working on the road or remotely.

Cintiq Pro’s new Pro Pen 2, according to Wacom, offers four times greater accuracy and pressure sensitivity than the previous Pro Pen. The improved Pro Pen 2 creates an intuitive experience with virtually lag-free tracking on a glass surface that produces the right amount of friction, and is coated to reduce reflection.

Additionally, the new optical bonding process reduces parallax, providing a pen-on-screen performance that feels natural and has the feedback of a traditional pen or brush. Both Cintiq Pro models also feature multi-touch for easy and fast navigation, as well as the ability to pinch, zoom and rotate illustrations, photos or models within supporting 2D or 3D creative software apps.

Both high-resolution Cintiq Pro models come with an optimized edge-to-edge etched glass workspace. The Cintiq Pro also builds on its predecessor, the Cintiq 13HD touch, offering the ExpressKey Remote as an optional accessory so users can customize their most commonly used shortcuts and modifiers when working with their most-used software applications. In addition, ergonomic features, such as ErgoFlex, fully integrated pop out legs and an optional three-position desk stand (available in February), let users focus on their work instead of constantly adjusting for comfort.

The Wacom Cintiq Pro 13 and 16 are compatible with both Macs and PCs and feature full HD (1920×1080) and UHD (3840×2160) resolution, respectively. Both Cintiq Pro configurations deliver vivid colors, the 13-inch model providing 87 percent Adobe RGB and the 16-inch, 94 percent.

Priced at $999.95 USD, the Cintiq Pro 13 is expected to be available online and at select retail locations at the beginning of December. The Cintiq Pro 16, $1499.95 USD, is expected in February.

GenPop’s Bill Yukich directs, edits gritty open for Amazon’s Goliath 

Director/editor Bill Yukich helmed the film noir-ish opening title sequence for Amazon’s new legal drama, Goliath. Produced by LA-based content creation studio GenPop, the black and white intro starts with Goliath lead actor Billy Bob Thornton jumping into the ocean. While underwater, and smoking a cigarette and holding a briefcase, he casually strolls through rooms filled with smoke and fire. At the end of the open, he rises from the water as the Santa Monica Pier appears next to him and as the picture turns from B&W to color. The Silent Comedy’s “Bartholomew” track plays throughout.

The ominous backdrop, of a man underwater but not drgoliathowning, is a perfect visual description of Thornton’s role as disgraced lawyer Billy McBride. Yukich’s visuals, he says, are meant to strike a balance between dreamlike and menacing.

The approved concept called for a dry shoot, so Yukich came up with solutions to make it seem as though the sequence was actually filmed underwater. Shot on a Red Magnesium Weapon camera, Yukich used a variety of in-camera techniques to achieve the illusion of water, smoke and fire existing within the same world, including the ingenious use of smoke to mimic the movement of crashing waves.

After wrapping the live-action shoot with Thornton, Yukich edited and color corrected the sequence. The VFX work was mostly supplementary and used to enhance the practical effects which were captured on set, such as adding extra fireballs into the frame to make the pyrotechnics feel fuller. Editing was via Adobe Premiere and VFX and color was done in Autodesk Flame. In the end, 80 percent was live action and only 20 percent visual effects.

Once post production was done, Yukich projected the sequence onto a screen which was submerged underwater and reshot the projected footage. Though technically challenging, Yukich says, this Inception-style method of re-shooting the footage gave the film the organic quality that he was looking for.

Yukich recently worked as lead editor for Beyoncé’s visual album Lemonade. Stepping behind the lens was a natural progression for Yukich, who began directing concerts for bands like Godsmack and The Hollywood Undead, as well as music videos for HIM, Vision of Disorder and The Foo Fighters.

Marvel’s Victoria Alonso to receive VES Visionary Award

The VES (Visual Effects Society) has named Victoria Alonso, producer and Marvel Studios EVP of production, as the next recipient of its Visionary Award in recognition of her contributions to visual arts and filmed entertainment. The award will be presented to Alonso at the 15th Annual VES Awards on February 7 at the Beverly Hilton.

The VES Visionary Award, voted on by the VES board of directors, “recognizes an individual who has uniquely and consistently employed the art and science of visual effects to foster imagination and ignite future discoveries by way of artistry, invention and groundbreaking work.” VES will honor Alonso for her dedication to the industry and advancement of storytelling through visual effects.

Alonso is currently executive producing James Gunn’s Guardians of the Galaxy Vol. 2 and Taika Waititi’s Thor: Ragnarok. In her executive role, she oversees post and visual effects for Marvel’s slate. She executive produced Scott Derrickson’s Doctor Strange, Joe and Anthony Russo’s Captain America: Civil War, Peyton Reed’s Ant-Man, Joss Whedon’s Avengers: Age of Ultron, James Gunn’s Guardians of the Galaxy, Joe and Anthony Russo’s Captain America: The Winter Soldier, Alan Taylor’s Thor: The Dark World and Shane Black’s Iron Man 3, as well as Marvel’s The Avengers for Joss Whedon. She co-produced Iron Man and Iron Man 2 with director Jon Favreau, Kenneth Branagh’s Thor and Joe Johnston’s Captain America: The First Avenger.

Alonso’s career began as a commercial VFX producer. From there, she VFX-produced numerous feature films, working with such directors as Ridley Scott (Kingdom of Heaven), Tim Burton (Big Fish) and Andrew Adamson (Shrek), to name a few.

Over the years, Alonso’s dedication to the industry has been admired and her achievements recognized. Alonso was the keynote speaker at the 2014 Visual Effects Society Summit, where she exemplified her role as an advocate for women in the visual effects industry. In 2015, she was an honoree of the New York Women in Film & Television’s Muse Award for Outstanding Vision and Achievement.  This past January she was presented with the Advanced Imaging Society’s Harold Lloyd Award and was recently named to Variety’s 2016 Power of Women L.A. Impact Report, which spotlights creatives and executives who’ve ‘rocked’ the industry in the past year.

Alfonso is in good company. Previous winners of the VES Visionary Award have been Christopher Nolan, Ang Lee, Alfonso Cuarón, J.J. Abrams and Syd Mead.

Behind the Title: Reel FX editor Chris Collins

NAME: Chris Collins
 
COMPANY: Reel FX (@wearereelfx) in Dallas
 
CAN YOU DESCRIBE YOUR COMPANY?
Reel FX is made up of directors, editors, animators, VFX artists, audio engineers and more. We work on everything feature length projects to commercials to VR/360 experiences.

WHAT’S YOUR JOB TITLE?
Editor

WHAT DOES THAT ENTAIL?
What it means to be an editor depends on what kind of editor you ask. If you ask me, the editor is the final director — the person responsible for compiling and composing the hard work of production into a finalized coherent piece of media. Sometimes it’s simple and sometimes there is a lot of creative problem-solving. Sometimes you only cut footage, sometimes you dive into Photoshop, After Effects and other programs to execute a vision. Sometimes there is only one way to make a video work, and sometimes there are infinite ways a piece can be cut. It all depends on the concept and production.

Now with VR, a whole new aspect of editing has opened up by being able to put on a headset and be transported into your footage. I couldn’t be more excited to see the new places that VR can take editing.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I think people look at editors and think the job is easy because they sit in a cozy office on the computer… and sometimes, they’re not wrong. But there is a lot of hidden stress, problem-solving and creativity that is invisible within a finished piece of media. They may watch a final cut and never notice all the things an editor did or fixed — and that’s what makes a good editor.

WHAT DO YOU EDIT ON?
I cut on Avid Media Composer and Adobe Premiere but I also use Adobe’s After Effects and Photoshop in my work.

DO YOU HAVE A FAVORITE PLUG-IN?
Right now it would have to be Mettle’s Skybox VR Player because it allows me to edit and view my cut of 360 footage within the Oculus headset — so ridiculously cool!

WHAT’S YOUR FAVORITE PART OF THE JOB?
Screening a cut to someone for the first time and watching their reaction.

WHAT’S YOUR LEAST FAVORITE?
The fact that the majority of people will never see or know all the unused footage and options that didn’t make the cut.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
I’d have to say those first few hours in the morning with my coffee and late at night after hours because that is when I am the most creative.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Photography.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I’ve been shooting and editing videos since I was a little kid. It carried on into high school and then into college, since that’s really what my hobby and passion was. It was one of the only things I was good at besides video games so it seemed like a no-brainer.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
TGI Fridays Countdown, Ram Division of Labor, Jeep Renegade campaign, Tostitos Recipe Videos and Texas Health Resources.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
A video I cut called Jeep Legendary Lives. It started out as a personal project of mine that I was cutting after hours, and eventually became part of a presentation video that opened the Detroit Auto Show. It’s also one of the first things that got my foot in the door as an editor.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
Phone. Computer. Camera.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
Facebook. Instagram. Twitter.

DO YOU LISTEN TO MUSIC WHILE YOU WORK? 
Only if the footage does not have audio and I am in the organization and melting phase. It’s usually some sort of chill electronic — typically instrumental.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Playing video games or taking photos really helps to distract my brain, but typically all I have to do is remind myself that I am getting paid doing something I love and something that I’ve been passionate about since I was a kid. With that thought, it’s hard not to be anything but grateful.

Ronen Tanchum brought on to run The Artery’s new AR/VR division

New York City’s The Artery has named Ronen Tanchum head of its newly launched virtual reality/augmented reality division. He will serve as creative director/technical director.

Tanchum has a rich VFX background, having produced complex effects set-ups and overseen digital tools development for feature films including Deadpool, Transformers, The Amazing Spiderman, Happy Feet 2, Teenage Mutant Ninja Turtles and The Wolverine. He is also the creator of the original VR film When We Land: Young Yosef. His work on The Future of Music — a 360-degree virtual experience from director Greg Barth and Phenomena Labs, which immerses the viewer in a surrealist musical space — won the DA&D Silver Award in the “Best Branded Content” category in 2016.

“VR today stands at just the tip of the iceberg,” says Tanchum. “Before VR came along, we were just observers and controlled our worlds through a mouse and a keyboard. Through the VR medium, humans become active participants in the virtual world — we get to step into our own imaginations with a direct link to our brains for the first time, experiencing the first impressions of a virtual world. As creators, VR offers us a very powerful tool by which to present a unique new experience.”

Tanchum says the first thing he asks a potential new VR client is, ‘Why VR? What is the role of VR in your story? “Coming from our long experiences in the CG world by working on highly demanding creative visual projects, we at The Artery have evolved our collective knowledge and developed a strong pipeline into this new VR platform,” he explains, adding that The Artery’s new division is currently gearing up for a big VR project for a major brand. “We are using it to its fullest to tell stories. We inform our clients that VR shouldn’t be created just because it’s ‘cool.’ The new VR platform should be used to play an integral part of the storyline itself — a well crafted VR experience should embellish and complement the story.”

 

The Colonie provides editing, VFX for Toyota Corolla spot

Chicago’s The Colonie has teamed with Burrell Communications to provide editorial, visual effects and design services for I Do, a broadcast spot introducing the 2017 Toyota Corolla.

Creative editor Bob Ackerman edited with the carmaker’s tagline, “Let’s Go Places,” in mind. Fast-paced cuts and editing effects helped create an upbeat message that celebrates the new Corolla, as well as its target audience — what Toyota refers to as today’s “on-the-go” generation of young adults.

Lewis Williams, Burrell’s EVP/CCO, brought The Colonie onboard early in the process to collaborate with the production company The Cavalry, ensuring the seamless integration of a variety of visual effects that were central to the style of this spot.

The commercial integrates three distinct vignettes. The spot opens with a young woman behind the wheel of her Toyota. She arrives at a city park and her friends help her yarn bomb the surroundings — from hand-knitted tree trunk covers to a slipcover for a love seat and a garbage pail cozy in the likeness of whimsical characters.

barbarFrom the get-go art director Winston Cheung was very focused on keeping the tone of the spot fresh and young. When selecting footage during the edit session, Ackerman and Cheung made sure to use some of the more playful set-ups from the yarn vignette to providing the bold color palette for final transfer.

The second scenario finds an enterprising man parking his Corolla and unloading his “Pop-Up Barbershop” in front of a tall wall featuring artful graffiti. A well-placed painting of a young man’s face extends over the top of the wall completes the picture. As soon as the barber sets up his chair, his first customer arrives.

The third vignette features a young filmmaker shooting footage of the 2017 Toyota as her crew adds some illuminating effects. Taking her cues from this scene, The Colonie senior designer, Jen Moody, crafted a series of shots that use a “light painting” technique to create a trail of light effect. One of the characters writes the spot’s title, I Do with a light, which Moody layered to create a more tangible quality that really sells the effect. VFX supervisor Tom Dernulc took a classic Toyota Corolla from a previous segment and seamlessly integrated it into the background of the scene.

The Colonie’s team explored several methods for creating the various VFX in the spot before deciding upon a combination of Autodesk Flame Premium and Adobe After Effects. Then it was a matter of picking the right moments. Ackerman grabbed some of their top choices, roughed in the effect on the Avid Media Composer, and presented the client with a nearly finished look right from the very first rough cuts.

“Early on, creative director Lisa McConnell had expressed a desire to explore using a series of stills flashing (á la TV’s Scandal) to advance the spot’s story,” says Ackerman. “We loved the idea. Condensing short sequences of footage into rapid progressions of imagery provided us with an innovative way to convey the full scope of these three scenarios in a very limited 30-second time frame — while also adding an interesting visual element to the final spot.”

Fred Keller of Chicago’s Filmworkers provided the color grade, CRC’s Ian Scott performed the audio mix and sound design, and composers Mike Dragovic, Michael Yessian and Brian Yessian provided the score.

The A-List: Director Terrence Malick’s team behind Voyage of Time

By Iain Blair

A new Terrence Malick film is always a cinematic event, and his latest, Voyage of Time, doesn’t disappoint. Thought provoking and visually transcendent, it’s nothing less than a celebration of life and the grand history of the cosmos. It’s a journey that spans the eons from the Big Bang to the dinosaur age to our present human world… and beyond.

A labor of love, several decades in the making, it also represents Malick’s first foray into documentary storytelling and will be released in two formats: Voyage of Time: Life’s Journey, the 90-minute version narrated by Cate Blanchett, and Voyage of Time: The IMAX Experience, a 45-minute version narrated by Brad Pitt.

Of course, as all cinephiles know, it’s hard to say which is the bigger mystery, Malick or the origin of the universe. The reclusive, enigmatic, thrice Oscar-nominated director — who taught philosophy at MIT before attending AFI — is the writer/director of such films as Badlandlaves, Days of Heaven, The Thin Red Line, The New World, The Tree of Life, To the Wonder, Knight of Cups and an upcoming untitled project starring Ryan Gosling, Michael Fassbender and Rooney Mara. But in the 43 years since the release of Badlands, Malick has rarely done any press.

Happily, longtime collaborators, such as producers Sarah Green and Nicolas Gonda, as well as the film’s VFX supervisor, Dan Glass, and producer, Sophokles Tasioulis, were eager to talk about making the film and Malick’s vision for it.

How long has this labor of love been in the making?
Nicolas Gonda: The ideas behind this film have been gestating ever since Terry’s childhood. He’s always been fascinated by nature and man’s place in the universe. It’s always been on his mind. Sarah and I began working very actively on it about 14 years ago, making budgets and planning all the different shoots we knew we’d need. About four years ago, we joined forces with Sophokles, who brought his great experience making natural history documentaries to both the post part of it and the international distribution.

Why did he decide to make two versions of the film?
Sarah Green: Terry had a very strong vision for it, and he saw opportunities to expand on it this way — he saw it could be for both the educational and entertainment markets. So when he shot, it was for both films, and then we could just edit for each version.

Having worked on many of his films, tell us about Terry’s approach to post.
Green: He absolutely loves post and sees it as this wonderful, exploratory experience for him. Whatever the film is, he always likes to shoot a lot of footage during principal photography, and then he settles down to really explore it in post.

On this one, we all knew the beats he wanted to hit — whether they were emotional or cerebral — but post was also about trying the find the best way to tell this amazing story succinctly and the most emotionally.

Where did you do all the post?
Gonda: At our offices in Austin, and then we finished it all in LA, here at IMAX headquarters in Playa Vista, including the DI.

Sophokles Tasioulis: We had this great post supervisor, Jini Durr, and we were able to project all the VFX on the IMAX screen and really see what we had. It was a real education for all of us, because finishing in 4K is fairly standard, but here at IMAX, that’s their lowest resolution.

How did it work using two editors — Keith Fraase and Rehman Ali — who had also worked with Terry before?
Gonda: They were the main editors, but Terry used a number of editors on the project — a lot of young people who had work on specific shots or scenes. We’d joke that anyone over 25 would be kicked out of the edit room by Terry, because this film really gave him a chance to experiment and work with young kids, which he loved doing.

After all the years it took to become a reality, was Terry happy with the final film?
Gonda: Very, and I don’t think he could have done it 10 years ago. All the recent scientific discoveries put him in a position where it was possible, and using the great imagery of outer space from Hubble and NASA. But then Terry could have continued working on it for another 10 or 20 years.

Green: He’s so curious and knowledgeable, and there was always a new study, a new theory, a new discovery that would excite him. In the early days, he’d give me lists of cutting-edge scientists to contact and learn from. (Laughs) And when they recently announced they had finally discovered gravity waves, I think we all panicked — “Oh no, how will we fit that into the film now?”


 

The Visual Effects

In a career spanning more than 20 years in the industry, Daniel Glass has built an extensive list of credits as a visual effects supervisor on such films as The Hateful Eight, Batman Begins and The Matrix franchise. Voyage of Time is the culmination of a 10-year working relationship with Malick that began with the Oscar-nominated and Palme D’Or winner The Tree of Life.

When did you start on this film?
Ten years ago, but he’s been working on it for decades. He has footage in it that he shot back in the ‘70s. But it’s never really had a script, although obviously the story itself provides a timeline, and he had a huge amount of notes which formed the structure of the film. The whole thing was very much a journey in itself, and one major challenge was keeping up with all the scientific advances made over the past decade or so, and trying to incorporate those into the film.

I heard that you and Terry formed a “skunkworks” operation to deal with some of the imagery you developed?
Yeah, we set up this lab in Austin to do chemical experiments to see how various elements such as liquids, dyes, gasses and fluids — and combinations of them — behaved as we filmed them at high-speed. We used everything from gels and glass to smoke machines and fluid tanks to create a whole range of effects. We’d build up all these layers, to illustrate tiny debris in the cosmos or floating particulates at the microbial level, along with various models of orbs with strong backlight and so on. It really was fascinating work.

We also built our own “flow tables” that we filled with milk and dyes and paints, to mimic galaxies. And we used water tanks, because just a few drops of dye in a water tank can feel like vast nebulae or strange microscopic environments. Those were really effective, and we also used ferrofluids that become strongly magnetized in the presence of a magnetic field to explore other unusual effects, like black holes. When we played with the ferrofluids, we were able to create these wild, amazing shapes by controlling the current around them. So you get very bizarre yet organic effects that suggest some of the theoretical ideas surrounding black holes. We also used salt crystals on a disc, which we then spun very slowly to replicate the movement in asteroid belts. The big challenge was that nearly every effects shot was a one-off, so the breadth of it was a bit daunting.

Where did you do the rest of the VFX?
Like the film, they were very disparate, and from vendors all over the world. Terry’s original mandate to me was that he wanted every shot to feel as if it was created by a different artist, and while that was a beautiful ideal, it was a very difficult process to manage. But we got pretty close to it in the end. He always wanted the VFX to stem from an analog place first, and not have that overly polished, artificial CG look.

Is it fair to say that every shot has some kind of VFX?
Yes, although some are relatively minor, while others used layer upon layer to create an effect. In the end, we used pretty much every digital tool out there, including Maya, Nuke, 3D Studio Max, Flame for clean-up, Houdini, and a lot of custom renderers.

How do you look back on the project now?
In many ways this was a culmination of taking everything I’ve learned over all the films I’ve done, and then reapplying it in a different way, which made it immensely satisfying.

Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.