Tag Archives: visual effects

Bill Hewes

Behind the Title: Click 3X executive producer Bill Hewes

NAME: Bill Hewes

COMPANY: Click 3X  (@Click3X) in New York City.

CAN YOU DESCRIBE YOUR COMPANY?
We are a digital creation studio that also provides post and animation services.

WHAT’S YOUR JOB TITLE?
I am an executive producer with a roster of animation and live-action directors.

WHAT DOES THAT ENTAIL?
Overseeing everything from the initial creative pitch, working closely with directors, budgeting, approach to a given project, overseeing line producers for shooting, animation and post, client relations and problem solving.

PGIM Prudential

One recent project was this animated spot for a Prudential Global Investment Management campaign.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
Probably that there is no limit to the job description — it involves business skills, a creative sensibility, communication and logistics. It is not about the big decisions, but more about the hundreds of small ones made moment to moment in a given day that add up.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Winning projects.

WHAT’S YOUR LEAST FAVORITE?
Losing projects

WHAT IS YOUR FAVORITE TIME OF THE DAY?
Depends on the day and where I am.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
A park ranger at Gettysburg.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I didn’t choose it. I had been on another career path in the maritime transportation industry and did not want to get on another ship, so I took an entry-level job at a video production company. From day one, there was not a day I did not want to go to work. I was fortunate to have had great mentors that made it possible to learn and advance.

Click it or Ticket

‘Click it or Ticket’ for the National Highway Traffic Safety Administration.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
Two animated spots for Prudential Global Investment Management, commercials and a social media campaign for Ford Trucks, and two humorous online animated spots for the NHTSA’s “Click It or Ticket” campaign.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
A few years back, I took some time off and worked with a director for several months creating films for Amnesty International. Oh, and putting a Dodge Viper on a lava field on a mountain in Hawaii.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
The wheel, anesthesia and my iPhone.

DO YOU LISTEN TO MUSIC WHILE YOU WORK?
I share an office, so we take turns picking the music selections. Lately, we’ve been listening to a lot of Kamasi Washington, Telemann, J Mascis and My Bloody Valentine.

I also would highly recommend, “I Plan to Stay a Believer” by William Parker and the album, “The Inside Songs” by Curtis Mayfield.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Jeet Kune Do, boxing, Muy Thai, Kali/Escrima, knife sparring, and some grappling. But I do this outside of the office.

Creating and tracking roaches for Hulu’s 11.22.63

By Randi Altman

Looking for something fun and compelling to watch while your broadcast shows are on winter break? You might want to try Hulu’s original eight-part miniseries 11.22.63, which the streaming channel released last February.

It comes with a pretty impressive pedigree — it’s based on a Stephen King novel, it’s executive produced by J.J. Abrams, it stars Oscar-nominee James Franco (127 Hours) and it’s about JFK’s assassination and includes time travel. C’mon!

The plot involves Franco’s character traveling back to 1960 in an effort to stop JFK’s assassination, but just as he makes headway, he feels the past pushing back in some dangerous, and sometimes gross, ways.

Bruce Branit

In the series pilot, Franco’s character, Jack Epping, is being chased by Kennedy’s security after he tries to sneak into a campaign rally. He ducks in a storage room to hide, but he’s already ticked off the past, which slowly serves him up a room filled with cockroaches that swarm him. The sequence is a slow build, with roaches crawling out, covering the floor and then crawling up him.

I’m not sure if Franco has a no-roach clause in his contract (I would), but in order to have control over these pests, it was best to create them digitally. This is where Bruce Branit, owner of BranitFX in Kansas City, Missouri came in. Yes, you read that right, Kansas City, and his resume is impressive. He is a frequent collaborator with Jay Worth, Bad Robot’s VFX supervisor.

So for this particular scene, BranitFX had one or two reference shots, which they used to create a roach brush via Photoshop. Once the exact look was determined regarding the amount of attacking roaches, they animated it in 3D and and composited. They then used 2D and 3D tracking tools to track Franco while the cockroaches swarmed all over him.

Let’s find out more from Bruce Branit.

How early did you get involved in that episode? How much input did you have in how it would play out?
For this show, there wasn’t a lot of lead time. I came on after shooting was done and there was a rough edit. I don’t think the edit changed a lot after we started.

What did the client want from the scene, and how did you go about accomplishing that?
VFX supervisor Jay Worth and I have worked together on a lot of shows. We’d done some roaches for an episode of Almost Human, and also I think for Fringe, so we had some similar assets and background with talking “roach.” The general description was tons of roaches crawling on James Franco.

Did you do previs?
Not really. I rendered about 10 angles of the roach we had previously worked with and made Adobe Photoshop brushes out of each frame. I used that to paint up a still of each shot to establish a baseline for size, population and general direction of the roaches in each of the 25 or so shots in the sequence.

Did you have to play with the movements a lot, or did it all just come together?
We developed a couple base roach walks and behaviors and then populated each scene with instances of that. This changed depending on whether we needed them crossing the floor, hanging on a light fixture or climbing on Franco’s suit. The roach we had used in the past was similar to what the producers on 11.22.63 had in mind. We made a few minor modifications with texture and modeling. Some of this affected the rig we’d built so a lot of the animations had to be rebuilt.

Can you talk about your process/workflow?
This sequence was shot in anamorphic and featured a constantly flashing light on the set going from dark emergency red lighting to brighter florescent lights. So I generated unsqueezed lens distortion, removed and light mitigated interim plates to pull all of our 2D and 3D tracking off of. The tracking was broken into 2D, 3D and 3D tracking by hand involving roaches on Franco’s body as he turns and swats at them in a panic. The production had taped large “Xs” on his jacket to help with this roto-tracking, but those two had to be painted out for many shots prior to the roaches reaching Franco.

The shots were tracked in Fusion Studio for 2D and SynthEyes for 3D. A few shots were also tracked in PFTrack.

The 3D roach assets were animated and rendered in NewTek LightWave. Passes for the red light and white light conditions were rendered as well as ambient show and specular passes. Although we were now using tracking plates with the 2:1 anamorphic stretch removed, a special camera was created in LightWave that was actually double the anamorphic squeeze to duplicate the vertical booked and DOF from an anamorphic lens. The final composite was completed in Blackmagic Fusion Studio using the original anamorphic plates.

What was the biggest challenge you faced working on this scene?
Understanding the anamorphic workflow was a new challenge. Luckily, I had just completed a short project of my own called Bully Mech that was shot with Lomo anamorphic lenses. So I had just recently developed some familiarity and techniques to deal with the unusual lens attributes of those lenses. Let’s just say they have a lot of character. I talked with a lot of cinematographer friends to try to understand how the lenses behaved and why they stretched the out-of-focus element vertically while the image was actually stretched the other way.

What are you working on now?
I‘ve wrapped up a small amount of work on Westworld and a handful of shots on Legends of Tomorrow. I’ve been directing some television commercials the last few months and just signed a development deal on the Bully Mech project I mentioned earlier.

We are making a sizzle reel of the short that expands the scope of the larger world and working with concept designers and a writer to flush out a feature film pitch. We should be going out with the project early next year.

Infinite Fiction

Republic Editorial launches design/VFX studio

Republic Editorial in Dallas has launched a design- and VFX-focused sister studio, Infinite Fiction, and leading the charge as executive producer is visual effects industry veteran Joey Cade. In her new role, she will focus on developing Infinite Fiction’s sales and marketing strategy, growing its client roster and expanding the creative team and its capabilities. More on her background in a bit.

Infinite Fiction, which is being managed by Republic partners Carrie Callaway, Chris Gipson and Keith James, focuses on high-end, narrative-driven motion design and visual effects work for all platforms, including virtual reality. Although it shares management with Republic Editorial, Infinite Fiction is a stand-alone creative shop and will service agencies, outside post houses and entertainment studios.

Infinite Fiction is housed separately, but located next door to Republic Editorial’s uptown Dallas headquarters. It adds nearly 2,000 square feet of creative space to Republic’s recently renovated 8,000 square feet and is already home to a team of motion designers, visual effects artists, CG generalists and producers.

Cade began her career in live-action production working with Hungry Man, Miramax and NBC. She gained expertise in visual effects and animation at Reel FX, which grew from a 30-person boutique to an over 300-person studio with several divisions during her tenure. As its first entertainment division executive producer, Cade won business with Sony TV, Universal, A&E Networks and ABC Family as well as produced Reel FX’s first theatrical VFX project for Disney. She broadened her skill set by launching and managing a web-based business and gained branding, marketing and advertising experience within small independent agencies, including Tractorbeam.

Infinite Fiction already has projects in its pipeline, including design-driven content pieces for TM Advertising, Dieste and Tracy Locke.

ILM’s Richard Bluff talks VFX for Marvel’s Doctor Strange

By Daniel Restuccio

Comic book fans have been waiting for over 30 years for Marvel’s Doctor Strange to come to the big screen, and dare I say it was worth the wait. This is in large part because of the technology now available to create the film’s stunning visual effects.

Fans have the option to see the film in traditional 2D, Dolby Cinema (worthy of an interstate or plane fare pilgrimage, in my opinion) and IMAX 3D. Doctor Strange, Marvel Studios’ 15th film offering, is also receiving good critical reviews and VFX Oscar buzz — it’s currently on the list of 20 films still in the running in the Visual Effects category for the 89th Academy Awards.

Marvel Doctor StrangeThe unapologetically dazzling VFX shots, in many cases directly inspired by the original comic visuals by Steve Dittko, were created by multiple visual effects houses, including Industrial Light & Magic, Luma Pictures, Lola VFX, Method Studios, Rise FX, Crafty Apes, Framestore, Perception and previs house The Third Floor. Check out our interview with the film’s VFX supervisor Stephane Ceretti.

Director Scott Derrickson said in in a recent Reddit chat that Doctor Strange is “a fantastical superhero movie.

“Watching the final cut of the film was deeply satisfying,” commented Derrickson. “A filmmaker cannot depend upon critical reviews or box office for satisfaction — even if they are good. The only true reward for any artist is to pick a worthy target and hit it. When you know you’ve hit your target that is everything. On this one, I hit my target.”

Since we got an overview of how the visual effects workflow went from Ceretti, we decided to talk to one of the studios that provided VFX for the film, specifically ILM and their VFX supervisor Richard Bluff.

Richard Bluff

According to Bluff, early in pre-production Marvel presented concept art, reference images and previsualization on “what were the boundaries of what the visuals could be.” After that, he says, they had the freedom to search within those bounds.

During VFX presentations with Marvel, they frequently showed three versions of the work. “They went with the craziest version to the point where the next time we would show three more versions and we continued to up the ante on the crazy,” recalls Bluff.

As master coordinator of this effort for ILM, Bluff encouraged his artists, “to own the visuals and try to work out how the company could raise the quality of the work or the designs on the show to another level. How could we introduce something new that remains within the fabric of the movie?”

As a result, says Bluff, they had some amazing ideas flow from individuals on the film. Jason Parks came up with the idea of traveling through the center of a subway train as it fractured. Matt Cowey invented the notion of continually rotating the camera to heighten the sense of vertigo. Andrew Graham designed the kaleidoscope-fighting arena “largely because his personal hobby is building and designing real kaleidoscopes.”

Unique to Doctor Strange is that the big VFX sequences are all very “self-contained.” For example, ILM did the New York and Hong Kong sequence, Luma did the Dark Dimension and Method did the multi-universe. ILM also designed and developed the original concept for the Eldridge Magic and provided all the shared “digital doubles” — CGI rigged and animatable versions of the actors — that tied sequences together. The digital doubles were customized to the needs of each VFX house.

Previs
In some movies previs material is generated and thrown away. Not so with Doctor Strange. What ILM did this time was develop a previs workflow where they could actually hang assets and continue to develop, so it became part of the shot from the earliest iteration.

There was extensive previs done for Marvel by The Third Floor as a creative and technical guide across the movie, and further iterations internal to ILM done by ILM’s lead visualization artist, Landis Fields.

Warning! Spoiler! Once Doctor Strange moves the New York fight scene into the mirror universe, the city starts coming apart in an M.C. Escher-meets-Chris Nolan-Inception kind of way. To make that sequence, ILM created a massive tool kit of New York set pieces and geometry, including subway cars, buildings, vehicles and fire escapes.

In the previs, Fields started breaking apart, duplicating and animating those objects, like the fire escapes, to tell the story of what a kaleidoscoping city would look like. The artists then fleshed out a sequence of shots, a.k.a. “mini beats.” They absorbed the previs into the pipeline by later switching out the gross geometry elements in Fields’ previs with the actual New York hero assets.

Strange Cam
Landis and the ILM team also designed and built what ILM dubbed the “strange cam,” a custom 3D printed 360 GoPro rig that had to withstand the rigors of being slung off the edge of skyscrapers. What ILM wanted to do was to be able to capture 360 degrees of rolling footage from that vantage point to be used as a moving background “plates” that could be reflected within the New York City glass buildings.

VFX, Sound Design and the Hong Kong
One of the big challenges with the Hong Kong sequence was that time was reversing and moving forward at the same time. “What we had to do was ensure the viewer understands that time is reversing throughout that entire sequence.” During the tight hand-to-hand action moments that are moving forward in time, there’s not really much screen space to show you time reversing in the background. So they designed the reversing destruction sequence to work in concert with the sound design. “We realized we had to move away from a continuous shower of debris toward rhythmic beats of debris being sucked out of frame.”

before-streetafter-street

Bluff says the VFX the shot count on the film — 1,450 VFX — was actually a lot less than Captain America: Civil War. From a VFX point of view, The Avengers movies lean on the assets generated in Iron Man and Captain America. The Thor movies help provide the context for what an Avengers movie would look and feel like. In Doctor Strange “almost everything in the movie had to be designed (from scratch) because they haven’t already existed in a previous Marvel film. It’s a brand-new character to the Marvel world.”

Bluff started development on the movie in October of 2014 and really started doing hands on work in February of 2016, frequently traveling between Vancouver, San Francisco and London. A typical day, working out of the ILM London office, would see him get in early and immediately deal with review requests from San Francisco. Then he would jump into “dailies” in London and work with them until the afternoon. After “nightlies” with London there was a “dailies” session with San Francisco and Vancouver, work with them until evening, hit the hotel, grab some dinner, come back around 11:30pm or midnight and do nightlies with San Francisco. “It just kept the team together, and we never missed a beat.”

2D vs. IMAX 3D vs. Dolby Cinema
Bluff saw the entire movie for the first time in IMAX 3D, and is looking forward to seeing it in 2D. Considering sequences in the movie are surreal in nature and Escher-like, there’s an argument that suggests that IMAX 3D is a better way to see it because it enhances the already bizarre version of that world. However, he believes the 2D and 3D versions are really “two different experiences.”

Dolby Cinema is the merging of Dolby Atmos — 128-channel surround sound — with the high dynamic range of Dolby Vision, plus really comfortable seats. It is, arguably, the best way to see a movie. Bluff says as far as VFX goes, high dynamic range information has been there for years. “I’m just thankful that exhibition technology is finally catching up with what’s always been there for us on the visual effects side.”

During that Reddit interview, Derrickson commented, “The EDR (Extended Dynamic Range) print is unbelievable — if you’re lucky enough to live where an EDR print is playing. As for 3D and/or IMAX, see it that way if you like that format. If you don’t, see it 2D.”

Doctor Strange is probably currently playing in a theater near you, but go see it in Dolby Cinema if you can.


In addition to being a West Coast correspondent for postPerspective, Daniel Restuccio is the multimedia department chair at California Lutheran University and former Walt Disney Imagineer.

New Wacom Cintiq Pro line offers portability, updated pen, more

Wacom has introduced a new line of Wacom Cintiq Pro creative pen displays: the Cintiq Pro 13 and Cintiq Pro 16. The Wacom Cintiq Pro features a thin and portable form factor, making them suitable for working on the road or remotely.

Cintiq Pro’s new Pro Pen 2, according to Wacom, offers four times greater accuracy and pressure sensitivity than the previous Pro Pen. The improved Pro Pen 2 creates an intuitive experience with virtually lag-free tracking on a glass surface that produces the right amount of friction, and is coated to reduce reflection.

Additionally, the new optical bonding process reduces parallax, providing a pen-on-screen performance that feels natural and has the feedback of a traditional pen or brush. Both Cintiq Pro models also feature multi-touch for easy and fast navigation, as well as the ability to pinch, zoom and rotate illustrations, photos or models within supporting 2D or 3D creative software apps.

Both high-resolution Cintiq Pro models come with an optimized edge-to-edge etched glass workspace. The Cintiq Pro also builds on its predecessor, the Cintiq 13HD touch, offering the ExpressKey Remote as an optional accessory so users can customize their most commonly used shortcuts and modifiers when working with their most-used software applications. In addition, ergonomic features, such as ErgoFlex, fully integrated pop out legs and an optional three-position desk stand (available in February), let users focus on their work instead of constantly adjusting for comfort.

The Wacom Cintiq Pro 13 and 16 are compatible with both Macs and PCs and feature full HD (1920×1080) and UHD (3840×2160) resolution, respectively. Both Cintiq Pro configurations deliver vivid colors, the 13-inch model providing 87 percent Adobe RGB and the 16-inch, 94 percent.

Priced at $999.95 USD, the Cintiq Pro 13 is expected to be available online and at select retail locations at the beginning of December. The Cintiq Pro 16, $1499.95 USD, is expected in February.

GenPop’s Bill Yukich directs, edits gritty open for Amazon’s Goliath 

Director/editor Bill Yukich helmed the film noir-ish opening title sequence for Amazon’s new legal drama, Goliath. Produced by LA-based content creation studio GenPop, the black and white intro starts with Goliath lead actor Billy Bob Thornton jumping into the ocean. While underwater, and smoking a cigarette and holding a briefcase, he casually strolls through rooms filled with smoke and fire. At the end of the open, he rises from the water as the Santa Monica Pier appears next to him and as the picture turns from B&W to color. The Silent Comedy’s “Bartholomew” track plays throughout.

The ominous backdrop, of a man underwater but not drgoliathowning, is a perfect visual description of Thornton’s role as disgraced lawyer Billy McBride. Yukich’s visuals, he says, are meant to strike a balance between dreamlike and menacing.

The approved concept called for a dry shoot, so Yukich came up with solutions to make it seem as though the sequence was actually filmed underwater. Shot on a Red Magnesium Weapon camera, Yukich used a variety of in-camera techniques to achieve the illusion of water, smoke and fire existing within the same world, including the ingenious use of smoke to mimic the movement of crashing waves.

After wrapping the live-action shoot with Thornton, Yukich edited and color corrected the sequence. The VFX work was mostly supplementary and used to enhance the practical effects which were captured on set, such as adding extra fireballs into the frame to make the pyrotechnics feel fuller. Editing was via Adobe Premiere and VFX and color was done in Autodesk Flame. In the end, 80 percent was live action and only 20 percent visual effects.

Once post production was done, Yukich projected the sequence onto a screen which was submerged underwater and reshot the projected footage. Though technically challenging, Yukich says, this Inception-style method of re-shooting the footage gave the film the organic quality that he was looking for.

Yukich recently worked as lead editor for Beyoncé’s visual album Lemonade. Stepping behind the lens was a natural progression for Yukich, who began directing concerts for bands like Godsmack and The Hollywood Undead, as well as music videos for HIM, Vision of Disorder and The Foo Fighters.

Marvel’s Victoria Alonso to receive VES Visionary Award

The VES (Visual Effects Society) has named Victoria Alonso, producer and Marvel Studios EVP of production, as the next recipient of its Visionary Award in recognition of her contributions to visual arts and filmed entertainment. The award will be presented to Alonso at the 15th Annual VES Awards on February 7 at the Beverly Hilton.

The VES Visionary Award, voted on by the VES board of directors, “recognizes an individual who has uniquely and consistently employed the art and science of visual effects to foster imagination and ignite future discoveries by way of artistry, invention and groundbreaking work.” VES will honor Alonso for her dedication to the industry and advancement of storytelling through visual effects.

Alonso is currently executive producing James Gunn’s Guardians of the Galaxy Vol. 2 and Taika Waititi’s Thor: Ragnarok. In her executive role, she oversees post and visual effects for Marvel’s slate. She executive produced Scott Derrickson’s Doctor Strange, Joe and Anthony Russo’s Captain America: Civil War, Peyton Reed’s Ant-Man, Joss Whedon’s Avengers: Age of Ultron, James Gunn’s Guardians of the Galaxy, Joe and Anthony Russo’s Captain America: The Winter Soldier, Alan Taylor’s Thor: The Dark World and Shane Black’s Iron Man 3, as well as Marvel’s The Avengers for Joss Whedon. She co-produced Iron Man and Iron Man 2 with director Jon Favreau, Kenneth Branagh’s Thor and Joe Johnston’s Captain America: The First Avenger.

Alonso’s career began as a commercial VFX producer. From there, she VFX-produced numerous feature films, working with such directors as Ridley Scott (Kingdom of Heaven), Tim Burton (Big Fish) and Andrew Adamson (Shrek), to name a few.

Over the years, Alonso’s dedication to the industry has been admired and her achievements recognized. Alonso was the keynote speaker at the 2014 Visual Effects Society Summit, where she exemplified her role as an advocate for women in the visual effects industry. In 2015, she was an honoree of the New York Women in Film & Television’s Muse Award for Outstanding Vision and Achievement.  This past January she was presented with the Advanced Imaging Society’s Harold Lloyd Award and was recently named to Variety’s 2016 Power of Women L.A. Impact Report, which spotlights creatives and executives who’ve ‘rocked’ the industry in the past year.

Alfonso is in good company. Previous winners of the VES Visionary Award have been Christopher Nolan, Ang Lee, Alfonso Cuarón, J.J. Abrams and Syd Mead.

Talking with new Shade VFX NY executive producer John Parenteau

By Randi Altman

John Parenteau, who has a long history working in visual effects, has been named executive producer of Shade VFX’s New York studio. Shade VFX, which opened in Los Angeles in 2009, provides feature and television visual effects, as well as design, stereoscopic, VR and previs services. In 2014, they opened their New York office to take advantage of the state’s fairly aggressive tax incentives and all that it brings to the city.

Shade-1“As a native New Yorker, with over a decade of working as an artist there, the decision to open an office back home was an easy one,” explains owner Bryan Godwin. “With John coming on board as our New York executive producer, I feel our team is complete and poised to grow — continuing to provide feature-film-level visuals. John’s deep experience running large facilities, working with top tier tent-pole clients and access to even more potential talent convinced me that he is the right choice to helm the production efforts out east.”

Shade’s New York office is already flush with work, including Rock that Body for Sony, The OA and The Get Down for Netflix, Mosaic for HBO and Civil for TNT. Not long ago, the shop finished work on Daredevil and Jessica Jones, two of Marvel’s Netflix collaborations. As John helps grow the client list in NYC, he will be supporting NY visual effects supervisor Karl Coyner, and working directly with Shade’s LA-based EP/VP of production Lisa Maher.

John has a long history in visual effects, starting at Amblin Entertainment in the early ‘90s all the way through to his recent work with supercomputer company Silverdraft, which provides solutions for VFX, VR and more. I’ve known him for many years. In fact, I first started spelling John Parenteau’s name wrong when he was co-owner and VFX supervisor at Digital Muse back in the mid to late ‘90s — kidding, I totally know how to spell it… now.

We kept in touch over the years. His passion and love for filmmaking and visual effects has always been at the forefront of our conversations, along with his interest in writing. John even wrote some NAB blogs for me when he was managing director of Pixomondo (they won the VFX Oscar for Hugo during that time) and I was editor-in-chief of Post Magazine. We worked together again when he was managing director of Silverdraft.

“I’ve always been the kind of guy who likes a challenge, and who likes to push into new areas entertainment,” says John. “But leaving visual effects was less an issue of needing a change and more of a chance to stretch my experience into new fields. After Pixomondo, Silverdraft was a great opportunity to delve into the technology behind VFX and to help develop some unique computer systems for visual effects artists.”

Making the decision to leave the industry a couple years ago to take care of his mother was difficult, but John knew it was the right thing to do. “While moving to Oregon led me away from Hollywood, I never really left the industry; it gets under your skin, and I think it’s impossible to truly get out, even if you wanted to.”

Parenteau realized quickly that the Portland scene wasn’t a hot-bed of film and television VFX, so he took the opportunity to apply his experience in entertainment to a new market, founding marketing boutique Bigfoot Robot. “I discovered a strong need for marketing for small- to mid-sized companies, including shooting and editing content for commercials and marketing videos. But I did keep my hand in media and entertainment thanks to one of my first clients, the industry website postPerspective. Randi and I had known each other for so many years, and our new relationship helped her out technically while allowing me to stay in touch with the industry.”

John’s mom passed over a year ago, and while he was enjoying his work at Bigfoot Robot, he realized how much he missed working in visual effects. “Shade VFX had always been a company I was aware of, and one that I knShade-2ew did great work,” he says. “In returning to the industry, I was trying to avoid landing in too safe of a spot and doing something I’d already done before. That’s when Bryan Godwin and Dave Van Dyke (owner and president of Shade, respectively) contacted me about their New York office. I saw a great opportunity to help build an already successful company into something even more powerful. Bryan, Lisa and Dave have become known for producing solid work in both feature and television, and they were looking for a missing component in New York to help them grow. I felt like I could fill that role and work with a company that was fun and exciting. There’s also something romantic about living in Manhattan, I have to admit.”

And it’s not just about building Shade for John. “I’m the kind of guy who likes to become part of a community. I hope I can contribute in various ways to the success of visual effects for not only Shade but for the New York visual effects community as a whole.”

While I’ll personally miss working with John on a day-to-day basis, I’m happy for him and for Shade. They are getting a very talented artist, who also happens to be a really nice guy.

The A-List — ‘Independence Day: Resurgence’ director Roland Emmerich

The director talks about this VFX-heavy sequel and how it takes advantage of today’s technology to tell its story. 

By Iain Blair

After two decades of rumors and speculation, “The Master of Disaster” — German director/writer/producer Roland Emmerich — is finally back with Independence Day: Resurgence. This is the long-awaited sequel to his seminal 1996 alien invasion epic Independence Day, one of the most financially successful movies in the history of Hollywood — it ended up making over $817 million worldwide and turning Will Smith into a superstar.

Following that smash, Emmerich went on to make other apocalyptic mega-productions, including Godzilla (the 1998 version), The Day After Tomorrow, 10,000 BC and 2012, all of which were huge box office hits despite little love from the critics. And while Emmerich has also made smaller movies, such as Anonymous, The Patriot and Stonewall, which didn’t involve aliens, the destruction of cities, rising sea levels or vast armies of VFX artists, his latest blockbuster will only further cement his legacy as an ambitious filmmaker who doesn’t just love to blow shit up but who has always seen the big picture. The Fox release opens June 24.

INDEPENDENCE DAY: RESURGENCE

I recently spoke with Emmerich about making the film, which features many visual effects shots, and the post process.

It’s been two decades since Independence Day became a global blockbuster. Why did it take so long to do a sequel?
I made the first one as a stand-alone film, and for 10 years I felt that way. Plus, ideas that were pitched for a sequel didn’t work for me. Then, about six, seven years ago, I was shooting for the first time on digital cameras for the film 2012. We did all of the 1,500 VFX shots in the computer, and it suddenly hit me that the technology had changed so much that maybe it was time to try a sequel.

On the first one I was just so frustrated as I couldn’t do everything I wanted and had imagined, because of all the limitations with VFX and technology back then. I had these scissors in my head — this I can do, that I cannot do — but this time I had no scissors and no limitations, and that was a huge difference for me.

How much pressure was there to top the last film?
I honestly didn’t feel much pressure, although I’m very aware that times have changed. I see all the other big VFX films out there and I keep up on it all and I know how competitive it is now. But I felt pretty good about what we could do with this one. And I feel I’ve always been able to create these “impossible images” where people go, “Oh my God! What is that?” Like water coming over the Himalayas. This time it was this enormous 3,000-mile long alien spaceship that comes down to Earth, like this giant spider. That was the first image I had in my head for the film.

It’s a very image-driven business I’m in, and while you obviously work hard on characters and themes and so on, most of the time it’s these images that pop into my head that inspire everything else. And this giant spaceship wasn’t something we could do back in ’96. It was just impossible.

How different was the approach on this and what sort of film did you set out to make?
I tried very hard to avoid making a classic sequel. And it’d been so long anyway. It’s a different society, and we can stay united and fight together. The other big idea was that we’ve harvested all the alien technology. We can’t rebuild it, but we’ve harvested it, and humans are so ingenious, so we can take it and adapt it for human use and machines. So all these themes and ideas were very interesting to me.

How early on did you start integrating post and all the VFX?
Even when I’m writing I’m already thinking about all the VFX and post, and the moment the script is there it’s well under way. I like to make 25-30 big paintings of key scenes that really show you where the movie’s going — the style, the size of the film. They’re so helpful for showing everyone from production executives at the studio to the visual effects teams. It gives a very clear visual idea of what I want. Then you break it down into sequences and start storyboarding and so on.

INDEPENDENCE DAY: RESURGENCE

You must have done a lot of previz for this one?
Yes, but we had very little time because of the release date, and it was very complicated. I had started shooting already and still had to do previz since we weren’t able to previz the whole film before. We needed to previz everything, so I had double duties: at lunch and after shooting I always had to meet with my previz team. When I look back, the film was like a long race against time.

Post and VFX have evolved so much since the first film. What have been the biggest changes for you?
The biggest for me is the whole digital revolution. Digital cameras can now make far better blue- and greenscreen composites, and we shot with Red Weapon Dragons. That’s huge for me as I used to hate the old look of composites and all the limits you had, whereas now, if you can imagine it, you can do it. The computer gives you infinite possibilities in VFX. On the first one I would have these images in my head and then find out we couldn’t do them. Anything is possible today.

Where did you post?
We rented offices in North Hollywood, and we had our editing suite there… the 3D people, and the VFX team. For sound, I always work with sound designer Paul Ottosson, who has his whole set-up at Sony on the lot. So we did all the mixing there, including a Dolby Atmos mix.

INDEPENDENCE DAY: RESURGENCE
This was edited by Adam Wolfe, who cut Stonewall and White House Down for you. Tell us about that relationship and how it worked.
He’s a very active editor and he’ll run on to the set saying, “I need this or that.” He’s not on the set all the time, but he’s close by when I shoot, and we’ll work together on the weekends so I can get a feel for the film and what we’ve shot so far.

This is obviously a VFX-driven piece, and the VFX play a big role. Can you talk about that and working with the visual effects supervisor?
I really enjoy working with VFX — from the concepts to cutting the shots in — and working with a relatively small team of maybe 15 people on them every day, talking on Skype or in person, ideally. I feel that you can also cast VFX companies like actors — for their special talents. Some excel at this, some excel at that. If you’re doing a creature film, then Weta is great. If it’s a very complicated sequence with a lot of water and buildings collapsing and fires, then Scanline is great.

I always try and inspire them to do VFX they’ve never done before, so it’s not boring for them. In the end, we used 10 big companies and another five smaller ones, including Weta, CinINDEPENDENCE DAY: RESURGENCEesite, Scanline, Image Engine, Trxter, MPC, Digital Domain and Buf.

What was the hardest VFX sequence to pull off?
The hardest was the big sequence where the mothership starts sucking up Singapore — the whole city and all the ships — before throwing it on London. That was very complicated to do, and Scanline did an amazing job. The whole scene at the end with the alien battle was also very hard to pull off. That took months and months to do, and the companies started doing tests and simulations at a very early stage. They also sent some of their people to the set to advise you on how best to shoot the live action to go with their VFX.

What’s next?
Another huge film, I hope. I love them. It’s my job, my business.

Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Quick Chat: VFX Legion’s James Hattin on his visual effects collective

By Randi Altman

While VFX Legion does have a brick-and-mortar location in Burbank, California, their team of 50 visual effects artists is spread around the world. Started in 2012 by co-founder and VFX supervisor James Hattin and six others who were weary of the old VFX house model — including large overhead and long hours away from family — the virtual studio was set up to allow artists to work where they live, instead of having to move to where the work is.

VFX Legion has provided visual effects for television shows like Scandal and How to Get Away With Murder, as well as feature films such as Insidious: Chapter 3, Jem and the Holograms, and Sinister 2. We recently reached out to Hattin to find out more about his collective and how they make sure their remote collaboration workflow is buttoned up.

Sinister 2

Sinister 2

Can you talk about the work/services you provide?
VFX Legion full-service visual effects facility that provides on-set supervision, tracking, match move, animation, 3D, dynamics and compositing. We favor the compositing side of the work because we have so many skilled compositors on the team. However, we have talent all over the world for dynamics, lighting and animation as well.

You co-founded VFX Legion as a collective?
Legion was started by myself and six equal partners. We are mostly artists and production people. This has been the key to our early success — the partners alone could deliver a significant amount of work. Early on, Legion was designed to be a co-op, wherein, everyone who worked for the company would have a vested interest in getting projects done profitably. However, in researching how that could be done on a legal and business level, we found that we were going to have to change the industry one way at a time. A fully remote workflow was enough to get VFX Legion off the ground. We will have to wait for that change to take hold industry wide before we move into 100s of “owners.”

You have an official office, but you have artists working all over the world. Why did you guys opt to do that as opposed to expanding in Burbank?
The brick-and-mortar office is for the management and supervision. We have an expandable team that handles everything from IO to producing and supervising the artists around the world. We could expand this facility to house artists, but the goal of the company was to find the best artists around the world — not to open offices all over the world. We want people to be able to work wherever they want to live. We don’t mandate that they come in to the office and work a 9 to 5. Artists get to work on their own schedule in their own offices and personal spaces. It’s the new way of giving talent their lives back. VFX can be insanely demanding on the people who work in the industry.

What are the benefits?
The benefits are that artists take control over their lives. They can work all night if they are night owls. They can walk the dog or go out to eat with their families and not be chained to a desk in one of the most expensive cities in the world — which is where all VFX hubs are based. It takes a certain kind of artist, with a certain level of experience, to manage themselves in this atmosphere. Those who do it well can live pretty well by working full time for Legion on projects.

Are there any negatives?
If the artist isn’t the kind of person that can start and finish something, if they can’t manage their time very well, or don’t communicate well, this can be very challenging. We’ve had a few artists bow out over the last few years because they simply weren’t cut out for the type of work that we do. Self management is very important to this pipeline, and if someone isn’t up to it, it can be frustrating.

What kind of software do you use for your VFX work?
We use Nuke and Maya, along with Redshift and VRay for rendering. We also call on After Effects, Mocha, Zoom, Aspera and Shotgun.

With people spread around the world, how do you communicate and review and approve projects? Can you walk us through a typical workflow, starting with how early you get involved on a project?
On many projects, we start at the very beginning. We are there for production meetings and help drive the visual effects workflow so that it is easier to deal with in post. Once we are done on set, we work with the editorial staff to manage shot turnovers and ingesting plates into our system. Once we have plates in our system, we assign the work out to the artists who are a good fit for the work that needs to be done.

Jem and the Holograms

Jem and the Holograms

We let them know what the budget is for the shot and they can accept or refuse the work. Once the artist is kicked off, they will start sending shots through Shotgun for review by a supervisor in-house in Burbank. We generally look at the Shotgun media first to see if the basics are in place. If that looks good, we download the uploaded QuickTime from Shotgun. When that is approved, we pull the synced DPX frames and evaluate them through a QC process to make sure that they meet the quality standards we have as a company.

There are a lot of moving parts, and that is why we have a team of trained coordinators, project managers and producers here in Burbank, to make sure that we facilitate all the work and track all the progress.

Can you talk about some recent projects?
We have been working on Scandal and How to Get Away With Murder for ABC Television. There are a number of challenges working on shows like this. The schedule can be very tight and we are tasked with updating many older elements from previous vendors and previous seasons.

This can also be a lot of fun because we get a chance to make sure that the effects look as good as possible, but we slowly update each of the assets to be a little more ‘Legion-like.’ This can be little secondary animations that weren’t there originally or a change in seasons of a set extension. It is all very exciting and fast paced.

——–

For more on VFX Legion, check out James Hattin’s LinkedIn blog here.