Tag Archives: visual effects

Infinite Fiction

Republic Editorial launches design/VFX studio

Republic Editorial in Dallas has launched a design- and VFX-focused sister studio, Infinite Fiction, and leading the charge as executive producer is visual effects industry veteran Joey Cade. In her new role, she will focus on developing Infinite Fiction’s sales and marketing strategy, growing its client roster and expanding the creative team and its capabilities. More on her background in a bit.

Infinite Fiction, which is being managed by Republic partners Carrie Callaway, Chris Gipson and Keith James, focuses on high-end, narrative-driven motion design and visual effects work for all platforms, including virtual reality. Although it shares management with Republic Editorial, Infinite Fiction is a stand-alone creative shop and will service agencies, outside post houses and entertainment studios.

Infinite Fiction is housed separately, but located next door to Republic Editorial’s uptown Dallas headquarters. It adds nearly 2,000 square feet of creative space to Republic’s recently renovated 8,000 square feet and is already home to a team of motion designers, visual effects artists, CG generalists and producers.

Cade began her career in live-action production working with Hungry Man, Miramax and NBC. She gained expertise in visual effects and animation at Reel FX, which grew from a 30-person boutique to an over 300-person studio with several divisions during her tenure. As its first entertainment division executive producer, Cade won business with Sony TV, Universal, A&E Networks and ABC Family as well as produced Reel FX’s first theatrical VFX project for Disney. She broadened her skill set by launching and managing a web-based business and gained branding, marketing and advertising experience within small independent agencies, including Tractorbeam.

Infinite Fiction already has projects in its pipeline, including design-driven content pieces for TM Advertising, Dieste and Tracy Locke.

ILM’s Richard Bluff talks VFX for Marvel’s Doctor Strange

By Daniel Restuccio

Comic book fans have been waiting for over 30 years for Marvel’s Doctor Strange to come to the big screen, and dare I say it was worth the wait. This is in large part because of the technology now available to create the film’s stunning visual effects.

Fans have the option to see the film in traditional 2D, Dolby Cinema (worthy of an interstate or plane fare pilgrimage, in my opinion) and IMAX 3D. Doctor Strange, Marvel Studios’ 15th film offering, is also receiving good critical reviews and VFX Oscar buzz — it’s currently on the list of 20 films still in the running in the Visual Effects category for the 89th Academy Awards.

Marvel Doctor StrangeThe unapologetically dazzling VFX shots, in many cases directly inspired by the original comic visuals by Steve Dittko, were created by multiple visual effects houses, including Industrial Light & Magic, Luma Pictures, Lola VFX, Method Studios, Rise FX, Crafty Apes, Framestore, Perception and previs house The Third Floor. Check out our interview with the film’s VFX supervisor Stephane Ceretti.

Director Scott Derrickson said in in a recent Reddit chat that Doctor Strange is “a fantastical superhero movie.

“Watching the final cut of the film was deeply satisfying,” commented Derrickson. “A filmmaker cannot depend upon critical reviews or box office for satisfaction — even if they are good. The only true reward for any artist is to pick a worthy target and hit it. When you know you’ve hit your target that is everything. On this one, I hit my target.”

Since we got an overview of how the visual effects workflow went from Ceretti, we decided to talk to one of the studios that provided VFX for the film, specifically ILM and their VFX supervisor Richard Bluff.

Richard Bluff

According to Bluff, early in pre-production Marvel presented concept art, reference images and previsualization on “what were the boundaries of what the visuals could be.” After that, he says, they had the freedom to search within those bounds.

During VFX presentations with Marvel, they frequently showed three versions of the work. “They went with the craziest version to the point where the next time we would show three more versions and we continued to up the ante on the crazy,” recalls Bluff.

As master coordinator of this effort for ILM, Bluff encouraged his artists, “to own the visuals and try to work out how the company could raise the quality of the work or the designs on the show to another level. How could we introduce something new that remains within the fabric of the movie?”

As a result, says Bluff, they had some amazing ideas flow from individuals on the film. Jason Parks came up with the idea of traveling through the center of a subway train as it fractured. Matt Cowey invented the notion of continually rotating the camera to heighten the sense of vertigo. Andrew Graham designed the kaleidoscope-fighting arena “largely because his personal hobby is building and designing real kaleidoscopes.”

Unique to Doctor Strange is that the big VFX sequences are all very “self-contained.” For example, ILM did the New York and Hong Kong sequence, Luma did the Dark Dimension and Method did the multi-universe. ILM also designed and developed the original concept for the Eldridge Magic and provided all the shared “digital doubles” — CGI rigged and animatable versions of the actors — that tied sequences together. The digital doubles were customized to the needs of each VFX house.

Previs
In some movies previs material is generated and thrown away. Not so with Doctor Strange. What ILM did this time was develop a previs workflow where they could actually hang assets and continue to develop, so it became part of the shot from the earliest iteration.

There was extensive previs done for Marvel by The Third Floor as a creative and technical guide across the movie, and further iterations internal to ILM done by ILM’s lead visualization artist, Landis Fields.

Warning! Spoiler! Once Doctor Strange moves the New York fight scene into the mirror universe, the city starts coming apart in an M.C. Escher-meets-Chris Nolan-Inception kind of way. To make that sequence, ILM created a massive tool kit of New York set pieces and geometry, including subway cars, buildings, vehicles and fire escapes.

In the previs, Fields started breaking apart, duplicating and animating those objects, like the fire escapes, to tell the story of what a kaleidoscoping city would look like. The artists then fleshed out a sequence of shots, a.k.a. “mini beats.” They absorbed the previs into the pipeline by later switching out the gross geometry elements in Fields’ previs with the actual New York hero assets.

Strange Cam
Landis and the ILM team also designed and built what ILM dubbed the “strange cam,” a custom 3D printed 360 GoPro rig that had to withstand the rigors of being slung off the edge of skyscrapers. What ILM wanted to do was to be able to capture 360 degrees of rolling footage from that vantage point to be used as a moving background “plates” that could be reflected within the New York City glass buildings.

VFX, Sound Design and the Hong Kong
One of the big challenges with the Hong Kong sequence was that time was reversing and moving forward at the same time. “What we had to do was ensure the viewer understands that time is reversing throughout that entire sequence.” During the tight hand-to-hand action moments that are moving forward in time, there’s not really much screen space to show you time reversing in the background. So they designed the reversing destruction sequence to work in concert with the sound design. “We realized we had to move away from a continuous shower of debris toward rhythmic beats of debris being sucked out of frame.”

before-streetafter-street

Bluff says the VFX the shot count on the film — 1,450 VFX — was actually a lot less than Captain America: Civil War. From a VFX point of view, The Avengers movies lean on the assets generated in Iron Man and Captain America. The Thor movies help provide the context for what an Avengers movie would look and feel like. In Doctor Strange “almost everything in the movie had to be designed (from scratch) because they haven’t already existed in a previous Marvel film. It’s a brand-new character to the Marvel world.”

Bluff started development on the movie in October of 2014 and really started doing hands on work in February of 2016, frequently traveling between Vancouver, San Francisco and London. A typical day, working out of the ILM London office, would see him get in early and immediately deal with review requests from San Francisco. Then he would jump into “dailies” in London and work with them until the afternoon. After “nightlies” with London there was a “dailies” session with San Francisco and Vancouver, work with them until evening, hit the hotel, grab some dinner, come back around 11:30pm or midnight and do nightlies with San Francisco. “It just kept the team together, and we never missed a beat.”

2D vs. IMAX 3D vs. Dolby Cinema
Bluff saw the entire movie for the first time in IMAX 3D, and is looking forward to seeing it in 2D. Considering sequences in the movie are surreal in nature and Escher-like, there’s an argument that suggests that IMAX 3D is a better way to see it because it enhances the already bizarre version of that world. However, he believes the 2D and 3D versions are really “two different experiences.”

Dolby Cinema is the merging of Dolby Atmos — 128-channel surround sound — with the high dynamic range of Dolby Vision, plus really comfortable seats. It is, arguably, the best way to see a movie. Bluff says as far as VFX goes, high dynamic range information has been there for years. “I’m just thankful that exhibition technology is finally catching up with what’s always been there for us on the visual effects side.”

During that Reddit interview, Derrickson commented, “The EDR (Extended Dynamic Range) print is unbelievable — if you’re lucky enough to live where an EDR print is playing. As for 3D and/or IMAX, see it that way if you like that format. If you don’t, see it 2D.”

Doctor Strange is probably currently playing in a theater near you, but go see it in Dolby Cinema if you can.


In addition to being a West Coast correspondent for postPerspective, Daniel Restuccio is the multimedia department chair at California Lutheran University and former Walt Disney Imagineer.

New Wacom Cintiq Pro line offers portability, updated pen, more

Wacom has introduced a new line of Wacom Cintiq Pro creative pen displays: the Cintiq Pro 13 and Cintiq Pro 16. The Wacom Cintiq Pro features a thin and portable form factor, making them suitable for working on the road or remotely.

Cintiq Pro’s new Pro Pen 2, according to Wacom, offers four times greater accuracy and pressure sensitivity than the previous Pro Pen. The improved Pro Pen 2 creates an intuitive experience with virtually lag-free tracking on a glass surface that produces the right amount of friction, and is coated to reduce reflection.

Additionally, the new optical bonding process reduces parallax, providing a pen-on-screen performance that feels natural and has the feedback of a traditional pen or brush. Both Cintiq Pro models also feature multi-touch for easy and fast navigation, as well as the ability to pinch, zoom and rotate illustrations, photos or models within supporting 2D or 3D creative software apps.

Both high-resolution Cintiq Pro models come with an optimized edge-to-edge etched glass workspace. The Cintiq Pro also builds on its predecessor, the Cintiq 13HD touch, offering the ExpressKey Remote as an optional accessory so users can customize their most commonly used shortcuts and modifiers when working with their most-used software applications. In addition, ergonomic features, such as ErgoFlex, fully integrated pop out legs and an optional three-position desk stand (available in February), let users focus on their work instead of constantly adjusting for comfort.

The Wacom Cintiq Pro 13 and 16 are compatible with both Macs and PCs and feature full HD (1920×1080) and UHD (3840×2160) resolution, respectively. Both Cintiq Pro configurations deliver vivid colors, the 13-inch model providing 87 percent Adobe RGB and the 16-inch, 94 percent.

Priced at $999.95 USD, the Cintiq Pro 13 is expected to be available online and at select retail locations at the beginning of December. The Cintiq Pro 16, $1499.95 USD, is expected in February.

GenPop’s Bill Yukich directs, edits gritty open for Amazon’s Goliath 

Director/editor Bill Yukich helmed the film noir-ish opening title sequence for Amazon’s new legal drama, Goliath. Produced by LA-based content creation studio GenPop, the black and white intro starts with Goliath lead actor Billy Bob Thornton jumping into the ocean. While underwater, and smoking a cigarette and holding a briefcase, he casually strolls through rooms filled with smoke and fire. At the end of the open, he rises from the water as the Santa Monica Pier appears next to him and as the picture turns from B&W to color. The Silent Comedy’s “Bartholomew” track plays throughout.

The ominous backdrop, of a man underwater but not drgoliathowning, is a perfect visual description of Thornton’s role as disgraced lawyer Billy McBride. Yukich’s visuals, he says, are meant to strike a balance between dreamlike and menacing.

The approved concept called for a dry shoot, so Yukich came up with solutions to make it seem as though the sequence was actually filmed underwater. Shot on a Red Magnesium Weapon camera, Yukich used a variety of in-camera techniques to achieve the illusion of water, smoke and fire existing within the same world, including the ingenious use of smoke to mimic the movement of crashing waves.

After wrapping the live-action shoot with Thornton, Yukich edited and color corrected the sequence. The VFX work was mostly supplementary and used to enhance the practical effects which were captured on set, such as adding extra fireballs into the frame to make the pyrotechnics feel fuller. Editing was via Adobe Premiere and VFX and color was done in Autodesk Flame. In the end, 80 percent was live action and only 20 percent visual effects.

Once post production was done, Yukich projected the sequence onto a screen which was submerged underwater and reshot the projected footage. Though technically challenging, Yukich says, this Inception-style method of re-shooting the footage gave the film the organic quality that he was looking for.

Yukich recently worked as lead editor for Beyoncé’s visual album Lemonade. Stepping behind the lens was a natural progression for Yukich, who began directing concerts for bands like Godsmack and The Hollywood Undead, as well as music videos for HIM, Vision of Disorder and The Foo Fighters.

Marvel’s Victoria Alonso to receive VES Visionary Award

The VES (Visual Effects Society) has named Victoria Alonso, producer and Marvel Studios EVP of production, as the next recipient of its Visionary Award in recognition of her contributions to visual arts and filmed entertainment. The award will be presented to Alonso at the 15th Annual VES Awards on February 7 at the Beverly Hilton.

The VES Visionary Award, voted on by the VES board of directors, “recognizes an individual who has uniquely and consistently employed the art and science of visual effects to foster imagination and ignite future discoveries by way of artistry, invention and groundbreaking work.” VES will honor Alonso for her dedication to the industry and advancement of storytelling through visual effects.

Alonso is currently executive producing James Gunn’s Guardians of the Galaxy Vol. 2 and Taika Waititi’s Thor: Ragnarok. In her executive role, she oversees post and visual effects for Marvel’s slate. She executive produced Scott Derrickson’s Doctor Strange, Joe and Anthony Russo’s Captain America: Civil War, Peyton Reed’s Ant-Man, Joss Whedon’s Avengers: Age of Ultron, James Gunn’s Guardians of the Galaxy, Joe and Anthony Russo’s Captain America: The Winter Soldier, Alan Taylor’s Thor: The Dark World and Shane Black’s Iron Man 3, as well as Marvel’s The Avengers for Joss Whedon. She co-produced Iron Man and Iron Man 2 with director Jon Favreau, Kenneth Branagh’s Thor and Joe Johnston’s Captain America: The First Avenger.

Alonso’s career began as a commercial VFX producer. From there, she VFX-produced numerous feature films, working with such directors as Ridley Scott (Kingdom of Heaven), Tim Burton (Big Fish) and Andrew Adamson (Shrek), to name a few.

Over the years, Alonso’s dedication to the industry has been admired and her achievements recognized. Alonso was the keynote speaker at the 2014 Visual Effects Society Summit, where she exemplified her role as an advocate for women in the visual effects industry. In 2015, she was an honoree of the New York Women in Film & Television’s Muse Award for Outstanding Vision and Achievement.  This past January she was presented with the Advanced Imaging Society’s Harold Lloyd Award and was recently named to Variety’s 2016 Power of Women L.A. Impact Report, which spotlights creatives and executives who’ve ‘rocked’ the industry in the past year.

Alfonso is in good company. Previous winners of the VES Visionary Award have been Christopher Nolan, Ang Lee, Alfonso Cuarón, J.J. Abrams and Syd Mead.

Talking with new Shade VFX NY executive producer John Parenteau

By Randi Altman

John Parenteau, who has a long history working in visual effects, has been named executive producer of Shade VFX’s New York studio. Shade VFX, which opened in Los Angeles in 2009, provides feature and television visual effects, as well as design, stereoscopic, VR and previs services. In 2014, they opened their New York office to take advantage of the state’s fairly aggressive tax incentives and all that it brings to the city.

Shade-1“As a native New Yorker, with over a decade of working as an artist there, the decision to open an office back home was an easy one,” explains owner Bryan Godwin. “With John coming on board as our New York executive producer, I feel our team is complete and poised to grow — continuing to provide feature-film-level visuals. John’s deep experience running large facilities, working with top tier tent-pole clients and access to even more potential talent convinced me that he is the right choice to helm the production efforts out east.”

Shade’s New York office is already flush with work, including Rock that Body for Sony, The OA and The Get Down for Netflix, Mosaic for HBO and Civil for TNT. Not long ago, the shop finished work on Daredevil and Jessica Jones, two of Marvel’s Netflix collaborations. As John helps grow the client list in NYC, he will be supporting NY visual effects supervisor Karl Coyner, and working directly with Shade’s LA-based EP/VP of production Lisa Maher.

John has a long history in visual effects, starting at Amblin Entertainment in the early ‘90s all the way through to his recent work with supercomputer company Silverdraft, which provides solutions for VFX, VR and more. I’ve known him for many years. In fact, I first started spelling John Parenteau’s name wrong when he was co-owner and VFX supervisor at Digital Muse back in the mid to late ‘90s — kidding, I totally know how to spell it… now.

We kept in touch over the years. His passion and love for filmmaking and visual effects has always been at the forefront of our conversations, along with his interest in writing. John even wrote some NAB blogs for me when he was managing director of Pixomondo (they won the VFX Oscar for Hugo during that time) and I was editor-in-chief of Post Magazine. We worked together again when he was managing director of Silverdraft.

“I’ve always been the kind of guy who likes a challenge, and who likes to push into new areas entertainment,” says John. “But leaving visual effects was less an issue of needing a change and more of a chance to stretch my experience into new fields. After Pixomondo, Silverdraft was a great opportunity to delve into the technology behind VFX and to help develop some unique computer systems for visual effects artists.”

Making the decision to leave the industry a couple years ago to take care of his mother was difficult, but John knew it was the right thing to do. “While moving to Oregon led me away from Hollywood, I never really left the industry; it gets under your skin, and I think it’s impossible to truly get out, even if you wanted to.”

Parenteau realized quickly that the Portland scene wasn’t a hot-bed of film and television VFX, so he took the opportunity to apply his experience in entertainment to a new market, founding marketing boutique Bigfoot Robot. “I discovered a strong need for marketing for small- to mid-sized companies, including shooting and editing content for commercials and marketing videos. But I did keep my hand in media and entertainment thanks to one of my first clients, the industry website postPerspective. Randi and I had known each other for so many years, and our new relationship helped her out technically while allowing me to stay in touch with the industry.”

John’s mom passed over a year ago, and while he was enjoying his work at Bigfoot Robot, he realized how much he missed working in visual effects. “Shade VFX had always been a company I was aware of, and one that I knShade-2ew did great work,” he says. “In returning to the industry, I was trying to avoid landing in too safe of a spot and doing something I’d already done before. That’s when Bryan Godwin and Dave Van Dyke (owner and president of Shade, respectively) contacted me about their New York office. I saw a great opportunity to help build an already successful company into something even more powerful. Bryan, Lisa and Dave have become known for producing solid work in both feature and television, and they were looking for a missing component in New York to help them grow. I felt like I could fill that role and work with a company that was fun and exciting. There’s also something romantic about living in Manhattan, I have to admit.”

And it’s not just about building Shade for John. “I’m the kind of guy who likes to become part of a community. I hope I can contribute in various ways to the success of visual effects for not only Shade but for the New York visual effects community as a whole.”

While I’ll personally miss working with John on a day-to-day basis, I’m happy for him and for Shade. They are getting a very talented artist, who also happens to be a really nice guy.

The A-List — ‘Independence Day: Resurgence’ director Roland Emmerich

The director talks about this VFX-heavy sequel and how it takes advantage of today’s technology to tell its story. 

By Iain Blair

After two decades of rumors and speculation, “The Master of Disaster” — German director/writer/producer Roland Emmerich — is finally back with Independence Day: Resurgence. This is the long-awaited sequel to his seminal 1996 alien invasion epic Independence Day, one of the most financially successful movies in the history of Hollywood — it ended up making over $817 million worldwide and turning Will Smith into a superstar.

Following that smash, Emmerich went on to make other apocalyptic mega-productions, including Godzilla (the 1998 version), The Day After Tomorrow, 10,000 BC and 2012, all of which were huge box office hits despite little love from the critics. And while Emmerich has also made smaller movies, such as Anonymous, The Patriot and Stonewall, which didn’t involve aliens, the destruction of cities, rising sea levels or vast armies of VFX artists, his latest blockbuster will only further cement his legacy as an ambitious filmmaker who doesn’t just love to blow shit up but who has always seen the big picture. The Fox release opens June 24.

INDEPENDENCE DAY: RESURGENCE

I recently spoke with Emmerich about making the film, which features many visual effects shots, and the post process.

It’s been two decades since Independence Day became a global blockbuster. Why did it take so long to do a sequel?
I made the first one as a stand-alone film, and for 10 years I felt that way. Plus, ideas that were pitched for a sequel didn’t work for me. Then, about six, seven years ago, I was shooting for the first time on digital cameras for the film 2012. We did all of the 1,500 VFX shots in the computer, and it suddenly hit me that the technology had changed so much that maybe it was time to try a sequel.

On the first one I was just so frustrated as I couldn’t do everything I wanted and had imagined, because of all the limitations with VFX and technology back then. I had these scissors in my head — this I can do, that I cannot do — but this time I had no scissors and no limitations, and that was a huge difference for me.

How much pressure was there to top the last film?
I honestly didn’t feel much pressure, although I’m very aware that times have changed. I see all the other big VFX films out there and I keep up on it all and I know how competitive it is now. But I felt pretty good about what we could do with this one. And I feel I’ve always been able to create these “impossible images” where people go, “Oh my God! What is that?” Like water coming over the Himalayas. This time it was this enormous 3,000-mile long alien spaceship that comes down to Earth, like this giant spider. That was the first image I had in my head for the film.

It’s a very image-driven business I’m in, and while you obviously work hard on characters and themes and so on, most of the time it’s these images that pop into my head that inspire everything else. And this giant spaceship wasn’t something we could do back in ’96. It was just impossible.

How different was the approach on this and what sort of film did you set out to make?
I tried very hard to avoid making a classic sequel. And it’d been so long anyway. It’s a different society, and we can stay united and fight together. The other big idea was that we’ve harvested all the alien technology. We can’t rebuild it, but we’ve harvested it, and humans are so ingenious, so we can take it and adapt it for human use and machines. So all these themes and ideas were very interesting to me.

How early on did you start integrating post and all the VFX?
Even when I’m writing I’m already thinking about all the VFX and post, and the moment the script is there it’s well under way. I like to make 25-30 big paintings of key scenes that really show you where the movie’s going — the style, the size of the film. They’re so helpful for showing everyone from production executives at the studio to the visual effects teams. It gives a very clear visual idea of what I want. Then you break it down into sequences and start storyboarding and so on.

INDEPENDENCE DAY: RESURGENCE

You must have done a lot of previz for this one?
Yes, but we had very little time because of the release date, and it was very complicated. I had started shooting already and still had to do previz since we weren’t able to previz the whole film before. We needed to previz everything, so I had double duties: at lunch and after shooting I always had to meet with my previz team. When I look back, the film was like a long race against time.

Post and VFX have evolved so much since the first film. What have been the biggest changes for you?
The biggest for me is the whole digital revolution. Digital cameras can now make far better blue- and greenscreen composites, and we shot with Red Weapon Dragons. That’s huge for me as I used to hate the old look of composites and all the limits you had, whereas now, if you can imagine it, you can do it. The computer gives you infinite possibilities in VFX. On the first one I would have these images in my head and then find out we couldn’t do them. Anything is possible today.

Where did you post?
We rented offices in North Hollywood, and we had our editing suite there… the 3D people, and the VFX team. For sound, I always work with sound designer Paul Ottosson, who has his whole set-up at Sony on the lot. So we did all the mixing there, including a Dolby Atmos mix.

INDEPENDENCE DAY: RESURGENCE
This was edited by Adam Wolfe, who cut Stonewall and White House Down for you. Tell us about that relationship and how it worked.
He’s a very active editor and he’ll run on to the set saying, “I need this or that.” He’s not on the set all the time, but he’s close by when I shoot, and we’ll work together on the weekends so I can get a feel for the film and what we’ve shot so far.

This is obviously a VFX-driven piece, and the VFX play a big role. Can you talk about that and working with the visual effects supervisor?
I really enjoy working with VFX — from the concepts to cutting the shots in — and working with a relatively small team of maybe 15 people on them every day, talking on Skype or in person, ideally. I feel that you can also cast VFX companies like actors — for their special talents. Some excel at this, some excel at that. If you’re doing a creature film, then Weta is great. If it’s a very complicated sequence with a lot of water and buildings collapsing and fires, then Scanline is great.

I always try and inspire them to do VFX they’ve never done before, so it’s not boring for them. In the end, we used 10 big companies and another five smaller ones, including Weta, CinINDEPENDENCE DAY: RESURGENCEesite, Scanline, Image Engine, Trxter, MPC, Digital Domain and Buf.

What was the hardest VFX sequence to pull off?
The hardest was the big sequence where the mothership starts sucking up Singapore — the whole city and all the ships — before throwing it on London. That was very complicated to do, and Scanline did an amazing job. The whole scene at the end with the alien battle was also very hard to pull off. That took months and months to do, and the companies started doing tests and simulations at a very early stage. They also sent some of their people to the set to advise you on how best to shoot the live action to go with their VFX.

What’s next?
Another huge film, I hope. I love them. It’s my job, my business.

Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Quick Chat: VFX Legion’s James Hattin on his visual effects collective

By Randi Altman

While VFX Legion does have a brick-and-mortar location in Burbank, California, their team of 50 visual effects artists is spread around the world. Started in 2012 by co-founder and VFX supervisor James Hattin and six others who were weary of the old VFX house model — including large overhead and long hours away from family — the virtual studio was set up to allow artists to work where they live, instead of having to move to where the work is.

VFX Legion has provided visual effects for television shows like Scandal and How to Get Away With Murder, as well as feature films such as Insidious: Chapter 3, Jem and the Holograms, and Sinister 2. We recently reached out to Hattin to find out more about his collective and how they make sure their remote collaboration workflow is buttoned up.

Sinister 2

Sinister 2

Can you talk about the work/services you provide?
VFX Legion full-service visual effects facility that provides on-set supervision, tracking, match move, animation, 3D, dynamics and compositing. We favor the compositing side of the work because we have so many skilled compositors on the team. However, we have talent all over the world for dynamics, lighting and animation as well.

You co-founded VFX Legion as a collective?
Legion was started by myself and six equal partners. We are mostly artists and production people. This has been the key to our early success — the partners alone could deliver a significant amount of work. Early on, Legion was designed to be a co-op, wherein, everyone who worked for the company would have a vested interest in getting projects done profitably. However, in researching how that could be done on a legal and business level, we found that we were going to have to change the industry one way at a time. A fully remote workflow was enough to get VFX Legion off the ground. We will have to wait for that change to take hold industry wide before we move into 100s of “owners.”

You have an official office, but you have artists working all over the world. Why did you guys opt to do that as opposed to expanding in Burbank?
The brick-and-mortar office is for the management and supervision. We have an expandable team that handles everything from IO to producing and supervising the artists around the world. We could expand this facility to house artists, but the goal of the company was to find the best artists around the world — not to open offices all over the world. We want people to be able to work wherever they want to live. We don’t mandate that they come in to the office and work a 9 to 5. Artists get to work on their own schedule in their own offices and personal spaces. It’s the new way of giving talent their lives back. VFX can be insanely demanding on the people who work in the industry.

What are the benefits?
The benefits are that artists take control over their lives. They can work all night if they are night owls. They can walk the dog or go out to eat with their families and not be chained to a desk in one of the most expensive cities in the world — which is where all VFX hubs are based. It takes a certain kind of artist, with a certain level of experience, to manage themselves in this atmosphere. Those who do it well can live pretty well by working full time for Legion on projects.

Are there any negatives?
If the artist isn’t the kind of person that can start and finish something, if they can’t manage their time very well, or don’t communicate well, this can be very challenging. We’ve had a few artists bow out over the last few years because they simply weren’t cut out for the type of work that we do. Self management is very important to this pipeline, and if someone isn’t up to it, it can be frustrating.

What kind of software do you use for your VFX work?
We use Nuke and Maya, along with Redshift and VRay for rendering. We also call on After Effects, Mocha, Zoom, Aspera and Shotgun.

With people spread around the world, how do you communicate and review and approve projects? Can you walk us through a typical workflow, starting with how early you get involved on a project?
On many projects, we start at the very beginning. We are there for production meetings and help drive the visual effects workflow so that it is easier to deal with in post. Once we are done on set, we work with the editorial staff to manage shot turnovers and ingesting plates into our system. Once we have plates in our system, we assign the work out to the artists who are a good fit for the work that needs to be done.

Jem and the Holograms

Jem and the Holograms

We let them know what the budget is for the shot and they can accept or refuse the work. Once the artist is kicked off, they will start sending shots through Shotgun for review by a supervisor in-house in Burbank. We generally look at the Shotgun media first to see if the basics are in place. If that looks good, we download the uploaded QuickTime from Shotgun. When that is approved, we pull the synced DPX frames and evaluate them through a QC process to make sure that they meet the quality standards we have as a company.

There are a lot of moving parts, and that is why we have a team of trained coordinators, project managers and producers here in Burbank, to make sure that we facilitate all the work and track all the progress.

Can you talk about some recent projects?
We have been working on Scandal and How to Get Away With Murder for ABC Television. There are a number of challenges working on shows like this. The schedule can be very tight and we are tasked with updating many older elements from previous vendors and previous seasons.

This can also be a lot of fun because we get a chance to make sure that the effects look as good as possible, but we slowly update each of the assets to be a little more ‘Legion-like.’ This can be little secondary animations that weren’t there originally or a change in seasons of a set extension. It is all very exciting and fast paced.

——–

For more on VFX Legion, check out James Hattin’s LinkedIn blog here.

MPC goes into the storm for Disney’s ‘The Finest Hours’

Disney’s The Finest Hours is based on the true story of the greatest small boat rescue in Coast Guard history. As you can imagine, the film, which stars Casey Affleck and Chris Pine, is chock full of visual effects shots — 900 of which were supplied by London’s MPC. Over a 10-month period, MPC VFX supervisor Seth Maury and producer Felix Crawshaw worked closely with the film’s director Craig Gillespie and production VFX supervisor Kevin Hahn

MPC’s work primarily consisted of recreating an immersive nor’easter storm, including ocean swells, turbulent seas, blowing rain and snow, and a sequence where a 36-foot Coast Guard boat must cross a digital version of Chatham bar, with 30- to 50-foot waves rolling, swelling and crashing around them.

THE FINEST HOURS

Shooting began in the fall of 2014 in a 120-by-80-foot-by- 12-foot-deep water-tank built for the shoot at a warehouse in Fore River Shipyard in Quincy, Massachusetts. The filmmakers built a large gimbal set of the Coast Guard CG36500 rescue boat, a full-size mock-up of the Pendleton hull and a replica of the Pendleton engine room that could be flooded with 6ft of water. A number of scenes were also filmed at various locations in Cape Cod and four period rescue boats were restored for production.

The visual effects work started early on in production with the team creating some full CG shots in order to understand what would be required to create a storm of the magnitude required to tell the story.

MPC’s team completed around 300 large-scale water simulation shots (using Flowline), 300 weather and environment shots and 300 shots of the ocean behind actors shot on bluescreen. Digital doubles of the crew members, as well as digital versions of the Pendleton T2 tanker and CG36500 Coast Guard boat were also built by MPC’s team.

For the large-scale full CG water simulation shots, MPC built a library of FX elements and CG renders that were needed to create the ocean, consisting of the base ocean, a foam pass, a mist pass, a fine spray pass, a bubble pass, a water texture pass and refraction and reflection passes for water surfaces. A constantly blowing CG mist pass that the filmmakers named Speed Mist was added to the real footage shot on location in Cape Cod and on set in Quincy to accentuate the storm. For the shots that were panoramic enough to show interaction between the practical or CG boat and the water surface, there was also a suite of elements rendered to connect the boat and water, such as splashes, sprays, bow wakes, tail wakes, an engine wash of bubbles, turbulent water, foam and spray.

Simulations were created in Maya, Flowline and Houdini (including Houdini-Engine). Houdini was used for streaming water and environmental effects such as rain, snow and blowing mist effects — it was chosen because of its strength in handling many shots procedurally. Most of the shots were cached using the Houdini Engine plug-in within Maya. MPC built custom interfaces to simplify their workflow, manage assets and enable artists to handle multiple layers and shots together. They called on VRay and PRMan (Katana) for rendering.

The most challenging work involved a sequence where the CG36500 Coast Guard boat crosses the Chatham Bar in 30-50ft waves. The waves were in various state of swelling, crashing, dumping and spilling. MPC developed systems for creating each of these waves, and additional systems for adding layers of waves into the same scene. Many shots in this sequence were crafted as single-solution waves, such as looking down the barrel of a crashing wave, or being pushed along and backwards by a wave that had already crashed. Maya was used for wave rigs, layout, animation, geometry preparation and some of the FX work. MPC built a complex (and quite flexible) wave rig used by the animation department to prototype and design waves. Once size, speed and shape were locked in animation/layout, the FX department was bringing caches into Flowline for dynamic simulations, sprays, etc.

ILM welcomes Oscar-winning VFX supervisor Eric Barba

ILM (Industrial Light & Magic) has brought Academy Award-winning visual effects supervisor Eric Barba to its Vancouver-based studio as creative director. In addition to supervising effects work, Barba will also provide creative oversight across all of the studio’s projects. He will work closely with ILM Vancouver’s executive in charge, Randal Shore, and collaborate with ILM’s global talent base.

For the past two years, Barba was chief creative officer of Digital Domain. A visual effects supervisor since 1999, he supervised the visual effects on David Fincher’s Zodiac, The Girl With the Dragon Tattoo, Gone Girl and The Curious Case of Benjamin Button, for which he was honored with an Oscar and a BAFTA Film Award for Outstanding Visual Effects.

Barba often collaborates with Joseph Kosinski, having supervised work on his films Tron: Legacy and Oblivion. Most recently Barba has been consulting on a number of feature projects.

Outside of his feature work, Barba has supervised effects work on dozens of commercials for brands such as Nike, Heineken and Microsoft Xbox/Epic Games. He has directed ad campaigns for American Express, Cingular, Honda, Jaguar and Nike. He has received eight AICP Awards, and three gold and two bronze Clio Awards for his spot work.

Barba began his career as a digital artist on sci-fi programs from Steven Spielberg’s Amblin Imaging. He is a graduate of Art Center College of Design and is a member of The Academy of Motion Picture Arts & Sciences.

ILM Vancouver is currently in production on Warcraft for Duncan Jones, Luc Besson’s Valerian and David Green’s Teenage Mutant Turtles 2.

EP Blythe Klippsten returns to Zoic for series work

Blythe Klippsten has joined Culver City visual effects house Zoic as executive producer, working on television series. This almost-15-year VFX vet has worked with other busy LA studios in the past, including Psyop, MassMarket, Stardust Studios and Ntropic.

Actually Klippsten isn’t new to Zoic, having worked there near the start of her career — she was a VFX producer from 2006 to 2008. Klippsten, who is now working with EP Gina Fiore to continue building Zoic’s list of television clients, has an extensive background in both commercial and television visual effects.

Prior to joining Zoic in 2006, she coordinated VFX for CSI: Miami and CSI: New York. She gained experience working on a number of TV series, including HBO’s True Blood, and CBS and Jericho, for which she was nominated for an Emmy.

Klippsten spent the next eight years leading teams at a variety of VFX shops, overseeing the production on major campaigns for Pepsi, BMW, EA, Nintendo, Sony, Sprint, Starbucks and Honda, as well as a Super Bowl spot for Cars.com.

“Television has always been a first love of mine and I’m excited to return to Zoic where I really got my start in visual effects,” says Klippsten. “It’s a dynamic time in television, with a much wider distribution landscape and a more diverse range of creative content.”

Lucy Killick upped to managing director of Framestore Montréal

Framestore has promoted Lucy Killick from executive producer of the VFX house’s film team in London to managing director of their Montréal studio, which has over 300 employees. As an established VFX producer, with experience working on both the facility- and production-side of the business, Killick’s credits include Chris Columbus’ Harry Potter and the Chamber of Secrets, Alfonso Cuarón’s Children of Men and Guillermo del Toro’s Hellboy II: The Golden Army.

Earlier this year, Sir William Sargent, CEO of Framestore, talked about significantly expanding the workforce in Montréal. With solid foundations now built in the city, Framestore is looking to nurture, as well as recruit, talent in Montréal.

According to Matt Fox, joint MD of Framestore’s film division, “Lucy has in-depth knowledge of the film VFX industry, combining a prolific career as our client with a history in VFX production at Framestore. This makes her an ideal person to take on this senior management role, ensuring that Framestore’s ethos of a highly creative, technically cutting-edge, and production-focused approach underpins all that we do as we continue to grow and develop the superb talents of our Montréal team.”

“I am excited to be a part of the next stage of development in Montréal,” says Killick. “With successful projects like Edge of Tomorrow and Paddington already amongst the list of credits for Studio Framestore, I am keen to continue working with the filmmakers and studios to deliver VFX of the highest quality.”

Projects in Montreal at the moment include Beauty and the Beast, Knights of the Roundtable: King Arthur, Jungle Book: Origins, Fantastic Beasts and Where to Find Them.

VFX vet Andrea D’Amico joins FuseFX, working on ‘Agents of S.H.I.E.L.D.’

Veteran VFX producer Andrea D’Amico has joined Burbank-based FuseFX, bringing with her 25 years of experience, including tenures at Eden FX, Digital Domain, Riot and CIS Hollywood. Most recently, she served as VFX producer at MPC Montreal, working on Fox’s Victor Frankenstein.

D’Amico joins the FuseFX visual effects team on Marvel’s Agents of S.H.I.E.L.D; FuseFX has received two Emmy nominations and three VES Award nominations for its work on the show.

D’Amico’s many television credits include Warner Bros.’ Person of Interest and Ghost Whisperer and HBO’s Angels in America. Features include The Girl With the Dragon Tattoo, The Social Network and Superman Returns. She was nominated for an Emmy Award in 2008 for her work on the History Channel’s TV movie Life After People.

D’Amico has also held executive posts at Ascent Media.  She began her career in New York at Charlex and EFX Unlimited.

SoldAnim to demo virtual production tech with Muto camera system

SolidAnim, which makes virtual production products, will showcase the latest version of its SolidTrack solution, combined with XD Motion’s Muto, at SATIS 2015.

The combination of Solidtrack, with XD Motion‘s aerial filming systems, offers broadcasters and filmmakers a solution to use and integrate live virtual effects. The combined systems can be used in feature films, broadcast shows, sports, advertising and commercials. The SolidTrack system is compatible with any camera and supports high speeds.

SolidAnim will present a special demo at its booth at SATIS, showcasing SolidTrack’s new version combined with Muto. SolidTrack is a realtime camera tracking solution for recording camera moves and logging data on the virtual set and the Muto is a very light cable camera system. With its onboard engine and power supply, it is designed to be rigged very quickly, either indoor or outdoor. Useful for the integration of post effects, the motion control version is able to reproduce multi positions and movements. The XD Motion gyro-stabilitzed Mini Flight head is also motion control.

SolidTrack’s new version includes a box that gathers all tracking and other data in a single wire, which makes the organization of data and global information significantly easier. In addition, a new feature to genlock SolidTrack tracking data and the film camera has been implemented. This two-way synchronization ensures there are no frame delays.

Winners of the 2015 HPA Awards

Last week at the Skirball Cultural Center, the Hollywood Post Alliance held its 10th Annual HPA Awards ceremony. The HPA Awards recognize individuals and companies for outstanding contributions made in the creation of feature films, television, commercials and entertainment content around the world.

In addition to awards for individual artistry, Leon D. Silverman, a founder and current president of the HPA, was the recipient of the prestigious HPA Lifetime Achievement Award.

postPerspective had its own Dan Restuccio at the event tweeting the winners live, but in case you missed it…

The winners of the 2015 HPA Awards are:

Outstanding Color Grading – Feature Film

Steve Scott

Steve Scott

WINNER:
“Birdman”
Steven J. Scott // Technicolor

NOMINEES:
“Monsoon”
Charles Boileau // Post-Moderne

“Lady of Csejte”
Keith Roush // Roush Media

“The Boxtrolls”
John Daro // FotoKem

“Whiplash”
Natasha Leonnet // Modern VideoFilm

Outstanding Color Grading – Television

John Crowley

John Crowley

WINNER:
“Boardwalk Empire – Golden Days for Boys and Girls”
John Crowley // Technicolor PostWorks NY

NOMINEES:
“Game of Thrones – Hardhome”
Joe Finley // Chainsaw, Inc.

“Masters of Sex – A Parliament of Owls”
Matt Lear // Sony Pictures Television

“Olive Kitteridge – Incoming Tide”
Pankaj Bajpai // Encore

“Sense8 – What’s Going On?”
Tony Dustin // Technicolor

Outstanding Color Grading – Commercial

WINNER:
Lincoln – “Intro”
Tom Poole // Company 3

NOMINEES:
Caterpillar – “Lantern Festival”
Rob Sciarratta // Company 3

HPA AWARD 2015 WINNERS -3-
(continued – Nominees, Outstanding Color Grading – Commercial)

Dodge – “Wisdom”
Beau Leon // Company 3

Lexus – “Face Off”
Dave Hussey // Company 3

Toyota – “Harrier”
Siggy Ferstl // Company 3

Outstanding Editing – Feature Film

Tom Cross

Tom Cross

WINNER:
“Whiplash”
Tom Cross, ACE

NOMINEES:
“American Sniper”
Joel Cox, ACE; Gary Roach, ACE

“Interstellar”
Lee Smith, ACE

“Selma”
Spencer Averick

“The Imitation Game”
William Goldenberg, ACE

Outstanding Editing – Television

WINNER:
“Foo Fighters: Sonic Highways – Nashville”
Kristin McCasey // Therapy Studios

NOMINEES:
“VICE on HBO – Cold War 2.0”
Rich Lowe

“Game of Thrones – Hardhome”
Tim Porter // Beyond the Wall Productions, Inc.

“House of Cards – Chapter 32”
Cindy Mollo, ACE // Netflix

“Foo Fighters: Sonic Highways – Austin”
Scott D. Hanson // Therapy Studios

Outstanding Editing – Commercial

WINNER:
GNP Seguros – “World Cup”
Doobie White // Therapy Studios

NOMINEES:
Fiat – “Alive”
Kristin McCasey // Therapy Studios

Adidas – “Takers”
Steve Gandolfi // Cut+Run

Google – “Young Together”
Miky Wolf // Big Sky Edit

Skullcandy – “Push Play”
Doobie White // Therapy Studios

Outstanding Sound – Feature Film

WINNER:
“American Sniper”
Alan Murray, Tom Ozanich, John Reitz, Gregg Rudloff // Warner Bros. Post Production Services

NOMINEES:
“Birdman”
Jon Taylor, Frank A. Montano, Martin Hernandez, Aaron Glascock // NBCUniversal StudioPost

“Interstellar”
Richard King, Gary Rizzo, Gregg Landaker, Mark Weingarten // Warner Bros. Post Production Services

“Unbroken”
Jon Taylor, Frank A. Montano, Becky Sullivan, Andrew DeCristofaro // NBCUniversal StudioPost

“Mad Max: Fury Road”
Mark Mangini, Scott Hecker // Formosa Group
Chris Jenkins, Gregg Rudloff // Warner Bros. Post Production Services

Outstanding Sound – Television

WINNER:
“Homeland – Redux”
Nello Torri, Alan Decker // NBCUniversal StudioPost
Craig Dellinger // Sony Sound Services

NOMINEES:
“Banshee – You Can’t Hide from the Dead”
Bradley North, Joseph DeAngelis, Ken Kobett, Tiffany Griffith, David Werntz // Technicolor

“Black Sails – XVIII”
Benjamin Cook, Stefan Hendrix, Jeffrey Pitts, Sue Cahill, Onnalee Blank, Mathew Waters // Starz

“Game of Thrones – Hardhome”
Tim Kimmel, Paula Fairfield, Bradley Katona, Paul Bercovitch, Onnalee Blank, Mathew Waters // Formosa Group

“Halt and Catch Fire – SETI”
Sue Cahill, Keith Rogers, Scott Weber, Jane Boegel, Mark Cleary, Kevin McCullough // NBCUniversal StudioPost

Outstanding Sound – Commercial

WINNER:
The Syria Campaign – “In Reverse”
Jon Clarke // Factory

NOMINEES:
Honda – “The Other Side”
Tom Joyce, Anthony Moore // Factory

Prada – “The Battlefield”
Miky Wolf // Big Sky Edit

Volvo – “The Swell”
Aaron Reynolds // Wave Studios

Medicontour – “Bi-Flex 1.8″
Phil Bolland // Factory

Outstanding Visual Effects – Feature Film

WINNER:
“The Hobbit: The Battle of the Five Armies”
Joe Letteri, Eric Saindon, David Clayton, R. Christopher White, Matt Aitken // Weta Digital

NOMINEES:
“Tomorrowland”
Craig Hammack, Eddie Pasquarello, Francois Lambert, Maia Kayser, Barry Williams // Industrial Light & Magic

“Birdman”
Ara Khanikian, Sebastien Moreau, Sebastien Francoeur, Patrick David, Laurent Spillemaecker // Rodeo FX

“Into the Woods”
Matt Johnson, Christian Irles, Daniel Tarmy, Nicolas Chevallier, Benoit Dubuc // MPC

“Jurassic World”
Tim Alexander, Glen McIntosh, Tony Plett, Kevin Martel, Martyn Culpitt // Industrial Light & Magic

Outstanding Visual Effects – Television

WINNER:
“Game of Thrones – The Dance of Dragons”
Joe Bauer, Steve Kullback, Derek Spears, Eric Carney, Jabbar Raisani // Fire and Blood Productions

NOMINEES:
“Marvel’s Agent Carter – Now Is Not The End”
Sheena Duggal, Richard Bluff, Jay Mehta, Chad Taylor, Cody Gramstad // Industrial Light & Magic

“Black Sails – XVIII”
Erik Henry // Starz
Ken Jones // Digital Domain
Nic Spier // Shade FX
Christina Spring, Bjorn Ahlstedt // Crazy Horse Effects

“Ripper Street – Whitechapel Terminus”
Ed Bruce, Nicolas Murphy, John O’Connell, Joseph Courtis, Ronan Gantly // Screen Scene

“The Flash – Grodd Lives”
Armen V. Kevorkian, Andranik Taranyan, Stefan Bredereck, Jason Shulman, Gevork Babityan // Encore VFX

Outstanding Visual Effects – Commercial

WINNER:
“Game Of War – Decisions”
Benjamin Walsh, Brian Burke, Ian Holland, Brandon Nelson // Method Studios

NOMINEES:
Shell – “Shapeshifter”
Russell Dodgson, Robert Herrington, Ahmed Gharraph, Rafael Camacho // Framestore UK

General Electric – “Invention Donkey”
Seth Gollub, Theo Jones, Russell Miller, Raul Ortego // Framestore NY

Game Of War – “Time”
Benjamin Walsh, Brian Burke, Ian Holland, Chris Perkowitz // Method Studios

Pepsi – “Halftime Touches Down”
Chris Eckhardt, Michael Ralla // Framestore

Not To Scale welcomes director/animator Lucinda Schreiber

New York-based director, animator and illustrator Lucinda Schreiber has signed with film and animation content studio Not To Scale for representation worldwide. Schreiber comes from the multi-national, multi-medium production company Photoplay.

She has worked with clients from all over the world, including the US, Europe, Australia and Asia. Schreiber has done work for Coca-Cola, Telstra, Kotex, Gotye, Saks 5th Avenue, NPR and the Yoko Ono Morning Peace event at MoMA. In any given spot, Schreiber can be found employing 2D digital animation, stop-motion, live action, or a hybrid.

Schreiber got her start when one of her early animated films, The Goat That Ate Time, generated universal attention. Since then she has worked on films, music videos and animations, picking up awards along the way.

Not to Scale’s founder/EP/managing director, Dan O’Rourke, says this about Schreiber: “Lucinda is a versatile director who has an ability to let the craft skills and inventive transitions in her filmmaking add charm to any project that she is working on, or for any brand that she is working with.”

The A-List: Creating a VFX tightrope for ‘The Walk’

Visual effects supervisor Kevin Baillie talks about working with Robert Zemeckis on the director’s latest

By Iain Blair

Oscar-winner Bob Zemeckis has always been at the cutting edge of technology, and highly skilled at integrating that technology in the service of telling stories in such films as Forrest Gump, Back to the Future, Who Framed Roger Rabbit?, The Polar Express, Beowulf and Flight.

Now, in The Walk, he’s putting moviegoers in the shoes of Philippe Petit, the French aerialist who in 1974 stunned New Yorkers — and the world — with his high-wire walk between the iconic towers of the almost-completed World Trade Center.

L-R: Bob Zemeckis, Kevin Baillie, Joseph Gordon-Levitt, Steve Starkey and Jack Rapke on the last day of shooting.

L-R: Bob Zemeckis, Kevin Baillie, Joseph Gordon-Levitt, Steve Starkey and Jack Rapke on the last day of shooting.

“When I first heard this story, I thought, ‘My God, this is a movie that A: should be made under any circumstance, and B: should be absolutely presented in 3D,’ explains Zemeckis. “When you watch a wire walker, you always have to watch by looking up at him. You never get the perspective of what it’s like to be on the wire.”

But aided by DP Dariusz Wolski and VFX supervisor Kevin Baillie, Zemeckis has made an epic, big-screen spectacle that gives audiences that vertigo-inducing “you-are-on-the-wire” perspective and the chance to go where only one man has been or ever will be — 110 stories in the air, walking between the twin towers.

I spoke with Baillie — whose Atomic Fiction studio has locations in Oakland, LA and Montreal — about working with Zemeckis and creating the VFX.

Is it true you began working on this years ago?
Yes, Bob and I began discussing how to do it seven, eight years ago when I was still at ImageMovers Digital, the company Bob ran with Disney. Back then it was going to be completely motion capture, and I was shocked when TriStar later stepped in to make it after Bob and Disney went their separate ways, since it had been so long in development.

By then you’d co-founded Atomic Fiction, when the VFX industry was a bit rocky?
Right, we formed it out of the ashes of ImageMovers, and got to take some of the best talent with us. Back in 2010, there was a lot of VFX work going on, but companies weren’t making any money, and it was a tough time for the business.

1271033 - THE WALKJoseph Gordon Levitt

So that’s when you pioneered cloud rendering?
Yes, we decided to use cloud computing for all our rendering instead of using the traditional local vendor farm, and that’s been a critical part of being able to help filmmakers like Bob get their visions made on budget. So for this, our teams in Oakland and Montreal did about 9.1 billion hours of processing in the cloud, which is over 1,000 years on a single processor. So the scale it allows a company of our size to go to is pretty epic.

We even built our own tool to do all that, the Conductor, and that’s given us a 50 percent cost savings versus doing local rendering, and artists are also about 20 percent more productive, since they’re getting results back quicker. Atomic has between 120 and 150 people total, yet we have access to a renderfarm as big as what ILM has on demand. So we can have that one minute, and nothing the next.

1 face replace  final face

Which allows you to give filmmakers the budgets they need to hit, right?
Exactly, and it also makes Atomic healthier as a business, which lets us invest in more long-term things and in more talented artists. So when Bob came to us last year and said he wanted us to do all of The Walk, we decided to set up a new office in Montreal with 100 people. Even so, we had to bring in other companies to help handle the load — Rodeo Effects in Montreal, UPP in Prague and Legend 3D.

Tell us about working with Bob.
He’s a “story-first” guy, which I love, and which is very satisfying from a VFX standpoint, as it allows our visuals to be more effective. I don’t think there’s another director who knows how to use VFX and 3D as tools better than Bob. He has a clear vision, is very articulate and gives us an immense amount of freedom to work. This took just over a year to shoot, and post and VFX took about eight months.

green 12 composite

I noticed that the film’s pacing also perfectly parallels the material.
That’s so true. Bob makes incredibly effective use of 3D, and part of that is having shots be long, which gives the audience a chance to sit back and explore the world instead of force feeding them with quick cuts all the time. A typical movie now has 2,000 to 2,500 shots, with 2,000 VFX shots in a heavy-effects film, but we had just 826 total shots — a quarter of the usual number — and out of those, 672 were VFX shots. But that equates to over 2,000 VFX shots in a normal film, so there were a lot!

What was the hardest sequence to do?
Making New York feel like it was alive and bustling during the walk. As the sun comes up, it progresses into a darker, stormier look, so we couldn’t just build one matte painting of the city and be done with it and re-use it. We had to create moving cars, people and constantly changing lighting. So we had to build out New York 1974 completely digitally — every AC unit, every gutter – in order to make it look real. That alone took four months, with a dedicated team, and that left us with less time to render it, but thanks to Conductor we rendered a ton of stuff in a very short time. And I think the film turned out — as is the case with a lot of Bob’s films — better than anyone could have imagined.

Industry insider Iain Blair has been interviewing the biggest directors and artists in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Quick Chat: CO3’s Stephen Nakamura on grading ‘The Martian’

Ridley Scott’s The Martian tells the story of an astronaut left behind on Mars. The director, who created that world, called on Company 3’s Stephen Nakamura for the color grade, which he completed in London to be closer to Scott and the production.

We checked in with Nakamura to find out more about his process on The Martian.

You and Ridley have collaborated in the past. We assume you have developed a short hand of sorts?
There are definitely things I know he likes and doesn’t like, but each project is also a little bit different. Obviously, he is very interested in the visuals of every shot. The Martian was relatively straightforward. Something like Exodus: Gods and Kings was much more complex because of the kinds of things we were looking at, like the sea parting. On Prometheus, it was about helping to bring shape and definition to scenes that were really dark. Of course, he’s worked with [Dariusz Wolski, ASC], so a lot of the shaping has already happened between the two of them.

How early does he bring you on a film?

We speak very early on. I know before I see any images what kind of look he’s interested in.

Can you talk about the look of Mars? He referenced the terra cotta/orange look in our recent interview with him.
It was something that we all had a sense of conceptually but it took a lot of work with Ridley, the visual effects supervisor Richard Stammers and me in the DI theater to get it to all look the way it does in the final film. Quite a few shots involved a lot of sky replacements and the addition of mountains in the background. Richard’s team created these additional elements with a combination of CGI practical plates shot in Jordan and combined them with the first unit photography of Matt Damon.

So then when I added the heavy color correction Ridley wanted for that kind of orange look he talks about, it would have an effect on every element in the shot. It’s impossible to know in advance exactly how that correction for the planet’s surface is going to look in context and in a theater until you actually see it. I could get some elements of some shots where we needed them using Power Windows [in the DaVinci Resolve] but sometimes that heavy correction was too much and the effects elements would have to be altered. Maybe the sky needed to be darkened or we needed more separation in the mountains. We might make a change to the foreground, and the background would “break,” or vice versa.

So we had quite a few sessions where Richard would sit with Ridley and me and we would figure that out shot by shot.

You work with Resolve. What is it about that system helps your creative process?
I’ve worked in it as long as it’s been around. I like the way it’s laid out. I like the way I can work… the node-based corrections. I can get to the tools that most colorists use on a normal basis very quickly, and with very few keystrokes or buttons to push. That kind of time saving adds up to a really big deal when you’re coloring complicated movies.

I know there are other great color correctors out there too, but so far Resolve is just the most comfortable for me.

(from left) Matt Damon, Jessica Chastain, Sebastian Stan, Kate Mara, and Aksel Hennie portray the crewmembers of the fateful mission to Mars.

Was there one particular scene that was more challenging than others, or a scene that you are most proud of?
There are a number of shots set outside the ship Jessica Chastain’s character commands where we see the ship and some characters in the foreground and the surface of Mars further away and then blackness and stars in the far background.

Here again, we all have a strong conceptual sense of the look, but ultimately it’s something you can’t get to without seeing it in a theater and in context. How saturated should the color of Mars be? How sharp should the focus be on the planet’s surface, on the distant stars? It’s not simply a question of having it look “real.” Ridley’s the kind of filmmaker who wants it feel right for the story. And so I might use Resolve’s aperture correction function to make the stars appear more vibrant, the way Ridley wants it, and that could “break” another part of the shot. And then it’s a question of whether I can use power windows to address that issue or if the VFX team needs to re-render and composite the element.

That kind of massaging of every shot takes a lot of time, but when it’s done you really see the results on the screen.

Can you talk about grading for the brighter Dolby Vision 3D?
It definitely gets rid of one of the major issues in 3D when you can effectively put a stereoscopic image onscreen at the traditional 2D spec of 14-foot lamberts. Previously, doing a stereoscopic pass always involved putting a darker image on screen, and when you have that much less light to work with it affects the whole image. That’s particularly true with highlights that might have plenty of detail at 14 but will blow out when you’re working at 3.5.

Of course, we still did a pass for traditional 3D, since there are very few theaters currently able to show Dolby Vision 3D.

Does that involve a whole different pass or a trim pass, or is it just a LUT that translates everything from the 14-foot lambert world to 3.5?
Company 3’s technology team is always building and updating LUTs that get us a lot of the way there. But when there’s never 100 percent “translation” from the one set of display parameters to the other, image characteristics change. The relative brightness of that practical in the background to the character in the shadows may not feel the same at 14 as it does at 3.5.

So which pass would you do first?
The way I work when we’re doing multiple theatrical deliverables like this is to start with the most “constricted” version [the 3.5 fl 3D] and get that where we want it. Then we go and “open it up” for the wider space. It’s important to be consistent. Very often, it’s a question of building Power Windows around bright parts of the frame and bringing them down for the regular 3D version and then either taking them off or lessening the corrections for the brighter projection spec.


For more on The Martian, read our interview with director Ridley Scott.

Film producer Sam Mercer named head of studio at ILM

San Francisco-based Industrial Light & Magic (ILM), a division of Lucasfilm, has hired film producer Sam Mercer as head of ILM Studio. Reporting to Lucasfilm GM Lynwen Brennan, Mercer will oversee and coordinate the company’s operations across all four of ILM’s global studios — San Francisco, Singapore, Vancouver and London.

Mercer has produced many high-profile films over the years such as Brian De Palma’s Mission to Mars; seven of M. Night Shyamalan’s films, including Signs and The Sixth Sense for which he received both Academy Award and BAFTA Best Picture nominations; Sam Mendes’ Jarhead; and Rupert Sanders’ Snow White and the Huntsman. Mercer is also a producer on Steven Spielberg’s upcoming fantasy film The BFG, based on the novel by Roald Dahl.

Says Brennan, “[Sam has] vast experience as a producer and together with his long history with visual effects including eight films with ILM, he provides a valuable perspective within the VFX industry. As visual effects becomes increasingly integrated into filmmaking from pre-production through to post, Sam’s filmmaker point of view enables us to provide a unique level of creative collaboration and partnership with directors, producers and studios to bring their visions to the screen.”

Having previously been with The Walt Disney Studios as a production executive, Mercer supervised such films as Good Morning, Vietnam, Three Fugitives and Dead Poets Society. Within a few years, Mercer was promoted to VP of motion picture production for Hollywood Pictures, and in addition to Arachnophobia he was responsible for such releases as Quiz Show, The Joy Luck Club, Born Yesterday, Swing Kids and The Hand That Rocks the Cradle. Mercer left Hollywood Pictures to pursue independent producing on Frank Marshall’s second film, Congo.

Mercer started in the film business as a freelance location and unit production manager on such films as The Witches of Eastwick, Peggy Sue Got Married, Stripes, Swing Shift and The Escape Artist. He also served as the associate producer/unit manager for KCET-TV in Los Angeles where he received a Daytime Emmy for the live presentation of the San Francisco Opera’s production of La Gioconda.

Kathleen Kennedy, president of Lucasfilm, noted, “I have had the pleasure of knowing and working with Sam for many years and I have come to rely on his skill as a creative problem solver. Sam has that rare ability to head off potential issues before they become real problems, and he manages to do it while maintaining an even keel and level of professionalism that has earned him the respect and admiration of every crew he works with.”

Technicolor buys VFX house The Mill

Technicolor has purchased London-based The Mill, a large visual effects and content creation studio for the advertising industry, for €259 million ($292 million) on a debt-free basis.

Founded in 1990, The Mill has been providing high-end visual effects for both advertising agencies and brands, and has earned in excess of 1,000 industry awards. It has operations in the key markets of London, New York, Los Angeles and Chicago.

According to Technicolor, this acquisition accomplishes many objectives set out in Technicolor’s Drive 2020 strategic roadmap:
·     It establishes visual effects and digital creation across all segments of high-end content, including cinema, TV and advertising.
·     It reinforces Technicolor’s portfolio of brands, including MPC, Mr. X and Mikros Image servicing a broad range of customers across 10 global locations.
·     It brings talent and expertise around emerging technologies such as virtual reality content that will enable Technicolor’s to enhance its technology platform across the entire industry.
·    It adds significant financial contribution with a business that has grown revenues at a 16% CAGR since 2009 to reach €135 million in 2014 while delivering EBITDA margins of approximately 20 percent.
·     It allows Production Services to better balance its portfolio through increased exposure to advertising and strengthens the financial profile of the Entertainment Services segment. With this acquisition, Production Services accounts for approximately 40 percent of Entertainment Services revenues.

Tim Sarnoff, president of production services and deputy CEO at Technicolor, wrote this about the acquisition in his blog:

“For a company like Technicolor, this phenomenon is a terrific opportunity to focus on our established strengths as a creative technology company and to invest further in the talent and resources that will help drive this evolving definition of artistic and persuasive storytelling.  Today’s acquisition of The Mill is aligned to the growing demand for premium content and creation of new consumer experiences.

Tim Sarnoff

Tim Sarnoff

The Mill is a leading provider of VFX content creation for the advertising, gaming and music industries. The Mill’s leading position in the global advertising VFX/post-production market aligns to our desire for immediate scale. Their investment in developing and producing content in emerging spaces (they partnered with Google’s Advanced Technology and Projects group on a five-minute live-action virtual reality short film) also complements our company-wide efforts in these technologies. The Mill and their passionate talent are constantly pushing the frontiers of visual narrative, which means they will avidly leverage the technical chops of Technicolor to create new solutions for their clients.

“Equally, The Mill’s push into the content creation space, working closely with agencies to develop and realize their more technically complex ideas and stories, provides Technicolor visibility further upstream in the content creation value chain. 

“With this acquisition Technicolor will extend its current position and technology know-how in VFX to rapidly capture further market share within the advertising and branded experience sector.”

To read Sarnoff’s entire blog click here.

SFX and VFX veteran Scott Coulter joins MastersFX

LA- and Vancouver-based MastersFX (MFX) has added longtime VFX producer Scott Coulter as its new VP/executive producer of visual effects. He brings extensive experience in both practical and digital effects and is already overseeing day-to-day operations on visual effects at MFX’s Los Angeles headquarters. He is also heading up character and creature design work for their current slate of projects.

Coulter joins the company with over 30 years of effects experience, having worked on more than 200 films that have integrated almost every aspect of special and visual effects. His credits include the feature films The Expendables, Conan the Barbarian, Dogma, My Favorite Martian and The Crow.

He began his career as a production assistant on George Romero’s Creepshow. His first love was special effects makeup and monsters and he quickly moved up the ranks, becoming a makeup FX supervisor and developing his animatronics and practical blood skills. While working on The Crow, Coulter saw Jurassic Park in the theater and became inspired by the digital visual effects. Not long after he took a six-month sabbatical from his role as a makeup FX artist to learn everything he could about computer animation.

In 2001, Coulter founded and launched Worldwide FX, headquartered in Bulgaria, to meet the visual effects needs of Nu_Image/Millennium Films — Millennium Studio owns Worldwide FX. He eventually expanded the facility, even opening a second facility in Shreveport, Louisiana. To date, Worldwide FX has created VFX for more than 140 films.

“Scott is a rare sort… his knowledge is diverse and his skill set is truly unmatched,” says founder/president of Masters FX Todd Masters, noting that he and Coulter had a working relationship that spans two decades. “Scott and I worked together more than 15 years ago on early MFX projects such as Demon Knight, Dark Skies and Mortal Kombat. Back then we were using only puppets and prosthetics. Scott was there, coming up with great tricks, even then. But it was all practical FX, and long before the birth of digital.”

Today Masters FX combines both practical and digital techniques. “Because we’ve changed many of the ways we approach FX these days, both Scott and I did a bit of an adjustment over these past many, many years — working from separate locations and with slightly separate disciplines,” adds Masters. “Most recently, I’ve been helping develop and supervise the art-side of our advanced FX methodologies, while Scott’s been evolving himself into this amazingly creative and experienced, worldwide VFX executive. He’s one of the few effects executives I know of who has seen it all and done it all, from numerous vantage points.”

“The blending of these disciplines is the greatest challenge moving forward, and this company is the place to perfect this blending and really make it happen,” adds Coulter. “On top of that, the idea of working in LA again after many years based in Bulgaria also appealed to me!”

Shade VFX ups Lisa Maher to executive producer

Bi-coastal visual effects company Shade VFX has promoted Lisa Maher, former biz dev executive, to the position of executive producer with primary responsibility for overseeing the studio’s Los Angeles-based productions.

Maher assumes her new role not long after Shade VFX was nominated for an Emmy in the Outstanding Special Effects in a Supporting Role for the Netflix series Daredevil.

Maher joined the shade VFX team in 2013, bringing with her over 20 years in the film industry, primarily producing visual effects for facilities such as Rhythm & Hues and CIS Hollywood. She completed production on over 25 feature films before transitioning into business development. Prior to joining Shade, she worked for Dr. D Studios and Fuel VFX.

Maher emphasizes that regardless of the title change, her key goals won’t change. “Focusing on the needs of our clients will still be my primary mission.”

“Lisa has been one of the driving forces of Shade’s recent and rapid expansion and growth. While acting as our business development executive, Shade’s revenues grew 300 percent by the end of 2014, and are on target for even more growth in 2015,” says CEO Bryan Godwin.

The Shade team has lent their expertise to projects such as Daredevil, True Detective, The Amazing Spider-Man 2, Teenage Mutant Ninja Turtles, The Intern, Black Sails, Olive Kitteridge, 22 Jump Street, Selma, Behind the Candelabra and The Wolverine among many others. They are in post for a number of projects at this time, including Batman v. Superman: Dawn of Justice, to be released in 2016.

Frame.io updates video collaboration tool

The minds behind Frame.io have added some new features to their cloud-based video collaboration tool, which they say has now surpassed 50,000 members from 120 countries.

These include the following for the Frame.io Web App 1.1
• New Private Team Files and Folders — You can now set files and folders to be invisible from collaborators.
• New Collaborator Permissions — You can now restrict collaborators from downloading, sharing or inviting other collaborators.
• New Project Sharing — When project sharing is turned on, anyone with the link can join. You can invite large groups of collaborators without having to invite them individually.
• New Realtime Upload Status — Now all participants of a project can see upload progress in realtime, which can eliminate lots of confusion.
• Expanded Keyboard Shortcuts — You can now use the arrow keys to navigate through thumbnails: spacebar to Quicklook, Esc to exit, Enter key to enter the player, and Esc key again to exit the player.

Updates for the Final Cut Pro Companion App 1.1. include:
• Added support for queuing
• Drag and drop upload from the desktop
• Custom export locations allow access to rendered FCP X media
• Convert FCP X markers into timestamped Frame.io comments
• Added options to choose marker types when exporting only clips with markers
• New reduced bandwidth option

See our past coverage of Frame.io here.

Conceptual artist Syd Mead to get VES Visionary Award

The Visual Effects Society (VES), has named visual futurist and conceptual artist Syd Mead as the next recipient of its Visionary Award in recognition of his contributions to visual arts and filmed entertainment. Mead designed vehicles and robots for many classic sci-fi films, including Tron. The award will be presented at the 14th Annual VES Awards on February 2, 2016.

The VES Visionary Award, voted on by the VES board of directors, recognizes an individual who has “uniquely and consistently employed the art and science of visual effects to foster imagination and ignite future discoveries by way of artistry, invention and groundbreaking work.”

Mead’s career, which spans almost six decades, began as he created characters and backgrounds for animated cinema intermission trailers after he graduated high school. After serving in the US Army and receiving his education at the Art Center School in Los Angeles, Mead was recruited by Ford Motor Company’s Advanced Styling Studio. After Ford, he took on high-profile design assignments for companies including US Steel, Philips and InterContinental Hotels.

In 1979, Mead’s projects expanded to designing for Hollywood as he began to work with most major studios. His started with the creation of the V’ger entity for Star Trek: The Motion Picture, followed by two cult classics, Bladerunner and Tron. Mead’s designs for robots, vehicles and other-worldly environments have also been featured in films including 2010, Short Circuit, Aliens, Time Cop, Johnny Mnemonic, Mission Impossible III and Elysium.

In the 1980’s, Mead established close working relationships with a number of major Japanese companies including Sony, Minolta, Dentsu, Dyflex, Tiger, Seibu, Mitsukoshi, Bandai, NHK and Honda as well as contributing to Japanese film projects, Yamato 2520 and Solar Crisis. In the 1990s, he supplied designs for all eight robot characters in the Turn A Gundam mobile suit series and TV show.

“Syd is truly a defining creative force in the world of visual arts,” said Mike Chambers, VES board chair. “He has a rare ability to create fiercely inventive images, both iconic and sublime, and he has contributed to some of our most unforgettable cinematic experiences. Syd’s legendary contribution to the field of design, and the inspiration he has provided for generations of visual effects artists is immense.”

Previous winners of the VES Visionary Award have been Christopher Nolan, Ang Lee, Alfonso Cuarón and J.J. Abrams.

Photo: Jenny Risher

Quick Chat: ‘Ted 2’ previs/postvis supervisor Webster Colcord

By Randi Altman

Ted, the foul-mouthed but warm-hearted teddy bear, is back on the big screen, this time fighting for the right to be recognized as a person — he really wants to get married — in Ted 2 from director Seth MacFarlane. Once again, this digital character is seen out and about in Boston, in all sorts of environments, so previs, mocap and postvis played a huge role.

Cue Webster Colcord. He was previsualization and postvisualization supervisor on Ted 2, reporting to Culver City, California’s The Creative-Cartel. Colcord held similar titles on the first Ted, serving as the production’s previs/postvis artist and mocap integration artist. He also worked on Ted’s appearances between the two movies — The Jimmy Kimmel Show and the Oscars. He worked out of the production unit set up by Universal Pictures and studio MRC.

For Ted 2, Colcord and team used motion capture, via the Xsens MVN system, to record MacFarlane, who also voices Ted, as he acted out scenes. Because it’s an inertial system, MVN allowed the character (and director) to step out of the mocap volume and onto the streets, something that couldn’t be done with an optical offering.

Colcord has been working in CG since 1997. Prior to that he was a stop-motion animation artist. “I do all kinds of things, he says. “Previs, animation, postvis and supervision.  Mocap is not my usual gig, actually! Right now, I’m animation supervising at Atomic Fiction (Flight, Star Trek Into Darkness, Game of Thrones) in the Bay Area.”

We reached out to Colcord to find out more about his process and the workflow on Ted 2.

You worked with The Creative-Cartel and Jenny Fulle. What was that relationship like?
Creative-Cartel has been the VFX management unit on the Ted movies, so they oversee the dissemination of assets between the different parties involved and planning. They are involved every step of the way, from pre-production through to final delivery.

The on-set duties for all of us tend to be all-engrossing but after principal photography, when I am in-house with the editorial department doing postvis, I’m supporting just the editorial department and the VFX teams. At a couple of points in the schedule, however, we were prepping for re-shoots on stage with the main unit, mocap at the editorial office, postvis for upcoming screenings and delivery of synced mocap to the vendors. It could be overwhelming!

Whose decision was it to use Xsens? Do you know if that’s what they used on the first Ted?
During development on the first movie, producer Jason Clark and VFX producer Jenny Fulle researched and tested various mocap options and arrived at Xsens’ inertial mocap system, which was very new at the time. It was decided to go with the Xsens MVN system because of the ease of set-up on location. You don’t need to set up a volume, and it’s very portable — the set-up is minimal. Also, there are no marker occlusion issues.  It has a few limitations that the optical systems do not have, but with every update those differences become less and less.

There is a big dance sequence in the film. It must have been particularly challenging to capture the movements of a completely CG character?
It was a complex sequence, and it blends in from a previous scene with Ted dancing in a different environment, adding to the complexity.  The credit for working out the choreography goes to Rob Ashford, Sara O’Gleby and Chris Bailey.  Also, of course, VFX supervisor Blair Clark.  It’s important to understand, though, that the mocap system just provides a core performance and the final Ted is a blending of keyframe animation (Iloura did the dance sequence) and mocap. My role was to facilitate the performance and get it over to the VFX team with a high degree of fidelity and in a pipeline-ready form.

Film Title: Ted 2 Film Title: Ted 2

We ended up capturing it in about four sessions, with five different dancers, each of whom acted out Ted’s motions for various parts of the choreography. During production on Ted 2, Xsens released an updated version of their system, which they call MVN Link.  The sensors are smaller, the data has been improved and the wireless signal uses Wi-Fi rather than Bluetooth.  So we used that version of the system for the dancers. For Seth’s performances we use a fully wired system with an umbilical cable attached to the computer, as Ted is usually not being very acrobatic in his motions.

What’s the workflow like?
We recorded a live feed of the mocap on the low-res Ted model from Autodesk Motion Builder, while we captured the data.  In some cases editorial was able to comp this into shots to use as postvis, pretty much right out-of-the-box.

Having Fun: Colocrd and the postvis team were called in the day of a screening  to help make a joke "play" as per MacFarlane's direction.

Postvis in progress on “Ted Hooker scene: Colcord and the postvis team were called on the day of a screening to help make a joke “play” as per MacFarlane’s direction.

So you captured the data and sent it to Iloura and Tippett Studio?
Yes, I would retarget the data in Motion Builder, then sync the data in Autodesk Maya with a minimal amount of clean-up and send it off to both houses. The data would also be used as the core performance for many of our postvis shots.

If Ted makes any more appearances on talk shows/awards shows will you be using Xsens for that too?
I assume that we will be using the MVN system since we have an established pipeline. It’s pretty much the same as what we do for the feature, but it depends on who is doing the editorial duties, since the first pass at deciding which part of a mocap performance is used is made by the editor.

Design director Leo Nguyen joins Carbon VFX

Carbon VFX‘s New York-based studio has added design director Leo Nguyen to its staff. He brings with him experience that spans across art direction, illustration, 3D, motion graphics and editing. Prior to Carbon VFX, Nguyen spent two years at Light of Day as design director and eventually became creative director. He joined Light of Day in 2013 when he moved from Australia, where he spent his childhood, to New York City.

Nguyen has worked on all sorts of projects, from branding to commercials, short films, music videos and broadcast promos. Recently he completed a campaign for Royal Canin pet food and the animated Christmas short Window Washer for DDB NY, as well as The Big Picture, a collection of shorts scored by jazz musician David Krakauer. Nguyen also directed a series of broadcast promos for the 54th and 55th Grammys and Australia’s Next Top Model.

Over his design career, Nguyen has won five PromaxBDA International Awards, one Rocket Award for Global Excellence (along with Alessandra Menozzi) and four additional awards for his design work on the Australian TV shows Cricket Superstar and Slide. Nguyen directed his own short film called Paper Boats & Paper Planes. It was an official selection at the Sydney International Animation Festival in 2010.

Behind the Title: rof vfx’s John Myers

NAME: John Myers @jmyersrof

COMPANY: rof vfx (Ring of Fire)

CAN YOU DESCRIBE YOUR COMPANY?
We are a ninja chameleon that can bench press 5,000 pounds plus and run a sub :01 second 40, all day, every day! We are also an award-winning visual arts, effects and animation company (with offices in Santa Monica and New York) that works across all platforms, specializing in custom creative approaches. We specialize in high-end seamless effects, including animation, 2D & 3D compositing, motion graphics design and finishing for all formats and mediums.

WHAT’S YOUR JOB TITLE?
Executive Producer/Visual Effects Supervisor

WHAT DOES THAT ENTAIL?
It’s lots of creative thinking and action to navigate the day, on set or in the studio. Helping to create energy and make things happen, keep it simple and focused.

rof_vfx2

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
That my partner — and our creative director — Jerry Spivack and I are not brothers or the same person.

WHAT HAVE YOU LEARNED OVER THE YEARS ABOUT RUNNING A BUSINESS?
You cannot wait for anything to happen, and expect and prepare not to be surprised by absolutely anything.

A LOT OF IT MUST BE ABOUT TRYING TO KEEP EMPLOYEES AND CLIENTS HAPPY. HOW DO YOU BALANCE THAT?
Beer.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Getting to work every day in the studio and on set with amazingly smart, talented and funny people. I am blown away everyday by something or someone.

WHAT’S YOUR LEAST FAVORITE?
5am call times… only to wait until 11pm to start the set-up for the “VFX” shot.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
5am.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Driving combines in Nebraska or on a world tour as drum tech for Nicko McBrain.


Photographic proof that John and Jerry are two different people, and Heroes Reborn.

CAN YOU NAME SOME RECENT CLIENTS?
A Father John Misty music video for director Grant James, the TV series It’s Always Sunny in Philadelphia, an integrated content for Ad Agency High, Wide & Handsome, TV Series work for Instant Mom for Nickelodeon, TV Series work for Girl Meets World for Disney,and Heroes Reborn “Where Are The Heroes” for NBC and director Kendall Bowlin.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
iPhone, microwave oven and a giant HD monitor.

Quick Chat: Stargate Studios president Darren Frankel

Stargate Studios, a visual effects and high-end production company, has 10 offices in seven countries where they work on television, features, commercials and special venue projects. Their credits are impressive and include work on GracepointThe Walking Dead, Grey’s Anatomy, Ray Donovan, House of Lies and 12 Monkeys.

Stargate has been around for 25 years, which in this business is a rarity. We checked in with president Darren Frankel to find out how they have survived, thrived and more.

You’ve hit the quarter of a century mark, which is impressive. How have you not only survived in a very difficult market but also thrived and expanded around the world? Any wisdom to share?
You have to constantly reinvent yourself and your process. The industry is constantly changing and producers, directors, studio executives, etc. are looking for partners to help them stay ahead of the curve. We are always looking at new ways of doing things and which tools exist to allow us to do the things that we couldn’t achieve before. To stay current, you always have to be ready to break the model and improve it. You also need to look at your client’s problems as your own so that you are thinking along with them rather than just being reactive.

Can you talk about your different locations and is different work done at each or does the work at all locations mirror the others?
We have 10 offices — Los Angeles, Atlanta, Vancouver, Toronto, Mexico City, London, Berlin, Cologne, Malta and Dubai. At the crux of each facility is local work. All the facilities are interconnected using a proprietary system known as VOS (Visual Operating System). This inter-connectivity allows artists to share work and for VFX supervisors, producers and coordinators to communicate with greater ease. The business has grown internationally, so Stargate’s network of facilities also allows us to put talent in place regardless of location and bring that talent to the projects that need it.


Before and After: An example of work Stargate did for the show Gracepoint.

Can you talk about some of the work you have done and are working on currently?
Currently we’re working approximately 25 different projects around the world, a sampling of which includes The Walking Dead, Dig, Grey’s Anatomy, Ray Donovan, Damien, El Principe and a host of other projects that I wish I was at liberty to talk about!

Any one that you are particularly proud of. Can you describe?
Every project brings its own set of challenges and some of the work that I’m most proud of is work that nobody would ever know we even did because it’s invisible. At the end of the day, a company is really about its people, and I’m extremely proud of all of them.

101-138-10_before second
Before and After: More work for Gracepoint.

What are your main tools?
Our main software tools, in addition to the aforementioned VOS, are After Effects, Maya, LightWave, Premiere, Mocha Pro, RealFlow, Photoshop, Golaem and a host of other plugins and tools. In addition we use all kinds of cameras and production tools because real is always best when feasible, so we often shoot our own elements as well.

You are also using Signiant’s Media Shuttle to work with all your locations seamlessly. Can you walk us through that workflow and describe how you were doing this before Media Shuttle?
The business has become global. Shows often shoot in one city, do editorial in another city, and desire to gain tax incentives from yet another city. The ability to move and share data across Stargate’s network has become of paramount importance. Using our internal VOS system, data can be moved through the network automatically using preference settings rather than manual human interaction. It will even place files in the same directory on the network in a different city, without relying on moving files to a shared folder, and then having to manually disperse those files to their proper locations once the transfer is complete. We have moved from using FTP along our private VPN network and externally to clients, to Signiant’s Media Shuttle.

The two major reasons we did this are:
1. File Transfer Speed: Media Shuttle optimizes the bandwidth of users on upload and download to make files move faster between locations, and helps to mitigate the need for shuttling drives from client editorial and post facilities.

2. Security: Password-protected FTP sites are only so secure, and the way Signiant packages files makes them less susceptible to being compromised in any way.

Internally it didn’t create more work on our end but provided a significant net gain.

Check out the Stargate website for reels and more credits.

Ingenuity Engine effects ‘The Last Man on Earth’

Los Angeles-based Ingenuity Engine was the primary visual effects vendor on the first season of the Fox hit The Last Man on Earth — the story of a man who believes he is the only survivor of a deadly virus that has wiped out the world’s population. In an act of desperation, Phil (series creator and star Will Forte) puts up billboards telling anyone who might care that he is “Alive in Tuscon.” Amazingly he finds other survivors.

The series, scored by Mark Mothersbaugh (see our story here), has already been renewed for a second season.

Aside from the show’s virtual cul de sac exterior location (via CBS Digital) and a few small shots done by Mango LA, Ingenuity Engine was responsible for all of this season’s VFX on the show. The number of shots they provided per episode varied quite a bit, anywhere from 10 to more than 50. “Last Man takes place a few years after a global disaster kills most of the world’s population,” explains Ingenuity Engine’s VFX producer Oliver Taylor. “So any time the show’s characters went out into that world we wound up doing a large amount of work to assist making the world look uninhabited.”

Oliver Taylor

Oliver Taylor

The work ranged from basic paint-outs, sky replacements and general clean-up all the way up to creating a CG space station orbiting Earth for the season finale. “The Last Man crew is very experienced and pretty VFX savvy, so they had a very good grasp of what was possible in post, what they could do on their own and what we needed to be deeply involved in,” says Taylor.

Two VFX sequences from the first season stick out in Taylor’s mind in terms of how involved the studio got with the shots. The first is a where Will Forte’s character climbs up onto the catwalk of a billboard, only to have the ladder fall. “They couldn’t put their actor on top of a real billboard because it would be unsafe, and putting a camera up that high would be slow and error-prone. So a billboard set was constructed with greenscreen around it, and a plate shoot was done to capture the backgrounds,” he describes. “Production brought us on a week or so before the shoot and we went over the storyboards in detail with their team. We also participated in the location scouts and helped them figure out the dimensions of the set they were building.”

The second VFX-heavy work is the reveal of the International Space Station (ISS) for the finale. Just as with the billboard sequence, Ingenuity Engine got involved from the very start. “We were looking at storyboards, recommending adjustments that were appropriate, walking through the set with the director and DP, and helping everyone come to a consensus as to how it was going to be photographed.”

LME_stills_20150508.002jason 2 main

The first shot of the sequence is a crane shot that rises from the surface of the Earth, through the clouds and up to the International Space Station. “We knew right away we couldn’t get away with a matte painting of the Earth, and that to some extent we had to do it for real,” explains Taylor. “Because the camera moves through the clouds we needed them to be volumetric. We also needed them to cast correct shadows on the Earth, interrupt reflections on the surface of the Earth’s terrain and water, etc. All this necessitated a detailed build of the Earth, it’s textures and displacement, and proper interaction with the clouds.”

Once the viewer gets inside the ISS they see a character that has been stranded for some time. Making the actor (Jason Sudeikis) look like he was in zero gravity involved placing the actor on a crane arm, which would be moved around as he pushed off the walls. Taylor explains that the scene and the removal of the zero-gravity rig from the actor on the ISS was a fun challenge. “The two complicating factors were that it was shot on a moving Steadicam and the stunts team used a crane rig, which entered the shot from just behind camera. To remove the crane, from a shot which drifted around, we had to build geo for the interior of the ISS to match the set, get an accurate match-move for the camera and re-project clean-plate textures back onto the geo. It’s a tricky process, one that requires a lot of tweaking, but it allows production to be very flexible with how they shoot the scene.”

LME_stills_20150508.003iss LME_stills_20150508.000ISS

Remember that all of this was done on a TV production schedule, meaning fast turnarounds. Taylor says timing is a big challenge when it comes to creating CG-heavy shots like this. “The typical episode only gets a week or so for VFX, but with the ISS sequence it was necessary to lock the edit five weeks in advance. Editorial was able to do that because we did animated previsualiztions specifically for their edit. They were able to cut in different versions of our animation and retime them to get the pacing and shots they wanted. In parallel we developed the model, texture and rendering work. The Last Man post team was great to work with. Doing a sequence like this in such a short time requires very tight collaboration, which keeps everyone on the same page and makes the creatives on their end feel more involved in the process and more at ease with where we’re going.”

Tools
Ingenuity Engine is in the process of testing and switching to The Foundry’s Modo for a lot of its 3D work, so they used this project as an opportunity to “jump in the deep end,” says Taylor. “The interactivity of the render preview is a great advantage for us and for this particular situation. To stay true to reality we knew there was only one way we could light the exterior of the ISS — with the light from the sun and a soft bounce from the Earth. Being able to work on the lighting and shading and quickly see results makes the process much easier to navigate and, ultimately, allows us to do much better work.”

The studio calls on The Foundry’s Nuke for compositing, as well as The Foundry’s Hiero for all I/O, which Taylor refers to as “a great integrated pipeline that makes everything a little faster and easier.” Side Effects Houdini was used to create the clouds around the Earth in the ISS sequence. “The primary benefit for us was that we could place rough geo in the scene, so that the placement of the clouds made sense from an artistic sense, and use that geo to generate VDB volumes of the clouds. It’s a process we nailed down working on commercials and have used it again and again doing cloud work.”

Ingenuity Engine called on another Foundry product, Mari, for texturing the exterior of the ISS. “Mari was very useful in this situation because it saved us a lot of time in model prep and skipping a lot of the grunt work in laying out UVs,” concludes Taylor.

If you haven’t seen The Last Man on Earth yet, check it out on Hulu and get your binge on.

Behind the Title: Method VFX supervisor Alvin Cruz

NAME: Eduardo “Alvin” Cruz

COMPANY: Method Studios (@method_studios)

CAN YOU DESCRIBE YOUR COMPANY?
Method Studios is an artist-driven global studio that offers high-end visual effects for the film, commercial, television, gaming and design industries.

WHAT’S YOUR JOB TITLE?
Visual Effects Supervisor

WHAT DOES THAT ENTAIL?
A visual effects supervisor is the person who determines creative and technical approaches for Continue reading

London’s Milk VFX house expands to Wales

London-based Milk VFX is opening a second studio in Cardiff, Wales, to support its expanding roster of TV and feature film projects, such as ITV’s 13-part warrior drama Beowulf and Thunderbirds are Go, FX’s The Bastard Executioner, Hartswood/BBC’s Sherlock, as well as the feature film Poltergeist and the recently completed Insurgent, the second offering in the Divergent series.

Milk’s new studio is located at the GloWorks building in Cardiff Bay — the Welsh Government’s flagship center for creative industries. Milk will open at the end of April with 20 artist seats and will begin work immediately on projects including the upcoming ninth season of the BBC’s Doctor Who and Hartswood/BBC’s Sherlock Christmas Special 2015. The new studio will share seamless communications and workflow with Milk’s London office via Sohonet.

VFX supervisor Sue Land will manage the new studio, reporting to Will Cohen, Milk’s CEO and executive producer. Milk has already recruited a number of key crew locally and will work closely with the Welsh Government and Creative Skillset to maximize opportunities to hire and train local talent as the studio grows. Milk has received the support of the Welsh Government as part of its program to foster the creative industries in Wales.

“Wales has a growing reputation as a great location for the film and television industry, and the Milk team are excited to be part of it,” reports Cohen. “The Welsh Government has been extremely supportive. They have helped us to locate our premises and talked us through the various options for new business support in Wales. They are passionate about growing the infrastructure and it is infectious!”

And regarding the move to Cardiff in particular, he says, it was “a natural choice of location, given our long-term relationship with BBC Wales, as the BBC Roath Lock Studios are just opposite GloWorks, where Milk’s studio will be located. Milk’s new Cardiff hub will share seamless communications and workflow with our main London studio — enabling us to replicate its boutique style service.”

Milk’s location in Cardiff will use the same tools as its London location, including Maya, Houdini, Arnold, Golaem Crowd, Yeti, Mari, Nuke, Ocula, 3D Equalizer, Deadline, RV, Shotgun and some proprietary offerings.

Behind the Title: Zoic’s Peter Hunt

NAME: Peter Hunt

COMPANY: Zoic Studios (@zoicstudios)

CAN YOU DESCRIBE YOUR COMPANY?
Zoic Studios is a visual effects company, based in Culver City, California, and Vancouver, British Columbia, that creates VFX for films, TV, commercials, games and digital experiences.

WHAT’S YOUR JOB TITLE?
CG Supervisor

WHAT DOES THAT ENTAIL?
It means I have two sets of bosses and responsibilities.
1. Supporting the creative vision of our client through the VFX supervisor.
2. Making sure my team and departments have the technical and artistic means to bring that Continue reading

Quick Chat: Camille Geier, EP of Shade VFX New York

Earlier this month, bi-coastal studio Shade VFX brought on veteran visual effects producer Camille Geier as executive producer of its New York location, which recently expanded into a new 5,000-square-foot location.

Geier, who started her visual effects career at ILM as a VFX producer, will oversee Shade’s New York feature and television work, including shots on Marvel’s Netflix series, Daredevil.

She comes to Shade after a recent stint at Rodeo FX. Prior to that Geier spearheaded the feature film division for RhinoFX where, as EP, she oversaw over 20 films, including The Adjustment Bureau, Salt, The Other Guys and Ghost Town. Before that, she worked at Curious Pictures in television animation. Continue reading

Paul Marangos brings his Flame expertise to Hooligan in NYC

Paul Marangos has joined New York editing and VFX boutique Hooligan as senior visual effects Flame artist. He has over 20 years of post experience, with an eight year tenure at The Mill and stops at London’s Cell, LA’s The Finish Line, Johannesburg’s Blade and Scarlet in NYC.

He has already wrapped up several projects with Hooligan, including commercials for Citi and Match.com, and Indrani’s short film Crescendo, curated by Pepsi in conjunction with the 2014 FIFA World Cup.

Match

“I’ve always enjoyed working with editors, and Hooligan provides me the opportunity to step out of traditional facilities and work directly with them in order to continue doing what I do best,” says the South African born Marangos.

“Paul always seems to be way ahead of me,” adds Kane Platt, president/senior editor at Hooligan. “He always knows what I’m working on and what the creative challenges are, and will call me into to his room to look at wonderful visual ideas before I’ve even started cutting. It’s an enthusiasm that one rarely finds — and it usually leads to great things.”

Over the years Marangos has contributed to commercials for brands such as Nike, BMW, Taco Bell, Lexus, Honda, Guinness, Cadburys, Samsung and Pepsi. His advertising reel is highlighted by Nando’s Cannes Lion-honored “Dictator” campaign and FNB’s 2010 World Cup ad in which he seamlessly composited a full stadium using only 150 people.

Marangos has also worked on graphics packages for major networks and TV shows, including the Sucker TV, and the 2001 feature film “Hannibal. His special effects work can be found in music videos for Oasis (Right Here, Right Now) and Mariah Carey (My All), as well as Björk, Madonna, Goldfrapp, Kyle Minogue, Radiohead and Elton John.

After purchase of Eyeon, BMD releases free Fusion 7, Fusion Studio for $995

During IBC in September, the news broke that Blackmagic Design had purchased Eyeon and its popular Fusion visual effects software. The questions among those of us at the show began immediately. Will it be free? Will it be $995, the price Blackmagic has used in the past after buying software and then turning it around? Well turns out it’s yes on both counts.

Fusion 7, the advanced visual effects and motion graphics software, is now available for free; Fusion 7 for Windows can be downloaded from the Blackmagic Design website now.

The free Fusion 7 is not limited in its features — it offers an infinite 3D workspace and a node-based workflow for quickly building unlimited effects. Customers get advanced 3D compositing, paint, rotoscope, retiming, stabilization, titling, a 3D particle generator and multiple keyers, including Primatte. Fusion 7 also lets customers import and render 3D geometry and scenes from other applications as well as create their own elements from scratch.

The $995 Fusion 7 Studio includes everything found in the free Fusion 7 software, plus high-end features such as optical flow tools for advanced retiming, stabilization and stereoscopic 3D production, support for third-party OpenFX plug-ins, and distributed network rendering so customers can render jobs on an unlimited number of computers at no additional cost.

Fusion 7 Studio also includes Generation, a studio-wide multi-user workflow and collaboration tool that helps creative teams manage, track and review versions of every shot in a production. Customers can also move projects from the free Fusion 7 software to a workstation running Fusion 7 Studio and take advantage of workflow collaboration and unlimited distributed network rendering.

The company emphasizes that Fusion 7 Studio doesn’t require annual maintenance fees, subscriptions, a connection to the cloud or per-node render license costs. Fusion 7 Studio will be available from all Blackmagic Design resellers. Existing Fusion 7 customers and customers on a current Fusion support plan can upgrade to Fusion 7 Studio at no additional cost by contacting Blackmagic Design.

Fusion has been used on thousands of feature film and television projects, including Maleficent, Edge of Tomorrow, Sin City: A Dame to Kill For, The Amazing Spider-man 2 and The Hunger Games, as well TV shows like Battlestar Galactica, Orphan Black and others.

“Visual effects software has been expensive for way too long and it’s time that this changed. Consumers are screaming for more exciting movies and television programs and so we need to do everything we can to help our customers create stunning visual effects,” says Grant Petty CEO of Blackmagic. “Now, with the free version of Fusion, everyone from individual artists to the biggest studios can create Hollywood-caliber visual effects and motion graphics. When combined with DaVinci Resolve Lite, customers can get advanced tools for editing, grading, 3D compositing, visual effects and motion graphics, all absolutely free.”

What about a Mac version?
Apparently postPerspective wasn’t the only outlet asking about a Mac version of Fusion.  After some nudging Grant Petty released a statement.

“Yes, we are working on a version of Fusion for Mac OS X, but there are some important things to know about that.  We are lucky that the engineering team who has been working on Fusion 7 has kept the code base very modern and clean so that allows us to move it forward. However, there is some Windows-specific code in the buttons and menus in Fusion and that code is being changed out right now. What that means is the time it’s going to take to do a Mac OS X version of Fusion is a bit unknown, and so it’s impossible right now to specify any kind of release day. It’s impossible to even know when we can show a Mac OS X version.

“However, it is early. We have only been working with Fusion as a Blackmagic Design product for a few weeks and so we will know more soon hopefully. We have already doubled the size of the engineering team, so this means we should be able to move faster, depending on how the team grows and works together. The trick is to work on a Mac OS X version of Fusion as well as doing all the other things we want to do, such as new features. A bigger team will help that. There is a lot more we want to do than just the Mac OS X version, even though that’s important!

“One thing I can say though, is that our plan is to allow anyone who purchases the Windows version of Fusion 7 Studio to use their dongle on the Mac and to be able to download that Mac OS X version of Fusion free of charge. That’s what we do with DaVinci Resolve and it’s very flexible, and I think helps people a lot. So we want to do that with Fusion also, even though they are very different types of software.

“I use a Mac, so I want to use Fusion without needing the VMware emulator I need to use now!”

Framestore launches live-action cat into CG universe

For agency VCCP and 02, a UK-based provider of mobile phones, mobile broadband and sim-only plans, London’s Framestore put a real cat into a CG spacesuit and sent him out into a completely computer-generated universe.

The viewer first sees a planet floating in deep blue space. That transitions to a close-up of the adventurous kitty; his head is in a helmet. The cat then floats off into space as a voiceover describes O2’s sim-only offerings while a satellite made up of sim cards are displayed. We get another view of the floating cat as a 4G “comet” logo whizzes by him. You can watch the Simplicity spot here.

“We treated the project very much like Gravity,” says the spot’s director, Framestore‘s Mike McGee, “creating a previs with The Third Floor and using Framestore’s art department to concept the key moments and how a cat would actually look in a spacesuit.”

“It was an ambitious project to launch our cat into outer space, and we knew there was no one better than Framestore to help us make it a reality,” reports VCCP creative director Jim Capp.

On set McGee and producer David Hay had to shoot live action that would fit what they had prevised. The cat, a Maine Coon called Jonesy, was given a little 3D-printed space helmet in order to cast the right shadows across his face and placed on a turn table so we could move him smoothly.

“We had prevised the cat to do this upside down motion, as if he’s falling off into the distance,” says CG supervisor Jay Khan. “Obviously we couldn’t shoot that, so we reverse-engineered the camera moves so that the he was doing as little as possible and the camera would compensate for that in its movement.”

The cat’s face is the only live-action element in the commercial, so Framestore needed to create his suit, the planets, nebula and comets with a combination of matte paintings, 2D elements and a lot of 3D. All of this was given a stylized look. “There was a conscious decision not to go photoreal and to lean slightly more towards the hyper-real. It doesn’t take itself too seriously, so once we finished we went to town with lens flares,” adds Khan.

The cat plates were tracked, stabilized and composited into the CG helmet and suit, which were built from scratch. The suit was then layered up with the HUD, the thrusters and reflection on the glass, while the studio animated displacement on top of the material to gave a gradual creasing effect.

Maya was used for the 3D, with Houdini employed for effects. Zbrush and Mudbox were called on for modeling, and compositing was via Nuke.

The studio strategically placed debris from a planet to form the word ‘Priority.’ Framestore’s team also had to animate asteroid belts in a way that is reminiscent of a Wi-Fi symbol, while they created a pyro simulation for the 4G comet to give it a fluid-like trail to convey the speed.

“The satellite made of sim cards was a big modeling task and we needed to get the size right so you could see what they are, while being small enough to give you a sense of the satellite’s huge scale,” explains Khan. “We went into a really high level of detail for the whole ad, but it’s all about the cat. We’re all really proud of it, with the end shot being a particular favorite. Vanessa DuQuesnay did a beautiful job of compositing the cat-suits and added a real sense of realism to the 3D renders.”

European editor Martin Leroy heads to LA for Whitehouse Post

Edit studio Whitehouse Post, with offices in NYC, LA, London, Amsterdam and Chicago, has brought Martin Leroy on board at their Los Angeles studio. A 10-year veteran, Leroy has worked all over the world editing work for brands such as BMW, IKEA, Adidas, Ford and Audi.

This Belgian-born editor’s portfolio spans commercials, music videos, documentaries and short films, editor has established relationships with directors worldwide, including Raf Wathion of SKUNK, Koen Mortier of Czar Films, and Arnaud Uyttenhove of Caviar.

“Martin is one of the most sought-after editors in Europe and we are delighted he is making the move to join us in the States,” says Whitehouse Post managing partner David Brixton. “His ability to craft beautiful and emotional visual narratives is exceptional. This coupled with his unparalleled technical skills and knowledge of visual effects makes him the perfect partner for big brand campaigns.”

Cloud-based collaboration tool Wipster is released

After an extensive beta program, Wipster is formally launching its new cloud-based, collaborative video review and approval platform created by filmmaker Rollo Wenlock and team.

It is designed specifically for content creators, filmmakers, in-house corporate media teams or anyone creating short-form video projects.

Wipster offers the ability to create content rich emails and share secure folders with anyone via an invite. Wipster, and its simple-to-use interface, offers frame-accurate video playback, contextual commenting and version stacking. Wipster also allows users to deliver unlimited videos in their original formats, an important point for those working in post.

WipsterUI1280x800

Wipster is available as a cloud-based, subscription service with options to meet any teams’ collaboration requirements. Wipster also provides users with a “Free Forever” plan that gives teams access to core Wipster features, and 15 minutes of video uploads per month.

“Video is clearly exploding across the Internet — across every industry and every genre. And as a result, a whole new generation of content creator is emerging,” says Rollo Wenlock, CEO/founder of Wipster. “The problem is, while we all love creating our art, we honestly hate the process of getting it approved and delivered. Let’s face it, it’s painful. But it doesn’t have to be — that’s why we created Wipster.”

For premium features, Wipster is available on a monthly subscription model, with pricing starting at $25 USD for a single user account, $50 USD for teams, and $100 USD for a pro company account. Enterprise level pricing is available upon request.

Summing up the offerings:
• Contextual Commenting – Comment and reply to feedback directly on the video
• To-Do List -Generate and share to-do lists directly from Wipster
• Version Stacking – Unlimited access to every version of your video
• Import from Dropbox – Get feedback on your videos from your Dropbox folder
• Frame Accurate Video Playback – Let your reviewers be specific and point to the right frame
• Creating Teams – Create teams and invite people to work together as a team on the same account
• Shared Folders – Share an entire folder of videos with anyone
• Security – Wipster uses encryption (256 SSL), which is what banks use
• Content Rich Emails – Receive frame and comment activity notifications via email, sent directly from within Wipster
• Nudge – If you’re not hearing back from the team in a timely way, gently remind people to review your video when you’re waiting for them
• Unlimited Sharing – Share your video with as many people as you want, wherever they are, no matter if they have an account
• Unlimited Archive – All your videos stored in one place, no limits
• Branded Presentation – When done, create tailored, branded presentations for clients.

 

 

Milk provides 117 VFX shots for ‘Doctor Who’ debut episode

The BBC’s Doctor Who is back, to the delight of television audiences worldwide. The series, which has gone through eight iterations over the years, recently had its season debut, and London-based VFX house Milk played a role.

The studio created the visual effects for the premiere episode “Deep Breath,” which featured Peter Capaldi as Doctor Who. Ben Wheatley directed the 80-minute episode, which was simulcast and screened in cinemas globally on August 23.

The focus of Milk’s work, 87 shots worth, was the sinister and mysterious “Half-Face Man,” who appears throughout the episode. Milk replaced one entire side of the actor Peter Ferdinando’s head in 87 of the 117 digital shots produced by Milk.

Continue reading

Dennis Hoffman running daily operations at Framestore Montreal

Dennis Hoffman has joined Framestore’s Montreal location as senior VP. He will be responsible for overseeing the day-to-day operation of the facility. Long-time industry vet Hoffman will be working closely with joint worldwide managing directors/presidents of film Fiona Walkinshaw and Matt Fox to refine Framestore’s multi-city production pipeline, while growing the capability of the Montreal studio. The VFX studio has locations in London, New York, Los Angeles and Montreal.

Over the years, Hoffman has held executive positions at a number of big-time visual effects facilities, including Digital Domain, Cinesite, Method and Dream Quest Images. He has been involved in many VFX projects, including Flags of Our Fathers and Changeling (VES Award winners for supporting VFX), as well as VFX Oscar nominees Mighty Joe Young and Armageddon, VES nominee Invictus and Earth 2, an Emmy Award-winner for best VFX. In addition, Hoffman is a founding member of the VES and currently serves on its board in Los Angeles as well as the board of the Vancouver section.

 

Bringing killer sharks to the Big Apple in ‘Sharknado 2’

Sharknado has become a phenomenon all to itself. After the first made-for-TV movie became a cult classic, it was only a matter of time until we saw those storm-raveling sharks on screen once more. This week Sharknado 2: The Second One premiered on SyFy and it quickly started trending on Twitter.

Artists at Burbanks’s The Asylum, producers of the Sharknado films, brought the chaos to the streets of New York City, and Citi Field in Queens. “From the outset, the entire Sharknado 2 project was challenging, but perhaps a little easier than ‘the first one,’” says Emile Smith, VFX supervisor on Sharknado 2.

Smith and team called on LightWave 3D to deliver final renders in a very short time. “We use LightWave for 99% of everything. We used a lot of the new Dynamics tools as well as the TurbulenceFD plugin [from Jawset] to generate the actual Sharknado, which is exactly what you might think it is—a monster tornado, filled with amongst other things, hungry angry sharks.”

today before today_2after

The opening teaser of Sharknado 2 takes place on an airplane in the middle of a storm. This presented the challenge of generating art-directable volumetric clouds that allowed the plane to fly through a ‘Sharknado’ before landing in New York. “This sequence brought in every effect we would use in the film, except for a water surface,” explains Smith. “There was a lot of digital double work for the latter part of the film and the volumetrics of the final Sharknado sequence will entertain for sure. Perhaps some people will even be impressed by how real the VFX looks. Of course, if it didn’t look real, it wouldn’t be funny.”

Placing debris (and the sharks) inside the cloud formation was where the crew started to have a lot of fun in LightWave. The creative brief called for the creation of an animatable 3D tornado; LightWave allowed the animators to animate the tornado before handing it over for simulation and rendering.

There’s a particular scene in the film where a single taxi completely surrounded by man-eating sharks is stuck on a flooded New York City street. Parts of the scene were shot on location, while other elements and environments were completely greenscreened. “The visuals tell the story of the two occupants climbing onto the roof of the cab while several sharks are swimming around and sizing them up,” says Smith. “With LightWave, the multiple elements in the passes and layers for this sequence came together quickly.”

“This project was on a very tight timeline. LightWave’s ease of use and really straightforward nature helped The Asylum crew hit the deadline we were presented with,” says Mark Hennessy-Barrett, VFX artist on Sharknado 2. “Because LightWave is a really forgiving piece of software to use, we were able to break the rules and take immense liberties in the rendering engine to get the job done and out the door.”

To read about how the film was edited, see our interview with Vashi Nedomanksi about his work on the project.

Kira Karlstrom joins Arsenal FX from Marvel Entertainment

Santa Monica’s Arsenal FX, which offers high-end commercial finishing, has brought on Kira Karlstrom in the roll of “business development executive.” She comes to the studio from Marvel Entertainment, where she was a manager in the live events division.

In this role, Karlstrom will be responsible for growing the business in terms of new clients and other biz dev opportunities. Her immediate focus will be integrating animation in the Arsenal FX Design Department to customize content and messaging through visual storytelling across multiple platforms.

“I see opportunities to broaden our work into co-branded TV spots, network animation and live events, as well as commercial and digital content,” says Karlstrom. “These new avenues are crucial not only for our brand, but, more importantly, for our clients. And, it will allow us to compete with any full-service post production, VFX or design studio.”

Zoic’s Mike Romey discusses previs app ZEUS:Scout

By Randi Altman

Visual effects studio Zoic has released to the masses an iPad-based previs tool they developed while providing shots for VFX-heavy shows such as Once Upon a Time, Intelligence, Pan Am and V. The app is available now via iTunes for $9.99.

According to Zoic (@zoicstudios), ZEUS:Scout (Zoic Environmental Unification System) offers seven main modes: View allows the user to move the camera around and save camera positions for shock blocking purposes; Measurements mode allows users to bring real-world measurements into a virtual world; Characters mode can be used to insert character cards into the virtual location; Props lets users add, move, rotate and scale set props that are available for in-app purchase; Previs Animation lets users explore camera moves for previs and rehearsal purposes. The Tracking mode allows users to use the tablet as a virtual camera with the CG view matching the Continue reading

Meet the Owner: Tone Visuals’ Brian Buongiorno

NAME: Brian Buongiorno

COMPANY: Austin, Texas-based Tone Visuals.

CAN YOU DESCRIBE YOUR COMPANY?
We are a post-production boutique specializing in digital color grading, 2D visual effects and finishing. We also have Tone Tags, which is a custom searchable database platform that we can provide for our clients.

WHAT’S YOUR JOB TITLE?
Creative Director/Colorist-Finishing Artist/Founder

WHAT DOES THAT ENTAIL?
I am involved in all aspects of our work, from bidding and pre-production to on-set VFX supervision to dailies, final color grading, VFX and finishing.

Continue reading

Meet the Artist/Owner: Mechanism Digital’s Lucien Harriot

NAME: 
Lucien Harriot (@mechdigi)

COMPANY:Mechanism Digital Inc.

CAN YOU DESCRIBE YOUR COMPANY?
A New York City-based production studio founded in 1996, providing design, visual effects and new development for the film, television and advertising industry. We are a smart, friendly group of passionate creatives who stay up late to help entertainment and marketing pros tell their stories in memorable ways.

WHAT’S YOUR JOB TITLE?

Executive producer and visual effects supervisor.

WHAT DOES THAT ENTAIL?
Being chief cook and bottle washer. Sales, quality control, overseeing marketing message and Continue reading

VFX house Hydraulx opens equipment rental and stage company

Hydraulx, the veteran visual effects shop, has opened a new production stage and camera equipment rental company called Hydraulx Filmz. Based in Playa Vista, CA, the new location offers clients a 40,000-square-foot facility with 22,000-square-feet of soundstage space, 25-foot-high ceilings, and 60×30-foot greenscreens.

Hydraulx Filmz is also offering up rental services including camera and lens packages that include nine Red Epic Dragon 6K cameras, four fully stereo-capable camera rigs with matching lenses and a QTake Video Assist System.

L-R: Greg Strause and operations manager John Duke DuQuesnay at Hydraulx Filmz.

L-R: Greg Strause and operations manager John Duke DuQuesnay at Hydraulx Filmz.

“We have decided now was a perfect time to expand our business model by launching a stage and equipment rentals business,” says Hydraulx owner/founder Greg Strause. “Our clients have the benefit of us owning everything here — we are literally a ‘one-stop shop’ solution to any production need. We have created a truly unique production pipeline, by connecting all the pieces. All of our equipment ‘talks to each other.’ We have assembled everything imaginable here under one roof, for any kind of film, TV, commercial, music video, corporate or multi-media production.”

The new Hydraulx Filmz facility also includes a DI projection theatre, connected via 10GB fiber to the Internet, for color grading and watching dailies. The site also offers a dressing room, makeup room, three entry bays for loading materials, and provides ample parking.

Five tips for VFX supervising on set

By Hasraf “HaZ” Dulull

I’ve been on many sets over the course of my career as a visual effects supervisor and I’ve seen just about everything, from full on greenscreen sets, motion control sets, high-speed photography elements shoots and even guerrilla-style handheld shoots that needed visual effects added later in post.

Although most projects are unique in their own way, the fundamentals of gathering VFX data on set are always the same, regardless of the scale or budget of the project. When I’m preparing to VFX supervise a shoot, there are five fundamentals I keep in mind. Here they are:

1) Bring a laptop with you that is loaded with editing software (Premiere or FCP) and compositing software (Nuke or After Effects) to be able to play back animatics and do test Continue reading

Quick Chat: Vancouver Film School’s Marty Hasselbach

By Randi Altman

Recently, Vancouver Film School, whose program offerings are designed to prepare students to be independent filmmakers/game producers and animators, invested in 775 new HP z420 workstations with AMD W7000 FirePro graphics for use in their new 155,000-square-foot campus in the heart of Vancouver’s Gastown. The goal was to create an environment that emulates real-world production studios.

The facilities used by the school feature the same tools you’ll find in professional studios. The new campus will be home to the Animation, Visual Effects and Film departments. Other departments will also have access to the resources as opportunities for collaboration arise.

They’ve committed 70,000 square feet of the new campus to classrooms and studio space Continue reading

Phosphene provides VFX shots for ‘The Fault in Our Stars’

Phosphene’s visual effects team, under the direction of creative director/VFX supervisor John Bair and executive producer Vivian Connolly, completed effects work and adapted the 20th Century Fox logo for the film The Fault in Our Stars. This 20th Century Fox offering, which stars Shailene Woodley, and Ansel Elgort, focuses on Hazel and Gus who meet and fall in love at a cancer support group.

At several key points in the film, Hazel lies in the grass and stares up at the starry night sky, which was created by Phosphene. The very first time we see the night sky is during the 20th Century Fox logo animation. Phosphene was assigned the task of tying the iconic logo with the theme of the film. “We came to Phosphene with the idea of incorporating the starry sky motif into the Fox logo. Phosphene did an amazing job of bringing the night sky to life in a way that really helps launch the story,” explained director Josh Boone.

The Fault in our Stars FOX_logo_night_comp_v034_editsmall

Phosphene’s primary visual effects work included the compositing of a prosthetic leg and stump for Gus. “From my first meeting with Josh, he and I talked about how real and naturalistic we wanted Gus’s leg to look. Phosphene was my first call,” said the film’s VFX supervisor Jake Braver. “We decided early on to stay away from CG and to use the combination of an amputee double and a 2D approach. Phosphene did an amazing job of seamlessly integrating the prosthetic leg and selling the illusion that Gus had lost his leg to cancer.”

Phosphene’s lead digital artist, Aaron Raff, said, “In order to replace the leg, we used camera projections and proxy geometry in Nuke X to project the shape and textures of an amputee body double’s leg into the plate of Gus’ limb. Using this method, we were able to show Gus’s amputation in shots with dynamic camera moves, as well as in shots where the actor moved freely, shifting his position.” The Foundry’s Nuke X, running on PCs, was the only system the New York-based VFX studio used on the shots.

Phosphene effects producer Ariela Rotenberg added, “Throughout the process, we took extreme care to remain true to the thematic and emotional tone of the story, particularly for our work on Gus’s leg, which is such an important plot point.  We were very lucky that Josh Boone and his team came to us with such a specific and grounded vision, which allowed us to really focus on helping tell the story of these incredibly vivid characters.”

Technicolor-PostWorks in Los Angeles provided DI and lab processing.

 

Cut+Run ups Amburr Farls to head of production

Cut+Run has promoted senior producer Amburr Farls to head of production for its Los Angeles, San Francisco and Austin offices.

Since joining Cut+Run in 2013, Farls has produced projects for BK Subservient Chicken Returns, Pepsi (World Cup), Turbo Tax (Super Bowl) and Miller Lite. She was previously senior producer at Arcade Edit and at Beast. She also enjoyed successful producer tenures at both FilmCore and Trailer Park.

US managing director Michelle Eskin had this to say about Faris: “She has a wide breadth of production experience gathered during her tenure in the industry. Our Cut+Run culture is an important part of our overall brand philosophy and success. Amburr enhances this with a wealth of ideas and dynamic energy each day. Additionally, Amburr will oversee our production practices and team in an effort to hone continuity and efficiency.”

“Her work on complex projects, budgeting and adept skill with clients, along with her creative cultural ideas, have helped raise our C+R bar even higher,” agreed EP Carr Schilling.

In addition to editing services, Cut+Run offers resources for visual effects, design and finishing services for advertising, entertainment and art content.