Tag Archives: visual effects

Framestore VFX will open in Mumbai in 2020

Oscar-winning creative studio Framestore will be opening a full-service visual effects studio in Mumbai in 2020 to target India’s booming creative industry. The studio will be located in the Nesco IT Park in Goregaon, in the center of Mumbai’s technology district. The news hammers home Framestore’s continued interest in India, after having made a major investment in Jesh Krishna Murthy’s VFX studio, Anibrain, in 2017.

“Mumbai represents a rolling of wheels that were set in motion over two years ago,” says Framestore founder/CEO William Sargent. “Our investment in Anibrain has grown considerably, and we continue in our partnership with Jesh Krishna Murthy to develop and grow that business. Indeed, they will become a valued production partner to our Mumbai offering.”

Framestore looks to make considerable hires in the coming months, aiming to build an initial 500-strong team with existing Framestore talent combined with the best of local Indian expertise. Mumbai will work alongside the global network, including London and Montreal, to create a cohesive virtual team delivering high-quality international work.

“Mumbai has become a center of excellence in digital filmmaking. There’s a depth of talent that can deliver to the scale of Hollywood with the color and flair of Bollywood,” Sargent continues. “It’s an incredibly vibrant city and its presence on the international scene is holding us all to a higher standard. In terms of visual effects, we will set the standard here as we did in Montreal almost eight years ago.”

 

London’s Freefolk beefs up VFX team

Soho-based visual effects studio Freefolk, which has seen growth in its commercials and longform work, has grown its staff to meet this demand. As part of the uptick in work, Freefolk promoted Cheryl Payne from senior producer to head of commercial production. Additionally, Laura Rickets has joined as senior producer, and 2D artist Bradley Cocksedge has been added to the commercials VFX team.

Payne, who has been with Freefolk since the early days, has worked on some of the studio’s biggest commercials, including; Warburtons for Engine, Peloton for Dark Horses and Cadburys for VCCP.

Rickets comes to Freefolk with over 18 years of production experience working at some of the biggest VFX houses in London, including Framestore, The Mill and Smoke & Mirrors, as well as agency side for McCann. Since joining the team, Rickets has VFX-produced work on the I’m A Celebrity IDs, a set of seven technically challenging and CG-heavy spots for the new series of the show as well as ads for the Rugby World Cup and Who Wants to Be a Millionaire?.

Cocksedge is a recent graduate who joins from Framestore, where he was working as an intern on Fantastic Beasts: The Crimes of Grindelwald. While in school at the University of Hertfordshire, he interned at Freefolk and is happy to be back in a full-time position.

“We’ve had an exciting year and have worked on some really stand-out commercials, like TransPennine for Engine and the beautiful spot for The Guardian we completed with Uncommon, so we felt it was time to add to the Freefolk family,” says Fi Kilroe, Freefolk’s co-managing director/executive producer.

Main Image: (L-R) Cheryl Payne, Laura Rickets and Bradley Cocksedge

Alkemy X adds Albert Mason as head of production

Albert Mason has joined VFX house Alkemy X as head of production. He comes to Alkemy X with over two decades of experience in visual effects and post production. He has worked on projects directed by such industry icons as Peter Jackson on the Lord of the Rings trilogy, Tim Burton on Alice in Wonderland and Robert Zemeckis on The Polar Express. In his new role at Alkemy X, he will use his experience in feature films to target the growing episodic space.

A large part of Alkemy X’s work has been for episodic visual effects, with credits that include Amazon Prime’s Emmy-winning original series, The Marvelous Mrs. Maisel, USA’s Mr. Robot, AMC’s Fear the Walking Dead, Netflix’s Maniac, NBC’s Blindspot and Starz’s Power.

Mason began his career at MTV’s on-air promos department, sharpening his production skills on top series promo campaigns and as a part of its newly launched MTV Animation Department. He took an opportunity to transition into VFX, stepping into a production role for Weta Digital and spending three years working globally on the Lord of the Rings trilogy. He then joined Sony Pictures Imageworks, where he contributed to features including Spider-Man 3 and Ghost Rider. He has also produced work for such top industry shops as Logan, Rising Sun Pictures and Greymatter VFX.

“[Albert’s] expertise in constructing advanced pipelines that embrace emerging technologies will be invaluable to our team as we continue to bolster our slate of VFX work,” says Alkemy X president/CEO Justin Wineburgh.

Creating With Cloud: A VFX producer’s perspective

By Chris Del Conte

The ‘90s was an explosive era for visual effects, with films like Jurassic Park, Independence Day, Titanic and The Matrix shattering box office records and inspiring a generation of artists and filmmakers, myself included. I got my start in VFX working on seaQuest DSV, an Amblin/NBC sci-fi series that was ground-breaking for its time, but looking at the VFX of modern films like Gemini Man, The Lion King and Ad Astra, it’s clear just how far the industry has come. A lot of that progress has been enabled by new technology and techniques, from the leap to fully digital filmmaking and emergence of advanced viewing formats like 3D, Ultra HD and HDR to the rebirth of VR and now the rise of cloud-based workflows.

In my nearly 25 years in VFX, I’ve worn a lot of hats, including VFX producer, head of production and business development manager. Each role involved overseeing many aspects of a production and, collectively, they’ve all shaped my perspective when it comes to how the cloud is transforming the entire creative process. Thanks to my role at AWS Thinkbox, I have a front-row seat to see why studios are looking at the cloud for content creation, how they are using the cloud, and how the cloud affects their work and client relationships.

Chris Del Conte on the set of the IMAX film Magnificent Desolation.

Why Cloud?
We’re in a climate of high content demand and massive industry flux. Studios are incentivized to find ways to take on more work, and that requires more resources — not just artists, but storage, workstations and render capacity. Driving a need to scale, this trend often motivates studios to consider the cloud for production or to strengthen their use of cloud in their pipelines if already in play. Cloud-enabled studios are much more agile than traditional shops. When opportunities arise, they can act quickly, spinning resources up and down at a moment’s notice. I realize that for some, the concept of the cloud is still a bit nebulous, which is why finding the right cloud partner is key. Every facility is different, and part of the benefit of cloud is resource customization. When studios use predominantly physical resources, they have to make decisions about storage and render capacity, electrical and cooling infrastructure, and staff accommodations up front (and pay for them). Using the cloud allows studios to adjust easily to better accommodate whatever the current situation requires.

Artistic Impact
Advanced technology is great, but artists are by far a studio’s biggest asset; automated tools are helpful but won’t deliver those “wow moments” alone. Artists bring the creativity and talent to the table, then, in a perfect world, technology helps them realize their full potential. When artists are free of pipeline or workflow distractions, they can focus on creating. The positive effects spill over into nearly every aspect of production, which is especially true when cloud-based rendering is used. By scaling render resources via the cloud, artists aren’t limited by the capacity of their local machines. Since they don’t have to wait as long for shots to render, artists can iterate more fluidly. This boosts morale because the final results are closer to what artists envisioned, and it can improve work-life balance since artists don’t have to stick around late at night waiting for renders to finish. With faster render results, VFX supervisors also have more runway to make last-minute tweaks. Ultimately, cloud-based rendering enables a higher caliber of work and more satisfied artists.

Budget Considerations
There are compelling arguments for shifting capital expenditures to operational expenditures with the cloud. New studios get the most value out of this model since they don’t have legacy infrastructure to accommodate. Cloud-based solutions level the playing field in this respect; it’s easier for small studios and freelancers to get started because there’s no significant up-front hardware investment. This is an area where we’ve seen rapid cloud adoption. Considering how fast technology changes, it seems ill-advised to limit a new studio’s capabilities to today’s hardware when the cloud provides constant access to the latest compute resources.

When a studio has been in business for decades and might have multiple locations with varying needs, its infrastructure is typically well established. Some studios may opt to wait until their existing hardware has fully depreciated before shifting resources to the cloud, while others dive in right away, with an eye on the bigger picture. Rendering is generally a budgetary item on project bids, but with local hardware, studios are working to recoup a sunk cost. Using the cloud, render compute can be part of a bid and becomes a negotiable item. Clients can determine the delivery timeline based on render budget, and the elasticity of cloud resources allows VFX studios to pick up more work. (Even the most meticulously planned productions can run into 911 issues ahead of delivery, and cloud-enabled studios have bandwidth to be the hero when clients are in dire straits.)

Looking Ahead
When I started in VFX, giant rooms filled with racks and racks of servers and hardware were the norm, and VFX studios were largely judged by the size of their infrastructure. I’ve heard from an industry colleague about how their VFX studio’s server room was so impressive that they used to give clients tours of the space, seemingly a visual reminder of the studio’s vast compute capabilities. Today, there wouldn’t be nearly as much to view. Modern technology is more powerful and compact but still requires space, and that space has to be properly equipped with the necessary electricity and cooling. With cloud, studios don’t need switchers and physical storage to be competitive off the bat, and they experience fewer infrastructure headaches, like losing freon in the AC.

The cloud also opens up the available artist talent pool. Studios can dedicate the majority of physical space to artists as opposed to machines and even hire artists in remote locations on a per-project or long-term basis. Facilities of all sizes are beginning to recognize that becoming cloud-enabled brings a significant competitive edge, allowing them to harness the power to render almost any client request. VFX producers will also start to view facility cloud-enablement as a risk management tool that allows control of any creative changes or artistic embellishments up until delivery, with the rendering output no longer a blocker or a limited resource.

Bottom line: Cloud transforms nearly every aspect of content creation into a near-infinite resource, whether storage capacity, render power or artistic talent.


Chris Del Conte is senior EC2 business development manager at AWS Thinkbox.

Blur Studio uses new AMD Threadripper for Terminator: Dark Fate VFX

By Dayna McCallum

AMD has announced new additions to its high-end desktop processor family. Built for demanding desktop and content creation workloads, the 24-core AMD Ryzen Threadripper 3960X and the 32-core AMD Ryzen Threadripper 3970X processors will be available worldwide November 25.

Tim Miller on the set of Dark Fate.

AMD states that the powerful new processors provide up to 90 percent more performance and up to 2.5 times more available storage bandwidth than competitive offerings, per testing and specifications by AMD performance labs. The 3rd Gen AMD Ryzen Threadripper lineup features two new processors built on 7nm “Zen 2” core architecture, claiming up to 88 PCIe 4.0 lanes and 144MB cache with 66 percent better power efficiency.

Prior to the official product launch, AMD made the 3rd Gen Threadrippers available to LA’s Blur Studio for work on the recent Terminator: Dark Fate and continued a collaboration with the film’s director — and Blur Studio founder — Tim Miller.

Before the movie’s release, AMD hosted a private Q&A with Miller, moderated by AMD’s James Knight. Please note that we’ve edited the lively conversation for space and taken a liberty with some of Miller’s more “colorful” language. (Also watch this space to see if a wager is won that will result in Miller sporting a new AMD tattoo.) Here is the Knight/Miller conversation…

So when we dropped off the 3rd Gen Threadripper to you guys, how did your IT guys react?
Like little children left in a candy shop with no adult supervision. The nice thing about our atmosphere here at Blur is we have an open layout. So when (bleep) like these new AMD processors drops in, you know it runs through the studio like wildfire, and I sit out there like everybody else does. You hear the guys talking about it, you hear people giggling and laughing hysterically at times on the second floor where all the compositors are. That’s where these machines really kick ass — busting through these comps that would have had to go to the farm, but they can now do it on a desktop.

James Knight

As an artist, the speed is crucial. You know, if you have a machine that takes 15 minutes to render, you want to stop and do something else while you wait for a render. It breaks your whole chain of thought. You get out of that fugue state that you produce the best art in. It breaks the chain between art and your brain. But if you have a machine that does it in 30 seconds, that’s not going to stop it.

But really, more speed means more iterations. It means you deal with heavier scenes, which means you can throw more detail at your models and your scenes. I don’t think we do the work faster, necessarily, but the work is much higher quality. And much more detailed. It’s like you create this vacuum, and then everybody rushes into it and you have this silly idea that it is really going to increase productivity, but what it really increases most is quality.

When your VFX supervisor showed you the difference between the way it was done with your existing ecosystem and then with the third-gen Threadripper, what were you thinking about?
There was the immediate thing — when we heard from the producers about the deadline, shots that weren’t going to get done for the trailer, suddenly were, which was great. More importantly, you heard from the artists. What you started to see was that it allows for all different ways of working, instead of just the elaborate pipeline that we’ve built up — to work on your local box and then submit it to the farm and wait for that render to hit the queue of farm machines that can handle it, then send that render back to you.

It has a rhythm that is at times tiresome for the artists, and I know that because I hear it all the time. Now I say, “How’s that comp coming and when are we going to get it, tick tock?” And they say, “Well, it’s rendering in the background right now, as I’m watching them work on another comp or another piece of that comp.” That’s pretty amazing. And they’re doing it all locally, which saves so much time and frustration compared to sending it down the pipeline and then waiting for it to come back up.

I know you guys are here to talk about technology, but the difference for the artists is the instead of working here until 1:00am, they’re going home to put their children to bed. That’s really what this means at the end of the day. Technology is so wonderful when it enables that, not just the creativity of what we do, but the humanity… allowing artists to feel like they’re really on the cutting edge, but also have a life of some sort outside.

Endoskeleton — Terminator: Dark Fate

As you noted, certain shots and sequences wouldn’t have made it in time for the trailer. How important was it for you to get that Terminator splitting in the trailer?
 Marketing was pretty adamant that that shot had to be in there. There’s always this push and pull between marketing and VFX as you get closer. They want certain shots for the trailer, but they’re almost always those shots that are the hardest to do because they have the most spectacle in them. And that’s one of the shots. The sequence was one of the last to come together because we changed the plan quite a bit, and I kept changing shots on Dan (Akers, VFX supervisor). But you tell marketing people that they can’t have something, and they don’t really give a (bleep) about you and your schedule or the path of that artist and shot. (Laughing)

Anyway, we said no. They begged, they pleaded, and we said, “We’ll try.” Dan stepped up and said, “Yeah, I think I can make it.” And we just made it, but that sounds like we were in danger because we couldn’t get it done fast enough. All of this was happening in like a two-day window. If you didn’t notice (in the trailer), that’s a Rev 7. Gabriel Luna is a Rev 9, which is the next gen. But the Rev 7s that you see in his future flashback are just pure killers. They’re still the same technology, which is looking like metal on the outside and a carbon endoskeleton that splits. So you have to run the simulation where the skeleton separates through the liquid that hangs off of an inch string; it’s a really hard simulation to do. That’s why we thought maybe it wasn’t going to get done, but running the simulation on the AMD boxes was lightning fast.

 

 

 

Carbon New York grows with three industry vets

Carbon in New York has grown with two senior hires — executive producer Nick Haynes and head of CG Frank Grecco — and the relocation of existing ECD Liam Chapple, who joins from the Chicago office.

Chapple joined Carbon in 2016, moving from Mainframe in London to open Carbon’s Chicago facility.  He brought in clients such as Porsche, Lululemon, Jeep, McDonald’s, and Facebook. “I’ve always looked to the studios, designers and directors in New York as the high bar, and now I welcome the opportunity to pitch against them. There is an amazing pool of talent in New York, and the city’s energy is a magnet for artists and creatives of all ilk. I can’t wait to dive into this and look forward to expanding upon our amazing team of artists and really making an impression in such a competitive and creative market.”

Chapple recently wrapped direction and VFX on films for Teflon and American Express (Ogilvy) and multiple live-action projects for Lululemon. The most recent shoot, conceived and directed by Chapple, was a series of eight live-action films focusing on Lululemon’s brand ambassadors and its new flagship store in Chicago.

Haynes joins Carbon from his former role as EP of MPC, bringing over 20 years of experience earned at The Mill, MPC and Absolute. Haynes recently wrapped the launch film for the Google Pixel phone and the Chromebook, as well as an epic Middle Earth: Shadow of War Monolith Games trailer combining photo-real CGI elements with live-action shot on the frozen Black Sea in Ukraine.  “We want to be there at the inception of the creative and help steer it — ideally, lead it — and be there the whole way through the process, from concept and shoot to delivery. Over the years, whether working for the world’s most creative agencies or directly with prestigious clients like Google, Guinness and IBM, I aim to be as close to the project as possible from the outset, allowing my team to add genuine value that will garner the best result for everyone involved.”

Grecco joins Carbon from Method Studios, where he most recently led projects for Google, Target, Microsoft, Netflix and Marvel’s Deadpool 2.  With a wide range of experience from Emmy-nominated television title sequences to feature films and Super Bowl commercials, Grecco looks forward to helping Carbon continue to push its visuals beyond the high bar that has already been set.

In addition to New York and Chicago, Carbon has a studio in Los Angeles.

Main Image: (L-R) Frank Grecco, Liam Chapple, Nick Haynes

Sheena Duggal to get VES Award for Creative Excellence

The Visual Effects Society (VES) named acclaimed visual effects supervisor Sheena Duggal as the forthcoming recipient of the VES Award for Creative Excellence in recognition of her valuable contributions to filmed entertainment. The award will be presented at the 18th Annual VES Awards on January 29, 2020, at the Beverly Hilton Hotel.

The VES Award for Creative Excellence, bestowed by the VES Board of Directors, recognizes individuals who have made significant and lasting contributions to the art and science of the visual effects industry by uniquely and consistently creating compelling and creative imagery in service to story. The VES will honor Duggal for breaking new ground in compelling storytelling through the use of stunning visual effects. Duggal has been at the forefront of embracing emerging technology to enhance the moviegoing experience, and her creative vision and inventive techniques have paved the way for future generations of filmmakers.

Duggal is an acclaimed visual effects supervisor and artist whose work has shaped numerous studio tentpole and Academy Award-nominated productions. She is known for her design skills, creative direction and visual effects work on blockbuster films such as Venom, The Hunger Games, Mission: Impossible, Men in Black II, Spider-Man 3 and Contact. She has worked extensively with Marvel Studios as VFX supervisor on projects including Doctor Strange, Thor: The Dark World, Iron Man 3, Marvel One-Shot: Agent Carter and the Agent Carter TV series. She also contributed to Sci-Tech Academy Award wins for visual effects and compositing software Flame and Inferno. Since 2012, Duggal has been consulting with Codex (and now Codex and Pix), providing guidance on various new technologies for the VFX community. Duggal is currently visual effects supervisor for Venom 2 and recently completed design and prep for Ghostbusters 2020.

In 2007, Duggal made her debut as a director on an award-winning short film to showcase the Chicago Spire, simultaneously designing all of the visual effects. Her career in movies began when she moved to Los Angeles to work as a Flame artist on Super Mario Bros. for Roland Joffe and Jake Eberts’ Lightmotive Fatman. She had previously been based in London, where she created high-resolution digital composites for Europe’s top advertising and design agencies. Her work included album covers for Elton John and Traveling Wilburys.

Already an accomplished compositor (she began in 1985 working on early generation paint software), in 1992 Duggal worked as a Flame artist on the world’s first Flame feature production. Soon after, she was hired by Industrial Light & Magic as a supervising lead Flame artist on a number of high-profile projects (Mission: Impossible, Congo and The Indian in the Cupboard). In 1996, Duggal left ILM to join Sony Pictures Imageworks as creative director of high-speed compositing and soon began to take on the additional responsibilities of visual effects supervisor. She was production-side VFX supervisor for multiple directors during this time, including Jane Anderson (The Prize Winner of Defiance, Ohio), Peter Segal (50 First Dates and Anger Management) and Ridley Scott (Body of Lies and Matchstick Men).
In addition to feature films, Duggal has also worked on a number of design projects. In 2013 she designed the logo and the main-on-ends for Agent Carter. She was production designer for SIGGRAPH Electronic Theatre 2001, and she created the title design for the groundbreaking Technology Entertainment and Design conference (TED) in 2004.

Duggal is also a published photographer and traveled to Zimbabwe and Malawi on her last assignment on behalf of UK water charity Pump Aid, where she was photo-documenting how access to clean water has transformed the lives of thousands of people in rural areas.
Duggal is a member of the Academy of Motion Pictures Arts and Sciences and serves on the executive committee for the VFX branch.

Whiskytree experiences growth, upgrades tools

Visual effects and content creation company Whiskytree has gone through a growth spurt that included a substantial increase in staff, a new physical space and new infrastructure.

Providing content for films, television, the Web, apps, game and VR or AR, Whiskytree’s team of artists, designers and technicians use applications such as Autodesk Maya, Side Effects Houdini, Autodesk Arnold, Gaffer and Foundry Nuke on Linux — along with custom tools — to create computer graphics and visual effects.

To help manage its growth and the increase in data that came with it, Whiskytree recently installed Panasas ActiveStor. The platform is used to store and manage Whiskytree’s computer graphics and visual effects workflows, including data-intensive rendering and realtime collaboration using extremely large data sets for movies, commercials and advertising; work for realtime render engines and games; and augmented reality and virtual reality applications.

“We recently tripled our employee count in a single month while simultaneously finalizing the build-out of our new facility and network infrastructure, all while working on a 700-shot feature film project [The Captain],” says Jonathan Harb, chief executive officer and owner of Whiskytree. “Panasas not only delivered the scalable performance that we required during this critical period, but also delivered a high level of support and expertise. This allowed us to add artists at the rapid pace we needed with an easy-to-work-with solution that didn’t require fine-tuning to maintain and improve our workflow and capacity in an uninterrupted fashion. We literally moved from our old location on a Friday, then began work in our new facility the following Monday morning, with no production downtime. The company’s ‘set it and forget it’ appliance resulted in overall smooth operations, even under the trying circumstances.”

In the past, Whiskytree operated a multi-vendor storage solution that was complex and time consuming to administer, modify and troubleshoot. With the office relocation and rapid team expansion, Whiskytree didn’t have time to build a new custom solution or spend a lot of time tuning. It also needed storage that would grow as project and facility needs change.

Projects from the studio include Thor: Ragnarok, Monster Hunt 2, Bolden, Mother, Star Wars: The Last Jedi, Downsizing, Warcraft and Rogue One: A Star Wars.

Game of Thrones’ Emmy-nominated visual effects

By Iain Blair

Once upon a time, only glamorous movies could afford the time and money it took to create truly imaginative and spectacular visual effects. Meanwhile, television shows either tried to avoid them altogether or had to rely on hand-me-downs. But the digital revolution changed all that with its technological advances, and new tools quickly leveling the playing field. Today, television is giving the movies a run for their money when it comes to sophisticated visual effects, as evidenced by HBO’s blockbuster series, Game of Thrones.

Mohsen Mousavi

This fantasy series was recently Emmy-nominated a record-busting 32 times for its eighth and final season — including one for its visually ambitious VFX in the penultimate episode, “The Bells.”

The epic mass destruction presented Scanline’s VFX supervisor, Mohsen Mousavi, and his team many challenges. But his expertise in high-end visual effects, and his reputation for constant innovation in advanced methodology, made him a perfect fit to oversee Scanline’s VFX for the crucial last three episodes of the final season of Game of Thrones.

Mousavi started his VFX career in the field of artificial intelligence and advanced-physics-based simulations. He spearheaded designing and developing many different proprietary toolsets and pipelines for doing crowd, fluid and rigid body simulation, including FluidIT, BehaveIT and CardIT, a node-based crowd choreography toolset.

Prior to joining Scanline VFX Vancouver, Mousavi rose through the ranks of top visual effects houses, working in jobs that ranged from lead effects technical director to CG supervisor and, ultimately, VFX supervisor. He’s been involved in such high-profile projects as Hugo, The Amazing Spider-Man and Sucker Punch.

In 2012, he began working with Scanline, acting as digital effects supervisor on 300: Rise of an Empire, for which Scanline handled almost 700 water-based sea battle shots. He then served as VFX supervisor on San Andreas, helping develop the company’s proprietary city-generation software. That software and pipeline were further developed and enhanced for scenes of destruction in director Roland Emmerich’s Independence Day: Resurgence. In 2017, he served as the lead VFX supervisor for Scanline on the Warner Bros. shark thriller, The Meg.

I spoke with Mousavi about creating the VFX and their pipeline.

Congratulations on being Emmy-nominated for “The Bells,” which showcased so many impressive VFX. How did all your work on Season 4 prepare you for the big finale?
We were heavily involved in the finale of Season 4, however the scope was far smaller. What we learned was the collaboration and the nature of the show, and what the expectations were in terms of the quality of the work and what HBO wanted.

You were brought onto the project by lead VFX supervisor Joe Bauer, correct?
Right. Joe was the “client VFX supervisor” on the HBO side and was involved since Season 3. Together with my producer, Marcus Goodwin, we also worked closely with HBO’s lead visual effects producer, Steve Kullback, who I’d worked with before on a different show and in a different capacity. We all had daily sessions and conversations, a lot of back and forth, and Joe would review the entire work, give us feedback and manage everything between us and other vendors, like Weta, Image Engine and Pixomondo. This was done both technically and creatively, so no one stepped on each other’s toes if we were sharing a shot and assets. But it was so well-planned that there wasn’t much overlap.

[Editor’s Note: Here is the full list of those nominated for their VFX work on Game of Thrones — Joe Bauer, lead visual effects supervisor; Steve Kullback, lead visual effects producer; Adam Chazen, visual effects associate producer; Sam Conway, special effects supervisor; Mohsen Mousavi, visual effects supervisor; Martin Hill, visual effects supervisor; Ted Rae, visual effects plate supervisor; Patrick Tiberius Gehlen, previz lead; and Thomas Schelesny, visual effects and animation supervisor.]

What were you tasked with doing on Season 8?
We were involved as one of the lead vendors on the last three episodes and covered a variety of sequences. In episode four, “The Last of the Starks,” we worked on the confrontation between Daenerys and Cersei in front of the King’s Landing’s gate, which included a full CG environment of the city gate and the landscape around it, as well as Missandei’s death sequence, which featured a full CG Missandei. We also did the animated Drogon outside the gate while the negotiations took place.

Then for “The Bells” we were responsible for most of the Battle of King’s Landing, which included full digital city, Daenerys’ army camp site outside the walls of King’s Landing, the gathering of soldiers in front of the King’s Landing walls, Danny’s attack on the scorpions, the city gate, streets and the Red Keep, which had some very close-up set extensions, close-up fire and destruction simulations and full CG crowd of various different factions — armies and civilians. We also did the iconic Cleaganebowl fight between The Hound and The Mountain and Jamie Lannister’s fight with Euron at the beach underneath the Red Keep. In Episode 5, we received raw animation caches of the dragon from Image Engine and did the full look-dev, lighting and rendering of the final dragon in our composites.

For the final episode, “The Iron Throne, we were responsible for the entire Deanerys speech sequence, which included a full 360 digital environment of the city aftermath and the Red Keep plaza filled with digital unsullied Dothrakies and CG horses leading into the majestic confrontation between Jon and Drogon, where it revealed itself from underneath a huge pile of snow outside Red Keep. We were also responsible for the iconic throne melt sequence, which included some advance simulation of high viscous fluid and destruction of the area around the throne and finishing the dramatic sequence with Drogon carrying Danny out of the throne room and away from King’s Landing into the unknown.

Where was all this work done?
The majority of the work was done here in Vancouver, which is the biggest Scanline office. Additionally we had teams working in our Munich, Montreal and LA offices. We’re a 100% connected company, all working under the same infrastructure in the same pipeline. So if I work with the team in Munich, it’s like they’re sitting in the next room. That allows us to set up and attack the project with a larger crew and get the benefit of the 24/7 scenario; as we go home, they can continue working, and it makes us far more productive.

How many VFX did you have to create for the final season?
We worked on over 600 shots across the final three episodes which gave us approximately over an hour of screen time of high-end consistent visual effects.

Isn’t that hour length unusual for 600 shots?
Yes, but we had a number of shots that were really long, including some ground coverage shots of Arya in the streets of King’s Landing that were over four or five minutes long. So we had the complexity along with the long duration.

How many people were on your team?
At the height, we had about 350 artists on the project, and we began in March 2018 and didn’t wrap till nearly the end of April 2019 — so it took us over a year of very intense work.

Tell us about the pipeline specific to Game of Thrones.
Scanline has an industry-wide reputation for delivering very complex, full CG environments combined with complex simulation scenarios of all sort of fluid dynamics and destruction based on our simulation framework “Flowline.” We had a high-end digital character and hero creature pipeline that gave the final three episodes a boost up front. What was new were the additions to our procedural city generation pipeline for the recreation of King’s Landing, making sure it can deliver both in wide angle shots as well as some extreme close-up set extensions.

How did you do that?
We used a framework we developed back for Independence Day: Resurgence, which is a module-based procedural city generation leveraging some incredible scans of the historical city of Dubrovnik as a blueprint and foundation of King’s Landing. Instead of doing the modeling conventionally, you model a lot of small modules, kind of like Lego blocks. You create various windows, stones, doors, shingles and so on, and once it’s encoded in the system, you can semi-automatically generate variations of buildings on the fly. That also goes for texturing. We had procedurally generated layers of façade textures, which gave us a lot of flexibility on texturing the entire city, with full control over the level of aging and damage. We could decide to make a block look older easily without going back to square one. That’s how we could create King’s Landing with its hundreds of thousands of unique buildings.

The same technology was applied to the aftermath of the city in Episode 6. We took the intact King’s Landing and ran a number of procedural collapsing simulations on the buildings to get the correct weight based on references from the bombed city of Dresden during WWII, and then we added procedurally created CG snow on the entire city.

It didn’t look like the usual matte paintings were used at all.
You’re right, and there were a lot of shots that normally would be done that way, but to Joe’s credit, he wanted to make sure the environments weren’t cheated in any way. That was a big challenge, to keep everything consistent and accurate. Even if we used traditional painting methods, it was all done on top of an accurate 3D representation with correct lighting and composition.

What other tools did you use?
We use Autodesk Maya for all our front-end departments, including modeling, layout, animation, rigging and creature effects, and we bridge the results to Autodesk 3ds Max, which encapsulates our look-dev/FX and rendering departments, powered by Flowline and Chaos Group’s V-Ray as our primary render engine, followed by Foundry’s Nuke as our main compositing package.

At the heart of our crowd pipeline, we use Massive and our creature department is driven with Ziva muscles which was a collaboration we started with Ziva Dynamics back for the creation of the hero Megalodon in The Meg.

Fair to say that your work on Game of Thrones was truly cutting-edge?
Game of Thrones has pushed the limit above and beyond and has effectively erased the TV/feature line. In terms of environment and effects and the creature work, this is what you’d do for a high-end blockbuster for the big screen. No difference at all.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Behind the Title: MPC Senior Compositor Ruairi Twohig

After studying hand-drawn animation, this artist found his way to visual effects.

NAME: NYC-based Ruairi Twohig

COMPANY: Moving Picture Company (MPC)

CAN YOU DESCRIBE YOUR COMPANY?
MPC is a global creative and visual effects studio with locations in London, Los Angeles, New York, Shanghai, Paris, Bangalore and Amsterdam. We work with clients and brands across a range of different industries, handling everything from original ideas through to finished production.

WHAT’S YOUR JOB TITLE?
I work as a 2D lead/senior compositor.

Cadillac

WHAT DOES THAT ENTAIL?
The tasks and responsibilities can vary depending on the project. My involvement with a project can begin before there’s even a script or storyboard, and we need to estimate how much VFX will be involved and how long it will take. As the project develops and the direction becomes clearer, with scripts and storyboards and concept art, we refine this estimate and schedule and work with our clients to plan the shoot and make sure we have all the information and assets we need.

Once the commercial is shot and we have an edit, the bulk of the post work begins. This can involve anything from compositing fully CG environments, dragons or spaceships to beauty and product/pack-shot touch-ups or rig removal. So, my role involves a combination of overall project management and planning. But I also get into the detailed shot work and ultimately delivering the final picture. But the majority of the work I do can require a large team of people with different specializations, and those are usually the projects I find the most fun and rewarding due to the collaborative nature of the work.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I think the variety of the work would surprise most people unfamiliar with the industry. In a single day, I could be working on two or three completely different commercials with completely different challenges while also bidding future projects or reviewing prep work in the early stages of a current project.

HOW LONG HAVE YOU BEEN WORKING IN VFX?
I’ve been working in the industry for over 10 years.

HOW HAS THE VFX INDUSTRY CHANGED IN THE TIME YOU’VE BEEN WORKING?
The VFX industry is always changing. I find it exciting to see how quickly the technology is advancing and becoming more widely accessible, cost-effective and faster.

I still find it hard to comprehend the idea of using optical printers for VFX back in the day … before my time. Some of the most interesting areas for me at the moment are the developments in realtime rendering from engines such as Unreal and Unity, and the implementation of AI/machine learning tools that might be able to automate some of the more time-consuming tasks in the future.

DID A PARTICULAR FILM INSPIRE YOU ALONG THIS PATH IN ENTERTAINMENT?
I remember when I was 13, my older brother — who was studying architecture at the time — introduced me to 3ds Max, and I started playing around with some very simple modeling and rendering.

I would buy these monthly magazines like 3D World, which came with demo discs for different software and some CG animation compilations. One of the issues included the short CG film Fallen Art by Tomek Baginski. At the time I was mostly familiar with Pixar’s feature animation work like Toy Story and A Bug’s Life, so watching this short film created using similar techniques but with such a dark, mature tone and story really blew me away. It was this film that inspired me to pursue animation and, ultimately, visual effects.

DID YOU GO TO FILM SCHOOL?
I studied traditional hand-drawn animation at the Dun Laoghaire Institute of Art, Design and Technology in Dublin. This was a really fun course in which we spent the first two years focusing on the craft of animation and the fundamental principles of art and design, followed by another two years in which we had a lot of freedom to make our own films. It was during these final two years of experimentation that I started to move away from traditional animation and focus more on learning CG and VFX.

I really owe a lot to my tutors, who were really supportive during that time. I also had the opportunity to learn from visiting animation masters such as Andreas Deja, Eric Goldberg and John Canemaker. Although on the surface the work I do as a compositor is very different to animation, understanding those fundamental principles has really helped my compositing work; any additional disciplines or skills you develop in your career that require an eye for detail and aesthetics will always make you a better overall artist.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Even after 10 years in the industry, I still get satisfaction from the problem-solving aspect of the job, even on the smaller tasks. I love getting involved on the more creative projects, where I have the freedom to develop the “look” of the commercial/film. But, day to day, it’s really the team-based nature of the work that keeps me going. Working with other artists, producers, directors and clients to make a project look great is what I find really enjoyable.

WHAT’S YOUR LEAST FAVORITE?
Sometimes even if everything is planned and scheduled accordingly, a little hiccup along the way can easily impact a project, especially on jobs where you might only have a limited amount of time to get the work done. So it’s always important to work in such a way that allows you to adapt to sudden changes.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I used to draw all day, every day as a kid. I still sketch occasionally, but maybe I would have pursued a more traditional fine art or illustration career if I hadn’t found VFX.

Tiffany & Co.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
Over the past year, I’ve worked on projects for clients such as Facebook, Adidas, Samsung and Verizon. I also worked on the Tiffany & Co. campaign “Believe in Dreams” directed by Francis Lawrence, as well as the company’s holiday campaign directed by Mark Romanek.

I also worked on Cadillac’s “Rise Above” campaign for the 2019 Oscars, which was challenging since we had to deliver four spots within a short timeframe. But it was a fun project. There was also the Michelob Ultra Robots Super Bowl spot earlier this year. That was an interesting project, as the work was completed between our LA, New York and London studios.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
Last year, I had the chance to work with my friend and director Sofia Astrom on the music video for the song “Bone Dry” by Eels. It was an interesting project since I’d never done visual effects for a stop-motion animation before. This had its own challenges, and the style of the piece was very different compared to what I’m used to working on day to day. It had a much more handmade feel to it, and the visual effects design had to reflect that, which was such a change to the work I usually do in commercials, which generally leans more toward photorealistic visual effects work.

WHAT TOOLS DO YOU USE DAY TO DAY?
I mostly work with Foundry Nuke for shot compositing. When leading a job that requires a broad overview of the project and timeline management/editorial tasks, I use Nuke Studio or
Autodesk Flame, depending on the requirements of the project. I also use ftrack daily for project management.

WHERE DO YOU FIND INSPIRATION NOW?
I follow a lot of incredibly talented concept artists and photographers/filmmakers on Instagram. Viewing these images/videos on a tiny phone doesn’t always do justice to the work, but the platform is so active that it’s a great resource for inspiration and finding new artists.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I like to run and cycle around the city when I can. During the week it can be easy to get stuck in a routine of sitting in front of a screen, so getting out and about is a much-needed break for me.

Beecham House‘s VFX take viewers back in time

Cambridge, UK-based Vine FX was the sole visual effects vendor on Gurinder Chadha’s Beecham House, a new Sunday night drama airing on ITV in the UK. Set in the India of 1795, Beecham House is the story of John Beecham (Tom Bateman), an Englishman who resigned from military service to set up as an honorable trader of the East India Company.

The series was shot at Ealing Studios and at some locations in India, with the visual effects work focusing on the Port of Delhi, the emperor’s palace and Beecham’s house. Vine FX founder Michael Illingworth assisted during development of the series and supervised his team of artists, creating intricate set extensions, matte paintings and period assets.

To make the shots believable and true to the era, the Vine FX team consulted closely with the show’s production designer and researched the period thoroughly. All modern elements — wires, telegraph poles, cars and lamp posts — had to be removed from the shoot footage, but the biggest challenge for the team was the Port of Delhi itself, a key location in the series.

Vine FX created a digital matte painting to extend the port and added numerous 3D boats and 3D people people working on the docks to create a busy working port of 1795 — a complex task and achieved by the expert eye of the Vine team.

“The success of this type of VFX is in its subtlety. We had to create a Delhi of 1795 that the audience believed, and this involved a great deal of research into how this would have looked that was essential to making it realistic,” says Illingworth. “Hopefully, we managed to do this.  I’m particularly happy with the finished port sequences as originally there were just three boats.

“I worked very closely with on-set supervisor Oliver Milburn while he was on set in India so was very much part of the production process in terms of VFX,” he continues. “Oliver would send me reference material from the shoot; this is always fundamental to the outcome of the VFX, as it allows you to plan ahead and work out any potential upcoming challenges. I was working on the VFX in Cambridge while Oliver was on set in Delhi — perfect!”

Vine FX used Photoshop and Nuke are its main tools. The artists modeled assets with Maya and Zbrush and painted assets using Substance painter. They rendered with Arnold.

Vine FX is currently working on War of the Worlds for Fox Networks and Canal+, due for release next year.

Brittany Howard music video sets mood with color and VFX

The latest collaboration between Framestore and director Kim Gehrig is for Brittany Howard’s debut solo music video for Stay High, which features a color grade and subtle VFX by the studio. A tribute to the Alabama Shakes’ lead singer’s late father, the stylized music video stars actor Terry Crews (Brooklyn Nine-Nine, The Expendables) as a man finishing a day’s work and returning home to his family.

Produced by production company Somesuch, the aim of Stay High is to present a natural and emotionally driven story that honors the singer’s father, K.J. Howard. Shot in her hometown of Nashville, the music video features Howard’s family and friends while the singer pops up in several scenes throughout the video as different characters.

The video begins with Howard’s father getting off of work at his factory job. The camera follows him on his drive home, all the while he’s singing “Stay High.” As he drives home, we see images people and locations where Howard grew up. The video ends when her dad pulls into his driveway and is met by his daughters and wife.

“Kim wanted to really highlight the innocence of the video’s story, something I kept in mind while grading the film,” says Simon Bourne, Framestore’s head of creative color, who’s graded several films for the director. “The focus needed to always be on Terry with nothing in his surroundings distracting from that and the grade needed to reflect that idea.”

Framestore’s creative director Ben Cronin, who was also a compositor on the project along with Nuke compositor Christian Baker, adds, “From a VFX point of view, our job was all about invisible effects that highlighted the beautiful job that Ryley Brown, the film’s DP, did and to complement Kim’s unique vision.”

“We’ve worked with Kim on several commercials and music video projects, and we love collaborating because her films are always visually-interesting and she knows we’ll always help achieve the ground-breaking and effortlessly cool work that she does.”


Glassbox’s virtual camera toolset for Unreal, Unity, Maya

Virtual production software company Glassbox Technologies has released its virtual camera plugin DragonFly from private beta for public use. DragonFly offers professional virtual cinematography tools to filmmakers and content creators, allowing users to view character performances and scenes within computer-generated virtual environments in realtime, through the camera’s viewfinder, an external monitor or iPad.

Available for Unreal Engine, Unity 3D and Autodesk Maya, DragonFly delivers an inclusive virtual cinematography workflow that allows filmmakers and content creators to make and test creative decisions faster and earlier in the process, whittling down production cost on projects of all scopes and sizes.

This off-the-shelf toolkit allows users to create previz to postviz without the need for large teams of operators, costly hardware or proprietary tools. It is platform-agnostic and fits seamlessly into any workflow out of box. Users can visualize and explore a CG virtual environment, then record, bookmark, create snapshots and replicate real camera movement as seamlessly as conducting a live-action shoot.

“Virtual production poses great potential for creators, but there were no off-the-shelf filming solutions available that worked out of the box,” notes co-founder/CPO Mariana Acuña. “In response, we made DragonFly: a virtual window that allows users to visualize complex sets, environments and performances through a viewfinder. Without the need for a big stage or mocap crew, it brings greater flexibility to the production and post pipeline for films, animation, immersive content, games and realtime VFX.”

The product was developed in collaboration with top Hollywood visualization and production studios, including The Third Floor for best-in-class results.

“Prior to DragonFly, each studio created its own bespoke virtual production workflow, which is costly and time-consuming per project. DragonFly makes realtime virtual production usable for all creators,” says Evelyn Cover, global R&D manager for The Third Floor. “We’re excited to collaborate with the Glassbox team to develop and test  DragonFly in all kinds of production scenarios from previz to post, with astounding success.”

Glassbox’s second in-beta virtual production software solution, BeeHive — the multi-platform, multi-user collaborative virtual scene syncing, editing and review solution–is slated to launch later this summer.

DragonFly is now available for purchase or can be downloaded for free as a 15-day trial from the Glassbox website. Pricing and licensing includes a permanent license option costing $750 USD (including $250 for the first year of support and updates) and an annual rental option costing $420 a year.

Technicolor opens prepro studio in LA

Technicolor is opening a new studio in Los Angeles dedicated to creating a seamless pipeline for feature projects — from concept art and visualization through virtual production, production and into final VFX.

As new distribution models increase the demand for content, Technicolor Pre-Production will provide the tools, the talent and the space for creatives to collaborate from day one of their project – from helping set the vision at the start of a job to ensuring that the vision carries through to production and VFX. The result is a more efficient filmmaking process.

Technicolor Pre-Production studio is headed by Kerry Shea, an industry veteran with over 20 years of experience. She is no stranger to this work, having held executive positions at Method Studios, The Third Floor, Digital Domain, The Jim Henson Company, DreamWorks Animation and Sony Pictures Imageworks.

Kerry Shea

Credited on more than 60 feature films including The Jungle Book, Pirates of the Caribbean: Dead Men Tell No Tales and Guardians of the Galaxy Vol. 2, Shea has an extensive background in VFX and post production, as well as live action, animatronics and creature effects.

While the Pre-Production studio stands apart from Technicolor’s visual effects studios — MPC Film, Mill Film, MR. X and Technicolor VFX — it can work seamlessly in conjunction with one or any combination of them.

The Technicolor Pre-Production Studio will comprise of key departments:
– The Business Development Department will work with clients, from project budgeting to consulting on VFX workflows, to help plan and prepare projects for a smooth transition into VFX.
– The VFX Supervisors Department will offer creative supervision across all aspects of VFX on client projects, whether delivered by Technicolor’s studios or third-party vendors.
– The Art Department will work with clients to understand their vision – including characters, props, technologies, and environments – creating artwork that delivers on that vision and sets the tone for the rest of the project.
– The Virtual Production Department will partner with filmmakers to bridge the gap between them and VFX through the production pipeline. Working on the ground and on location, the department will deliver a fully integrated pipeline and shooting services with the flexibility of a small, manageable team — allowing critical players in the filmmaking process to collaborate, view and manipulate media assets and scenes across multiple locations as the production process unfolds.
– The Visualization Department will deliver visualizations that will assist in achieving on screen exactly what clients envisioned.

“With the advancements of tools and technologies, such as virtual production, filmmaking has reached an inflection point, one in which storytellers can redefine what is possible on-set and beyond,” says Shea. “I am passionate about the increasing role and influence that the tools and craft of visual effects can have on the production pipeline and the even more important role in creating more streamlined and efficient workflows that create memorable stories.”

SIGGRAPH making-of sessions: Toy Story 4, GoT, more

The SIGGRAPH 2019 Production Sessions program offers attendees a behind-the-scenes look at the making of some of the year’s most impressive VFX films, shows, games and VR projects. The 11 production sessions will be held throughout the conference week of July 28 through August 1 at the Los Angeles Convention Center.

With 11 total sessions, attendees will hear from creators who worked on projects such as Toy Story 4, Game of Thrones, The Lion King and First Man.

Other highlights include:

Swing Into Another Dimension: The Making of Spider-Man: Into the Spider-Verse
This production session will explore the art and innovation behind the creation of the Academy Award-winning Spider-Man: Into the Spider-Verse. The filmmaking team behind the first-ever animated Spider-Man feature film took significant risks to develop an all-new visual style inspired by the graphic look of comic books.

Creating the Immersive World of BioWare’s Anthem
The savage world of Anthem is volatile, lush, expansive and full of unexpected characters. Bringing these aspects to life in a realtime, interactive environment presented a wealth of problems for BioWare’s technical artists and rendering engineers. This retrospective panel will highlight the team’s work, alongside reflections on innovation and the successes and challenges of creating a new IP.

The VFX of Netflix Series
From the tragic tales of orphans to a joint force of super siblings to sinister forces threatening 1980s Indiana, the VFX teams on Netflix series have delivered some of the year’s most best visuals. Creatives behind A Series of Unfortunate Events, The Umbrella Academy and Stranger Things will present the work and techniques that brought these worlds and characters into being.

The Making of Marvel Studios’ Avengers: Endgame
The fourth installment in the Avengers saga is the culmination of 22 interconnected films and has drawn audiences to witness the turning point of this epic journey. SIGGRAPH 2019 keynote speaker Victoria Alonso will join Marvel Studios, Digital Domain, ILM and Weta Digital as they discuss how the diverse collection of heroes, environments, and visual effects were assembled into this ultimate, climactic final chapter.

Space Explorers — Filming VR in Microgravity
Felix & Paul Studios, along with collaborators from NASA and the ISS National Lab, share insights from one of the most ambitious VR projects ever undertaken. In this session, the team will discuss the background of how this partnership came to be before diving into the technical challenges of capturing cinematic virtual reality on the ISS.

Productions Sessions are open to conference participants with Select Conference, Full Conference or Full Conference Platinum registrations. The Production Gallery can be accessed with an Experiences badge and above.

Fox Sports promotes US women’s World Cup team with VFX-heavy spots

Santa Monica creative studio Jamm worked with Wieden+Kennedy New York on the Fox Sports campaign “All Eyes on US.” Directed by Joseph Kahn out of Supply & Demand, the four spots celebrate the US Women’s soccer team as it gears up for the 2019 FIFA Women’s World Cup in June.

The newest 60-second spot All Eyes on US, features tens of thousands of screaming fans thanks to Jamm’s CG crowd work. On set, Jamm brainstormed with Kahn on how to achieve the immersive effect he was looking for. Much of the on-the-ground footage was shot using wide-angle lenses, which posed a unique set of challenges by revealing the entire environment as well as the close-up action. With pacing, Jamm achieved the sense of the game occurring in realtime, as the tempo of the camera keeps in step with the team moving the ball downfield.

The 30-second spot Goliath features the first CG crowd shot by the Jamm team, who successfully filled the soccer stadium with a roaring crowd. In Goliath, the entire US women’s soccer team runs toward the camera in slow motion. Captured locked off but digitally manipulated via a 3D camera to create a dolly zoom technique replicating real-life parallax, the altered perspective translates the unsettling feeling of being an opponent as the team literally runs straight into the camera.

On set, Jamm got an initial Lidar scan of the stadium as a base. From there, they used that scan along with reference photos taken on set to build a CG stadium that included accurate seating. They extended the stadium where there were gaps as well to make it a full 360 stadium. The stadium seating tools tie in with Jamm’s in-house crowd system (based on Side Effects Houdini) and allowed them to easily direct the performance of the crowd in every shot.

The Warrior focuses on Megan Rapinoe standing on the field in the rain, with a roaring crowd behind her. Whereas CG crowd simulation is typically captured with fast-moving cameras, the stadium crowd remains locked in the background of this sequence. Jamm implemented motion work and elements like confetti to make the large group of characters appear lively without detracting from Rapinoe in the foreground. Because the live-action scenes were shot in the rain, Jamm used water graphing to seamlessly blend the real-world footage and the CG crowd work.

The Finisher centers on Alex Morgan, who earned the nickname because “she’s the last thing they’ll see before it’s too late.”  The team ran down the field at a slow motion pace, while the cameraman rigged with a steady cam sprinted backwards through the goal. Then the footage was sped up by 600%, providing a realtime quality, as Morgan kicks a perfect strike to the back of the net.

Jamm used Autodesk Flame for compositing the crowds and CG ball, camera projections to rebuild and clean up certain parts of the environment, refining the skies and adding in stadium branding. They also used Foundry Nuke and Houdini for 3D.

The edit was via FinalCut and editor Spencer Campbell. The color grade was by Technicolor’s Tom Poole.

Veteran VFX supervisor Lindy De Quattro joins MPC Film

Long-time visual effects supervisor Lindy De Quattro has joined MPC Film in Los Angeles. Over the last two and a half decades, which included 21 years at ILM, De Quattro has worked with directors such as Guillermo Del Toro, Alexander Payne and Brad Bird. She also currently serves on the Executive Committee for the VFX branch of the Academy of Motion Picture Arts and Sciences.

De Quattro’s VFX credits include Iron Man 2, Mission Impossible: Ghost Protocol, Downsizing and Pacific Rim, for which she won a VES Award for Outstanding Visual Effects. In addition to supervising visual effects teams, she has also provided on-set supervision.

De Quattro says she was attracted to MPC because of “their long history of exceptional high-quality visual effects, but I made the decision to come on board because of their global commitment to inclusion and diversity in the VFX industry. I want to be an active part of the change that I see beginning to happen all around me, and MPC is giving me the opportunity to do just that. They say, ‘If you can see it, you can be it.’ Girls need role models, and women and other underrepresented groups in the industry need mentors. In my new role at MPC I will strive to be both while contributing to MPC’s legacy of outstanding visual effects.”

The studio’s other VFX supervisors include Richard Stammers (Dumbo, The Martian, X-Men: Days of Future Past), Erik Nash (Avengers Assemble, Titanic), Nick Davis (The Dark Knight, Edge of Tomorrow) and Adam Valdez (The Lion King, Maleficent, The Jungle Book).

MPC Film is currently working on The Lion King, Godzilla: King of the Monsters, Detective Pikachu, Call of the Wild and The New Mutants.

Review: Red Giant’s Trapcode Suite 15

By Brady Betzel

We are now comfortably into 2019 and enjoying the Chinese Year of the Pig — or at least I am! So readers, you might remember that with each new year comes a Red Giant Trapcode Suite update. And Red Giant didn’t disappoint with Trapcode Suite 15.

Every year Red Giant adds more amazing features to its already amazing particle generator and emitter toolset, Trapcode Suite, and this year is no different. Trapcode Suite 15 is keeping tools like 3D Stroke, Shine, Starglow, Sound Keys, Lux, Tao, Echospace and Horizon while significantly updating Particular, Form and Mir.

I won’t be covering each plugin in this review but you can check out what each individual plugin does on the Red Giant’s website.

Particular 4
The bread and butter of the Trapcode Suite has always been Particular, and Version 4 continues to be a powerhouse. The biggest differences between using a true 3D app like Maxon’s Cinema 4D or Autodesk Maya and Adobe After Effects (besides being pseudo 3D) are features like true raytraced rendering and interacting particle systems with fluid dynamics. As I alluded to, After Effects isn’t technically a 3D app, but with plugins like Particular you can create pseudo-3D particle systems that can affect and be affected by different particle emitters in your scenes. Trapcode Suite 15 and, in particular (all the pun intended), Particular 4, have evolved to another level with the latest update to include Dynamic Fluids. Dynamic Fluids essentially allows particle systems that have the fluid-physics engine enabled to interact with one another as well as create mind-blowing liquid-like simulations inside of After Effects.

What’s even more impressive is that with the Particular Designer and over 335 presets, you don’t  need a master’s degree to make impressive motion graphics. While I love to work in After Effects, I don’t always have eight hours to make a fluidly dynamic particle system bounce off 3D text, or have two systems interact with each other for a text reveal. This is where Particular 4 really pays for itself. With a little research and tutorial watching, you will be up and rendering within 30 minutes.

When I was using Particular 4, I simply wanted to recreate the Dynamic Fluid interaction I had seen in one of their promos. Basically, two emitters crashing into each other in a viscus-like fluid, then interacting. While it isn’t necessarily easy, if you have a slightly above-beginner amount of After Effects knowledge you will be able to do this. Apply the Particular plugin to a new solid object and open up the Particular Designer in Effect Controls. From there you can designate emitter type, motion, particle type, particle shadowing, particle color and dispersion types, as well as add multiple instances of emitters, adjust physics and much more.

The presets for all of these options can be accessed by clicking the “>” symbol in the upper left of the Designer interface. You can access all of the detailed settings and building “Blocks” of each of these categories by clicking the “<” in the same area. With a few hours spent watching tutorials on YouTube, you can be up and running with particle emitters and fluid dynamics. The preset emitters are pretty amazing, including my favorite, the two-emitter fluid dynamic systems that interact with one another.

Form 4
The second plugin in the Trapcode Suite 15 that has been updated is Trapcode Form 4. Form is a plugin that literally creates forms using particles that live forever in a unified 3D space, allowing for interaction. Form 4 adds the updated Designer, which makes particle grids a little more accessible and easier to construct for non-experts. Form 4 also includes the latest Fluid Dynamics update that Particular gained. The Fluid Dynamics engine really adds another level of beauty to Form projects, allowing you to create fluid-like particle grids from the 150 included presets or even your own .obj files.

My favorite settings to tinker with are Swirl and Viscosity. Using both settings in tandem can help create an ooey-gooey liquid particle grid that can interact with other Form systems to build pretty incredible scenes. To test out how .obj models worked within form, I clicked over to www.sketchfab.com and downloaded an .obj 3D model. If you search for downloadable models that do not cost anything, you can use them in your projects under Creative Commons licensing protocols, as long as you credit the creator. When in doubt always read the licensing (You can find more info on creative commons licensing here, but in this case you can use them as great practice models.

Anyway, Form 4 allows us to import .obj files, including animated .obj sequences as well as their textures. I found a Day of the Dead-type skull created by JMUHIST, pointed form to the .obj as well as its included texture, added a couple After Effect’s lights, a camera, and I was in business. Form has a great replicator feature (much like Element3D). There are a ton of options, including fog distance under visibility, animation properties, and even the ability to quickly add a null object linked to your model for quick alignment of other elements in the scene.

Mir 3
Up last is Trapcode Mir 3. Mir 3 is used to create 3D terrains, objects and wireframes in After Effects. In this latest update, Mir has added the ability to import .obj models and textures. Using fractal displacement mapping, you can quickly create some amazing terrains. From mountain-like peaks to alien terrains, Mir is a great supplement when using plugins like Video Copilot Element 3D to add endless tunnels or terrains to your 3D scenes quickly and easily.

And if you don’t have or own Element 3D, you will really enjoy the particle replication system. Use one 3D object and duplicate, then twist, distort and animate multiple instances of them quickly. The best part about all of these Trapcode Suite tools is that they interact with the cameras and lighting native to After Effects, making it a unified animating experience (instead of animating separate camera and lighting rigs like in the old days). Two of my favorite features from the last update are the ability to use quad- or triangle-based polygons to texture your surfaces. This can give an 8-bit or low-poly feel quickly, as well as a second pass wireframe to add a grid-like surface to your terrain.

Summing Up
Red Giant’s Trapcode Suite 15 is amazing. If you have a previous version of the Trapcode Suite, you’re in luck: the upgrade is “only” $199. If you need to purchase the full suite, it will cost you $999. Students get a bit of a break at $499.

If you are on the fence about it, go watch Daniel Hashimoto’s Cheap Tricks: Aquaman Underwater Effects tutorial (Part 1 and Part 2). He explains how you can use all of the Red Giant Trapcode Suite effects with other plugins like Video CoPilot’s Element 3D and Red Giant’s Universe and offers up some pro tips when using www.sketchfab.com to find 3D models.

I think I even saw him using Video CoPilot’s FX Console, which is a free After Effects plugin that makes accessing plugins much faster in After Effects. You may have seen his work as @ActionMovieKid on Twitter or @TheActionMovieKid on Instagram. He does some amazing VFX with his kids — he’s a must follow. Red Giant made a power move to get him to make tutorials for them! Anyway, his Aquaman Underwater Effects tutorial take you step by step through how to use each part of the Trapcode Suite 15 in an amazing way. He makes it look a little too easy, but I guess that is a combination of his VFX skills and the Trapcode Suite toolset.

If you are excited about 3D objects, particle systems and fluid dynamics you must check out Trapcode Suite 15 and its latest updates to Particular, Mir and Form.

After I finished the Trapcode Suite 15 review, Red Giant released the Trapcode Suite 15.1 update. The 15.1 update includes Text and Mask Emitters for Form and Particular 4.1, updated Designer, Shadowlet particle type matching, shadowlet softness and 21 additional presets.

This is a free update that can be downloaded from the Red Giant website.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

 

Avengers: Infinity War leads VES Awards with six noms

The Visual Effects Society (VES) has announced the nominees for the 17th Annual VES Awards, which recognize outstanding visual effects artistry and innovation in film, animation, television, commercials and video games as well as the VFX supervisors, VFX producers and hands-on artists who bring this work to life.

Avengers: Infinity War garners the most feature film nomination with six. Incredibles 2 is the top animated film contender with five nominations and Lost in Space leads the broadcast field with six nominations.

Nominees in 24 categories were selected by VES members via events hosted by 11 of the organizations Sections, including Australia, the Bay Area, Germany, London, Los Angeles, Montreal, New York, New Zealand, Toronto, Vancouver and Washington.

The VES Awards will be held on February 5th at the Beverly Hilton Hotel. As previously announced, the VES Visionary Award will be presented to writer/director/producer and co-creator of Westworld Jonathan Nolan. The VES Award for Creative Excellence will be given to award-winning creators/executive producers/writers/directors David Benioff and D.B. Weiss of Game of Thrones fame. Actor-comedian-author Patton Oswalt will once again host the VES Awards.

Here are the nominees:

Outstanding Visual Effects in a Photoreal Feature

Avengers: Infinity War

Daniel DeLeeuw

Jen Underdahl

Kelly Port

Matt Aitken

Daniel Sudick

 

Christopher Robin

Christopher Robin

Chris Lawrence

Steve Gaub

Michael Eames

Glenn Melenhorst

Chris Corbould

 

Ready Player One

Roger Guyett

Jennifer Meislohn

David Shirk

Matthew Butler

Neil Corbould

 

Solo: A Star Wars Story

Rob Bredow

Erin Dusseault

Matt Shumway

Patrick Tubach

Dominic Tuohy

 

Welcome to Marwen

Kevin Baillie

Sandra Scott

Seth Hill

Marc Chu

James Paradis

 

Outstanding Supporting Visual Effects in a Photoreal Feature 

12 Strong

Roger Nall

Robert Weaver

Mike Meinardus

 

Bird Box

Marcus Taormina

David Robinson

Mark Bakowski

Sophie Dawes

Mike Meinardus

 

Bohemian Rhapsody

Paul Norris

Tim Field

May Leung

Andrew Simmonds

 

First Man

Paul Lambert

Kevin Elam

Tristan Myles

Ian Hunter

JD Schwalm

 

Outlaw King

Alex Bicknell

Dan Bethell

Greg O’Connor

Stefano Pepin

 

Outstanding Visual Effects in an Animated Feature

Dr. Seuss’ The Grinch

Pierre Leduc

Janet Healy

Bruno Chauffard

Milo Riccarand

 

Incredibles 2

Brad Bird

John Walker

Rick Sayre

Bill Watral

 

Isle of Dogs

Mark Waring

Jeremy Dawson

Tim Ledbury

Lev Kolobov

 

Ralph Breaks the Internet

Scott Kersavage

Bradford Simonsen

Ernest J. Petti

Cory Loftis

 

Spider-Man: Into the Spider-Verse

Joshua Beveridge

Christian Hejnal

Danny Dimian

Bret St. Clair

 

Outstanding Visual Effects in a Photoreal Episode

Altered Carbon; Out of the Past

Everett Burrell

Tony Meagher

Steve Moncur

Christine Lemon

Joel Whist

 

Krypton; The Phantom Zone

Ian Markiewicz

Jennifer Wessner

Niklas Jacobson

Martin Pelletier

 

LOST IN SPACE

Lost in Space; Danger, Will Robinson

Jabbar Raisani

Terron Pratt

Niklas Jacobson

Joao Sita

 

The Terror; Go For Broke

Frank Petzold

Lenka Líkařová

Viktor Muller

Pedro Sabrosa

 

Westworld; The Passenger

Jay Worth

Elizabeth Castro

Bruce Branit

Joe Wehmeyer

Michael Lantieri

 

Outstanding Supporting Visual Effects in a Photoreal Episode

Tom Clancy’s Jack Ryan; Pilot

Erik Henry

Matt Robken

Bobo Skipper

Deak Ferrand

Pau Costa

 

The Alienist; The Boy on the Bridge

Kent Houston

Wendy Garfinkle

Steve Murgatroyd

Drew Jones

Paul Stephenson

 

The Deuce; We’re All Beasts

Jim Rider

Steven Weigle

John Bair

Aaron Raff

 

The First; Near and Far

Karen Goulekas

Eddie Bonin

Roland Langschwert

Bryan Godwin

Matthew James Kutcher

 

The Handmaid’s Tale; June

Brendan Taylor

Stephen Lebed

Winston Lee

Leo Bovell

 

Outstanding Visual Effects in a Realtime Project

Age of Sail

John Kahrs

Kevin Dart

Cassidy Curtis

Theresa Latzko

 

Cycles

Jeff Gipson

Nicholas Russell

Lauren Nicole Brown

Jorge E. Ruiz Cano

 

Dr Grordbort’s Invaders

Greg Broadmore

Mhairead Connor

Steve Lambert

Simon Baker

 

God of War

Maximilian Vaughn Ancar

Corey Teblum

Kevin Huynh

Paolo Surricchio

 

Marvel’s Spider-Man

Grant Hollis

Daniel Wang

Seth Faske

Abdul Bezrati

 

Outstanding Visual Effects in a Commercial 

Beyond Good & Evil 2

Maxime Luere

Leon Berelle

Remi Kozyra

Dominique Boidin

 

John Lewis; The Boy and the Piano

Kamen Markov

Philip Whalley

Anthony Bloor

Andy Steele

 

McDonald’s; #ReindeerReady

Ben Cronin

Josh King

Gez Wright

Suzanne Jandu

 

U.S. Marine Corps; A Nation’s Call

Steve Drew

Nick Fraser

Murray Butler

Greg White

Dave Peterson

 

Volkswagen; Born Confident

Carsten Keller

Anandi Peiris

Dan Sanders

Fabian Frank

 

Outstanding Visual Effects in a Special Venue Project

Beautiful Hunan; Flight of the Phoenix

R. Rajeev

Suhit Saha

Arish Fyzee

Unmesh Nimbalkar

 

Childish Gambino’s Pharos

Keith Miller

Alejandro Crawford

Thelvin Cabezas

Jeremy Thompson

 

DreamWorks Theatre Presents Kung Fu Panda

Marc Scott

Doug Cooper

Michael Losure

Alex Timchenko

 

Osheaga Music and Arts Festival

Andre Montambeault

Marie-Josee Paradis

Alyson Lamontagne

David Bishop Noriega

 

Pearl Quest

Eugénie von Tunzelmann

Liz Oliver

Ian Spendloff

Ross Burgess

 

Outstanding Animated Character in a Photoreal Feature

Avengers: Infinity War; Thanos

Jan Philip Cramer

Darren Hendler

Paul Story

Sidney Kombo-Kintombo

 

Christopher Robin; Tigger

Arslan Elver

Kayn Garcia

Laurent Laban

Mariano Mendiburu

 

Jurassic World: Fallen Kingdom; Indoraptor

Jance Rubinchik

Ted Lister

Yannick Gillain

Keith Ribbons

 

Ready Player One; Art3mis

David Shirk

Brian Cantwell

Jung-Seung Hong

Kim Ooi

 

Outstanding Animated Character in an Animated Feature

Dr. Seuss’ The Grinch; The Grinch

David Galante

Francois Boudaille

Olivier Luffin

Yarrow Cheney

 

Incredibles 2; Helen Parr

Michal Makarewicz

Ben Porter

Edgar Rodriguez

Kevin Singleton

 

Ralph Breaks the Internet; Ralphzilla

Dong Joo Byun

Dave K. Komorowski

Justin Sklar

Le Joyce Tong

 

Spider-Man: Into the Spider-Verse; Miles Morales

Marcos Kang

Chad Belteau

Humberto Rosa

Julie Bernier Gosselin

 

Outstanding Animated Character in an Episode or Realtime Project

Cycles; Rae

Jose Luis Gomez Diaz

Edward Everett Robbins III

Jorge E. Ruiz Cano

Jose Luis -Weecho- Velasquez

 

Lost in Space; Humanoid

Chad Shattuck

Paul Zeke

Julia Flanagan

Andrew McCartney

 

Nightflyers; All That We Have Found; Eris

Peter Giliberti

James Chretien

Ryan Cromie

Cesar Dacol Jr.

 

Spider-Man; Doc Ock

Brian Wyser

Henrique Naspolini

Sophie Brennan

William Salyers

 

Outstanding Animated Character in a Commercial

McDonald’s; Bobbi the Reindeer

Gabriela Ruch Salmeron

Joe Henson

Andrew Butler

Joel Best

 

Overkill’s The Walking Dead; Maya

Jonas Ekman

Goran Milic

Jonas Skoog

Henrik Eklundh

 

Peta; Best Friend; Lucky

Bernd Nalbach

Emanuel Fuchs

Sebastian Plank

Christian Leitner

 

Volkswagen; Born Confident; Bam

David Bryan

Chris Welsby

Fabian Frank

Chloe Dawe

 

Outstanding Created Environment in a Photoreal Feature

Ant-Man and the Wasp; Journey to the Quantum Realm

Florian Witzel

Harsh Mistri

Yuri Serizawa

Can Yuksel

 

Aquaman; Atlantis

Quentin Marmier

Aaron Barr

Jeffrey De Guzman

Ziad Shureih

 

Ready Player One; The Shining, Overlook Hotel

Mert Yamak

Stanley Wong

Joana Garrido

Daniel Gagiu

 

Solo: A Star Wars Story; Vandor Planet

Julian Foddy

Christoph Ammann

Clement Gerard

Pontus Albrecht

 

Outstanding Created Environment in an Animated Feature

Dr. Seuss’ The Grinch; Whoville

Loic Rastout

Ludovic Ramiere

Henri Deruer

Nicolas Brack

 

Incredibles 2; Parr House

Christopher M. Burrows

Philip Metschan

Michael Rutter

Joshua West

 

Ralph Breaks the Internet; Social Media District

Benjamin Min Huang

Jon Kim Krummel II

Gina Warr Lawes

Matthias Lechner

 

Spider-Man; Into the Spider-Verse; Graphic New York City

Terry Park

Bret St. Clair

Kimberly Liptrap

Dave Morehead

 

Outstanding Created Environment in an Episode, Commercial, or Realtime Project

Cycles; The House

Michael R.W. Anderson

Jeff Gipson

Jose Luis Gomez Diaz

Edward Everett Robbins III

 

Lost in Space; Pilot; Impact Area

Philip Engström

Kenny Vähäkari

Jason Martin

Martin Bergquist

 

The Deuce; 42nd St

John Bair

Vance Miller

Jose Marin

Steve Sullivan

 

The Handmaid’s Tale; June; Fenway Park

Patrick Zentis

Kevin McGeagh

Leo Bovell

Zachary Dembinski

 

The Man in the High Castle; Reichsmarschall Ceremony

Casi Blume

Michael Eng

Ben McDougal

Sean Myers

 

Outstanding Virtual Cinematography in a Photoreal Project

Aquaman; Third Act Battle

Claus Pedersen

Mohammad Rastkar

Cedric Lo

Ryan McCoy

 

Echo; Time Displacement

Victor Perez

Tomas Tjernberg

Tomas Wall

Marcus Dineen

 

Jurassic World: Fallen Kingdom; Gyrosphere Escape

Pawl Fulker

Matt Perrin

Oscar Faura

David Vickery

 

Ready Player One; New York Race

Daniele Bigi

Edmund Kolloen

Mathieu Vig

Jean-Baptiste Noyau

 

Welcome to Marwen; Town of Marwen

Kim Miles

Matthew Ward

Ryan Beagan

Marc Chu

 

Outstanding Model in a Photoreal or Animated Project 

Avengers: Infinity War; Nidavellir Forge Megastructure

Chad Roen

Ryan Rogers

Jeff Tetzlaff

Ming Pan

 

Incredibles 2; Underminer Vehicle

Neil Blevins

Philip Metschan

Kevin Singleton

 

Mortal Engines; London

Matthew Sandoval

James Ogle

Nick Keller

Sam Tack

 

Ready Player One; DeLorean DMC-12

Giuseppe Laterza

Kim Lindqvist

Mauro Giacomazzo

William Gallyot

 

Solo: A Star Wars Story; Millennium Falcon

Masa Narita

Steve Walton

David Meny

James Clyne

 

Outstanding Effects Simulations in a Photoreal Feature

Avengers: Infinity War; Titan

Gerardo Aguilera

Ashraf Ghoniem

Vasilis Pazionis

Hartwell Durfor

 

Avengers: Infinity War; Wakanda

Florian Witzel

Adam Lee

Miguel Perez Senent

Francisco Rodriguez

 

Fantastic Beasts: The Crimes of Grindelwald

Dominik Kirouac

Chloe Ostiguy

Christian Gaumond

 

Venom

Aharon Bourland

Jordan Walsh

Aleksandar Chalyovski

Federico Frassinelli

 

Outstanding Effects Simulations in an Animated Feature

Dr. Seuss’ The Grinch; Snow, Clouds and Smoke

Eric Carme

Nicolas Brice

Milo Riccarand

 

Incredibles 2

Paul Kanyuk

Tiffany Erickson Klohn

Vincent Serritella

Matthew Kiyoshi Wong

 

Ralph Breaks the Internet; Virus Infection & Destruction

Paul Carman

Henrik Fält

Christopher Hendryx

David Hutchins

 

Smallfoot

Henrik Karlsson

Theo Vandernoot

Martin Furness

Dmitriy Kolesnik

 

Spider-Man: Into the Spider-Verse

Ian Farnsworth

Pav Grochola

Simon Corbaux

Brian D. Casper

 

Outstanding Effects Simulations in an Episode, Commercial, or Realtime Project

Altered Carbon

Philipp Kratzer

Daniel Fernandez

Xavier Lestourneaud

Andrea Rosa

 

Lost in Space; Jupiter is Falling

Denys Shchukin

Heribert Raab

Michael Billette

Jaclyn Stauber

 

Lost in Space; The Get Away

Juri Bryan

Will Elsdale

Hugo Medda

Maxime Marline

 

The Man in the High Castle; Statue of Liberty Destruction

Saber Jlassi

Igor Zanic

Nick Chamberlain

Chris Parks

 

Outstanding Compositing in a Photoreal Feature

Avengers: Infinity War; Titan

Sabine Laimer

Tim Walker

Tobias Wiesner

Massimo Pasquetti

 

First Man

Joel Delle-Vergin

Peter Farkas

Miles Lauridsen

Francesco Dell’Anna

 

Jurassic World: Fallen Kingdom

John Galloway

Enrik Pavdeja

David Nolan

Juan Espigares Enriquez

 

Welcome to Marwen

Woei Lee

Saul Galbiati

Max Besner

Thai-Son Doan

 

Outstanding Compositing in a Photoreal Episode

Altered Carbon

Jean-François Leroux

Reece Sanders

Stephen Bennett

Laraib Atta

 

Handmaids Tale; June

Winston Lee

Gwen Zhang

Xi Luo

Kevin Quatman

 

Lost in Space; Impact; Crash Site Rescue

David Wahlberg

Douglas Roshamn

Sofie Ljunggren

Fredrik Lönn

 

Silicon Valley; Artificial Emotional Intelligence; Fiona

Tim Carras

Michael Eng

Shiying Li

Bill Parker

 

Outstanding Compositing in a Photoreal Commercial

Apple; Unlock

Morten Vinther

Michael Gregory

Gustavo Bellon

Rodrigo Jimenez

 

Apple; Welcome Home

Michael Ralla

Steve Drew

Alejandro Villabon

Peter Timberlake

 

Genesis; G90 Facelift

Neil Alford

Jose Caballero

Joseph Dymond

Greg Spencer

 

John Lewis; The Boy and the Piano

Kamen Markov

Pratyush Paruchuri

Kalle Kohlstrom

Daniel Benjamin

 

Outstanding Visual Effects in a Student Project

Chocolate Man

David Bellenbaum

Aleksandra Todorovic

Jörg Schmidt

Martin Boué

 

Proxima-b

Denis Krez

Tina Vest

Elias Kremer

Lukas Löffler

 

Ratatoskr

Meike Müller

Lena-Carolin Lohfink

Anno Schachner

Lisa Schachner

 

Terra Nova

Thomas Battistetti

Mélanie Geley

Mickael Le Mezo

Guillaume Hoarau

VFX studio Electric Theatre Collective adds three to London team

London visual effects studio Electric Theatre Collective has added three to its production team: Elle Lockhart, Polly Durrance and Antonia Vlasto.

Lockhart brings with her extensive CG experience, joining from Touch Surgery where she ran the Johnson & Johnson account. Prior to that she worked at Analog as a VFX producer where she delivered three global campaigns for Nike. At Electric, she will serve as producer on Martini and Toyota.

Vlasto joins Electric working on clients such Mercedes, Tourism Ireland and Tui. She joins from 750MPH where, over a four-year period, she served as producer on Nike, Great Western Railway, VW and Amazon to name but a few.

At Electric, Polly Durrance will serve as producer on H&M, TK Maxx and Carphone Warehouse. She joins from Unit where she helped launched their in-house Design Collective, worked with clients such as Lush, Pepsi and Thatchers Cider. Prior to Unit Polly was at Big Buoy where she produced work for Jaguar Land Rover, giffgaff and Redbull.

Recent projects at the studio, which also has an office in Santa Monica, California, include Tourism Ireland Capture Your Heart and Honda Palindrome.

Main Image: (L-R) Elle Lockhart, Antonia Vlasto and Polly Durrance.

Milk VFX provides 926 shots for YouTube’s Origin series

London’s Milk VFX, known for its visual effects work on Adrift, Annihilation and Altered Carbon, has just completed production on YouTube Premium’s new sci-fi thriller original series, Origin.

Milk created all of the 926 VFX shots for Origin in 4K, encompassing a wide range of VFX work, in a four-month timeframe. Milk executed rendering entirely in the cloud (via the AWS Cloud Platform); allowing the team to scale its current roster of projects, which include Amazon’s Good Omens and feature film Four Kids and It.

VFX supervisor and Milk co-founder Nicolas Hernandez supervised the entire roster of VFX work on Origin. Milk also supervised the VFX shoot on location in South Africa.

“As we created all the VFX for the 10-episode series it was even more important for us to be on set,” says Hernandez. “As such, our VFX supervisor Murray Barber and onset production manager David Jones supervised the Origin VFX shoot, which meant being based at the South Africa shoot location for several months.”

The series is from Left Bank Pictures, Sony Pictures Television and Midnight Radio in association with China International Television Corporation (CiTVC). Created by Mika Watkins, Origin stars Tom Felton and Natalia Tena and will premiere on 14 November on YouTube Premium.

“The intense challenge of delivering and supervising a show on the scale of Origin — 900 4K shots in four months — was not only helped by our recent expansion and the use of the cloud for rendering, but was largely due to the passion and expertise of the Milk Origin team in collaboration with Left Bank Pictures,” says Cohen.

In terms of tools, Milk used Autodesk Maya, Side Effects Houdini, Foundry’s Nuke and Mari, Shotgun, Photoshop, Deadline for renderfarms and Arnold for rendering and a variety of in-house tools. Hardware includes HPz series workstations and Nvidia graphics. Storage used was Pixitmedia’s PixStor.

The series, from director Paul W.S. Anderson and the producers of The Crown and Lost, follows a group of outsiders who find themselves abandoned on a ship bound for a distant land. Now they must work together for survival, but quickly realize that one of them is far from who they claim to be.

 

Sony Imageworks provides big effects, animation for Warner’s Smallfoot

By Randi Altman

The legend of Bigfoot: a giant, hairy two-legged creature roaming the forests and giving humans just enough of a glimpse to freak them out. Sightings have been happening for centuries with no sign of slowing down — seriously, Google it.

But what if that story was turned around, and it was Bigfoot who was freaked out by a Smallfoot (human)? Well, that is exactly the premise of the new Warner Bros. film Smallfoot, directed by Karey Kirkpatrick. It’s based on the book “Yeti Tracks” by Sergio Pablos.

Karl Herbst

Instead of a human catching a glimpse of the mysterious giant, a yeti named Migo (Channing Tatum) sees a human (James Corden) and tells his entire snow-filled village about the existence of Smallfoot. Of course, no one believes him so he goes on a trek to find this mythical creature and bring him home as proof.

Sony Pictures Imageworks was tasked with all of the animation and visual effects work on the film, while Warner Animation film did all of the front end work — such as adapting the script, creating the production design, editing, directing, producing and more. We reached out to Imageworks VFX supervisor Karl Herbst (Hotel Transylvania 2) to find out more about creating the animation and effects for Smallfoot.

The film has a Looney Tunes-type feel with squash and stretch. Did this provide more freedom or less?
In general, it provided more freedom since it allowed the animation team to really have fun with gags. It also gave them a ton of reference material to pull from and come up with new twists on older ideas. Once out of animation, depending on how far the performance was pushed, other departments — like the character effects team — would have additional work due to all of the exaggerated movements. But all of the extra work was worth it because everyone really loved seeing the characters pushed.

We also found that as the story evolved, Migo’s journey became more emotionally driven; We needed to find a style that also let the audience truly connect with what he was going through. We brought in a lot more subtlety, and a more truthful physicality to the animation when needed. As a result, we have these incredibly heartfelt performances and moments that would feel right at home in an old Road Runner short. Yet it all still feels like part of the same world with these truly believable characters at the center of it.

Was scale between such large and small characters a challenge?
It was one of the first areas we wanted to tackle since the look of the yeti’s fur next to a human was really important to filmmakers. In the end, we found that the thickness and fidelity of the yeti hair had to be very high so you could see each hair next to the hairs of the humans.

It also meant allowing the rigs for the human and yetis to be flexible enough to scale them as needed to have moments where they are very close together and they did not feel so disproportionate to each other. Everything in our character pipeline from animation down to lighting had to be flexible in dealing with these scale changes. Even things like subsurface scattering in the skin had dials in it to deal with when Percy, or any human character, was scaled up or down in a shot.

How did you tackle the hair?
We updated a couple of key areas in our hair pipeline starting with how we would build our hair. In the past, we would make curves that look more like small groups of hairs in a clump. In this case, we made each curve its own strand of a single hair. To shade this hair in a way that allowed artists to have better control over the look, our development team created a new hair shader that used true multiple-scattering within the hair.

We then extended that hair shading model to add control over the distribution around the hair fiber to model the effect of animal hair, which tends to scatter differently than human hair. This gave artists the ability to create lots of different hair looks, which were not based on human hair, as was the case with our older models.

Was rendering so many fury characters on screen at a time an issue?
Yes. In the past this would have been hard to shade all at once, mostly due to our reliance on opacity to create the soft shadows needed for fur. With the new shading model, we were no longer using opacity at all so the number of rays needed to resolve the hair was lower than in the past. But we now needed to resolve the aliasing due to the number of fine hairs (9 million for LeBron James’ Gwangi).

We developed a few other new tools within our version of the Arnold renderer to help with aliasing and render time in general. The first was adaptive sampling, which would allow us to up the anti-aliasing samples drastically. This meant some pixels would only use a few samples while others would use very high sampling. Whereas in the past, all pixels would get the same number. This focused our render times to where we needed it, helping to reduce overall rendering. Our development team also added the ability for us to pick a render up from its previous point. This meant that at a lower quality level we could do all of our lighting work, get creative approval from the filmmakers and pick up the renders to bring them to full quality not losing the time already spent.

What tools were used for the hair simulations specifically, and what tools did you call on in general?
We used Maya and the Nucleus solvers for all of the hair simulations, but developed tools over them to deal with so much hair per character and so many characters on screen at once. The simulation for each character was driven by their design and motion requirements.

The Looney Tunes-inspired design and motion created a challenge around how to keep hair simulations from breaking with all of the quick and stretched motion while being able to have light wind for the emotional subtle moments. We solved all of those requirements by using a high number of control hairs and constraints. Meechee (Zendaya) used 6,000 simulation curves with over 200 constraints, while Migo needed 3,200 curves with around 30 constraints.

Stonekeeper (Common) was the most complex of the characters, with long braided hair on his head, a beard, shaggy arms and a cloak made of stones. He required a cloth simulation pass, a rigid body simulation was performed for the stones and the hair was simulated on top of the stones. Our in-house tool called Kami builds all of the hair at render time and also allows us to add procedurals to the hair at that point. We relied on those procedurals to create many varied hair looks for all of the generics needed to fill the village full of yetis.

How many different types of snow did you have?
We created three different snow systems for environmental effects. The first was a particle simulation of flakes for near-ground detail. The second was volumetric effects to create lots of atmosphere in the backgrounds that had texture and movement. We used this on each of the large sets and then stored those so lighters could pick which parts they wanted in each shot. To also help with artistically driving the look of each shot, our third system was a library of 2D elements that the effects team rendered and could be added during compositing to add details late in shot production.

For ground snow, we had different systems based on the needs in each shot. For shallow footsteps, we used displacement of the ground surface with additional little pieces of geometry to add crumble detail around the prints. This could be used in foreground or background.

For heavy interactions, like tunneling or sliding in the snow, we developed a new tool we called Katyusha. This new system combined rigid body destruction with fluid simulations to achieve all of the different states snow can take in any given interaction. We then rendered these simulations as volumetrics to give the complex lighting look the filmmakers were looking for. The snow, being in essence a cloud, allowed light transport through all of the different layers of geometry and volume that could be present at any given point in a scene. This made it easier for the lighters to give the snow its light look in any given lighting situation.

Was there a particular scene or effect that was extra challenging? If so, what was it and how did you overcome it?
The biggest challenge to the film as a whole was the environments. The story was very fluid, so design and build of the environments came very late in the process. Coupling that with a creative team that liked to find their shots — versus design and build them — meant we needed to be very flexible on how to create sets and do them quickly.

To achieve this, we begin by breaking the environments into a subset of source shapes that could be combined in any fashion to build Yeti Mountain, Yeti Village and the surrounding environments. Surfacing artists then created materials that could be applied to any set piece, allowing for quick creative decisions about what was rock, snow and ice, and creating many different looks. All of these materials were created using PatternCreate networks as part of our OSL shaders. With them we could heavily leverage the portable procedural texturing between assets making location construction quicker, more flexible and easier to dial.

To get the right snow look for all levels of detail needed, we used a combination of textured snow, modeled snow and a simulation of geometric snowfall, which all needed to shade the same. For the simulated snowfall we created a padding system that could be run at any time on an environment giving it a fresh coating of snow. We did this so that filmmakers could modify sets freely in layout and not have to worry about broken snow lines. Doing all of that with modeled snow would have been too time-consuming and costly. This padding system worked not only in organic environments, like Yeti Village, but also in the Human City at the end of the film. The snow you see in the Human City is a combination of this padding system in the foreground and textures in the background.

Tom Cruise in MISSION: IMPOSSIBLE - FALLOUT. Director Chris McQuarrie.

Mission: Impossible — Fallout writer/director Christopher McQuarrie

By Iain Blair

It’s hard to believe, but it’s been 22 years since Tom Cruise first launched the Mission: Impossible franchise. Since then, it’s become a global cultural phenomenon that’s grossed more than $2.8 billion, making it one of the most successful series in movie history.

With Mission: Impossible — Fallout, Cruise reprises his role of Impossible Missions Force (IMF) team leader Ethan Hunt for the sixth time. And writer/director/producer Christopher McQuarrie, who directed the series’ previous film Mission: Impossible — Rogue Nation, also returns. That makes him the first filmmaker ever to return to direct a second film in a franchise where one of its signature elements is that there’s been a different director for every movie.

Mission: Impossible - Fallout Director Christopher McQuarrie

Christopher McQuarrie

In the latest twisty adventure, Hunt and his IMF team (Alec Baldwin, Simon Pegg, Ving Rhames), along with some familiar allies (Rebecca Ferguson, Michelle Monaghan), find themselves in a race against time to stop a nuclear bomb disaster after a mission gone wrong. The film, which also stars Henry Cavill, Angela Bassett, Sean Harris and Vanessa Kirby, features a stellar team behind the camera as well, including director of photography Rob Hardy, production designer Peter Wenham, editor Eddie Hamilton, visual effects supervisor Jody Johnson and composer Lorne Balfe.

In 1995, McQuarrie got his start writing the script for The Usual Suspects, which won him the Best Original Screenplay Oscar. In 2000, he made his directorial debut with The Way of the Gun. Then in 2008 he reteamed with Usual Suspects director Bryan Singer, co-writing the WWII film Valkyrie, starring Tom Cruise. He followed that up with his 2010 script for The Tourist, then two years later, he worked with Cruise again on Jack Reacher, which he wrote and directed.

I recently talked with the director about making the film, dealing with all the visual effects and the workflow.

How did you feel when Tom asked for you to come back and do another MI film?
I thought, “Oh no!” In fact, when he asked me to do Rogue Nation, I was very hesitant because I’d been on the set of Ghost Protocol, and I saw just how complicated and challenging these films are. I was terrified. So after I’d finished Rogue, I said to myself, “I feel really sorry for the poor son-of-a-bitch who does the next one.” After five movies, I didn’t think there was anything left to do, but the joke turned out to be on me!

Tom Cruise, Mission: Impossible - FalloutWhat’s the secret of its continuing appeal?
First off, Tom himself. He’s always pushing himself and entertaining the audience with stuff they’ve never seen before. Then it’s all about character and story. The emphasis is always on that and the humanity of these characters. On every film, and with the last two we’ve done together, he’s learned how much deeper you can go with that and refined the process. You’re always learning from the audience as well. What they want.

How do you top yourself and make this different from the last one?
To make it different, I replaced my core crew — new DP, new composer and so on — and went for a different visual language. My intention on both films was not to even try to top the previous one. So when we started this I told Tom, “I just want to place somewhere in the Top 6 of Mission: Impossible films. I’m not trying to make the greatest action film ever.”

You say that, but it’s stuffed full of nail-biting car chases and really ambitious action sequences.
(Laughs) Well, at the same time you’re always trying to do something different from the other films in the franchise, so in Rogue I had this idea for a female counterpart for Tom — Ilsa (Rebecca Ferguson) was a more dynamic love interest. I looked at the other five films and realized that the biggest action scene of any of those films had not come in the third act. So it was a chance to create the biggest and most climactic third act — a huge team sequence that involved everyone. That was the big goal. But we didn’t set out to make this giant movie, and it wasn’t till we began editing that we realized just how much action there is.

Women seem to have far larger roles this time out.
That was very intentional from the start. In my earliest talks with Tom, we discussed the need to resolve the Julia (Michelle Monaghan) character and find closure to that story. So we had her and Rebecca, and then Angela Bassett came on board to replace Alec Baldwin’s character at the CIA after he moves to IMF, and it grew from there. I had an idea for the White Widow (Vanessa Kirby) character, and we just stayed open to all possibilities and the idea that these strong women, who own all the scenes they’re in, throw Ethan off balance all the time.

How early did you integrate post into the shoot?
Right at the start, since we had so many visual effects. We also had a major post challenge as Tom broke his ankle doing a rooftop chase stunt in London. So we had to shut down totally for six weeks and re-arrange the whole schedule to accommodate his recovery, and even when he got back on the movie his ankle wasn’t really healed enough.

We then had to shoot a lot of stuff piecemeal, and I knew, in order to make the release date, we had to start cutting right away when we had to stop for six weeks. But that also gave me a chance to re-evaluate it all, since you don’t really know the film you’ve shot until you get in the edit room, and that let me do course corrections I couldn’t have done otherwise. So, I essentially ended up doing re-shoots while still shooting the film. I was able to rewrite the second act, and it also meant that we had a finished cut done just six days after we wrapped. And we were able to test that movie four times and keep fine-tuning it.

Where did you do the post?Mission: Impossible: Fallout Tom Cruise
All in London, around Soho, and we did the sound at De Lane Lea.

Like Rogue, this was edited by Eddie Hamilton. Was he on the set?
Yes, and he’s invaluable because he’s got a very good eye, is a great storyteller and has a great sense of the continuity. He can also course-correct very quickly and let me know when we need to grab another shot. On Rogue Nation, he also did a lot of 2nd unit stuff, and he has great skills with the crew. We didn’t really have a 2nd unit on this one, which I think is better because it can get really chaotic with one. Basically, I love the edit, and I love being in the editing room and working hand in hand with my editor, shot for shot, and communicating all the time during production. It was a great collaboration.

There’s obviously a huge number of visual effects shots in the film. How many are there?
I’d say well over 3,000, and our VFX supervisor Jody Johnson at Double Negative did an amazing job. DNeg, Lola, One of Us, Bluebolt and Cheap Shot all worked on them. There was a lot of rig removal and cleanup along with the big set pieces.

Mission: Impossible Fallout

What was the most difficult VFX sequence/shot to do and why?
The big “High Altitude Low Opening,” or HALO sequence, where Tom jumps out of a Boeing Globemaster at 25,000 feet was far and away the most difficult one. We shot part of it at an RAF base in England, but then with Tom’s broken ankle and the changed schedule, we ended up shooting some of it in Abu Dhabi. Then we had to add in the Paris backdrop and the lightning for the storm, and to maintain the reality we had to keep the horizon in the shot. As the actors were falling at 160 MPH toward the Paris skyline, all of those shots had to be tracked by hand. No computer could do it, and that alone took hundreds of people working on it for three months to complete. It was exhausting.

Can you talk about the importance of music and sound to you as a filmmaker?
It’s so vital, and for me it’s always a three-pronged approach — music, sound and silence, and then the combination of all three elements. It’s very important to maintain the franchise aesthetic, but I wanted to have a fresh approach, so I brought in composer Lorne Balfe, and he did a great job.

The DI must have been vital. How did that process help?
We did it at Molinare in London with colorist Asa Shoul, who is just so good. I’m fairly hands on, especially as the DP was off on another project by the time we did the DI, although he worked on it with Asa as well. We had a big job dealing with all the stuff we shot in New Zealand, bringing it up to the other footage. I actually try to get the film as close as possible to what I want on the day, and then use the DI as a way of enhancing and shaping that, but I don’t actually like to manipulate things too much, although we gave all the Paris stuff this sort of hazy, sweaty look and feel which I love.

What’s next?
A very long nap.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Mark Thorely joins Mill Film Australia as MD

Mill Film in Australia, a Technicolor VFX studio, has named Mark Thorley as managing director.Hi appointment comes in the wake of the February launch of Mill Film in Adelaide, Australia.

Thorley brings with him more than 15 years of executive experience, working at such studios as Lucas Film, Singapore, where he oversaw studio operations and production strategies. Prior to that, Thorley spent nine years at Animal Logic, at both their Los Angeles and Sydney locations, as head of production. He also held senior positions at Screen Queensland and Omnicom.

Throughout his career, Thorley has received credits on numerous blockbuster feature films, including Kong: Skull Island, Rogue One, Jurassic World and Avengers: Age of Ultron. Thorley will drive all aspects of VFX production, client relations and business development for Australia, reporting into the global head of Mill Film, Lauren McCallum.

Milk provides VFX for Adrift, adds new head of production Kat Mann

As it celebrates its fifth anniversary, Oscar-, Emmy- and BAFTA-winning VFX studio Milk has taken an additional floor at its London location on Clipstone Street. This visual effects house has worked on projects such as Annihilation, Altered Carbon and Fantastic Beasts and Where to Find Them.

Milk’s expansion increases its artist capacity to 250, and includes two 4K FilmLight Baselight screening rooms and a dedicated client area. The studio has upgraded its pipeline, with all its rendering requirements (along with additional storage and workstation capacity) now entirely in the cloud, allowing full scalability for its roster of film and TV projects.

Annihilation

Milk has just completed production as the main vendor on STXfilms’ new feature film Adrift, the Baltasar Kormákur-directed true story of survival at sea, starring Shailene Woodley and Sam Claflin. The Milk team created all the major water and storm sequences for the feature, which were rendered entirely in the cloud.

Milk has just begun work on new projects, including Four Kids And It — Dan Films/Kindle Entertainment’s upcoming feature film — based on Jacqueline Wilson’s modern-day variation on the 1902 E Nesbit classic novel Five Children And It for which the Milk team will create the protagonist CG sand fairy character. Milk is also in production as sole VFX vendor on Neil Gaiman’s and Terry Pratchett’s six-part TV adaptation of Good Omens for Amazon/BBC.

In other news, Milk has brought on VFX producer Kat Mann as head of production. She will oversee all aspects of the studio’s production at its premises in London and at their Cardiff location. Mann has held senior production roles at ILM and Weta Digital with credits, including Jurassic World: Fallen Kingdom, Thor: The Dark World and Avatar. Milk’s former head of production Clare Norman has been promoted to business development director.

Milk was founded by a small team of VFX supervisors and producers in June 2013,

Framestore London adds joint heads of CG

Framestore has named Grant Walker and Ahmed Gharraph as joint heads of CG at its London studio. The two will lead the company’s advertising, television and immersive work alongside head of animation Ross Burgess.

Gharraph has returned to Framestore after a two-year stint at ILM, where he was lead FX artist on Star Wars: The Last Jedi, receiving a VES nomination in Outstanding Effects Simulations in a Photoreal Feature. His credits on the advertising-side as CG supervisor include Mog’s Christmas Calamity, which was Sainsbury’s 2015 festive campaign, and Shell V-Power Shapeshifter, directed by Carl Erik Rinsch.

Walker joined Framestore in 2009, and in his time at the company he has worked across film, advertising and television, building a portfolio as a CG artist with campaigns, including Freesat’s VES-nominated Sheldon. He was also instrumental in Framestore’s digital recreation of Audrey Hepburn in Galaxy’s 2013 campaign Chauffeur for AMV BBDO. Most recently, he was BAFTA-nominated for his creature work in the Black Mirror episode, “Playtest.”

Lindsay Seguin upped to EP at NYC’s FuseFX

Visual effects studio FuseFX has promoted Lindsay Seguin to executive producer in the studio’s New York City office. Seguin is now responsible for overseeing all client relationships at the FuseFX New York office, acting as a strategic collaborator for current and future productions spanning television, commercial and film categories. The company also has an office in LA.

Seguin, who first joined FuseFX in 2014, was previously managing producer. During her time with the company, she has worked with a number of high-profile client productions, including The Blacklist, Luke Cage, The Punisher, Iron Fist, Mr. Robot, The Get Down and the feature film American Made.

“Lindsay has played a key role in the growth and success of our New York office, and we’re excited for her to continue to forge partnerships with some of our biggest clients in her new role,” says Joseph Bell, chief operating officer and executive VP of production at FuseFX.

“We have a really close-knit team that enjoys working together on exciting projects,” Seguin added about her experience working at FuseFX. “Our crew is very savvy and hardworking, and they manage to maintain a great work/life balance, even as the studio delivers VFX for some of the most popular shows on television. Our goal is to have a healthy work environment and produce awesome visual effects.”

Seguin is a member of the Visual Effects Society and the Post New York Alliance. Prior to making the transition to television and feature work, her experience was primarily in national broadcast and commercial projects, which included campaigns for Wendy’s, Garnier, and Optimum. She is a graduate of Penn State University with a degree in telecommunications. Born in Toronto, Seguin is a dual citizen of Canada and the United States.

Review: HP’s lower-cost DreamColor Z24x display

By Dariush Derakhshani

So, we all know how important a color-accurate monitor is in making professional-level graphics, right? Right?!? Even at the most basic level, when you’re stalking online for the perfect watch band for your holiday present of a smart watch, you want the orange band you see in the online ad to be what you get when it arrives a few days later. Even if your wife thinks orange doesn’t suit you, and makes you look like “you’re trying too hard.”

Especially as a content developer, you want to know what you’re looking at is an accurate representation of the image. Ever walk into a Best Buy and see multiple screens showing the same content but with wild ranging differences in color? You can’t have that discrepancy working as a pro, especially in collaboration; you need color accuracy. In my own experience, that position has been filled by HP’s 10-bit DreamColor displays for many years now, but not everyone is awash in bitcoins, and justifying a price tag of over $1,200 is sometimes hard to justify, even for a studio professional.

Enter HP’s DreamColor Z24x display at half the price, coming in around $550 online. Yes, DreamColor for half the cost. That’s pretty significant. For the record, I haven’t used a 24-inch monitor since the dark ages; when Lost was the hot TV show. I’ve been fortunate enough to be running at 27-inch and higher, so there was a little shock when I started using the Z24x HP sent me for review. But this is something I quickly got used to.

With my regular 32-inch 4K display still my primary — so I can fit loads of windows all over the place — I used this DreamColor screen as my secondary display, primarily to check output for my Adobe After Effects comps, Adobe Premiere Pro edits and to hold my render view window as I develop shaders and lighting in Autodesk Maya. I felt comfortable knowing the images I shared with my colleagues across town would be seen as I intended them, evening the playing field when working collaboratively (as long as everyone is on the same LUT and color space). Speaking of color spaces, the Z24x hits 100% of sRGB, 99% of AdobeRGB and 96% of DCI P3, which is just slightly under HP’s Z27x DreamColor. It is, however, slightly faster with a 6ms response rate.

The Z24x has a 24-inch IPS panel from LG that exhibits color in 10-bit, like its bigger 27-inch Z27x sibling. This gives you over a billion colors, which I have personally verified by counting them all —that was one, long weekend, I can tell you. Unlike the highest-end DreamColor screens though, the Z24x dithers up from 8-bit to 10-bit (called an 8-bit+FRC). This means it’s better than an 8-bit color display, for sure, but not quite up to real 10-bit, making it color accurate but not color critical. HP’s implementation of dithering is quite good, when subjectively compared to my full 10-bit main display. Frankly, a lot of screens that claim 10-bit may actually be 8-bit+FRC anyway!

While the Z27x gives you 2560×1440 as you expect of most 27inch displays, if not full on 4K, the Z24x is at a comfortable 1920×1200, just enough for a full 1080p image and a little room for a slider or info bar. Being the res snob that I am, I had wondered if that was just too low, but at 24-inches I don’t think you would want a higher resolution, even if you’re sitting only 14-inches away from it. And this is a sentiment echoed by the folks at HP who consulted with so many of their professional clients to build this display. That gives a pixel density of about 94PPI, a bit lower than the 109PPI of the Z27x. This density is about the same as a 1080p HD display at 27-inch, so it’s still crisp and clean.

Viewing angles are good at about 178 degrees, and the screen is matte, with an anti-glare coating, making it easier to stare at without blinking for 10 hours at a clip, as digital artists usually do. Compared to my primary display, this HP’s coating was more matte and still gave me a richer black in comparison, which I liked to see.

Connection options are fairly standard with two DisplayPorts, one HDMI, and one DVI dual link for anyone still living in the past. You also get four USB ports and an analog 3.5mm audio jack if you want to drive some speakers, since you can’t from your phone anymore (Apple, I’m looking at you).

Summing Up
So while 24-inches is a bit small for my tastes for a display, I am seriously impressed at the street price of the Z24x, allowing a lot more pros and semi-pros to get the DreamColor accuracy HP offers at half the price. While I wouldn’t recommend color grading a show on the Z24x, this DreamColor does a nice job of bringing a higher level of color confidence at an attractive price. As a secondary display, the z24x is a nice addition to an artist workflow with budget in mind — or who has a mean, orange-watch-band-hating spouse.


Dariush Derakhshani is a VFX supervisor and educator in Southern California. You can follow his random tweets at @koosh3d.

Young pros with autism contribute to Oscar-nominated VFX films

Exceptional Minds Studio, the LA-based visual effects and animation studio made up of young people on the autism spectrum, earned screen credit on three of the five films nominated for Oscars in the visual effects category — Star Wars: The Last Jedi, Guardians of the Galaxy Vol. 2 and War for the Planet of the Apes.

L-R: Lloyd Hackl, Kenneth Au, Mason Taylor and Patrick Brady.

For Star Wars: The Last Jedi, the artists at Exceptional Minds Studio were contracted to do visual effects cleanup work that involved roto and paint for several shots. “We were awarded 20 shots for this film that included very involved rotoscoping and paint work,” explains Exceptional Minds Studio executive producer Susan Zwerman.

The studio was also hired to create the end titles for Star Wars: The Last Jedi, which involved compositing the text into a star-field background.

For Guardians of the Galaxy Vol. 2, Exceptional Minds provided the typesetting for the end credit crawl. For War for the Planet of the Apes, the studio provided visual effects cleanup on 10 shots — this involved tracker marker removal using roto and paint.

Exceptional Minds used Foundry’s Nuke for much of their work, in addition to Silhouette and Mocha for After Effects.

Star Wars: The Last Jedi. Courtesy of ILM

Since opening its doors almost four years ago, this small studio has worked on visual effects for more than 50 major motion pictures and/or television series, including The Good Doctor, Game of Thrones and Doctor Strange.

“The VFX teams we worked with on each of these movies were beyond professional, and we are so thankful that they gave our artists the opportunity to work with them,” says Zwerman, adding that “many of our artists never even dreamed they would be working in this industry.”

An estimated 90 percent of the autism population is under employed or unemployed, and few training programs exist to prepare young adults with autism for meaningful careers, which is what makes this program so important.

“I couldn’t imagine doing this when I was young,” agreed Patrick Brady, an Exceptional Minds VFX artist.

VFX supervisor Lesley Robson-Foster on Amazon’s Mrs. Maisel

By Randi Altman

If you are one of the many who tend to binge-watch streaming shows, you’ve likely already enjoyed Amazon’s The Marvelous Mrs. Maisel. This new comedy focuses on a young wife and mother living in New York City in 1958, when men worked and women tended to, well, not work.

After her husband leaves her, Mrs. Maisel chooses stand-up comedy over therapy — or you could say stand-up comedy chooses her. The show takes place in a few New York neighborhoods, including the toney Upper West Side, the Garment District and the Village. The storyline brings real-life characters into this fictional world — Midge Maisel studies by listening to Red Foxx comedy albums, and she also befriends comic Lenny Bruce, who appears in a number of episodes.

Lesley Robson-Foster on set.

The show, created by Amy Sherman-Palladino and Dan Palladino, is colorful and bright and features a significant amount of visual effects — approximately 80 per episode.

We reached out to the show’s VFX supervisor, Lesley Robson-Foster, to find out more.

How early did you get involved in Mrs. Maisel?
The producer Dhana Gilbert brought my producer Parker Chehak and I in early to discuss feasibility issues, as this is a period piece and to see if Amy and Dan liked us! We’ve been on since the pilot.

What did the creators/showrunners say they needed?
They needed 1958 New York City, weather changes and some very fancy single-shot blending. Also, some fantasy and magic realism.

As you mentioned, this is a period piece, so I’m assuming a lot of your work is based on that.
The big period shots in Season 1 are the Garment District reconstruction. We shot on 19th Street between 5th and 6th — the brilliant production designer Bill Groom did 1/3 of the street practically and VFX took care of the rest, such as crowd duplication and CG cars and crowds. Then we shot on Park Avenue and had to remove the Met Life building down near Grand Central, and knock out anything post-1958.

We also did a major gag with the driving footage. We shot driving plates around the Upper West Side and had a flotilla of period-correct cars with us, but could not get rid of all the parked cars. My genius design partner on the show Douglas Purver created a wall of parked period CG cars and put them over the modern ones. Phosphene then did the compositing.

What other types of effects did you provide?
Amy and Dan — the creators and showrunners — haven’t done many VFX shows, but they are very, very experienced. They write and ask for amazing things that allow me to have great fun. For example, I was asked to make a shot where our heroine is standing inside a subway car, and then the camera comes hurtling backwards through the end of the carriage and then sees the train going away down the tunnel. All we had was a third of a carriage with two and a half walls on set. Douglas Purver made a matte painting of the tunnel, created a CG train and put it all together.

Can you talk about the importance of being on set?
For me being on set is everything. I talk directors out of VFX shots and fixes all day long. If you can get it practically you should get it practically. It’s the best advice you’ll ever give as a VFX supervisor. A trust is built that you will give your best advice, and if you really need to shoot plates and interrupt the flow of the day, then they know it’s important for the finished shot.

Having a good relationship with every department is crucial.

Can you give an example of how being on set might have saved a shot or made a shot stronger?
This is a character-driven show. The directors really like Steadicam and long, long shots following the action. Even though a lot of the effects we want to do really demand motion control, I know I just can’t have it. It would kill the performances and take up too much time and room.

I run around with string and tennis balls to line things up. I watch the monitors carefully and use QTake to make sure things line up within acceptable parameters.

In my experience you have to have the production’s best interests at heart. Dhana Gilbert knows that a VFX supervisor on the crew and as part of the team smooths out the season. They really don’t want a supervisor who is intermittent and doesn’t have the whole picture. I’ve done several shows with Dhana; she knows my idea of how to service a show with an in-house team.

You shot b-roll for this? What camera did you use, and why?
We used a Blackmagic Ursa Mini Pro. We rented one on The OA for Netflix last year and found it to be really easy to use. We liked that’s its self-contained and we can use the Canon glass from our DSLR kits. It’s got a built-in monitor and it can shoot RAW 4.6K. It cut in just fine with the Alexa Mini for establishing shots and plates. It fits into a single backpack so we could get a shot at a moment’s notice. The user interface on the camera is so intuitive that anyone on the VFX team could pick it up and learn how to get the shot in 30 minutes.

What VFX houses did you employ, and how do you like to work with them?
We keep as much as we can in New York City, of course. Phosphene is our main vendor, and we like Shade and Alkemy X. I like RVX in Iceland, El Ranchito in Spain and Rodeo in Montreal. I also have a host of secret weapon individuals dotted around the world. For Parker and I, it’s always horses for courses. Whom we send the work to depends on the shot.

For each show we build a small in-house team — we do the temps and figure out the design, and shoot plates and elements before shots leave us to go to the vendor.

You’ve worked on many critically acclaimed television series. Television is famous for quick turnarounds. How do you and your team prepare for those tight deadlines?
Television schedules can be relentless. Prep, shoot and post all at the same time. I like it very much as it keeps the wheels of the machine oiled. We work on features in between the series and enjoy that slower process too. It’s all the same skill set and workflow — just different paces.

If you have to offer a production a tip or two about how to make the process go more smoothly, what would it be?
I would say be involved with EVERYTHING. Keep your nose close to the ground. Really familiarize yourself with the scripts — head trouble off at the pass by discussing upcoming events with the relevant person. Be fluid and flexible and engaged!

Jogger moves CD Andy Brown from London to LA

Creative director Andy Brown has moved from Jogger’s London office to its Los Angeles studio. Brown led the development of boutique VFX house Jogger London, including credits for the ADOT PSA Homeless Lights via Ogilvy & Mather, as well as projects for Adidas, Cadbury, Valentino, Glenmorangie, Northwestern Mutual, La-Z-Boy and more. He’s also been involved in post and VFX for short films such as Foot in Mouth, Containment and Daisy as well as movie title sequences (via The Morrison Studio), including Jupiter Ascending, Collide, The Ones Below and Ronaldo.

Brown got his start in the industry at MPC, where he worked for six years, eventually assuming the role of digital online editor. He then went on to work in senior VFX roles at some of London’s post houses, before assuming head of VFX at One Post. Following One Post’s merger with Rushes, Brown founded his own company Four Walls, establishing the company’s reputation for creative visual effects and finishing.

Brown oversaw Four Walls’ merger with LA’s Jogger Studios in 2016. Andy has since helped form interconnections with Jogger’s teams in London, New York, Los Angeles, San Francisco and Austin, with high-end VFX, motion graphics and color grading carried out on projects globally.

VFX house Jogger is a sister company of editing house Cut + Run.

Creating CG wildlife for Jumanji: Welcome to the Jungle

If you are familiar with the original Jumanji film from 1995 — about a board game that brings its game jungle, complete with animals and the boy it trapped decades earlier, into the present day — you know how important creatures are to the story. In this new version of the film, the game traps four teens inside its video game jungle, where they struggle to survive among the many creatures, while trying to beat the game.

For Columbia Pictures’ current sequel, Jumanji: Welcome to the Jungle, Montreal-based visual effects house Rodeo FX was called on to create 96 shots, including some of the film’s wildlife. The film stars Dwayne Johnson, Jack Black and Kevin Hart.

“Director Jake Kasdan wanted the creatures to feel cursed, so our team held back from making them too realistic,” explains Rodeo FX VFX supervisor Alexandre Lafortune. “The hippo is a great example of a creature that would have appeared scary if we had made it look real, so we made it bigger and faster and changed the pink flesh in its mouth to black. These changes make the hippo fit in with the comedy.”

The studio’s shots for the film feature a range of creatures, as well as matte paintings and environments. Rodeo FX worked alongside the film’s VFX supervisor, Jerome Chen, to deliver the director’s vision for the star-studded film.

“It was a pleasure to collaborate with Rodeo FX on this film,” says Chen. “I relied on Alexandre Lafortune and his team to help us with sequences requiring full conceptualization and execution from start to finish.”

Chen entrusted Rodeo FX with the hippo and other key creatures, including the black mamba snake that engages Bethany, played by Jack Black, in a staring contest. The snake was created by Rodeo FX based on a puppet used on set by the actors. Rodeo FX used a 3D scan of the prop and brought it to life in CG, making key adjustments to its appearance, including coloring and mouth shape. The VFX studio also delivered shots of a scorpion, crocodile, a tarantula and a centipede that complement the tone of the film’s villain.

In terms of tools, “We used Maya and Houdini — mainly for water effects — as 3D tools, Zbrush for modeling and Nuke for compositing,” reports Lafortune. “Arnold renderer was used for 3D renders, such as lighting and shading shaders.”

Additional Rodeo FX’s creature work can be seen in IT, The Legend of Tarzan and Paddington 2.

VES names award nominees

The Visual Effects Society (VES) has announced the nominees for its 16th Annual VES Awards, which recognize visual effects artistry and innovation in film, animation, television, commercials and video games and the VFX supervisors, VFX producers and hands-on artists who bring this work to life.

Blade Runner 2049 and War for the Planet of the Apes have tied for the most feature film nominations with seven each. Despicable Me 3 is the top animated film contender with five nominations, and Game of Thrones leads the broadcast field and scores the most nominations overall with 11.

Nominees in 24 categories were selected by VES members via events hosted by 10 of its sections, including Australia, the Bay Area, London, Los Angeles, Montreal, New York, New Zealand, Toronto, Vancouver and Washington. The VES Awards will be held on February 13 at the Beverly Hilton Hotel. The VES Georges Méliès Award will be presented to Academy Award-winning visual effects supervisor Joe Letteri, VES. The VES Lifetime Achievement Award will be presented to producer/writer/director Jon Favreau. Comedian Patton Oswalt will once again host.

Here are the nominees:

Outstanding Visual Effects in a Photoreal Feature

 

Blade Runner 2049

John Nelson

Karen Murphy Mundell

Paul Lambert

Richard Hoover

Gerd Nefzer

 

Guardians of the Galaxy Vol. 2

Christopher Townsend

Damien Carr

Guy Williams

Jonathan Fawkner

Dan Sudick

Kong: Skull Island

Jeff White

Tom Peitzman

Stephen Rosenbaum

Scott Benza

Michael Meinardus

 

Star Wars: The Last Jedi

Ben Morris

Tim Keene

Eddie Pasquarello

Daniel Seddon

Chris Corbould

 

War for the Planet of the Apes

Joe Letteri

Ryan Stafford

Daniel Barrett

Dan Lemmon

Joel Whist

 

Outstanding Supporting Visual Effects in a Photoreal Feature

 

Darkest Hour

Stephane Naze

Warwick Hewitt

Guillaume Terrien

Benjamin Magana

Downsizing

James E. Price

Susan MacLeod

Lindy De Quattro

Stéphane Nazé

 

Dunkirk

Andrew Jackson

Mike Chambers

Andrew Lockley

Alison Wortman

Scott Fisher

 

Mother!

Dan Schrecker

Colleen Bachman

Ben Snow

Wayne Billheimer

Peter Chesney

 

Only the Brave

Eric Barba

Dione Wood

Matthew Lane

Georg Kaltenbrunner

Michael Meinardus

 

Outstanding Visual Effects in an Animated Feature

 

Captain Underpants

David Soren

Mark Swift

Mirielle Soria

David Dulac

 

Cars 3

Brian Fee

Kevin Reher

Michael Fong

Jon Reisch

Coco

Lee Unkrich

Darla K. Anderson

David Ryu

Michael K. O’Brien

 

Despicable Me 3

Pierre Coffin

Chris Meledandri

Kyle Balda

Eric Guillon

 

The Lego Batman Movie

Rob Coleman

Amber Naismith

Grant Freckelton

Damien Gray

The Lego Ninjago Movie

Gregory Jowle

Fiona Chilton

Miles Green

Kim Taylor

 

Outstanding Visual Effects in a Photoreal Episode

 

Agents of S.H.I.E.L.D.: Orientation Part 1

Mark Kolpack

Sabrina Arnold

David Rey

Kevin Yuille

Gary D’Amico

 

Game of Thrones: Beyond the Wall

Joe Bauer

Steve Kullback

Chris Baird

David Ramos

Sam Conway

 

Legion: Chapter 1

John Ross

Eddie Bonin

Sebastien Bergeron

Lionel Lim

Paul Benjamin

 

Star Trek: Discovery: The Vulcan Hello

Jason Michael Zimmerman

Aleksandra Kochoska

Ante Dekovic

Mahmoud Rahnama

 

Stranger Things 2: The Gate

Paul Graff

Christina Graff

Seth Hill

Joel Sevilla

Caius the Man

 

Outstanding Supporting Visual Effects in a Photoreal Episode

 

Black Sails: XXIX

Erik Henry

Terron Pratt

Yafei Wu

David Wahlberg

Paul Dimmer

 

Fear the Walking Dead: Sleigh Ride

Peter Crosman

Denise Gayle

Philip Nussbaumer

Martin Pelletier

Frank Ludica

 

Mr. Robot: eps3.4_runtime-err0r.r00

Ariel Altman

Lauren Montuori

John Miller

Luciano DiGeronimo

 

Outlander: Eye of the Storm

Richard Briscoe

Elicia Bessette

Aladino Debert

Filip Orrby

Doug Hardy

 

Taboo: Pilot

Henry Badgett

Tracy McCreary

Nic Birmingham

Simon Rowe

Colin Gorry

 

Vikings: On the Eve

Dominic Remane

Mike Borrett

Ovidiu Cinazan

Paul Wishart

Paul Byrne

 

Outstanding Visual Effects in a Real-Time Project

 

Assassin’s Creed Origins

Raphael Lacoste

Patrick Limoges

Jean-Sebastien Guay

Ulrich Haar

 

Call of Duty: WWII

Joe Salud

Atsushi Seo

Danny Chan

Jeremy Kendall

 

Fortnite: A Hard Day’s Night

Michael Clausen

Gavin Moran

Brian Brecht

Andrew Harris

 

Sonaria

Scot Stafford

Camille Cellucci

Kevin Dart

Theresa Latzko

 

Uncharted: The Lost Legacy

Shaun Escayg

Tate Mosesian

Eben Cook

 

Outstanding Visual Effects in a Commercial

 

Beyond Good and Evil 2

Leon Berelle

Maxime Luère

Dominique Boidin

Remi Kozyra

 

Kia Niro: Hero’s Journey

Robert Sethi

Anastasia von Rahl

Tom Graham

Chris Knight

Dave Peterson

 

Mercedes Benz: King of the Jungle

Simon French

Josh King

Alexia Paterson

Leonardo Costa

 

Monster: Opportunity Roars

Ruben Vandebroek

Clairellen Wallin

Kevin Ives

Kyle Cody

 

Samsung: Do What You Can’t, Ostrich

Diarmid Harrison-Murray

Tomek Zietkiewicz

Amir Bazazi

Martino Madeddu

 

Outstanding Visual Effects in a Special Venue Project

 

Avatar: Flight of Passage

Richard Baneham

Amy Jupiter

David Lester

Thrain Shadbolt

 

Corona: Paraiso Secreto

Adam Grint

Jarrad Vladich

Roberto Costas Fernández

Ed Thomas

Felipe Linares

 

Guardians of the Galaxy: Mission: Breakout!

Jason Bayever

Amy Jupiter

Mike Bain

Alexander Thomas

 

National Geographic Encounter: Ocean Odyssey

Thilo Ewers

John Owens

Gioele Cresce

Mariusz Wesierski

 

Nemo and Friends SeaRider

Anthony Apodaca

Kathy Janus

Brandon Benepe

Nick Lucas

Rick Rothschild

 

Star Wars: Secrets of the Empire

Ben Snow

Judah Graham

Ian Bowie

Curtis Hickman

David Layne

 

Outstanding Animated Character in a Photoreal Feature

 

Blade Runner 2049: Rachael

Axel Akkeson

Stefano Carta

Wesley Chandler

Ian Cooke-Grimes

Kong: Skull Island: Kong

Jakub Pistecky

Chris Havreberg

Karin Cooper

Kris Costa

 

War for the Planet of the Apes: Bad Ape

Eteuati Tema

Aidan Martin

Florian Fernandez

Mathias Larserud

War for the Planet of the Apes: Caesar

Dennis Yoo

Ludovic Chailloleau

Douglas McHale

Tim Forbes

 

Outstanding Animated Character in an Animated Feature

 

Coco: Hèctor

Emron Grover

Jonathan Hoffman

Michael Honsel

Guilherme Sauerbronn Jacinto

 

Despicable Me 3: Bratt

Eric Guillon

Bruno Dequier

Julien Soret

Benjamin Fournet

 

The Lego Ninjago Movie: Garma Mecha Man

Arthur Terzis

Wei He

Jean-Marc Ariu

Gibson Radsavanh

 

The Boss Baby: Boss Baby

Alec Baldwin

Carlos Puertolas

Rani Naamani

Joe Moshier

 

The Lego Ninjago Movie: Garmadon

Matthew Everitt

Christian So

Loic Miermont

Fiona Darwin

 

Outstanding Animated Character in an Episode or Real-Time Project

 

Black Mirror: Metalhead

Steven Godfrey

Stafford Lawrence

Andrew Robertson

Lestyn Roberts

 

Game of Thrones: Beyond the Wall – Zombie Polar Bear

Paul Story

Todd Labonte

Matthew Muntean

Nicholas Wilson

 

Game of Thrones: Eastwatch – Drogon Meets Jon

Jonathan Symmonds

Thomas Kutschera

Philipp Winterstein

Andreas Krieg

 

Game of Thrones: The Spoils of War – Drogon Loot Train Attack

Murray Stevenson

Jason Snyman

Jenn Taylor

Florian Friedmann

 

Outstanding Animated Character in a Commercial

 

Beyond Good and Evil 2: Zhou Yuzhu

Dominique Boidin

Maxime Luère

Leon Berelle

Remi Kozyra

 

Mercedes Benz: King of the Jungle

Steve Townrow

Joseph Kane

Greg Martin

Gabriela Ruch Salmeron

 

Netto: The Easter Surprise – Bunny

Alberto Lara

Jorge Montiel

Anotine Mariez

Jon Wood

 

Samsung: Do What You Can’t – Ostrich

David Bryan

Maximilian Mallmann

Tim Van Hussen

Brendan Fagan

 

Outstanding Created Environment in a Photoreal Feature

 

Blade Runner 2049: Los Angeles

Chris McLaughlin

Rhys Salcombe

Seungjin Woo

Francesco Dell’Anna

 

Blade Runner 2049: Trash Mesa

Didier Muanza

Thomas Gillet

Guillaume Mainville

Sylvain Lorgeau

Blade Runner 2049: Vegas

Eric Noel

Arnaud Saibron

Adam Goldstein

Pascal Clement

 

War for the Planet of the Apes: Hidden Fortress

Greg Notzelman

James Shaw

Jay Renner

Gak Gyu Choi

 

War for the Planet of the Apes: Prison Camp

Phillip Leonhardt

Paul Harris

Jeremy Fort

Thomas Lo

 

Outstanding Created Environment in an Animated Feature

 

Cars 3: Abandoned Racetrack

Marlena Fecho

Thidaratana Annee Jonjai

Jose L. Ramos Serrano

Frank Tai

 

Coco: City of the Dead

Michael Frederickson

Jamie Hecker

Jonathan Pytko

Dave Strick

 

Despicable Me 3: Hollywood Destruction

Axelle De Cooman

Pierre Lopes

Milo Riccarand

Nicolas Brack

 

The Lego Ninjago Movie: Ninjago City

Kim Taylor

Angela Ensele

Felicity Coonan

Jean Pascal leBlanc

 

Outstanding Created Environment in an Episode, Commercial or Real-Time Project

 

Assassin’s Creed Origins: City of Memphis

Patrick Limoges

Jean-Sebastien Guay

Mikael Guaveia

Vincent Lombardo

 

Game of Thrones: Beyond the Wall – Frozen Lake

Daniel Villalba

Antonio Lado

José Luis Barreiro

Isaac de la Pompa

 

Game of Thrones: Eastwatch

Patrice Poissant

Deak Ferrand

Dominic Daigle

Gabriel Morin

 

Still Star-Crossed: City

Rafael Solórzano

Isaac de la Pompa

José Luis Barreiro

Óscar Perea

 

Stranger Things 2: The Gate

Saul Galbiati

Michael Maher

Seth Cobb

Kate McFadden

 

Outstanding Virtual Cinematography in a Photoreal Project

 

Beauty and the Beast: Be Our Guest

Shannon Justison

Casey Schatz

Neil Weatherley

Claire Michaud

 

Guardians of the Galaxy Vol. 2: Groot Dance/Opening Fight

James Baker

Steven Lo

Alvise Avati

Robert Stipp

 

Star Wars: The Last Jedi – Crait Surface Battle

Cameron Nielsen

Albert Cheng

John Levin

Johanes Kurnia

 

Thor: Ragnarok – Valkyrie’s Flashback

Hubert Maston

Arthur Moody

Adam Paschke

Casey Schatz

 

Outstanding Model in a Photoreal or Animated Project

 

Blade Runner 2049: LAPD Headquarters

Alex Funke

Steven Saunders

Joaquin Loyzaga

Chris Menges

 

Despicable Me 3: Dru’s Car

Eric Guillon

François-Xavier Lepeintre

Guillaume Boudeville

Pierre Lopes

 

Life: The ISS

Tom Edwards

Chaitanya Kshirsagar

Satish Kuttan

Paresh Dodia

 

US Marines: Anthem – Monument

Tom Bardwell

Paul Liaw

Adam Dewhirst

 

Outstanding Effects Simulations in a Photoreal Feature

 

Kong: Skull Island

Florent Andorra

Alexis Hall

Raul Essig

Branko Grujcic

 

Only the Brave: Fire & Smoke

Georg Kaltenbrunner

Thomas Bevan

Philipp Zaufel

Himanshu Joshi

 

Star Wars: The Last Jedi – Bombing Run

Peter Kyme

Miguel Perez Senent

Ahmed Gharraph

Billy Copley

Star Wars: The Last Jedi – Mega Destroyer Destruction

Mihai Cioroba

Ryoji Fujita

Jiyong Shin

Dan Finnegan

 

War for the Planet of the Apes

David Caeiro Cebrián

Johnathan Nixon

Chet Leavai

Gary Boyle

 

Outstanding Effects Simulations in an Animated Feature

 

Cars 3

Greg Gladstone

Stephen Marshall

Leon JeongWook Park

Tim Speltz

 

Coco

Kristopher Campbell

Stephen Gustafson

Dave Hale

Keith Klohn

 

Despicable Me 3

Bruno Chauffard

Frank Baradat

Milo Riccarand

Nicolas Brack

Ferdinand

Yaron Canetti

Allan Kadkoy

Danny Speck

Mark Adams

 

The Boss Baby

Mitul Patel

Gaurav Mathur

Venkatesh Kongathi

 

Outstanding Effects Simulations in an Episode, Commercial or Real-Time Project

 

Game of Thrones: Beyond the Wall – Frozen Lake

Manuel Ramírez

Óscar Márquez

Pablo Hernández

David Gacituaga

 

Game of Thrones: The Dragon and the Wolf – Wall Destruction

Thomas Hullin

Dominik Kirouac

Sylvain Nouveau

Nathan Arbuckle

 

Heineken: The Trailblazers

Christian Bohm

Andreu Lucio Archs

Carsten Keller

Steve Oakley

 

Outlander: Eye of the Storm – Stormy Seas

Jason Mortimer

Navin Pinto

Greg Teegarden

Steve Ong

 

Outstanding Compositing in a Photoreal Feature

 

Blade Runner 2049: LAPD Approach and Joy Holograms

Tristan Myles

Miles Lauridsen

Joel Delle-Vergin

Farhad Mohasseb

 

Kong: Skull Island

Nelson Sepulveda

Aaron Brown

Paolo Acri

Shawn Mason

 

Thor: Ragnarok: Bridge Battle

Gavin McKenzie

David Simpson

Owen Carroll

Mark Gostlow

 

War for the Planet of the Apes

Christoph Salzmann

Robin Hollander

Ben Morgan

Ben Warner

 

Outstanding Compositing in a Photoreal Episode

 

Game of Thrones: Beyond the Wall – Frozen Lake

Óscar Perea

Santiago Martos

David Esteve

Michael Crane

 

Game of Thrones: Eastwatch

Thomas Montminy Brodeur

Xavier Fourmond

Reuben Barkataki

Sébastien Raets

 

Game of Thrones: The Spoils of War – Loot Train Attack

Dom Hellier

Thijs Noij

Edwin Holdsworth

Giacomo Matteucci

 

Star Trek: Discovery

Phil Prates

Rex Alerta

John Dinh

Karen Cheng

 

Outstanding Compositing in a Photoreal Commercial

 

Destiny 2: New Legends Will Rise

Alex Unruh

Michael Ralla

Helgi Laxdal

Timothy Gutierrez

 

Nespresso: Comin’ Home

Matt Pascuzzi

Steve Drew

Martin Lazaro

Karch Koon

 

Samsung: Do What You Can’t – Ostrich

Michael Gregory

Andrew Roberts

Gustavo Bellon

Rashabh Ramesh Butani

 

Virgin Media: Delivering Awesome

Jonathan Westley

John Thornton

Milo Paterson

George Cressey

 

Outstanding Visual Effects in a Student Project

 

Creature Pinup

Christian Leitner

Juliane Walther

Kiril Mirkov

Lisa Ecker

 

Hybrids

Florian Brauch

Romain Thirion

Matthieu Pujol

Kim Tailhades

 

Les Pionniers de l’Univers

Clementine Courbin

Matthieu Guevel

Jérôme Van Beneden

Anthony Rege

 

The Endless

Nicolas Lourme

Corentin Gravend

Edouard Calemard

Romaric Vivier

 

 

 

 

 

 

 

 

 

 

 

 

 

Naomi Goldman

Principal
NLG Communications
Office: 424-293-2113

Cell: 310-770-2765

ngoldman77@gmail.com

 

LinkedIn Profile

 

VFX house Kevin adds three industry veterans

Venice, California-based visual effects house Kevin, founded by Tim Davies, Sue Troyan and Darcy Parsons, has beefed up its team even further with the hiring of head of CG Mike Dalzell, VFX supervisor Theo Maniatis and head of technology Carl Loeffler. This three-month-old studio has already worked on spots for Jaguar, Land Rover, Target and Old Spice, and is currently working on a series of commercials for the Super Bowl.

Dalzell brings years of experience as a CG supervisor and lead artist — he started as a 3D generalist before focusing on look development and lighting — at top creative studios including Digital Domain, MPC and Psyop, The Mill, Sony Imageworks and Method. He was instrumental in look development for VFX Gold Clio and British Arrow-winner Call of Duty Seize Glory and GE’s Childlike Imagination. He has also worked on commercials for Nissan, BMW, Lexus, Visa, Cars.com, Air Force and others. Early on, Dalzell honed his skills on music videos in Toronto, and then on feature films such as Iron Man 3 and The Matrix movies, as well as The Curious Case of Benjamin Button.

Maniatis, a Flame artist and on-set VFX supervisor, has a wide breadth of experience in the US, London and his native Sydney. “Tim [Davies] and I used to work together back in Australia, so reconnecting with him and moving to LA has been a blast.”

Maniatis’s work includes spots for Apple Watch 3 + Apple Music’s Roll (directed by Sam Brown), TAG Heuer’s To Jack (directed by and featuring Patrick Dempsey), Destiny 2’s Rally the Troops and Titanfall 2’s Become One (via Blur Studios), and PlayStation VR’s Batman Arkham and Axe’s Office Love, both directed by Filip Engstrom. Prior to joining Kevin, Maniatis worked with Blur Studios, Psyop, The Mill, Art Jail and Framestore.

Loeffler is creating the studio’s production model using the latest Autodesk Flame systems, high-end 3D workstations and render nodes and putting new networking and storage systems into place. Kevin’s new Culver City studio will open its doors in Q1, 2018 and Loeffler will guide the current growth in both hardware and software, plan for the future and make sure Kevin’s studio is optimized for the needs of production. He has over two decades of experience building out and expanding the technologies for facilities including MPC and Technicolor.

Image: (L-R) Mike Dalzell, Carl Loeffler and Theo Maniatis.

Storage Roundtable

Production, post, visual effects, VR… you can’t do it without a strong infrastructure. This infrastructure must include storage and products that work hand in hand with it.

This year we spoke to a sampling of those providing storage solutions — of all kinds — for media and entertainment, as well as a storage-agnostic company that helps get your large files from point A to point B safely and quickly.

We gathered questions from real-world users — things that they would ask of these product makers if they were sitting across from them.

Quantum’s Keith Lissak
What kind of storage do you offer, and who is the main user of that storage?
We offer a complete storage ecosystem based around our StorNext shared storage and data management solution,including Xcellis high-performance primary storage, Lattus object storage and Scalar archive and cloud. Our customers include broadcasters, production companies, post facilities, animation/VFX studios, NCAA and professional sports teams, ad agencies and Fortune 500 companies.

How are you making sure your products are scalable so people can grow either their storage or bandwidth needs (or both)?
Xcellis features continuous scalability and can be sized to precisely fit current requirements and scaled to meet future demands simply by adding storage arrays. Capacity and performance can grow independently, and no additional accelerators or controllers are needed to reach petabyte scale.

How many of the people buying your solutions are using them with another cloud-based product (i.e. Microsoft Azure)?
We don’t have exact numbers, but a growing number of our customers are using cloud storage. Our FlexTier cloud-access solution can be used with both public (AWS, Microsoft Azure and Google Cloud) and private (StorageGrid, CleverSafe, Scality) storage.

How does your system handle UHD, 4K and other higher-than-HD resolutions?
We offer a range of StorNext 4K Reference Architecture configurations for handling the demanding workflows, including 4K, 8K and VR. Our customers can choose systems with small or large form-factor HDDs, up to an all-flash SSD system with the ability to handle 66 simultaneous 4K streams.

What platforms do your systems connect to (Mac OS X, Windows, Linux, etc.)? And what differences might users notice when connecting on these different platforms?
StorNext systems are OS-agnostic and can work with all Mac, Windows and Linux clients with no discernible difference.

Zerowait’s Rob Robinson
What kind of storage do you offer, and who is the main user of that storage?
Zerowait’s SimplStor storage product line provides storage administrators scalable, flexible and reliable on-site storage needed for their growing storage requirements and workloads. SimplStor’s platform can be configured to work in Linux or Windows environments and we have several customers with multiple petabytes in their data centers. SimplStor systems have been used in VFX production for many years and we also provide solutions for video creation and many other large data environments.

Additionally, Zerowait specializes in NetApp service, support and upgrades, and we have provided many companies in the media and VFX businesses with off-lease transferrable licensed NetApp storage solutions. Zerowait provides storage hardware, engineering and support for customers that need reliable and big storage. Our engineers support customers with private cloud storage and customers that offer public cloud storage on our storage platforms. We do not provide any public cloud services to our customers.

How are you making sure your products are scalable so people can grow either their storage or bandwidth needs (or both)?
Our customers typically need on-site storage for processing speed and security. We have developed many techniques and monitoring solutions that we have incorporated into our service and hardware platforms. Our SimplStor and NetApp customers need storage infrastructures that scale into the multiple petabytes, and often require GigE, 10GigE or a NetApp FC connectivity solution. For customers that can’t handle the bandwidth constraints of the public Internet to process their workloads, Zerowait has the engineering experience to help our customers get the most of their on-premises storage.

How many of the people buying your solutions are using them with another cloud-based products (i.e. Microsoft Azure)?
Many of our customers use public cloud solutions for their non-proprietary data storage while using our SimplStor and NetApp hardware and support services for their proprietary, business-critical, high-speed and regulatory storage solutions where data security is required.

How does your system handle UHD, 4K and other higher-than-HD resolutions?
SimplStor’s density and scalability make it perfect for use in HD and higher resolution environments. Our SimplStor platform is flexible and we can accommodate customers with special requests based on their unique workloads.

What platforms do your systems connect to (Mac OS X, Windows, Linux, etc.)? And what differences might users notice when connecting on these different platforms?
Zerowait’s NetApp and SimplStor platforms are compatible with both Linux (NFS) and Windows (CIFS) environments. OS X is supported in some applications. Every customer has a unique infrastructure and set of applications they are running. Customers will see differences in performance, but our flexibility allows us to customize a solution to maximize the throughput to meet workflow requirements.

Signiant’s Mike Nash
What kind of storage works with your solution, and who is the main user or users of that storage?
Signiant’s Media Shuttle file transfer solution is storage agnostic, and for nearly 200,000 media pros worldwide it is the primary vehicle for sending and sharing large files. Even though Media Shuttle doesn’t provide storage, and many users think of their data as “in Media Shuttle.” In reality, their files are located in whatever storage their IT department has designated. This might be the company’s own on-premises storage, or it could be their AWS or Microsoft Azure cloud storage tenancy. Our users employ a Media Shuttle portal to send and share files; they don’t have to think about where the files are stored.

How are you making sure your products are scalable so people can grow either their use or the bandwidth of their networks (or both)?
Media Shuttle is delivered as a cloud-native SaaS solution, so it can be up and running immediately for new customers, and it can scale up and down as demand changes. The servers that power the software are managed by our DevOps team and monitored 24×7 — and the infrastructure is auto-scaling and instantly available. Signiant does not charge for bandwidth, so customers can use our solutions with any size pipe at no additional cost. And while Media Shuttle can scale up to support the needs of the largest media companies, the SaaS delivery model also makes it accessible to even the smallest production and post facilities.

How many of the people buying your solutions are using them with cloud storage (i.e. AWS or Microsoft Azure)?
Cloud adoption within the M&E industry remains uneven, so it’s no surprise that we see a mixed picture when we look at the storage choices our customers make. Since we first introduced the cloud storage option, there has been a constant month-over-month growth in the number of customers deploying portals with cloud storage. It’s not yet in parity with on-prem storage, but the growth trends are clear.

On-premises content storage is very far from going away. We see many Media Shuttle customers taking a hybrid approach, with some portals using cloud storage and others using on-prem storage. It’s also interesting to note that when customers do choose cloud storage, we increasingly see them use both AWS and Azure.

How does your system handle UHD, 4K and other higher-than-HD resolutions?
We can move any size of file. As media files continue to get bigger, the value of our solutions continues to rise. Legacy solutions such as FTP, which lack any file acceleration, will grind things to a halt if 4K, 8K, VR and other huge files need to be moved between locations. And consumer-oriented sharing services like Dropbox and Google Drive become non-starters with these types of files.

What platforms do your system connect to (e.g. Mac OS X, Windows, Linux), and what differences might end-users notice when connecting on these different platforms?
Media Shuttle is designed to work with a wide range of platforms. Users simply log in to portals using any web browser. In the background, a native application installed on the user’s personal computer provides the acceleration functionality. This App works with Windows or Mac OSX systems.

On the IT side of things, no installed software is required for portals deployed with cloud storage. To connect Media Shuttle to on-premises storage, the IT team will run Signiant software on a computer in the customer’s network. This server-side software is available for Linux and Windows.

NetApp’s Jason Danielson
What kind of storage do you offer, and who is the main user of that storage?
NetApp has a wide portfolio of storage and data management products and services. We have four fundamentally different storage platforms — block, file, object and converged infrastructure. We use these platforms and our data fabric software to create a myriad of storage solutions that incorporate flash, disk and cloud storage.

1. NetApp E-Series block storage platform is used by leading shared file systems to create robust and high-bandwidth shared production storage systems. Boutique post houses, broadcast news operations and corporate video departments use these solutions for their production tier.
2. NetApp FAS network-attached file storage runs NetApp OnTap. This platform supports many thousands of applications for tens of thousands of customers in virtualized, private cloud and hybrid cloud environments. In media, this platform is designed for extreme random-access performance. It is used for rendering, transcoding, analytics, software development and the Internet-of-things pipelines.
3. NetApp StorageGrid Webscale object store manages content and data for back-up and active archive (or content repository) use cases. It scales to dozens of petabytes, billions of objects and currently 16 sites. Studios and national broadcast networks use this system and are currently moving content from tape robots and archive silos to a more accessible object tier.
4. NetApp SolidFire converged and hyper-converged platforms are used by cloud providers and enterprises running large private clouds for quality-of-service across hundreds to thousands of applications. Global media enterprises appreciate the ease of scaling, simplicity of QOS quota setting and overall maintenance for largest scale deployments.

How are you making sure your products are scalable so people can grow either their storage or bandwidth needs (or both)?
The four platforms mentioned above scale up and scale out to support well beyond the largest media operations in the world. So our challenge is not scalability for large environments but appropriate sizing for individual environments. We are careful to design storage and data management solutions that are appropriate to media operations’ individual needs.

How many of the people buying your solutions are using them with another cloud-based product (i.e. Microsoft Azure)?
Seven years ago, NetApp set out on a major initiative to build the data fabric. We are well on the path now with products designed specifically for hybrid cloud (a combination of private cloud and public cloud) workloads. While the uptake in media and entertainment is slower than in other industries, we now have hundreds of customers that use our storage in hybrid cloud workloads, from backup to burst compute.

We help customers wanting to stay cloud-agnostic by using AWS, Microsoft Azure, IBM Cloud, and Google Cloud Platform flexibly and as the project and pricing demands. AWS, Microsoft Azure, IBM, Telsra and ASE along with another hundred or so cloud storage providers include NetApp storage and data management products in their service offerings.

How does your system handle UHD, 4K and other higher-than-HD resolutions?
For higher bandwidth, or bitrate, video production we’ll generally architect a solution with our E-Series storage under either Quantum StorNext or PixitMedia PixStor. Since 2012, when the NetApp E5400 enabled the mainstream adoption of 4K workflows, the E-Series platform has seen three generations of upgrades and the controllers are now more than 4x faster. The chassis has remained the same through these upgrades so some customers have chosen to put the latest controllers into these chassis to improve bandwidth or to utilize faster network interconnect like 16 gigabit fibrechannel. Many post houses continue to use fibrechannel to the workstation for these higher bandwidth video formats while others have chosen to move to Ethernet (40 and 100 Gigabit). As flash (SSDs) continue to drop in price it is starting to be used for video production in all flash arrays or in hybrid configurations. We recently showed our new E570 all flash array supporting NVM Express over Fabrics (NVMe-oF) technology providing 21GB/s of bandwidth and 1 million IOPs with less than 100µs of latency. This technology is initially targeted at super-computing use cases and we will see if it is adopted over the next couple of years for UHD production workloads.

What platforms do your system connect to (Mac OSx, Windows, Linux, etc.), and what differences might end-users notice when connecting on these different platforms?
NetApp maintains a compatibility matrix table that delineates our support of hundreds of client operating systems and networking devices. Specifically, we support Mac OS X, Windows and various Linux distributions. Bandwidth expectations differ between these three operating systems and Ethernet and Fibre Channel connectivity options, but rather than make a blanket statement about these, we prefer to talk with customers about their specific needs and legacy equipment considerations.

G-Technology’s Greg Crosby
What kind of storage do you offer, and who is the main user of that storage?
Western Digital’s G-Technology products provide high-performing and reliable storage solutions for end-to-end creative workflows, from capture and ingest to transfer and shuttle, all the way to editing and final production.

The G-Technology brand supports a wide range of users for both field and in-studio work, with solutions that span a number of portable handheld drives — which are often times used to backup content on-the-go — all the way to in-studio drives that offer capacities up to 144TB. We recognize that each creative has their own unique workflow and some embrace the use of cloud-based products. We are proud to be companions to those cloud services as a central location to store raw content or a conduit to feed cloud features and capabilities.

How are you making sure your products are scalable so people can grow either their storage or bandwidth needs (or both)?
Our line ranges from small portable and rugged drives to large, multi-bay RAID and NAS solutions, for all aspects of the media and entertainment industry. Integrating the latest interface technology such as USB-C or Thunderbolt 3, our storage solutions will take advantage of the ability to quickly transfer files.

We make it easy to take a ton of storage into the field. The G-Speed Shuttle XL drive is available in capacities up to 96TB, and an optional Pelican case, with handle, is available, making it easy to transport in the field and mitigating any concerns about running out of storage. We recently launched the G-Drive mobile SSD R-Series. This drive is built to withstand a three meter (nine foot) drop, and is able to endure accidental bumps or drops, given that it is a solid-state drive.

How many of the people buying your solutions are using them with another cloud-based product (i.e. Microsoft Azure)?
Many of our customers are using cloud-based solutions to complement their creative workflows. We find that most of our customers use our solutions as the primary storage or to easily transfer and shuttle their content since the cloud is not an efficient way to move large amounts of data. We see the cloud capabilities as a great way to share project files and low-resolution content, or collaborate with others on projects as well as distribute share a variety of deliverables.

How does your system handle UHD, 4K and other higher-than-HD resolutions?
Today’s camera technology enables not only capture at higher resolutions but also higher frame rates with more dynamic imagery. We have solutions that can easily support multi-stream 4K, 8K and VR workflows or multi-layer photo and visual effects projects. G-Technology is well positioned to support these creative workflows as we integrate the latest technologies into our storage solutions. From small portable and rugged SSD drives to high-capacity and fast multi-drive RAID solutions with the latest Thunderbolt 3 and USB-C interface technology we are ready tackle a variety of creative endeavors.

What platforms do your systems connect to (Mac OS X, Windows, Linux, etc.), and what differences might users notice when connecting on these different platforms?
Our complete portfolio of external storage solutions work for Mac and PC users alike. With native support for Apple Time Machine, these solutions are formatted for Mac OS out of the box, but can be easily reformatted for Windows users. G-Technology also has a number of strategic partners with technology vendors, including Apple, Atomos, Red Camera, Adobe and Intel.

Panasas’ David Sallak
What kind of storage do you offer, and who is the main user of that storage?
Panasas ActiveStor is an enterprise-class easy-to-deploy parallel scale-out NAS (network-attached storage) that combines Flash and SATA storage with a clustered file system accessed via a high-availability client protocol driver with support for standard protocols.

The ActiveStor storage cluster consists of the ActiveStor Director (ASD-100) control engine, the ActiveStor Hybrid (ASH-100) storage enclosure, the PanFS parallel file system, and the DirectFlow parallel data access protocol for Linux and Mac OS.

How are you making sure your products are scalable so people can grow either their storage or bandwidth needs (or both)?
ActiveStor is engineered to scale easily. There are no specific architectural limits for how widely the ActiveStor system can scale out, and adding more workloads and more users is accomplished without system downtime. The latest release of ActiveStor can grow either storage or bandwidth needs in an environment that lets metadata responsiveness, data performance and data capacity scale independently.

For example, we quote capacity and performance numbers for a Panasas storage environment containing 200 ActiveStor Hybrid 100 storage node enclosures with 5 ActiveStor Director 100 units for filesystem metadata management. This configuration would result in a single 57PB namespace delivering 360GB/s of aggregate bandwidth with an excess of 2.6M IOPs.

How many of the people buying your solutions are using them with another cloud-based product (i.e. Microsoft Azure)?
Panasas customers deploy workflows and workloads in ways that are well-suited to consistent on-site performance or availability requirements, while experimenting with remote infrastructure components such as storage and compute provided by cloud vendors. The majority of Panasas customers continue to explore the right ways to leverage cloud-based products in a cost-managed way that avoids surprises.

This means that workflow requirements for file-based storage continue to take precedence when processing real-time video assets, while customers also expect that storage vendors will support the ability to use Panasas in cloud environments where the benefits of a parallel clustered data architecture can exploit the agility of underlying cloud infrastructure without impacting expectations for availability and consistency of performance.

How does your system handle UHD, 4K and other higher-than-HD resolutions?
Panasas ActiveStor is engineered to deliver superior application responsiveness via our DirectFlow parallel protocol for applications working in compressed UHD, 4K and higher-resolution media formats. Compared to traditional file-based protocols such as NFS and SMB, DirectFlow provides better granular I/O feedback to applications, resulting in client application performance that aligns well with the compressed UHD, 4K and other extreme-resolution formats.

For uncompressed data, Panasas ActiveStor is designed to support large-scale rendering of these data formats via distributed compute grids such as render farms. The parallel DirectFlow protocol results in better utilization of CPU resources in render nodes when processing frame-based UHD, 4K and higher-resolution formats, resulting in less wall clock time to produce these formats.

What platforms do your systems connect to (Mac OS X, Windows, Linux, etc.)? And what differences might users notice when connecting on these different platforms?
Panasas ActiveStor supports macOS and Linux with our higher-performance DirectFlow parallel client software. We support all client platforms via NFS or SMB as well.

Users would notice that when connecting to Panasas ActiveStor via DirectFlow, the I/O experience is as if users were working with local media files on internal drives, compared to working with shared storage where normal protocol access may result in the slight delay associated with open network protocols.

Facilis’ Jim McKenna
What kind of storage do you offer, and who is the main user of that storage?
We have always focused on shared storage for the facility. It’s high-speed attached storage and good for anyone who’s cutting HD or 4K. Our workflow and management features really make us different than basic network storage. We have attachment to the cloud through software that uses all the latest APIs.

How are you making sure your products are scalable so people can grow either their storage or bandwidth needs (or both)?
Most of our large customers have been with us for several years, and many started pretty small. Our method of scalability is flexible in that you can decide to simply add expansion drives, add another server, or add a head unit that aggregates multiple servers. Each method increases bandwidth as well as capacity.

How many of the people buying your solutions are using them with another cloud-based product (i.e. Microsoft Azure)?
Many customers use cloud, either through a corporate gateway or directly uploaded from the server. Many cloud service providers have ways of accessing the file locations from the facility desktops, so they can treat it like another hard drive. Alternatively, we can schedule, index and manage the uploads and downloads through our software.

How does your system handle UHD, 4K and other higher-than-HD resolutions?
Facilis is known for our speed. We still support Fibre Channel when everyone else, it seems, has moved completely to Ethernet, because it provides better speeds for intense 4K and beyond workflows. We can handle UHD playback on 10Gb Ethernet, and up to 4K full frame DPX 60p through Fibre Channel on a single server enclosure.

What platforms do your systems connect to (e.g. Mac OS X, Windows, Linux, etc.)? And what differences might users notice when connecting on these different platforms?
We have a custom multi-platform shared file system, not NAS (network attached storage). Even though NAS may be compatible with multiple platforms by using multiple sharing methods, permissions and optimization across platforms is not easily manageable. With Facilis, the same volume, shared one way with one set of permissions, looks and acts native to every OS and even shows up as a local hard disk on the desktop. You can’t get any more cross-platform compatible than that.

SwiftStack’s Mario Blandini
What kind of storage do you offer, and who is the main user of that storage?
We offer hybrid cloud storage for media. SwiftStack is 100% software and runs on-premises atop the server hardware you already buy using local capacity and/or capacity in public cloud buckets. Data is stored in cloud-native format, so no need for gateways, which do not scale. Our technology is used by broadcasters for active archive and OTT distribution, digital animators for distributed transcoding and mobile gaming/eSports for massive concurrency among others.

How are you making sure your products are scalable so people can grow either their storage or bandwidth needs (or both)?
The SwiftStack software architecture separates access, storage and management, where each function can be run together or on separate hardware. Unlike storage hardware with the mix of bandwidth and capacity being fixed to the ports and drives within, SwiftStack makes it easy to scale the access tier for bandwidth independently from capacity in the storage tier by simply adding server nodes on the fly. On the storage side, capacity in public cloud buckets scales and is managed in the same single namespace.

How many of the people buying your solutions are using them with another cloud-based product (i.e. Microsoft Azure)?
Objectively, use of capacity in public cloud providers like Amazon Web Services and Google Cloud Platform is still “early days” for many users. Customers in media however are on the leading edge of adoption, not only for hybrid cloud extending their on-premises environment to a public cloud, but also using a second source strategy across two public clouds. Two years ago it was less than 10%, today it is approaching 40%, and by 2020 it looks like the 80/20 rule will likely apply. Users actually do not care much how their data is stored, as long as their user experience is as good or better than it was before, and public clouds are great at delivering content to users.

How does your system handle UHD, 4K and other higher-than-HD resolutions?
Arguably, larger assets produced by a growing number of cameras and computers have driven the need to store those assets differently than in the past. A petabyte is the new terabyte in media storage. Banks have many IT admins, where media shops have few. SwiftStack has the same consumption experience as public cloud, which is very different than on-premises solutions of the past. Licensing is based on the amount of data managed, not the total capacity deployed, so you pay-as-you-grow. If you save four replicas or use erasure coding for 1.5X overhead, the price is the same.

What platforms do your systems connect to (Mac OS X, Windows, Linux, etc.)? And what differences might end-users notice when connecting on these different platforms?
The great thing about cloud storage, whether it is on-premises or residing with your favorite IaaS providers like AWS and Google, the interface is HTTP. In other words, every smartphone, tablet, Chromebook and computer has an identical user experience. For classic applications on systems that do not support AWS S3 as an interface, users see the storage as a mount point or folder in their application — either NFS or SMB. The best part, it is a single namespace where data can come in file, get transformed via object, and get read either way, so the user experience does not need to change even though the data is stored in the most modern way.

Dell EMC’s Tom Burns
What kind of storage do you offer, and who is the main user of that storage?
At Dell EMC, we created two storage platforms for the media and entertainment industry: the Isilon scale-out NAS All-Flash, hybrid and archive platform to consolidate and simplify file-based workflows and the Dell EMC Elastic Cloud Storage (ECS), a scalable enterprise-grade private cloud solution that provides extremely high levels of storage efficiency, resiliency and simplicity designed for both traditional and next-generation workloads.

How are you making sure your products are scalable so people can grow either their storage or bandwidth needs (or both)?
In the media industry, change is inevitable. That’s why every Isilon system is built to rapidly and simply adapt by allowing the storage system to scale performance and capacity together, or independently, as more space or processing power is required. This allows you to scale your storage easily as your business needs dictate.

How many of the people buying your solutions are using them with another cloud-based product (i.e. Microsoft Azure)?
Over the past five years, Dell EMC media and entertainment customers have added more than 1.5 exabytes of Isilon and ECS data storage to simplify and accelerate their workflows.

Isilon’s cloud tiering software, CloudPools, provides policy-based automated tiering that lets you seamlessly integrate with cloud solutions as an additional storage tier for the Isilon cluster at your data center. This allows you to address rapid data growth and optimize data center storage resources by using the cloud as a highly economical storage tier with massive storage capacity.

How does your system handle UHD, 4K and other higher-than-HD resolutions?
As technologies that enhance the viewing experience continue to emerge, including higher frame rates and resolutions, uncompressed 4K, UHD, high dynamic range (HDR) and wide color gamut (WCG), underlying storage infrastructures must effectively scale to keep up with expanding performance requirements.

Dell EMC recently launched the sixth generation of the Isilon platform, including our all-flash (F800), which brings the simplicity and scalability of NAS to uncompressed 4K workflows — something that up until now required expensive silos of storage or complex and inefficient push-pull workflows.

What platforms do your systems connect to (Mac OS X, Windows, Linux, etc)? And what differences might end-users notice when connecting on these different platforms?
With Dell EMC Isilon, you can streamline your storage infrastructure by consolidating file-based workflows and media assets, eliminating silos of storage. Isilon scale-out NAS includes integrated support for a wide range of industry-standard protocols allowing the major operating systems to connect using the most suitable protocol, for optimum performance and feature support, including Internet Protocols IPv4, and IPv6, NFS, SMB, HTTP, FTP, OpenStack Swift-based Object access for your cloud initiatives and native Hadoop Distributed File System (HDFS).

The ECS software-defined cloud storage platform provides the ability to store, access, and manipulate unstructured data and is compatible with existing Amazon S3, OpenStack Swift APIs, EMC CAS and EMC Atmos APIs.

EditShare’s Lee Griffin
What kind of storage do you offer, and who is the main user of that storage?
Our storage platforms are tailored for collaborative media workflows and post production. It combines the advanced EFS (that’s EditShare File System, in short) distributed file system with intelligent load balancing. It’s a scalable, fault-tolerant architecture that offers cost-effective connectivity. Within our shared storage platforms, we have a unique take on current cloud workflows, with current security and reliability of cloud-based technology prohibiting full migration to cloud storage for production, EditShare AirFlow uses EFS on-premise storage to provide secure access to media from anywhere in the world with a basic Internet connection. Our main users are creative post houses, broadcasters and large corporate companies.

How are you making sure your products are scalable so people can grow either their storage or bandwidth needs (or both)?
Recently, we upgraded all our platforms to EFS and introduced two new single-node platforms, the EFS 200 and 300. These single-node platforms allow users to grow their storage whilst keeping a single namespace which eliminates management of multiple storage volumes. It enables them to better plan for the future, when their facility requires more storage and bandwidth, they can simply add another node.

How many of the people buying your solutions are using them with another cloud-based product (i.e. Microsoft Azure)?
No production is in one location, so the ability to move media securely and back up is still a high priority to our clients. From our Flow media asset management and via our automation module, we offer clients the option to backup their valuable content to places like Amazon S3 servers.

How does your system handle UHD, 4K and other higher-than HD resolutions?
We have many clients working with UHD content who are supplying programming content to broadcasters, film distributors and online subscription media providers. Our solutions are designed to work effortlessly with high data rate content, enabling the bandwidth to expand with the addition of more EFS nodes to the intelligent storage pool. So, our system is ready and working now for 4K content and is future proof for even higher data rates in the future.

What platforms do your systems connect to (Mac OS X, Windows, Linux, etc.)? And what differences might end-users notice when connecting on these different platforms?
EditShare supplies native client EFS drivers to all three platforms, allowing clients to pick and choose which platform they want to work on. If it is an Autodesk Flame for VFX, a Resolve for grading or our own Lightworks for editing on Linux, we don’t mind. In fact, EFS offers a considerable bandwidth improvement when using our EFS drivers over existing AFP and SMB protocol. Improved bandwidth and speed to all three platforms makes for happy clients!

And there are no differences when clients connect. We work with all three platforms the same way, offering a unified workflow to all creative machines, whether on Mac, Windows or PC.

Scale Logic’s Bob Herzan
What kind of storage do you offer, and who is the main user of that storage?
Scale Logic has developed an ecosystem (Genesis Platform) that includes servers, networking, metadata controllers, single and dual-controller RAID products and purpose-built appliances.

We have three different file systems that allow us to use the storage mentioned above to build SAN, NAS, scale-out NAS, object storage and gateways for private and public cloud. We use a combination of disk, tape and Flash technology to build our tiers of storage that allows us to manage media content efficiently with the ability to scale seamlessly as our customers’ requirements change over time.

We work with customers that range from small to enterprise and everything in between. We have a global customer base that includes broadcasters, post production, VFX, corporate, sports and house of worship.

In addition to the Genesis Platform we have also certified three other tier 1 storage vendors to work under our HyperMDC SAN and scale-out NAS metadata controller (HPE, HDS and NetApp). These partnerships complete our ability to consult with any type of customer looking to deploy a media-centric workflow.

How are you making sure your products are scalable so people can grow either their storage or bandwidth needs (or both)?
Great questions and it’s actually built into the name and culture of our company. When we bring a solution to market it has to scale seamlessly and it needs to be logical when taking the customer’s environment into consideration. We focus on being able to start small but scale any system into a high-availability solution with limited to no downtime. Our solutions can scale independently if clients are looking to add capacity, performance or redundancy.

For example, a customer looking to move to 4K uncompressed workflows could add a Genesis Unlimited as a new workspace focused on the 4K workflow, keeping all existing infrastructure in place alongside it, avoiding major adjustments to their facility’s workflow. As more and more projects move to 4K, the Unlimited can scale capacity, performance and the needed HA requirements with zero downtime.

Customers can then start to migrate their content from their legacy storage over to Unlimited and then repurpose their legacy storage onto the HyperFS file system as second tier storage.Finally, once we have moved the legacy storage onto the new file system we also are more than happy to bring the legacy storage and networking hardware under our global support agreements.

How many of the people buying your solutions are using them with another cloud-based product (i.e. Microsoft Azure)?
Cloud continues to be ramping up for our industry, and we have many customers using cloud solutions for various aspects of their workflow. As it pertains to content creation, manipulation and long-term archive, we have not seen much adoption with our customer base. The economics just do not support the level of performance or capacity our clients demand.

However, private cloud or cloud-like configurations are becoming more mainstream for our larger customers. Working with on-premise storage while having DR (disaster recovery) replication offsite continues to be the best solution at this point for most of our clients.

How does your system handle UHD, 4K and other higher-than-HD resolutions?
Our solutions are built not only for the current resolutions but completely scalable to go beyond them. Many of our HD customers are now putting in UHD and 4K workspaces on the same equipment we installed three years ago. In addition to 4K we have been working with several companies in Asia that have been using our HyperFS file system and Genesis HyperMDC to build 8K workflows for the Olympics.

We have a number of solutions designed to meet our customer’s requirements. Some are done with spinning disk, others with all flash, and then even more that want a hybrid approach to seamlessly combine the technologies.

What platforms do your systems connect to (Mac OS X, Windows, Linux, etc.)? And what differences might end-users notice when connecting on these different platforms?
All of our solutions are designed to support Windows, Linux, and Mac OS. However, how they support the various operating systems is based on the protocol (block or file) we are designing for the facility. If we are building a SAN that is strictly going to be block level access (8/16/32 Gbps Fibre Channel or 1/10/25/40/100 Gbps iSCSI, we would use our HyperFS file system and universal client drivers across all operating systems. If our clients also are looking for network protocols in addition to the block level clients we can support jSMB and NFS but allow access to block and file folders and files at the same time.

For customers that are not looking for block level access, we would then focus our design work around our Genesis NX or ZX product line. Both of these solutions are based on a NAS operating system and simply present themselves with the appropriate protocol over 1/10/25/40 or 100Gb. Genesis ZX solution is actually a software-defined clustered NAS with enterprise feature sets such as unlimited snapshots, metro clustering, thin provisioning and will scale up over 5 Petabytes.

Sonnet Technologies‘ Greg LaPorte
What kind of storage do you offer, and who is the main user of that storage?
We offer a portable, bus-powered Thunderbolt 3 SSD storage device that fits in your hand. Primary users of this product include video editors and DITs who need a “scratch drive” fast enough to support editing 4K video at 60fps while on location or traveling.

How are you making sure your products are scalable so people can grow either their storage or bandwidth needs (or both)?
The Fusion Thunderbolt 3 PCIe Flash Drive is currently available with 1TB capacity. With data transfer of up to 2,600 MB/s supported, most users will not run out of bandwidth when using this device.

What platforms do your systems connect to (Mac OS X, Windows, Linux, etc.)? And what differences might end-users notice when connecting on these different platforms?
Computers with Thunderbolt 3 ports running either macOS Sierra or High Sierra, or Windows 10 are supported. The drive may be formatted to suit the user’s needs, with either an OS-specific format such as HFS+, or cross-platform format such as exFAT.

Dementia 13: Helping enhance the horror with VFX

By Randi Altman

As scary movies are making a comeback and putting butts in seats, as they say, the timing couldn’t be better for NBC Universal’s remake of Dementia 13, a 1963 horror film directed by Francis Ford Coppola. The 2017 version, directed by Richard LeMay, can be streamed on all major VOD platforms. It focuses on a vengeful ghost, a mysterious murderer and a family with a secret. Jeremy Wanek was the lead VFX artist on Dementia 13, and Wayne Harry Johnson Jr. was the VFX producer. They are from Black Space VFX. We reached out to them with some questions.

Jeremy Wanek

How early did you get involved in Dementia 13?
Johnson: We were involved from the second or third draft of the script. Dan De Filippo, who wrote and produced the film, wanted our feedback immediately in terms of what was possible for VFX in the film. We worked with them through pre-production and even fielded a few questions during production. It is extremely important to start thinking about VFX immediately in any production. That way you can write for it and plan your shoot for it. There is nothing worse than a production hoping it can be fixed by VFX work. So getting us involved right away saves everyone a lot of time and money.

Wanek: During preproduction it seems incredibly common for filmmakers to underestimate how many effect shots there will be on their films. They forget about the simple/invisible effects, while concentrating on the bigger and flashier stuff. I don’t blame them; it’s nearly impossible to figure everything out ahead of time. There are always unexpected things that come up during production as well. Always.

For example, on Dementia 13 they shot in this really cool castle location, but they found out while on production that they couldn’t use as many of the practical blood effects as they intended. They didn’t want a bloody mess! So, we were asked to do more digital blood effects and enhancements.

Wayne Harry Johnson, Jr.

Were they open to suggestions from you or did they have a very specific idea of what they wanted?
Johnson: As in every production, there are always elements that are very specifically asked for, but director Richard LeMay is very collaborative. We discussed in great detail the look of all the important effects. And he was very open to suggestions and ideas. This was our second film with Rich. We also did the VFX work on his new film Blood Bound, and it has been a great creative relationship. We can’t wait to work with him again.

Wanek: Yeah, Rich has a vision for sure, but he always gives us creative freedom to explore options and see what we can come up with. I think that’s the best of both worlds.

How many shots did you provide?
Johnson: We did roughly 60 VFX shots for the film, and hopefully the audience won’t notice all of them. If we do our jobs correctly, most VFX work is invisible. As in all films there are little things that get cleaned up or straightened out. VFX isn’t just about robots and explosions. It has a lot to do with keeping the film looking the best it can be by hiding the blemishes that could not be avoided during production.

So again it is important for the filmmakers to consult on their film as they go and ask questions as they go. We all want the same thing for the film, and that is to make it the best it can be and sometimes that means painting out a light switch or removing a sign on that beautiful shot of a road.

Wanek: It’s interesting to note how many shots were intended during preproduction and how many we ended up doing in post. I’d say we ended up doing at least twice as many shots, which is not uncommon. There are elements like the smoke on Kathleen, the ghost girl, when it’s hard to know exactly how many times you’re going to cut to a shot of her. Half of the effect shots for the movie involved creating her ghostly appearance.

Ghost girl Kathleen.

Can you describe the type of effects you provided on the show?
Wanek: We did muzzle flashes, wire removal, visible breath from characters in a cold environment, frost that encapsulates windows, digital hands that pull a character off a dock and into water (that included a digital water splash), the Kathleen ghost effect and an assortment of blood effects.

You created a lot of element effects, such as smoke, water, blood, etc. What was the hardest one to create and why?
Wanek: Creating the smoke that blankets Kathleen was the most challenging and time consuming effect. There were about 30 shots of her in total, and I tackled them myself. With the quick turnaround on the film, it made for some long nights. Every action she performed, and each new camera angle, presented unique challenges. Thankfully, she doesn’t move much in most of the shots. But for shots where she picks a gun up from the ground, or walks across the room, I had to play around with the physics to make it play more realistically, which takes time.

What tools did you use on this project?
Wanek: We composited in Adobe After Effects, tracked in Mocha AE, used Photoshop to assist in painting out objects/wire removal, and I relied heavily on Red Giant’s Trapcode Particular to create the particle effects — ghostly smoke, some of the blood effects and a digital water splash.

Our artists work remotely, so we stored the shots on Dropbox to easily send them out to other artists on the team, who would then download them to their own hard drives. To review shots it was a similar process, using Dropbox and emailing the director a link to stream/download. We kept shot names and the progress info on all shots organized using a Google spreadsheet. This was great because we could update it live, and everyone was on the same page at all times.

CG hands.

Turnarounds are typically tight? Was that the case with Dementia 13? If so, how did you make it work?
Johnson: Yes, we had roughly 30 days to complete the VFX work on the film. Tight deadlines can be hard but we were aware of that when we went into it. What really helps with managing tight deadlines is all the upfront communication between us and the director. By the time we started we knew exactly what Rich was looking for so dialing it in was a much easier and faster process. We also previewed early cuts of the film so we could see and anticipate any potential problems ahead of time. Planning and preparing solves most problems even when time is tight.

So as I said, having VFX involved from the very beginning is essential. Bring us in early, even when it’s just a treatment. We can get a sense of what needs to be done, how long it will take and start estimating budgets. The thing that makes tight deadlines hard is that lots of filmmakers think about VFX last, or very late in the process. Then when they want it done fast they have to compromise because the effect may not have been planned right. So as you can see we have a theme, call us early on.

Wanek: And as I mentioned earlier, unexpected things happen. The dreaded, “we’ll fix it in post,” is a real thing, unfortunately. Filmmakers need to make sure they have additional VFX budget for those surprises.

What was the most challenging part of the process?
Johnson: Each area can have its own challenges. But making anything move like liquid and look convincing is hard. We worked on some ghostly blood effects in the title sequence of the film that were difficult, but in the end we think it looks great. It is a subtle plant for the audience to know there is a bit of supernatural action in this film. Our company is also a virtual company, meaning all of us work remotely. So sometimes communication internally and with clients can be a challenge, but in the end a quick phone call usually solves most problems. Again, more communication and earlier involvement helps alleviate a lot of issues.

CG blood spurts.

What’s next for you and your studio, and where are you based?
Wanek: We are based in Minneapolis, and just opened a second office in New York City. Wayne, myself and Adam Natrop are partners in the company. We’re currently in post production on a horror comedy zombie/hockey movie, Ahockalypse. It’s wackier than it sounds. It’s a lot of fun and pretty bold!

Wayne wrote and directed the film, and I edited it. We just handed it off to our sound designer, to our composer, and are starting work on the VFX. We’re hoping to finish before the year is up. We have several projects on the horizon that we can’t say anything about yet, but we’re excited!

Behind the Title: Framestore director of production & ops Sarah Hiddlestone

NAME: Sarah Hiddlestone

COMPANY: Framestore

CAN YOU DESCRIBE YOUR COMPANY?
Framestore is a BAFTA-and Oscar-winning visual effects studio. We produce visual content for any screen from films and TV programs to theme park rides to large-scale installations and virtual/augmented/mixed realities.

WHAT’S YOUR JOB TITLE?
Director of Production & Operations

WHAT DOES THAT ENTAIL?
My role oversees daily negotiation and communication, and ensures that the New York office runs smoothly. I focus on creating an environment, studio culture and working process that allows teams to produce high-quality work on time and on budget. My role looks at the bigger picture, ensuring projects are run as efficiently as possible. I’m constantly problem-solving and pushing to create the best working environment for our clients and creative talent.

Framestore

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
Choosing soap.

WHAT’S YOUR FAVORITE PART OF THE JOB?
My talented production team and our talented artists — they are the life and soul of all the work we produce at Framestore.

WHAT’S YOUR LEAST FAVORITE?
Tantrums.

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
The morning. I’m usually one of the first in, and I get a lot done as the office wakes up.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Living as a beach bum in Bali.

WHY DID YOU CHOOSE THIS PROFESSION?
I fell into this profession. I always loved animation, but studied hospitality management — thought I wanted to be a chef but hated the hours. Oh, the irony. I worked my way up from a PA, learning everything I know on the job. Along the way I’ve developed vital, in-depth knowledge of the production, VFX, VR and emerging technology processes, and the ability to see Framestore as a global whole rather than at individual office or project level.

Working in VFX has allowed me to travel the world, live in different cities (Sydney, New York, London) and meet a network of firm friends that span the globe. My VFX family. I am lucky to have worked at Framestore in both the London and NY offices.

Fantastic Beasts experience

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
I am behind the scenes on most of the jobs that come out of the NY office. A stand out for our New York office would include last year’s virtual school bus experience Field Trip to Mars with Lockheed Martin and McCann. It’s gone on to win over 100 awards and truly showed the strength and diversity of our staff. More recently we worked with multiple Academy Award-winner Emmanuel “Chivo” Lubezki to visualize One Night for Absolut and BBH. Our New York office collaborated with Framestore’s film teams in London and Montreal to produce the Fantastic Beasts and Where to Find Them experience.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
My personal all-time favorite is Chemical Brothers’ Salmon Dance, which I produced when working in the London office of Framestore for Dom & Nic at Outsider. I also love The Tale of Three Brothers (an animated storybook within Harry Potter and the Deathly Hallows: Part 1). It is a stunning piece of work.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
There’s just one: my iPhone.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Pilates, boxing, sitting in silence, lots of slow breathing. Thinking “calm blue ocean.”

VFX company Kevin launches in LA

VFX vets Tim Davies, Sue Troyan and Darcy Parsons have partnered to open the Los Angeles-based VFX house, Kevin. The company is currently up and running in a temp studio in Venice, while construction is underway on Kevin’s permanent Culver City location, scheduled to open early next year.

When asked about the name, as none of the partners are actually named Kevin, Davies said, “Well, Kevin is always there for you! He’s your best mate and will always have your back. He’s the kind of guy you want to have a beer with whenever he’s in town. Kevin knows his stuff and works his ass off to make sure you get what you need and then some!” Troyan added, “Kevin is a state of mind.”

Davies is on board as executive creative director, overseeing the collective creative output of the company. Having led teams of artists for over 25 years, he was formerly at Asylum Visual Effects and The Mill as creative director and head of 2D. Among his works are multiple Cannes Gold Lion-winning commercials, including HBO’s “Voyeur” campaign for Jake Scott, Nike Golf’s Ripple for Steve Rogers, Old Spice’s Momsong for Steve Ayson, Old Spice’s Dadsong for Andreas Nilsson, and Old Spice’s Whale and Rocket Car for Steve Rogers.

Troyan will serve as senior executive producer of Kevin, having previously worked on campaigns at The Mill and Method. Parsons, owner and partner of Kevin, has enjoyed a career covering multiple disciplines, including producer, VFX producer and executive producer.

Launch projects for Kevin include recent spots for Wieden + Kennedy Portland, The Martin Agency and Spark44.

Main Image: L-R: Darcy Parsons, Sue Troyan, Tim Davies

Zoic Studios adds feature film vet Lou Pecora as VFX supervisor

Academy Award-nominated Lou Pecora has joined Zoic Studios’ Culver City studio as VFX supervisor. Pecora has over two decades of experience in visual effects, working across commercial, feature film and series projects. He comes to Zoic from Digital Domain, where he spent 18 years working on large-scale feature film projects as a visual effects supervisor and compositing supervisor.

Pecora has worked on films including X Men: Apocalypse, Spider-Man: Homecoming, X-Men: Days of Future Past (his Oscar nom), Maleficent, Pirates of the Caribbean: At World’s End, I, Robot, Transformers: Revenge of the Fallen, Transformers: Dark of the Moon, Star Trek, G.I. Joe: Retaliation, Stealth, The Mummy: Tomb of the Emperor, How the Grinch Stole Christmas, Flags of Our Fathers and Letters From Iwo Jima, among others.

“There has been a major shift in the television landscape, with a much greater volume of work and substantially higher production values than ever before,” says Pecora. “Zoic has their hands in a diverse range of high-end television projects, and I’m excited to bring my experience in the feature film space to this dynamic sector of the entertainment industry.”

The addition of Pecora comes on the heels of several high-profile projects at Zoic, including work on Darren Aronofsky’s thriller Mother!, Game of Thrones for HBO and Marvel’s The Defenders for Netflix.

 

postPerspective Impact Award winners from SIGGRAPH 2017

Last April, postPerspective announced the debut of our Impact Awards, celebrating innovative products and technologies for the post production and production industries that will influence the way people work. We are now happy to present our second set of Impact Awards, celebrating the outstanding offerings presented at SIGGRAPH 2017.

Now that the show is over, and our panel of VFX/VR/post pro judges has had time to decompress, dig out and think about what impressed them, we are happy to announce our honorees.

And the winners of the postPerspective Impact Award from SIGGRAPH 2017 are:

  • Faceware Technologies for Faceware Live 2.5
  • Maxon for Cinema 4D R19
  • Nvidia for OptiX 5.0  

“All three of these technologies are very worthy recipients of our first postPerspective Impact Awards from SIGGRAPH,” said Randi Altman, postPerspective’s founder and editor-in-chief. “These awards celebrate companies that define the leading-edge of technology while producing tools that actually make users’ working lives easier and projects better, and our winners certainly fall into that category.

“While SIGGRAPH’s focus is on VFX, animation, VR/AR and the like, the types of gear they have on display vary. Some are suited for graphics and animation, while others have uses that slide into post production. We’ve tapped real-world users in these areas to vote for our Impact Awards, and they have determined what tools might be most impactful to their day-to-day work. That’s what makes our awards so special.”

There were many new technologies and products at SIGGRAPH this year, and while only three won an Impact Award, our judges felt there were other updates that it was important to let people know about as well.

Blackmagic Design’s Fusion 9 was certainly turning heads and Nvidia’s VRWorks 360 Video was called out as well. Chaos Group also caught our judges attention with V-Ray for Unreal Engine 4.

Stay tuned for future Impact Award winners in the coming months — voted on by users for users — from IBC.

Pixomondo streamlines compute management with Deadline

There’s never a dull moment at Pixomondo, where artists and production teams juggle feature film, TV, theme park and commercial VFX projects between offices in Toronto, Vancouver, Los Angeles, Frankfurt, Stuttgart, Shanghai and Beijing. The Academy- and Emmy-award-winning VFX studio securely manages its on-premises compute resources across its branches and keeps its rendering pipeline running 24/7 utilizing Thinkbox’s Deadline, which it standardized on in 2010.

In recent months, Pixomondo has increasingly been computing workstation tasks on its render farm via Deadline and has moved publishing to Deadline as well. Sebastian Kral, Pixomondo’s global head of pipeline, says, “By offloading more to Deadline, we’re able to accelerate production. Our artists don’t have to wait for publishes to finish before they move onto the next task, and that’s really something. Deadline’s security is top-notch, which is extremely important for us given the secretive nature of some of our projects.”

Kral is particularly fond of Deadline’s Python API, which allows his global team to develop custom scripts to minimize the minutia that artists must deal with, resulting in a productivity boon. “Deadline gives us incredible flexibility. The Python API is fast, reliable and more usable than a command line entry point, so we can script so many things on our own, which is convenient,” says Kral. “We can build submission scripts for texture conversions, and create proxy data when a render job is done, so our artists don’t have to think about whether or not they need a QT of a composite.”

Power Rangers. Images courtesy of Pixomondo.

The ability to set environment variables for renders, or render as a specific user, allows Pixomondo’s artists to send tasks to the farm with an added layer of security. With seven facilities worldwide, and the possibility of new locations based on production needs, Pixomondo has also found Deadline’s ability to enable multi-facility rendering valuable.

“Deadline is packed with a ton of great out-of-the-box features, in addition to the new features that Thinkbox implements in new iterations; we didn’t even need to build our own submission tool, because Deadline’s submission capabilities are so versatile,” Kral notes. “It also has a very user-friendly interface that makes setup quick and painless, which is great for getting new hires up to speed quickly and connecting machines across facilities.”

Pixomondo’s more than 400 digital artists are productive around the clock, taking advantage of alternating time zones at facilities around the world. Nearly every rendering decision at the studio is made with Deadline in mind, as it presents rendering metrics in an intuitive way that allows the team to more accurately estimate project turnaround. “When opening Deadline to monitor a render, it’s always an enjoyable experience because all the information I need is right there at my fingertips,” says Kral. “It provides a meaningful overview of our rendering resource spread. We just log in, test renders, and we have all the information needed to determine how long each task will take using the available machines.”

Lucasfilm and ILM release open source MaterialX library

Lucasfilm and ILM have launched the first open source release of the MaterialX library for computer graphics. MaterialX is an open standard developed by Lucasfilm’s Advanced Development Group and ILM engineers to facilitate the transfer of rich materials and look-development content between applications and renderers.

Originated at Lucasfilm in 2012, MaterialX has been used by ILM on features including Star Wars: The Force Awakens and Rogue One: A Star Wars Story, as well as realtime immersive experiences such as Trials On Tatooine.

Workflows at computer graphics production studios require multiple software tools for different parts of the production pipeline, and shared and outsourced work requires companies to hand off fully look-developed models to other divisions or studios which may use different software packages and rendering systems.

MaterialX addresses the current lack of a common, open standard for representing the data values and relationships required to transfer the complete look of a computer graphics model from one application or rendering platform to another, including shading networks, patterns and texturing, complex nested materials and geometric assignments. MaterialX provides a schema for describing material networks, shader parameters, texture and material assignments and color-space associations in a precise, application-independent and customizable way.

MaterialX is an open source project released under a modified Apache license.

Quick Chat: Filmmaker/DP/VFX artist Mihran Stepanyan

Veteran Armenian artist Mihran Stepanyan has an interesting background. In addition to being a filmmaker and cinematographer, he is also a colorist and visual effects artist. In fact, he won the 2017 Flame Award, which was presented to him during NAB in April.

Let’s find out how his path led to this interesting mix of expertise.

Tell us about your background in VFX.
I studied feature film directing in Armenia from 1997 through 2002. During the process, I also became very interested in being a director of photography. As a self-taught DP, I was shooting all my work, as well as films produced by my classmates and colleagues. This was great experience. Nearly 10 years ago, I started to study VFX because I had some projects that I wanted to do myself. I’ve fallen in love with that world. Some years ago, I started to work in Moscow as a DP and VFX artist for a Comedy Club Production special project. Today, I not only work as a VFX artist but also as a director and cinematographer.

How do your experiences as a VFX artist inform your decisions as a director and cinematographer?
They are closely connected. As a director, you imagine something that you want to see in the end, and you can realize that because you know what you can achieve in production and post. And, as a cinematographer, you know that if problems arise during the shoot, you can correct them in VFX and post. Experience in cinematography also complements VFX artistry, because your understanding of the physics of light and optics helps you create more realistic visuals.

What do you love most about your job?
The infinity of mind, fantasy and feelings. Also, I love how creative teams work. When a project starts, it’s fun to see how the different team members interact with one another and approach various challenges, ultimately coming together to complete the job. The result of that collective team work is interesting as well.

Tell us about some recent projects you’ve worked on.
I’ve worked on Half Moon Bay, If Only Everyone, Carpenter Expecting a Son and Doktor. I also recently worked on a tutorial for FXPHD that’s different from anything I’ve ever done before. It is not only the work of an Autodesk Flame artist or a lecturer, but also gave me a chance to practice English, as my first language is Armenian.

Mihran’s Flame tutorial on FXPHD.

Where do you get your inspiration?
First, nature. There nothing more perfect to me. And, I’m picturalist, so for various projects I can find inspiration in any kind of art, from cave paintings to pictorial art and music. I’m also inspired by other artists’ work, which helps me stay tuned with the latest VFX developments.

If you had to choose the project that you’re most proud of in your career, what would it be, and why?
I think every artist’s favorite project is his/her last project, or the one he/she is working on right now. Their emotions, feelings and ideas are very fresh and close at the moment. There are always some projects that will stand out more than others. For me, it’s the film Half Moon Bay. I was the DP, post production supervisor and senior VFX artist for the project.

What is your typical end-to-end workflow for a project?
It differs on each project. In some projects, I do everything from story writing to directing and digital immediate (DI) finishing. For some projects, I only do editing or color grading.

How did you come to learn Flame?
During my work in Moscow, nearly five years ago, I had the chance to get a closer look at Flame and work on it. I’m a self-taught Flame artist, and since I started using the product it’s become my favorite. Now, I’m back in Armenia working on some feature films and upcoming commercials. I am also a member of Flame and Autodesk Maya Beta testing groups.

How did you teach yourself Flame? What resources did you use?
When I started to learn Flame, there weren’t as many resources and tutorials as we have now. It was really difficult to find training documentation online. In some cases, I got information from YouTube, NAB or IBC presentations. I learned mostly by experimentation, and a lot of trial and error. I continue to learn and experiment with Flame every time I work.

Any tips for using the product?
As for tips, “knowing” the software is not about understanding the tools or shortcuts, but what you can do with your imagination. You should always experiment to find the shortest and easiest way to get the end result. Also, imagine how you can construct your schematic without using unnecessary nods and tools ahead of time. Exploring Flame is like mixing the colors on the palette in painting to get the perfect tone. In the same way, you must imagine what tools you can “mix” together to get the result you want.

Any advice for other artists?
I would advise that you not be afraid of any task or goals, nor fear change. That will make you a more flexible artist who can adapt to every project you work on.

What’s next for you?
I don’t really know what’s next, but I am sure that it is a new beginning for me, and I am very interested where this all takes me tomorrow.

Atomic Fiction hires Marc Chu to lead animation department

Atomic Fiction has welcomed animation expert Marc Chu to lead the studio’s animation efforts across its Oakland and Montreal locations. Chu joins Atomic Fiction from ILM, where he most recently served as animation director, bringing more than 20 years of experience animating and supervising the creation of everything from aliens and spaceships to pirates and superheroes.

Based out of Atomic Fiction’s Oakland office, Chu will oversee animation company-wide and serve as the principal architect of initial studio production, including the expansion of Atomic Fiction’s previs and digital creature offerings. He’s already begun work on The Predator and is ramping up activity on an upcoming Robert Zemeckis feature.

“Atomic Fiction is already well-established and known for its seamless work in environments, so this is an amazing opportunity to be a part of their journey into doing more animation-driven work,” said Chu. “My goal is to help grow an already-strong animation department to the next level, becoming a force that is able to tackle any challenge, notably high-level creature and character work.”

Chu established and built his career at ILM, creating and supervising work for some of the biggest film franchises of the last 20 years. For 2009’s Iron Man, he worked to define the characters and animation through the sequel and on the first two Avengers films. His extensive credits also include Star Wars franchise continuations The Force Awakens and Rogue One, and the original Pirates of the Caribbean trilogy, which earned Best VFX Oscar nominations, and won for Pirates of the Caribbean: Dead Men’s Chest.

Chu also has two VES Award wins for his Davy Jones CG character work.

Game of Thrones: VFX associate producer Adam Chazen

With excitement starting to build for the seventh season of HBO’s Game of Thrones, what better time to take a quick look back at last season’s VFX workflow. HBO associate VFX producer Adam Chazen was kind enough to spend some time answering questions after just wrapping Season 7.

Tell us about your background as a VFX associate producer and what led you to Game of Thrones.
I got my first job as a PA at VFX studio Pixomondo. I was there for a few years, working under my current boss Steve Kullback (visual effects producer on Game of Thrones). He took me with him when he moved to work on Yogi Bear, and then on Game of Thrones.

I’ve been with the show since 2011, so this is my sixth year on board. It’s become a real family at this point; lots of people have been on since the pilot.

From shooting to post, what is your role working on Game of Thrones?
As the VFX associate producer, in pre-production mode I assist with organizing our previs and concept work. I help run and manage our VFX database and I schedule reviews with producers, directors and heads of departments.

During production I make sure everyone has what they need on set in order to shoot for the various VFX requirements. Also during production, we start to post the show — I’m in charge of running review sessions with our VFX supervisor Joe Bauer. I make sure that all of his notes get across to the vendors and that the vendors have everything they need to put the shots together.

Season 7 has actually been the longest we’ve stayed on set before going back to LA for post. When in Belfast, it’s all about managing the pre-production and production process, making sure everything gets done correctly to make the later VFX adjustments as streamlined as possible. We’ll have vendors all over the world working on that next step — from Australia to Spain, Vancouver, Montreal, LA, Dublin and beyond. We like to say that the sun never sets on Game of Thrones.

What’s the process for bringing new vendors onto the show?
They could be vendors that we’ve worked with in the past. Other times, we employ vendors that come recommended by other people. We check out industry reels and have studios do testing for us. For example, when we have dragon work we ask around for vendors willing to run dragon animation tests for us. A lot of it is word of mouth. In VFX, you work with the people that you know will do great work.

What’s your biggest challenge in creating Game of Thrones?
We’re doing such complex work that we need to use multiple vendors. This can be a big hurdle. In general, whether it be film or TV, when you have multiple vendors working on the same shot, it becomes a potential issue.

Linking in with cineSync helps. We can have a vendor in Australia and a vendor in Los Angeles both working on the same shot, at exactly the same time. I first started using cineSync while at Pixomondo and found it makes the revision process a lot quicker. We send notes out to vendors, but most of the time it’s easier to get on cineSync, see the same image and draw on it.

Even the simple move of hovering a cursor over the frame can answer a million questions. We have several vendors who don’t use English as their first language, such as those in Spain. In these cases, communication is a lot easier via cineSync. By pointing to a single portion of a single frame, we completely bypass the language barrier. It definitely helps to see an image on screen versus just explaining it.

What is your favorite part of the cineSync toolkit?
We’ve seen a lot of cool updates to cineSync. Specifically, I like the notes section, where you can export a PDF to include whichever frame that note is attributed to.

Honestly, just seeing a cursor move on-screen from someone else’s computer is huge. It makes things so much easier to just point and click. If we’re talking to someone on the phone, trying to tell them about an issue in the upper left hand corner, it’s going to be hard to get our meaning across. cineSync takes away all of the guesswork.

Besides post, we also heavily use cineSync for shoot needs. We shoot the show in Northern Ireland, Iceland, Croatia, Spain and Calgary. With cineSync, we are able to review storyboards, previs, techvis and concepts with the producers, directors, HODs and others, wherever they are in the world. It’s crucial that everyone is on the same page. Being able to look at the same material together helps everyone get what they want from a day on set.

Is there a specific shot, effect or episode you’re particularly proud of?
The Battle of the Bastards — it was a huge episode. Particularly, the first half of the episode when Daenerys came in with her dragons at the battle of Meereen, showing those slavers who is boss. Meereen City itself was a large CG creation, which was unusual for Game of Thrones. We usually try to stay away from fully CG environments and like to get as much in-camera as possible.

For example, when the dragon breathes fire we used an actual flamethrower we shot. Back in Season 5, we started to pre-animate the dragon, translate it to a motion control rig, and attach a flamethrower to it. It moves exactly how the dragon would move, giving us a practical element to use in the shot. CG fire can be done but it’s really tricky. Real is real, so you can’t question it.

With multiple vendors working on the sequence, we had Rodeo FX do the environment while Rhythm & Hues did the dragons. We used cineSync a lot, reviewing shots between both vendors in order to point out areas of concern. Then in the second half of the episode, which was the actual Battle of the Bastards, the work was brilliantly done by Australian VFX studio Iloura.

Exceptional Minds: Autistic students learn VFX, work on major feature films

After graduation, these artists have been working on projects for Marvel, Disney, Fox and HBO.

By Randi Altman

With an estimated 1 in 68 children in the US being born with some sort of autism spectrum disorder, according to the Centers for Disease Control’s Autism and Developmental Disabilities Monitoring, I think it’s fair to say that most people have been touched in some way by a child on the spectrum.

As a parent of a teenager with autism, I can attest to the fact that one of our biggest worries, the thing that keeps us up at night, is the question of independence. Will he be able to make a living? Will there be an employer who can see beyond his deficits to his gifts and exploit those gifts in the best possible way?

Enter Exceptional Minds, a school in Los Angeles that teaches young adults with autism how to create visual effects and animation while working as part of a team. This program recognizes how bright these young people are and how focused they can be, surrounds them with the right teachers and behavioral therapists, puts the right tools in their hands and lets them fly.

The school, which also has a VFX and animation studio that employs its graduates, was started in 2011 by a group of parents who have children on the spectrum. “They were looking for work opportunities for their kids, and quickly discovered they couldn’t find any. So they decided to start Exceptional Minds and prepare them for careers in animation and visual effects,” explains Susan Zwerman, the studio executive producer at Exceptional Minds and a long-time VFX producer whose credits include Broken Arrow, Alien Resurrection, Men of Honor, Around the World in 80 Days and The Guardian.

Since the program began, these young people have had the opportunity to work on some very high-profile films and TV programs. Recent credits include Game of Thrones, The Fate of the Furious and Doctor Strange, which was nominated for an Oscar for visual effects this year.

We reached out to Zwerman to find out more about this school, its studio and how they help young people with autism find a path to independence.

The school came first and then the studio?
Yes. We started training them for visual effects and animation and then the conversation turned to, “What do they do when they graduate?” That led to the idea to start a visual effects studio. I came on board two years ago to organize and set it up. It’s located downstairs from the school.

How do you pick who is suitable for the program?
We can only take 10 students each year, and unfortunately, there is a waiting list because we are the only program of its kind anywhere. We have a review process that our educators and teachers have in terms of assessing the student’s ability to be able to work in this area. You know, not everybody can function working on a computer for six or eight hours. There are different levels of the spectrum. So the higher functioning and the medium functioning are more suited for this work, which takes a lot of focus.

Students are vetted by our teachers and behavioral specialists, who take into account the student’s ability, as well as their enthusiasm for visual effects and animation — it’s very intense, and they have to be motivated.

Susie Zwerman (in back row, red hair) with artists in the Exceptional Minds studio.

I know that kids on the spectrum aren’t necessarily social butterflies, how do you teach them to work as a team?
Oh, that’s a really good question. We have what’s called our Work Readiness program. They practice interviewing, they practice working as a team, they learn about appearance, attitude, organization and how to problem solve in a work place.

A lot of it is all about working in a team, and developing their social skills. That’s something we really stress in terms of behavioral curriculum.

Can you describe how the school works?
It’s a three-year program. In the first year, they learn about the principles of design and using programs like Adobe’s Flash and Photoshop. In Flash, they study 2D animation and in Photoshop they learn how to do backgrounds for their animation work.

During year two, they learn how to work in a production pipeline. They are given a project that the class works on together, and then they learn how to edit using Adobe Premiere Pro and compositing on Adobe After Effects.

In the third year, they are developing their skills in 3D via Autodesk Maya and compositing with The Foundry’s Nuke. So they learn the way we work in the studio and our pipeline, as well as preparing their portfolios for the workplace. At the end of three years, each student completes their training with a demo reel and resume of their work.

Who helps with the reels and resumes?
Their teachers supervise that process and help them with editing and picking the best pieces for their reel. Having a reel is important for many reasons. While many students will work in our studio for a year after graduation, I was able to place some directly into the work environment because their talent was so good… and their reel was so good.

What is the transition like from school to studio?
They graduate in June and we transition many of them to the studio, where they learn about deadlines and get paid for their work. Here, many experience independence for the first time. We do a lot of 2D-type visual effects clean-up work. We give them shots to work on and test them for the first month to see how they are doing. That’s when we decide if they need more training.

The visual effects side of the studio deals with paint work, wire and rod removal and tracker or marker removals — simple composites — plus a lot of rotoscoping and some greenscreen keying. We also do end title credits for the major movies.

We just opened the animation side of the studio in 2016, so it’s still in the beginning stages, but we’re doing 2D animation. We are not a 3D studio… yet! The 2D work we’ve done includes music videos, Websites, Power Points and some stuff for the LA Zoo. We are gearing up for major projects.

How many work in the studio?
Right now, we have about 15 artists at workstations in our current studio. Some of these will be placed on the outside, but that’s part of using strategic planning in the future to figure out how much expansion we want to do over the next five years.

Thanks to your VFX background, you have many existing relationships with the major studios. Can you talk about how that has benefitted Exceptional Minds?
We have had so much support from the studios; they really want to help us get work for the artists. We started out with Fox, then Disney and then HBO for television. Marvel Studios is one of our biggest fans. Marvel’s Victoria Alonso is a big supporter, so much so that we gave her our Ed Asner Award last June.

Once we started to do tracker marker and end title credits for Marvel, it opened doors. People say, “Well, if you work for Marvel, you could work for us.” So she has been so instrumental in our success.

What were the Fox and Marvel projects?
Our very first client was Fox and we did tracker removals for Dawn of the Planet of the Apes — that was about three years ago. Marvel happened about two years ago and our first job for them was on Avengers: Age of Ultron.

What are some of the other projects Exceptional Minds has worked on?
We worked on Doctor Strange, providing tracker marker removals and end credits. We worked on Ant-Man, Captain America: Civil War, Pete’s Dragon, Alvin & the Chipmunks: The Road Chip and X-Men: Apocalypse.

Thanks to HBO’s Holly Schiffer we did a lot of Game of Thrones work. She has also been a huge supporter of ours.

It’s remarkable how far you guys have come in a short amount of time. Can you talk about how you ended up at Exceptional Minds?
I used to be DGA production manager/location manager and then segued into visual effects as a freelance VFX producer for all the major studios. About three years ago, my best friend Yudi Bennett, who is one of the founders of Exceptional Minds, convinced me to leave my career and  come here to help set up the studio. I was also tasked with producing, scheduling and budgeting work to come into the studio. For me, personally, this has been a spiritual journey. I have had such a good career in the industry, and this is my way of giving back.

So some of these kids move on to other places?
After they have worked in the studio for about a year, or sometimes longer, I look to have them placed at an outside studio. Some of them will stay here at our studio because they may not have the social skills to work on the outside.

Five graduates have been placed so far and they are working full time at various productions studios and visual effects facilities in Los Angeles. We have also had graduates in internships at Cartoon Network and Nickelodeon.

One student is at Marvel, and others are at Stargate Studios, Mr. Wolf and New Edit. To be able to place our artists on the outside is our ultimate goal. We love to place them because it’s sort of life changing. For example, one of the first students we placed, Kevin, is at Stargate. He moved out of his parents’ apartment, he is traveling by himself to and from the studio, he is getting raises and he is moving up as a rotoscope artist.

What is the tuition like?
Students pay about 50 percent and we fundraise the other 50 percent. We also have scholarships for those that can’t afford it. We have to raise a lot of money to support the efforts of the school and studio.

Do companies donate gear?
When we first started, Adobe donated software. That’s how we were able to fund the school before the studio was up and running. Now we’re on an educational plan with them where we pay the minimum. Autodesk and The Foundry also give us discounts or try to donate licenses to us. In terms of hardware, we have been working with Melrose Mac, who is giving us discounts on computers for the school and studio.


Check out Exceptional Minds Website for more info.

The A-list — Kong: Skull Island director Jordan Vogt-Roberts

By Iain Blair

Plucky explorers! Exotic locations! A giant ape! It can only mean one thing: King Kong is back… again. This time, the new Warner Bros. and Legendary Pictures’ Kong: Skull Island re-imagines the origin of the mythic Kong in an original adventure from director Jordan Vogt-Roberts (The Kings of Summer).

Jordan Vogt-Roberts

With an all-star cast that includes Tom Hiddleston, Samuel L. Jackson, Oscar-winner Brie Larson, John Goodman and John C. Reilly, it follows a diverse team of explorers as they venture deep into an uncharted island in the Pacific — as beautiful as it is treacherous — unaware that they’re crossing into the domain of the mythic Kong.

The legendary Kong was brought to life on a whole new scale by Industrial Light & Magic, with two-time Oscar-winner Stephen Rosenbaum (Avatar, Forrest Gump) serving as visual effects supervisor.

To fully immerse audiences in the mysterious Skull Island, Vogt-Roberts, his cast and filmmaking team shot across three continents over six months, capturing its primordial landscapes on Oahu, Hawaii — where shooting commenced on October 2015 — on Australia’s Gold Coast and, finally, in Vietnam, where production took place across multiple locations, some of which have never before been seen on film. Kong: Skull Island was released worldwide in 2D, 3D and IMAX beginning March 10.

I spoke with Vogt-Roberts about making the film and his love of post.

What’s the eternal appeal of doing a King Kong movie?
He’s King Kong! But the appeal is also this burden, as you’re playing with film history and this cinematic icon of pop culture. Obviously, the 1933 film is this impeccable genre story, and I’m a huge fan of creature features and people like Ray Harryhausen. I liked the idea of taking my love for all that and then giving it my own point of view, my sense of style and my voice.

With just one feature film credit, you certainly jumped in the deep end with this — pun intended — monster production, full of complex moving parts and cutting-edge VFX. How scary was it?
Every movie is scary because I throw myself totally into it. I vanish from the world. If you asked my friends, they would tell you I completely disappear. Whether it’s big or small, any film’s daunting in that sense. When I began doing shorts and my own stuff, I did shooting, the lighting, the editing and so on, and I thrived off all that new knowledge, so even all the complex VFX stuff wasn’t that scary to me. The truly daunting part is that a film like this is two and a half years of your life! It’s a big sacrifice, but I love a big challenge like this was.

What were the biggest challenges, and how did you prepare?
How do you make it special —and relevant in 2017? I’m a bit of a masochist when it comes to a challenge, and when I made the jump to The Kings of Summer it really helped train me. But there are certain things that are the same as they always are, such as there’s never enough time or money or daylight. Then there are new things on a movie of this size, such as the sheer endurance you need and things you simply can’t prepare yourself for, like the politics involved, all the logistics and so on. The biggest thing for me was, how do I protect my voice and point of view and make sure my soul is present in the movie when there are so many competing demands? I’m proud of it, because I feel I was able to do that.

How early on did you start integrating post and all the VFX?
Very early on — even before we had the script ready. We had concept artists and began doing previs and discussing all the VFX.

Did you do a lot of previs?
I’m not a huge fan of it. Third Floor did it and it’s a great tool for communicating what’s happening and how you’re going to execute it, but there’s also that danger of feeling like you’re already making the movie before you start shooting it. Think of all the great films like Blade Runner and the early Star Wars films, all shot before they even had previs, whereas now it’s very easy to become too reliant on it; you can see a movie sequence where it just feels like you’re watching previs come to life. It’s lost that sense of life and spontaneity. We only did three previs sequences — some only partially — and I really stressed with the crew that it was only a guide.

Where did you do the post?
It was all done at Pivotal in Burbank, and we began cutting as we shot. The sound mix was done at Skywalker and we did our score in London.

Do you like the post process?
I love post. I love all aspects of production, but post is where you write the film again and where it ceases being what was on the page and what you wanted it to be. Instead you have to embrace what it wants to be and what it needs to be. I love repurposing things and changing things around and having those 3am breakthroughs! If we moved this and use that shot instead, then we can cut all that.

You had three editors — Richard Pearson, Bob Murawski and Josh Schaeffer. How did that work?
Rick and Bob ran point, and Rick was the lead. Josh was the editor who had done The Kings of Summer with me, and my shorts. He really understands my montages and comedy. It was so great that Rick and Bob were willing to bring him on, and they’re all very different editors with different skills — and all masters of their craft. They weren’t on set, except for Hawaii. Once we were really globe-trotting, they were in LA cutting.

VFX play a big role. Can you talk about working on them with VFX supervisor Jeff White and ILM, who did the majority of the effects work?
He ran the team there, and they’re all amazing. It was a dream come true for me. They’re so good at taking kernels of ideas and turning them into reality. I was able to do revisions as I got new ideas. Creating Kong was the big one, and it was very tricky because the way he moves isn’t totally realistic. It’s very stylized, and Jeff really tapped into my animé and videogame sensibility for all that. We also used Hybride and Rodeo for some shots.

What was the hardest VFX sequence to do?
The helicopter sequence was really very difficult, juggling the geography of that, with this 100-foot creature and people spread all over the island, and also the final battle sequence. The VFX team and I constantly asked ourselves, “Have we seen this before? Is it derivative? Is it redundant?” The goal was to always keep it fresh and exciting.

Where did you do the DI?
At Fotokem with colorist Dave Cole who worked on The Lord of the Rings and so many others. I love color, and we did a lot of very unusual stuff for a movie like this, with a lot of saturation.

Did the film turn out the way you hoped?
A movie never quite turns out the way you hope or think it will, but I love the end result and I feel it represents my voice. I’m very proud of what we did.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Recreating history for Netflix’s The Crown

By Randi Altman

If you, like me, binge-watched Netflix’s The Crown, you are now considerably better educated on the English monarchy, have a very different view of Queen Elizabeth, and were impressed with the show’s access to Buckingham Palace.

Well, it turns out they didn’t actually have access to the Palace. This is where London-based visual effects house One of Us came in. While the number of shots provided for the 10-part series varied, the average was 43 per episode.

In addition to Buckingham Palace, One of Us worked on photoreal digital set extensions, crowd replications and environments, including Downing Street and London Airport. The series follows a young Elizabeth who inherits the crown after her father, King George VI, dies. We see her transition from a vulnerable young married lady to a more mature woman who takes her role as head monarch very seriously.

We reached out to One of Us VFX supervisor Ben Turner to find out more.

How early did you join the production?
One of Us was heavily involved during an eight-month pre-production process, until shooting commenced in July 2015.

Ben Turner

Did they have clear vision of what they needed VFX vs. practical?
As we were involved from the pre-production stage, we were able to engage in discussions about how best to approach shooting the scenes with the VFX work in mind. It was important to us and the production that actors interacted with real set pieces and the VFX work would be “thrown away” in the background, not drawing attention to itself.

Were you on set?
I visited all relevant locations, assisted on set by Jon Pugh who gathered all VFX data required. I would attend all recces at these locations, and then supervise on the shoot days.

Did you do previs? If so, what software did you use?
We didn’t do much previs in the traditional sense. We did some tech-vis to help us figure out how best to film some things, such as the arrivals at the gates of Buckingham Palace and the Coronation sequence. We also did some concept images to help inform the shoot and design of some scenes. This work was all done in Autodesk Maya, The Foundry’s Nuke and Adobe Photoshop.

Were there any challenges in working in 4K? Did your workflow change at all, and how much of your work currently is in 4K?
Working in 4K didn’t really change our workflow too much. At One of Us, we are used to working on film projects that come in all different shapes and sizes (we recently completed work on Terrance Mallick’s Voyage of Time in IMAX 5K), but for The Crown we invested in the infrastructure that enabled us to take it in our stride — larger and faster disks to hold the huge amounts of data, as well as a new 4K monitor to review all the work.

     

What were some of your favorite, or most challenging, VFX for the show?
The most challenging work was the kind of shots that many people are already very familiar with. So the Queen’s Coronation, for example, was watched by 20 million people in 1953, and with Buckingham Palace and Downing Street being two of the most famous and recognizable addresses in the world, there wasn’t really anywhere for us to hide!

Some of my favorite shots are the ones where we were recreating real events for which there are amazing archive references, such as the tilt down on the scaffolding at Westminster Abbey on the eve of the Coronation, or the unveiling of the statue of King George VI.

     

Can you talk about the tools you used, and did you create any propriety tools during the workflow?
We used Enwaii and Maya for photogrammetry, Photoshop for digital matte painting and Nuke for compositing. For crowd replication we created our own in-house 2.5D tool in Nuke, which was a card generator that gave the artist a choice of crowd elements, letting them choose the costume, angle, resolution and actions required.

What are you working on now?
We are currently hard at work on Season 2 of The Crown, which is going to be even bigger and more ambitious, so watch this space! Recent work also includes King Arthur: Legend Of The Sword (Warner Bros.) and Assassin’s Creed (New Regency).

Swedish post/VFX company Chimney opens in LA

Swedish post company Chimney has opened a Los Angeles facility, its first in the US, but one of their 12 offices in eight countries. Founded in Stockholm in 1995, Chimney produces over 6,000 pieces for more than 60 countries each year, averaging 1,000 projects and 10,000 VFX shots. The company, which is privately held by 50 of its artists, is able to offer 24-hour service thanks to its many locations around the world.

When asked why Chimney decided to open an office in LA, founder Henric Larsson said, “It was not the palm trees and beaches that made us open up in LA. We’re film nerds and we want to work with the best talent in the world, and where do we find the top directors, DPs, ADs, CDs and producers if not in the US?”

The Chimney LA crew.

The Chimney LA team was busy from the start, working with Team One to produce two Lexus campaigns, including one that debuted during the Super Bowl. For the Lexus Man & Machine Super Bowl Spot, they took advantage of the talent at sister facilities in Poland and Sweden.

Chimney also reports that it has signed with Shortlist Mgmt, joining other companies like RSA, Caviar, Tool and No6 Editorial. Charlie McBrearty, founding partner of Shortlist Mgmt, says that Chimney has “been on our radar for quite some time, and we are very excited to be part of their US expansion. Shortlist is no stranger to managing director Jesper Palsson, and we are thrilled to be reunited with him after our past collaboration through Stopp USA.”

Tools used for VFX include Autodesk’s Flame and Maya, The Foundry’s Nukea and Adobe After Effects. Audio is via Avid Pro Tools. Color is done in Digital Vision’s Nucoda. For editing they call on Avid Media Composer, Apple Final Cut and Adobe Premiere

Quick Chat: Brent Bonacorso on his Narrow World

Filmmaker Brent Bonacorso has written, directed and created visual effects for The Narrow World, which examines the sudden appearance of a giant alien creature in Los Angeles and the conflicting theories on why it’s there, what its motivations are, and why it seems to ignore all attempts at human interaction. It’s told through the eyes of three people with differing ideas of its true significance. Bonacorso shot on a Red camera with Panavision Primo lenses, along with a bit of Blackmagic Pocket Cinema Camera for random B-roll.

Let’s find out more…

Where did the idea for The Narrow World come from?
I was intrigued by the idea of subverting the traditional alien invasion story and using that as a way to explore how we interpret the world around us, and how our subconscious mind invisibly directs our behavior. The creature in this film becomes a blank canvas onto which the human characters project their innate desires and beliefs — its mysterious nature revealing more about the characters than the actual creature itself.

As with most ideas, it came to me in a flash, a single image that defined the concept. I was riding my bike along the beach in Venice, and suddenly in my head saw a giant Kaiju as big as a skyscraper sitting on the sand, gazing out at the sea. Not directly threatening, not exactly friendly either, with a mutual understanding with all the tiny humans around it — we don’t really understand each other at all, and probably never will. Suddenly, I knew why he was here, and what it all meant. I quickly sketched the image and the story followed.

What was the process like bringing the film to life as an independent project?
After I wrote the script, I shot principal photography with producer Thom Fennessey in two stages – first with the actor who plays Raymond Davis (Karim Saleh) and then with the actress playing Emily Field (Julia Cavanaugh).

I called in a lot of favors from my friends and connections here in LA and abroad — the highlight was getting some amazing Primo lenses and equipment from Panavision to use because they love Magdalena Górka’s (the cinematographer) work. Altogether it was about four days of principal photography, a good bit of it guerrilla style, and then shooting lots of B-roll all over the city.

Kacper Sawicki, head of Papaya Films which represents me for commercial work in Europe, got on board during post production to help bring The Narrow World to completion. Friends of mine in Paris and Luxembourg designed and textured the creature, and I did the lighting and animation in Maxon Cinema 4D and compositing in Adobe After Effects.

Our editor was the genius Jack Pyland (who cut on Adobe Premiere), based in Dallas. Sound design and color grading (via Digital Vision’s Nucoda) were completed by Polish companies Głośno and Lunapark, respectively. Our composer was Cedie Janson from Australia. So even though this was an indie project, it became an amazing global collaborative effort.

Of course, with any no-budget project like this, patience is key — lack of funds is offset by lots of time, which is free, if sometimes frustrating. Stick with it — directing is a generally a war of attrition, and it’s won by the tenacious.

As a director, how did you pull off so much of the VFX work yourself, and what lessons do you have for other directors?
I realized early on in my career as a director that the more you understand about post, and the more you can do yourself, the more you can control the scope of the project from start to finish. If you truly understand the technology and what is possible with what kind of budget and what kind of manpower, it removes a lot of barriers.

I taught myself After Effects and Cinema 4D in graphic design school, and later I figured out how to make those tools work for me in visual effects and to stretch the boundaries of the short films I was making. It has proved invaluable in my career — in the early stages I did most of the visual effects in my work myself. Later on, when I began having VFX companies do the work, my knowledge and understanding of the process enabled me to communicate very efficiently with the artists on my projects.

What other projects do you have on the horizon?
In addition to my usual commercial work, I’m very excited about my first feature project coming up this year through Awesomeness Films and DreamWorks — You Get Me, starring Bella Thorne and Halston Sage.