Tag Archives: VFX

A VFX pro on avoiding the storage ‘space trap’

By Adam Stern

Twenty years is an eternity in any technology-dependent industry. Over the course of two-plus decades of visual effects facility ownership, changing standards, demands, capability upgrades and staff expansion have seen my company, Vancouver-based Artifex Studios, usher in several distinct eras of storage, each with its own challenges. As we’ve migrated to bigger and better systems, one lesson we’ve learned has proven critical to all aspects of our workflow.

Adam Stern

In the early days, Artifex used off-the-shelf hard drives and primitive RAIDs for our storage needs, which brought with it slow transfer speeds and far too much downtime when loading gigabytes of data on and off the system. We barely had any centralized storage, and depended on what was essentially a shared network of workstations — which our then-small VFX house could get away with. Even considering where we were then, which was sub-terabyte, this was a messy problem that needed solving.

We took our first steps into multi-TB NAS using off-the-shelf solutions from companies like Buffalo. This helped our looming storage space crunch but brought new issues, including frequent breakdowns that cost us vital time and lost iterations — even with plenty of space. I recall a particular feature film project we had to deliver right before Christmas. It almost killed us. Our NAS crashed and wouldn’t allow us to pull final shots, while throwing endless error messages our way. I found myself frantically hot-wiring spare drives to enable us to deliver to our client. We made it, but barely.

At that point it was clear a change was needed. We started using a solution that Annex Pro — a Vancouver-based VAR we’d been working with for years — helped put into place. That company was bought and then went away completely.

Our senior FX TD, Stanislav Enilenis, who was also handling IT for us back then, worked with Annex to install the new system. According to Stan, “the switch allowed bandwidth for expansion. However, when we would be in high-production mode, bandwidth became an issue. While the system was an overall improvement from our first multi-terabyte NAS, we had issues. The company was bought out, so getting drives became problematic, parts became harder to source and there were system failures. When we hit top capacity with the-then 20-plus staff all grinding, the system would slow to a crawl and our artists spent more time waiting than working.”

Artifex machine room.

As we transitioned from SD to HD, and then to 4K, our delivery requirements increased along with our rendering demands, causing severe bottlenecks in the established setup. We needed a better solution but options were limited. We were potentially looking at a six-figure investment, in a system not geared to M&E.

In 2014, Artifex was working on the TV series Continuum, which had fairly massive 3D requirements on an incredibly tight turnaround. It was time to make a change. After a number of discussions with Annex, we made the decision to move to an offering from a new company called Qumulo, which provided above-and-beyond service, training and setup. When we expanded into our new facility, Qumulo helped properly move the tech. Our new 48TB pipeline flowed freely and offered features we didn’t previously have, and Qumulo were constantly adding new and requested updates.

Laila Arshid, our current IS manager, has found this to be particularly valuable. “In Qumulo’s dashboard I can see realtime analytics of everything in the system. If we have a slowdown, I can track it to specific workstations and address any issues. We can shut that workstation or render-node down or reroute files so the system stays fast.”

The main lesson we’ve learned throughout every storage system change or upgrade is this: It isn’t just about having a lot of space. That’s an easy trap to fall into, especially today when we’re seeing skyrocketing demands from 4K+ workflows. You can have unlimited storage, but If you can’t utilize it efficiently and at speed, your storage space becomes irrelevant.

In our industry, the number of iterations we can produce has a dramatic impact on the quality of work we’re able to provide, especially with today’s accelerated schedules. One less pass can mean work with less polish, which isn’t acceptable.

Artifex provided VFX for Faster Than Light

Looking forward, we’re researching extended storage on the cloud: an ever-expanding storage pool with the advantages of fast local infrastructure. We currently use GCP for burst rendering with Zync, along with nearline storage, which has been fantastic — but the next step will be integrating these services with our daily production processes. That brings a number of new challenges, including how to combine local and cloud-based rendering and storage in ways that are seamless to our team.

Constantly expanding storage requirements, along with maintaining the best possible speed and efficiency to allow for artist iterations, are the principal drivers for every infrastructure decision at our company — and should be a prime consideration for everyone in our industry.


Adam Stern is the founder of Vancouver, British Columbia’s Artifex. He says the studio’s main goal is to heighten the VFX experience, both artistically and technically, and collaborate globally with filmmakers to tell great stories.

Post production in the cloud

By Adrian Pennington

After being talked about for years, the capacity to use the cloud for the full arsenal of post workflows is possible today with huge ramifications for the facilities business.

Rendering frames for visual effects requires an extraordinary amount of compute power for which VFX studios have historically assigned whole rooms full of servers to act as their renderfarm. As visual quality has escalated, most vendors have either had to limit the scope of their projects or buy or rent new machines on-premises to cope with the extra rendering needed. In recent times this has been upended as cloud networking has enabled VFX shops to relieve internal bottlenecks to scale, and then contract, at will.

The cloud rendering process has become so established that even this once groundbreaking capability has evolved to encompass a whole host of post workflows from previz to transcoding. In doing so, the conventional business model for post is being uprooted and reimagined.

“Early on, global facility powerhouses first recognized how access to unlimited compute and remote storage could empower the creative process to reach new heights,” explains Chuck Parker, CEO of Sohonet. “Despite spending millions of dollars on hardware, the demands of working on multiple, increasingly complex projects simultaneously, combined with decreasing timeframes, stretched on-premise facilities to their limits.”

Chuck Parker

Public cloud providers (Amazon Web Services, Google Cloud Platform, Microsoft Azure) changed the game by solving space, time and capacity problems for resource-intensive tasks. “Sohonet Fastlane and Google Compute Engine, for example, enabled MPC to complete The Jungle Book on time and to Oscar-winning standards, thanks to being able to run millions of Core hours in the cloud,” notes Parker.

Small- to mid-sized companies followed suit. “They lacked the financial resources and the physical space of larger competitors, and initially found themselves priced out of major studio projects,” says Parker. “But by accessing renderfarms in the cloud they can eliminate the cost and logistics of installing and configuring physical machines. Flexible pricing and the option of preemptible instances mean only paying for the compute power used, further minimizing costs and expanding the scope of possible projects.”

Milk VFX did just this when rendering the complex sequences on Adrift. Without the extra horsepower, the London-based house could not have bid on the project in the first place.

“The technology has now evolved to a point where any filmmaker with any VFX project or theatrical, TV or spot editorial can call on the cloud to operate at scale when needed — and still stay affordable,” says Parker. “Long anticipated and theorized, the ability to collaborate in realtime with teams in multiple geographic locations is a reality that is altering the post production landscape for enterprises of all sizes.”

Parker says the new post model might look like this. He uses the example of a company headquartered in Berlin — “an innovative company might employ only a dozen managers and project supervisors on its books. They can bid with confidence on jobs of any scale and any timeframe knowing that they can readily rent physical space in any location, anywhere in the world, to flexibly take advantage of tax breaks and populate it with freelance artists: 100 one week, say, 200 in week three, 300 in week five. The only hardware (rental) costs would be thin-client workstations and Wacom tablets, plus software licenses for 3D, roto, compositing and other key tools. With the job complete, the whole infrastructure can be smoothly scaled back.”

The compute costs of spinning up cloud processing and storage can be modelled into client pitches. “But building out and managing such connectivity independently may still require considerable CAPEX — one that might be cost-prohibitive if you only need the infrastructure for short periods,” notes Parker. “Cloud-compute resources are perfect for spikes in workload but, in between those spikes, paying for bandwidth you don’t need will hurt the bottom line.

Dedicated, “burstable” connectivity speeds of 100Mbit/s up to 50Gbit/s with flexibility, security and reliability are highly desirable attributes for the creative workflow. Price points, as ever, are a motivating concern. Parker’s offerings “move your data away from Internet bandwidth, removing network congestion and decreasing the time it takes to transfer your data. With a direct link to the major cloud provider of your choice, customers can be in control of how their data is routed, leading to a more consistent network experience.

“Direct links into major studios like Pinewood UK open up realtime on-set CGI rendering with live-action photography for virtual production scenarios,” adds Parker. “It is vital that your data transits straight to the cloud and never touches the Internet.”

With file sizes set to continue to increase exponentially over the next few years as 4K and HDR become standard and new immersive media like VR emerges to the mainstream, leveraging the cloud will not only be routine for the highest budget projects and largest vendors, it will become the new post production paradigm. In the cloud creative workflows are demystified and democratized.

Video editing and VFX app HitFilm gets an upgrade

FXhome has upgraded its video editing and VFX software app. The new HitFilm Version 11.0 features Surface Studio, a new VFX plugin modeled from Video Copilot’s Damage and Decay and Cinematic Titles tutorials. Based on customer requests, this new VFX tool enables users to create smooth or rough-textured metallic and vitreous surfaces on any text or in layers. By dropping a clear PNG file onto the HitFilm timeline, text titles instantly turn into weathered, rusty and worn metallic signs.

HitFilm’s Surface Studio also joins FXhome’s expanding library of VFX plugins, Ignite Pro. This set of plugins is available on Mac and PC platforms, and is compatible with 10 of the most used host software, including Adobe Creative Cloud, Apple Final Cut Pro X, Avid, DaVinci Resolve and others.

Last month, FXhome added to its product family with Imerge Pro, a non-destructive RAW image compositor with fully flexible layers and advanced keying for content creators. FXhome is also integrating a number of Imerge Pro plugins with HitFilm, including Exposure, Outer Glow, Inner Glow and Dehaze. New Imerge Pro plugins are tightly integrated with HitFilm V.11.0’s interface ensuring smooth, uninterrupted workflows.

Minimum system requirements are for Apple are: Mac OS 10.13 High Sierra, OS X 10.12 Sierra or OS X 10.11 El Capitan. And for Windows: Microsoft Windows 10 (64-bit), Microsoft Windows 8 (64-bit)

HitFilm 11.0 is available immediately from the FXhome store for $299. FXhome is also celebrating this holiday season with its annual sale. Through December 4, 2018, they are offering a 33% discount when users purchase the FXhome Pro Bundle, which includes HitFilm 11.0, Action, Ignite and Imerge.

Post studio Nomad adds Tokyo location

Creative editorial/VFX/sound design company Nomad has expanded its global footprint with a space in Tokyo, adding to a network that also includes offices in New York, Los Angeles and London. It will be led by managing director Yoshinori Fujisawa and executive producer Masato Midorikawa.

The Tokyo office has three client suites, an assistant support suite, production office and machine room. The tools for post workflow include Adobe Creative Cloud (Premiere, After Effects, Photoshop), Flame, Flame Assist, Avid Pro Tools and other various support tools.

Nomad partner/editor Glenn Martin says the studio often works with creatives who regularly travel between LA and Tokyo. He says Nomad will support the new Tokyo-based group with editors and VFX artists from our other offices whenever larger teams are needed.

“The role of a post production house is quite different between the US and Japan,” says Fujisawa and Midorikawa, jointly. “Although people in Japan are starting to see the value of the Western-style post production model, it has not been properly established here yet. We are able to give our Japanese directors and creatives the ability to collaborate with Nomad’s talented editors and VFX artists, who have great skills in storytelling and satisfying the needs of brands. Nomad has a comprehensive post-production workflow that enables the company to execute global projects. It’s now time for Japan to experience this process and be a part of the future of global advertising.”

Main Image: (L-R) Yoshinori Fujisawa and Masato Midorikawa

MPC Film provides VFX for new Predator movie

From outer space to suburban streets, the hunt comes to Earth in director Shane Black’s reinvention of the Predator series. Now, the universe’s most lethal hunters are stronger, smarter and deadlier than ever before, having genetically upgraded themselves with DNA from other species.

When a young boy accidentally triggers their return to Earth, only a crew of ex-soldiers and an evolutionary biology professor can prevent the end of the human race.

 

MPC Film’s team was led by VFX supervisors Richard Little and Arundi Asregadoo, creating 500 shots for this new Predator movie. The majority of the work required hero animation for the Upgrade Predator and additional work included the Predator dogs, FX effects and a CG swamp environment.

MPC Film’s character lab department modeled, sculpted and textured the Upgrade and Predator dogs to a high level of detail, allowing the director to have flexibility to view closeups, if needed. The technical animation department applied movement to the muscle system and added the flowing motion to the dreadlocks on all of the film’s hero alien characters, an integral part of the Predator’s aesthetic in the franchise.

MPC’s artists also created photorealistic Predator One and digital double mercenary characters for the film. Sixty-four other assets were created, ranging from stones and sticks for the swamp floor to grenades, a grenade launcher and bombs.

MPC’s artists also worked on the scene where we first meet the Upgrade’s dogs who are sent out to hunt the Predator. The sequence was shot on a bluescreen stage on location. The digital environments team built a 360-degree baseball field matching the location shoot from reference photography. Creating simple geometry and re-projecting the textures helped create a realistic environment.

Once the Upgrade tracks down the “fugitive” Predator the fight begins. To create the scene, MPC used a mixture of the live-action Predator intercut with its full CG Predator. The battle culminates with the Upgrade ripping the head and spine away from the body of the fugitive. This shot was a big challenge for the FX and tech animation team, who also added green Predator blood into the mix, amplifying the gore factor.

In the hunt scene, the misfit heroes trap the Upgrade and set the ultimate hunter alight. This sequence was technically challenging for the animation, lighting and FX team, which worked very closely to create a convincing Upgrade that appeared to be on fire.

For the final battle, MPC Film’s digital environments artists created a full CG swamp where the Upgrade’s ship crash-lands. The team was tasked with matching the set and creating a 360-degree CG set extension with water effects.

A big challenge was how to ensure the Upgrade Predator interacted realistically with the actors and set. The animation, tech animation and FX teams worked hard to make the Upgrade Predator fit seamlessly into this environment.

London design, animation studio Golden Wolf sets up shop in NYC

Animation studio Golden Wolf, headquartered in London, has launched its first stateside location in New York City. The expansion comes on the heels of an alliance with animation/VFX/live-action studio Psyop, a minority investor in the company. Golden Wolf now occupies studio space in SoHo adjacent to Psyop and its sister company Blacklist, which formerly represented Golden Wolf stateside and was instrumental to the relationship.

Among the year’s highlights from Golden Wolf are an integrated campaign for Nike FA18 Phantom (client direct), a spot for the adidas x Parley Run for the Oceans initiative (TBWA Amsterdam) in collaboration with Psyop, and Marshmello’s Fly music video for Disney. Golden Wolf also received an Emmy nomination for its main title sequence for Disney’s Ducktales reboot.

Heading up Golden Wolf’s New York office are two transplants from the London studio, executive producer Dotti Sinnott and art director Sammy Moore. Both joined Golden Wolf in 2015, Sinnott from motion design studio Bigstar, where she was a senior producer, and Moore after a run as a freelance illustrator/designer in London’s agency scene.

Sinnott comments: “Building on the strength of our London team, the Golden Wolf brand will continue to grow and evolve with the fresh perspective of our New York creatives. Our presence on either side of the Atlantic not only brings us closer to existing clients, but also positions us perfectly to build new relationships with New York-based agencies and brands. On top of this, we’re able to use the time difference to our advantage to work on faster turnarounds and across a range of budgets.”

Founded in 2013 by Ingi Erlingsson, the studio’s executive creative director, Golden Wolf is known for youth-oriented work — especially content for social media, entertainment and sports — that blurs the lines of irreverent humor, dynamic action and psychedelia. Erlingsson was once a prolific graffiti artist and, later, illustrator/designer and creative director at U.K.-based design agency ilovedust. Today he inspires Golden Wolf’s creative culture and disruptive style fed in part by a wave of next-gen animation talent coming out of schools such as Gobelins in France and The Animation Workshop in Denmark.

“I’m excited about our affiliation with Psyop, which enjoys an incredible legacy producing industry-leading animated advertising content,” Erlingsson says. “Golden Wolf is the new kid on the block, with bags of enthusiasm and an aim to disrupt the industry with new ideas. The combination of the two studios means that we are able to tackle any challenge, regardless of format or technical approach, with the support of some of the world’s best artists and directors. The relationship allows brands and agencies to have complete confidence in our ability to solve even the biggest challenges.”

Golden Wolf’s initial work out of its New York studio includes spots for Supercell (client direct) and Bulleit Bourbon (Barton F. Graf). Golden Wolf is represented in the US market by Hunky Dory for the East Coast, Baer Brown for the Midwest and In House Reps for the West Coast. Stink represents the studio for Europe.

Main Photo: (L-R) Dotti Sinnott, Ingi Erlingsson and Sammy Moore.

Our SIGGRAPH 2018 video coverage

SIGGRAPH is always a great place to wander around and learn about new and future technology. You can get see amazing visual effects reels and learn how the work was created by the artists themselves. You can get demos of new products, and you can immerse yourself in a completely digital environment. In short, SIGGRAPH is educational and fun.

If you weren’t able to make it this year, or attended but couldn’t see it all, we would like to invite you to watch our video coverage from the show.

SIGGRAPH 2018

postPerspective Impact Award winners from SIGGRAPH 2018

postPerspective has announced the winners of our Impact Awards from SIGGRAPH 2018 in Vancouver. Seeking to recognize debut products with real-world applications, the postPerspective Impact Awards are voted on by an anonymous judging body made up of respected industry artists and professionals. It’s working pros who are going to be using new tools — so we let them make the call.

The awards honor innovative products and technologies for the visual effects, post production and production industries that will influence the way people work. They celebrate companies that push the boundaries of technology to produce tools that accelerate artistry and actually make users’ working lives easier.

While SIGGRAPH’s focus is on VFX, animation, VR/AR, AI and the like, the types of gear they have on display vary. Some are suited for graphics and animation, while others have uses that slide into post production, which makes these SIGGRAPH Impact Awards doubly interesting.

The winners are as follows:

postPerspective Impact Award — SIGGRAPH 2018 MVP Winner:

They generated a lot of buzz at the show, as well as a lot of votes from our team of judges, so our MVP Impact Award goes to Nvidia for its Quadro RTX raytracing GPU.

postPerspective Impact Awards — SIGGRAPH 2018 Winners:

  • Maxon for its Cinema 4D R20 3D design and animation software.
  • StarVR for its StarVR One headset with integrated eye tracking.

postPerspective Impact Awards — SIGGRAPH 2018 Horizon Winners:

This year we have started a new Imapct Award category. Our Horizon Award celebrates the next wave of impactful products being previewed at a particular show. At SIGGRAPH, the winners were:

  • Allegorithmic for its Substance Alchemist tool powered by AI.
  • OTOY and Epic Games for their OctaneRender 2019 integration with UnrealEngine 4.

And while these products and companies didn’t win enough votes for an award, our voters believe they do deserve a mention and your attention: Wrnch, Google Lightfields, Microsoft Mixed Reality Capture and Microsoft Cognitive Services integration with PixStor.

 

Artifex provides VFX limb removal for Facebook Watch’s Sacred Lies

Vancouver-based VFX house Artifex Studios created CG amputation effects for the lead character in Blumhouse Productions’ new series for Facebook Watch, Sacred Lies. In the show, the lead character, Minnow Bly (Elena Kampouris), emerges after 12 years in the Kevinian cult missing both of her hands. Artifex was called on to remove the actress’ limbs.

VFX supervisor Rob Geddes led Artifex’s team who created the hand/stump transposition, which encompassed 165 shots across the series. This involved detailed paint work to remove the real hands, while Artifex 3D artists simultaneously performed tracking and match move in SynthEyes to align the CG stump assets to the actress’ forearm.

This was followed up with some custom texture and lighting work in Autodesk Maya and Chaos V-Ray to dial in the specific degree of scarring or level of healing on the stumps, depending on each scene’s context in the story. While the main focus of Artifex’s work was on hand removal, the team also created a pair of severed hands for the first episode after rubber prosthetics didn’t pass the eye test. VFX work was run through Side Effects Houdini and composited in Foundry’s Nuke.

“The biggest hurdle for the team during this assignment was working with the actress’ movements and complex performance demands, especially the high level of interaction with her environment, clothing or hair,” says Adam Stern, founder of Artifex. “In one visceral sequence, Rob and his team created the actual severed hands. These were originally shot practically with prosthetics, however the consensus was that the practical hands weren’t working. We fully replaced these with CG hands, which allowed us to dial in the level of decomposition, dirt, blood and torn skin around the cuts. We couldn’t be happier with the results.”

Geddes adds, “One interesting thing we discovered when wrangling the stumps, is that the logical and accurate placement of the wrist bone of the stumps didn’t necessarily feel correct when the hands weren’t there. There was quite a bit of experimentation to keep the ‘hand-less’ arms from looking unnaturally long, or thin.”

Artifex also added a scene involving absolute devastation in a burnt forest in Episode 101, involving matte painting and set extension of extensive fire damage that couldn’t safely be achieved on set. Artifex fell back on their experience in environmental VFX creation, using matte painting and projections tied together with ample rotoscope work.

Approximately 20 Artifex artists took part in Sacred Lies across 3D, compositing, matte painting, I/O and production staff.

Watch Artifex founder Adam Stern talk about the show from the floor of SIGGRAPH 2018:

Sony Pictures Post adds three theater-style studios

Sony Pictures Post Production Services has added three theater-style studios inside the Stage 6 facility on the Sony Pictures Studios lot in Culver City. All studios feature mid-size theater environments and include digital projectors and projection screens.

Theater 1 is setup for sound design and mixing with two Avid S6 consoles and immersive Dolby Atmos capabilities, while Theater 3 is geared toward sound design with a single S6. Theater 2 is designed for remote visual effects and color grading review, allowing filmmakers to monitor ongoing post work at other sites without leaving the lot. Additionally, centralized reception and client services facilities have been established to better serve studio sound clients.

Mix Stage 6 and Mix Stage 7 within the sound facility have been upgraded, each featuring two S6 mixing consoles, six Pro Tools digital audio workstations, Christie digital cinema projectors, 24 X 13 projection screens and a variety of support gear. The stages will be used to mix features and high-end television projects. The new resources add capacity and versatility to the studio’s sound operations.

Sony Pictures Post Production Services now has 11 traditional mix stages, the largest being the Cary Grant Theater, which seats 344. It also has mix stages dedicated to IMAX and home entertainment formats. The department features four sound design suites, 60 sound editorial rooms, three ADR recording studios and three Foley stages. Its Barbra Streisand Scoring Stage is among the largest in the world and can accommodate a full orchestra and choir.

Patrick Ferguson joins MPC LA as VFX supervisor

MPC’s Los Angeles studio has added Patrick Ferguson to its staff as visual effects supervisor. He brings with him experience working in both commercials and feature films.

Ferguson started out in New York and moved to Los Angeles in 2002, and he has since worked at a range of visual effect houses along the West Coast, including The Mission, where he was VFX supervisor, and Method, where he was head of 2D. “No matter where I am in the world or what I’m working on, one thing has remained consistent since I started working in the industry: I still love what I do. I think that’s the most important thing.”

Ferguson has collaborated with directors such as Stacy Wall, Mark Romanek, Melina Matsoukas, Brian Billow and Carl Rinsch, and has worked on campaigns for big global brands, including Nike, Apple, Audi, HP and ESPN.

He has also worked on high-profile films, including Pirates of the Caribbean and Alice in Wonderland, and he was a member of the Academy Award-winning team for The Curious Case of Benjamin Button.

“In this new role at MPC, I hope to bring my varied experience of working on large scale feature films as well as on commercials that have a much quicker turnaround time,” he says. “It’s all about knowing what the correct tools are for the particular job at hand, as every project is unique.”

For Ferguson, there is no substitute for being on set: “Being on set is vital, as that’s when key relationships are forged between the director, the crew, the agency and the entire team. Those shared experiences go a long way in creating a trust that is carried all the way through to end of the project and beyond.”

Breathing life into The Walking Dead with VFX

By Karen Moltenbrey

Zombies used to have a short life span, awakening sometime during October, just in time for Halloween, before once again stumbling back into obscurity for another year. But thanks to the hit series The Walking Dead and its spin-off Fear the Walking Dead, the popularity of these monsters is infectious, turning them — and the shows — into cult phenomenon.

The Walking Dead’s rise in popularity started almost immediately with the series’ US debut, on October 31, 2010, on AMC. The storyline began when Rick Grimes, a sheriff deputy, awakens from a coma to find the world overrun by zombies. He and other survivors in the Atlanta area then band together to fight off these so-called “walkers,” as well as other tribes of survivors intent on ensuring their own survival in this post-apocalyptic world, no matter the cost.

In mid-2015, the show gave rise to the companion series Fear the Walking Dead. Fear, a prequel to The Walking Dead, takes place at the start of the zombie apocalypse and follows a different set of characters as they struggle to survive on the West Coast.

And the series’ visual effects are, well, to die for. Literally.

Burbank’s Picture Shop began creating the effects for Walk starting last season and is now in the midst of Season 9 (The studio splits the lion’s share of the work on the show with Goodbye Kansas Studios). Picture Shop provides visual effects for Fear, as well (Season 4, Episodes 1 through 8).

According to Christian Cardona, senior visual effects supervisor at Picture Shop, the crux of the work for both series includes character “kill” effects and environment augmentations. “We do a lot of what we call ‘walker kills.’ What that usually requires is weapon extensions, whereby the weapon gets inserted into the walkers, and then the ensuing wounds. We have to track the wounds onto the practical walkers and then also do blood sims during those kills,” he explains. “That accounts for probably 50 to 60 percent of the work.”

The only way to kill a walker is to damage its brain or destroy its body. Therefore, each episode contains its fair share of bodily damage to the zombies (and, sometimes, to the humans trying to keep them and the adversarial tribes at bay).

Cardona notes that it took the group a few episodes to nail down the blood aesthetic the producers were looking for — the look of the blood as well as how it should flow from a walker’s mouth. “Throughout the season, we’ve definitely zeroed in on that and have really gotten a good system down, where now it takes us just a fraction of the time,” he says.

Nevertheless, the work has to be exact. “The client wants everything to be photoreal; they don’t want it to look like anything was added. With that said, they scrutinize every shot, frame by frame and pixel by pixel, so we definitely have to do our due diligence and make sure our work adheres to their standard,” Cardona says. “And they will art direct every drop of blood. They know what they want, and we make sure we deliver that.”

The Walking Dead

There is indeed consistency in the overall look of the blood and wounds, as well as the walkers themselves in each of the shows, although there was one scene in Walk whereby the walkers were stuck in a toxic sludge. As a result, their skin was more pale and saggy. “It was a unique scenario where we could change their appearance and the look of the blood for that episode to illustrate that these walkers were different,” recalls Cardona.

The Walking Dead
In Walk, the majority of the hero walkers — either individuals or those in small groups – are actors with prosthetic makeup, or in some cases, practical models. But when there are more than two dozen or so walkers in a shot, they are CG creations.

Whether a practical or digital walker is used for the show often depends on the action in the scene. Walker kills are prevalent throughout the series, as it is in Fear, and often this involves a decapitation or a scalping, “because in order to kill the walkers, they have to be either shot in the head or stabbed,” Cardona points out. “Oftentimes when that happens, we have to cut from the person with the makeup and prosthetics and replace the entire head digitally before chopping it off for the scene.”

Season 8 Episode 14 featured a practical walker with a broken body strapped to a dolly cart, its head locked in position on the cart. As a form of torture, the cart was wheeled close enough for the walker to bite a victim. Initially, the walker was practical, but Picture Shop artists ended up replacing it with a CG model.

“The client wasn’t really happy with the practical version on set. It looked rubbery, like an animatronic walker, which it was. It needed to be fleshier, a little more real,” says Cardona. “The blood and wounds felt too dry, and the muscles had to contract.”

Fear the Walking Dead

In fact, Cardona describes that sequence as one of the more challenging from Season 8. “It was close to the camera and had to be photoreal. It wasn’t just one shot, either; there were over a dozen shots in the scene, with multiple angles and long takes.”

The team used the practical cart walker’s head in the shots but replaced the body and arms, which also had been locked to the cart, requiring intricate match-moving. “We had to be spot on and tight, so there was a lot of soft tracking as well, since the camera was moving everywhere,” recalls Cardona. “And then we had to get that walker to look photoreal.”

Compounding this scene even more was the fact that the main character shoots the cart walker, requiring the addition of bullet hits and resulting wounds.

Zombie Nation
For the upcoming Season 9 (premiering October 7), Picture Shop began using a new system for generating large crowds of walkers. According to Cardona, the animators introduced Golaem’s population tool into the pipeline to help with the mass crowd simulations for the walker herds that will appear during the season. Previously, the artists used the particle system in Autodesk’s Maya for this task, “but we needed something that was more robust, something created specifically to do this kind of effect, especially on a TV schedule and with a TV pipeline and workflow,” he adds.

During this past off-season, the artists began establishing a system that would source the walker assets the group had created for Season 8 and in the off-season to prepare for Season 9. “We were modeling walkers and using some of the walkers from Season 8, and standardizing them all with the same T-pose so we could easily swap out rigs and customize and create a lot of variations with textures, changing their clothes, skin color and hair,” explains Cardona. “So when we have to create walker herds, we can easily get five or six variations from a single walker. We end up with a lot of variations in our herd sims without having to create a brand-new walker every time.”

And make no mistake, there will be more herds of walkers in Season 9 than viewers have seen in previous seasons.

The Walking Dead

Other VFX
On average, Picture Shop created 40 to 50 VFX shots per episode during Season 8 of The Walking Dead, with a typical turnaround time of two to three weeks. In addition to the kills and their associated effects, the group also built set extensions. For instance, most of Season 8 (and 7) revolved around what is called “the Sanctuary,” an old factory that is now home to The Saviors, with whom Grimes and his group must interact. The first two floors were practically built, and then Picture Shop extended the structure digitally by another 12 stories. At one point, a gun fight ensues, which called for the CG artists to break out all the windows — another visual effect.

“The group didn’t move far from their location in Season 8, so the need for additional set extensions wasn’t as high as it had been in earlier seasons,” adds Cardona.

Episode 8 of Walk — which has the tribes fighting each other more so than the walkers — did start off with a bang, however. In the first episode of the season, Daryl, Grimes’ trusted lieutenant, shoots a box of explosives, ripping a CG walker in half — work that Cardona describes as “challenging.” In another shot, an RPG blows up another tribe member; in it, the actor had to be swapped out for a digital double that had been projection modeled.

Cardona has seen an evolution in the effects Picture Shop is providing for Walk in particular. Some of the more interesting effects will be coming in Season 9, he teases. “We’ve been working on some of the stuff over the summer and have spent time in R&D on one effect in particular — something new that the audience hasn’t seen before,” he says.

In the upcoming season, nature is taking over, and there will be overgrown vegetation on all the buildings and structures. “With that said, we are doing an effect whereby we take a murder of crows and have them swarm similar to the murmuration of starlings, which wheel and dart through the sky in tight, fluid formations,” says Cardona. “This is something you will see through the course of the season.”

 

The Walking Dead

For this work, the artists built and rigged a crow in Maya and generated various animation cycles, which were cached out and used as a particle simulation within Side Effects’ Houdini.

While creating this effect was not nearly as time-consuming as setting up the crowd simulation in Golaem, “it was something unique, and we had to figure out an approach that gave us the flexibility to art direct and change [the results] quickly,” Cardona notes. “And, it’s an effect that has nothing to do with walkers, but it tells a big part of the story of what is happening to the world around them.”

In terms of the other overall effects in Walk and Fear, the Picture Shop artists use primarily Autodesk’s 3ds Max, although Maya is also used, albeit mainly for the Golaem crowd work. Yet once the sim is complete, the artists cache the results and bake out all the animation within Maya, then export it into Max. Rendering is done in Chaos’ V-Ray for 3ds Max.

In addition, the artists use Pixologic’s ZBrush for a lot of the organic modeling, mostly for the walkers. For effects, the crew usually turns to Houdini.

The effects that Picture Shop delivers for The Walking Dead are unique — in terms of the blood and gore — from other shows the studio works on, such as Hawaii Five-0 and MacGyver, which call for more traditional VFX, like muzzle flashes and explosions, as well as set extensions. “It’s more hard-surface modeling stuff, whereas Walk, for the most part, is a lot more organic,” Cardona adds. “And the tone is completely different, obviously.”

Monster Mash
Picture Shop performs the same type of work for Fear as it does for Walk — mostly weapon extensions and walker kills. Cardona notes there is a cohesiveness in the effects between the two shows, especially now that the timeline of both stories is nearly the same. Fear started at the beginning of the apocalypse, when the walkers were “fresher, and the blood kills were a little bigger, because the thinking was that there would be more blood present in the walkers at that point,” he explains.

Insofar as the general walker kills are concerned, the actors never really make a physical impact with their stabbing motions, so oftentimes their hands are not in the right positions or the reaction time of the walker is off. In these instances, it is up to the VFX artists to digitally rectify the action and reaction — for instance, separating and repositioning the actor’s arm, stabilizing it, then tracking on the weapon that is placed in their hand, as well as stabilizing the walker, adding the wound, and then adding the weapon extension. This holds true for both series.

“Sometimes this can be challenging because the camera is also moving, so it requires significant roto work, and we have to deconstruct the shot and reconstruct it back again,” says Cardona. “There are plenty of these types of shots we have to do for both Walk and Fear.”

For the tracking on this and other work, the studio uses Andersson Technologies’ SynthEyes; for the planar tracking, Boris FX’s Mocha, formerly from Imagineer Systems.

According to Cardona, the overall look of the walkers has changed throughout the course of the Fear seasons. In Walk, they are more decayed, whereas in Fear, they still look like humans for the most part, not as skeletal. But, that has now changed, and with a more cohesive look between the walkers is a more cohesive look with some of the VFX, particularly the blood effects going forward.

However, there is one big difference between the two shows. Walk is still shot on film, 16mm, while Fear is shot digitally, so the graininess is quite heavy with Walk. “It affects our tracking because there is so much noise. Often we would de-grain [the footage] to do the tracking, and then add the grain back in. Also, any time we have a bluescreen shot where we have to pull keys, it’s a problem,” says Cardona.

Indeed, honing the effects on The Walking Dead gives the artists a leg up when it comes to Fear. Another advantage: Picture Shop performs the color and finishing for both shows, as well, which can result in some emergency VFX work for the crew, especially during a time crunch. In fact, the colorists at the facility created the custom 16mm grain pattern that the artists use now during the tracking process. It was generated by the colorists when the client was considering migrating Walk to digital format but then decided to retain the current structure.

Another plus: The Walking Dead executives are also located in the same building as Picture Shop, several floors up. “They just moved here, and it’s convenient for everyone. We can do spot sessions with VFX producer Jason Sax or showrunner Angela Kang (who recently took over that role from Scott Gimple).”

At Comic-Con San Diego a few weeks ago, the series was a fan favorite, and online there is talk about plans for upcoming seasons. So, it appears these walkers still have a lot of life left in them, if fans — and the digital artists — have their way.


Karen Moltenbrey is a longtime writer and editor in the CG and post industries.

Using VFX to bring the new Volkswagen Jetta to life

LA-based studio Jamm provided visual effects for the all-new 2019 Volkswagen Jetta campaign Betta Getta Jetta. Created by Deutsch and produced by ManvsMachine, the series of 12 spots bring the Jetta to life by combining Jamm’s CG design with a color palette inspired by the car’s 10-color ambient lighting system.

“The VW campaign offered up some incredibly fun and intricate challenges. Most notably was the volume of work to complete in a limited amount of time — 12 full-CG spots in just nine weeks, each one unique with its own personality,” says VFX supervisor Andy Boyd.

Collaboration was key to delivering so many spots in such a short span of time. Jamm worked closely with ManvsMachine on every shot. “The team had a very strong creative vision which is crucial in the full 3D world where anything is possible,” explains Boyd.

Jamm employed a variety of techniques for the music-centric campaign, which highlights updated features such as ambient lighting and Beats Audio. The series includes spots titled  Remix, Bumper-to-Bumper, Turb-Whoa, Moods, Bass, Rings, Puzzle and App Magnet, along with 15-second teasers, all of which aired on various broadcast, digital and social channels during the World Cup.

For “Remix,” Jamm brought both a 1985 and a 2019 Jetta to life, along with a hybrid mix of the two, adding a cool layer of turntablist VFX, whereas for “Puzzle,” they cut up the car procedurally in Houdini​, which allowed the team to change around the slices as needed.

For Bass, Jamm helped bring personality to the car while keeping its movements grounded in reality. Animation supervisor Stew Burris pushed the car’s performance and dialed in the choreography of the dance with ManvsMachine as the Jetta discovered the beat, adding exciting life to the car as it bounced to the bassline and hit the switches on a little three-wheel motion.

We reached out to Jamm’s Boyd to find out more.

How early did Jamm get involved?
We got involved as soon as agency boards were client approved. We worked hand in hand with ManvMachine to previs each of the spots in order to lay the foundation for our CG team to execute both agency and directors’ vision.

What were the challenges of working on so many spots at once.
The biggest challenge was for editorial to keep up with the volume of previs options we gave them to present to agency.

Other than Houdini, what tools did they use?
Flame, Nuke and Maya were used as well.

What was your favorite spot of the 12 and why?
Puzzle was our favorite to work on. It was the last of the bunch delivered to Deutsch which we treated with a more technical approach, slicing up the car like a Rubix’s Cube.

 

2nd-gen AMD Ryzen Threadripper processors

At the SIGGRAPH show, AMD announced the availability of its 2nd-generation AMD Ryzen Threadripper 2990WX processor with 32 cores and 64 threads. These new AMD Ryzen Threadripper processors are built using 12nm “Zen+” x86 processor architecture. Second-gen AMD Ryzen Threadripper processors support the most I/O and are compatible with existing AMD X399 chipset motherboards via a simple BIOS update, offering builders a broad choice for designing the ultimate high-end desktop or workstation PC.

The 32-core/64-thread Ryzen Threadripper 2990WX and the 24-core/48-thread Ryzen Threadripper 2970WX are purpose-built for prosumers who crave raw computational compute power to dispatch the heaviest workloads. The 2nd-gen AMD Ryzen Threadripper 2990WX offers up to 53 percent faster multithread performance and up to 47 percent more rendering performance for creators than the core i9-7980XE.

This new AMD Ryzen Threadripper X series comes with a higher base and boost clocks for users who need high performance. The 16 cores and 32 threads in the 2950X model offer up to 41 percent more multithreaded performance than the Core i9-7900X.

Additional performance and value come from:
• AMD StoreMI technology: All X399 platform users will now have free access to AMD StoreMI technology, enabling configured PCs to load files, games and applications from a high-capacity hard drive at SSD-like read speeds.
• Ryzen Master Utility: Like all AMD Ryzen processors, the 2nd-generation AMD Ryzen Threadripper CPUs are fully unlocked. With the updated AMD Ryzen Master Utility, AMD has added new features, such as fast core detection both on die and per CCX; advanced hardware controls; and simple, one-click workload optimizations.
• Precision Boost Overdrive (PBO): A new performance-enhancing feature that allows multithreaded boost limits to be raised by tapping into extra power delivery headroom in premium motherboards.

With a simple BIOS update, all 2nd-generation AMD Ryzen Threadripper CPUs are supported by a full ecosystem of new motherboards and all existing X399 platforms. Designs are available from top motherboard manufacturers, including ASRock, ASUS, Gigabyte and MSI.

The 32-core, 64-thread AMD Ryzen Threadripper 2990WX is available now from global retailers and system integrators. The 16-core, 32-thread AMD Ryzen Threadripper 2950X processor is expected to launch on August 31, and the AMD Ryzen Threadripper 2970WX and 2920X models are slated for launch in October.

NIM 3.0 studio management tool debuts at Siggraph

NIM Labs’ new NIM 3.0 is an all-in-one studio management platform that helps visual effects and post production houses track projects and manage their company. Created by creative directors and studio heads, NIM 3.0 helps studios do more under one roof with a redesigned review tool, grouped bidding and a live link to Adobe Premiere.

What began as an in-house toolset for Ntropic is now the studio management software in-house at Digital Domain, Logan, Taylor James and Intelligent Creatures. With NIM 3.0, studios get access tools to handle bidding, tracking, versioning, scheduling, reviews, finances and time cards in one place.

The creative review tool has been rewritten from the ground up to make screening and markup of videos, PDFs and still frames even easier. Review capabilities have also been extended to more parts of NIM, so users can do more in a single location. With Review Bins, teams can stay organized using saveable smart filters and grouped elements for later use. Review Bins come with a new Theater View, which allows for a larger viewer experience and the ability to zoom, fit, fill and pan review items during playback. Review items can also be stacked according to version, allowing supervisors and clients to quickly visualize progress.

After years of building NIM Connectors into Maya, Houdini, 3ds Max, Nuke and Flame, NIM Labs is introducing its first direct workflow for Adobe Premiere editors. With NIM 3.0, Adobe Premiere becomes a conform tool, helping VFX, VR and post houses do everything from creating timeline shots to round-tripping elements rendered in other packages, all without leaving the app. Thanks to the new NIM Connector, the entire creative review process can be conducted from within Premiere, creating a seamless experience. Premiere is NIM’s third Adobe connection, following After Effects and Photoshop.

NIM’s bidding system was created to get bids out the door faster using studio-wide templates. In NIM 3.0, new organizational tools let producers define the different sections of their bid using groups, allowing greater flexibility and speed when responding to a large RFP. Through item linking, users will be able to modify multiple line items simultaneously, with the ability to attach further information like images, notes and descriptions when the need arises.

Companies with existing directory systems can now immediately integrate NIM with their users. By accessing NIM 3.0’s on-board security controls, organizations can manage permissions and security groups from Active Directory (AD) or Lightweight Directory Access Protocol (LDAP). Support for multiple domains will add even more authentication controls across networks.

NIM 3.0 will be available in the fall. Active user licenses cost $360 annually and $40 monthly. NIM 2.8 is currently available as a free 30-day trial.

Behind the Title: Jogger Studios’ CD Andy Brown

This veteran creative director can also often be found at the controls of his Flame working on a new spot.

NAME: Andy Brown

COMPANY: Jogger Studios (@joggerstudios)

CAN YOU DESCRIBE YOUR COMPANY?
We are a boutique post house with offices in the US and UK providing visual effects, motion graphics, color grading and finishing. We are partnered with Cut + Run for editorial and get to work with their editors from around the world. I am based in our Jogger Los Angeles office, after having helped found the company in London.

WHAT’S YOUR JOB TITLE?
Creative Director

WHAT DOES THAT ENTAIL?
Overseeing compositing, visual effects and finishing. Looking after staff and clients. Juggling all of these things and anticipating the unexpected.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I’m still working “on the box” every day. Even though my title is creative director, it is the hands-on work that is my first love as far as project collaborations go. Also I get to re-program the phones and crawl under the desks to get the wires looking neater when viewed from the client couch.

WHAT’S YOUR FAVORITE PART OF THE JOB?
The variety, the people and the challenges. Just getting to work on a huge range of creative projects is such a privilege. How many people get to go to work each day looking forward to it?

WHAT’S YOUR LEAST FAVORITE?
The hours, occasionally. It’s more common to have to work without clients nowadays. That definitely makes for more work sometimes, as you might need to create two or three versions of a spot to get approval. If everyone was in the room together you reach a consensus more quickly.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
I like the start of the day best, when everyone is coming into the office and we are getting set up for whatever project we are working on. Could be the first coffee of the day that does it.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I want to say classic car dealer, but given my actual career path the most likely alternative would be editor.

WHY DID YOU CHOOSE THIS PROFESSION?
There were lots of reasons, when I look at it. It was either the Blue Peter Book of Television (the longest running TV program for kids, courtesy of the BBC) or my visit to the HTV Wales TV station with my dad when I was about 12. We walked around the studios and they were playing out a film to air, grading it live through a telecine. I was really struck by the influence that the colorist was having on what was seen.

I went on to do critical work on photography, film and television at the Centre for Contemporary Cultural Studies at Birmingham University. Part of that course involved being shown around the Pebble Mill BBC Studios. They were editing a sequence covering a public enquiry into the Handsworth riots in 1985. It just struck me how powerful the editing process was. The story could be told so many different ways, and the editor was playing a really big part in the process.

Those experiences (and an interest in writing) led me to think that television might be a good place to work. I got my first job as a runner at MPC after a friend had advised me how to get a start in the business.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
We worked on a couple of spots for Bai recently with Justin Timberlake creating the “brasberry.” We had to make up some graphic animations for the newsroom studio backdrop for the shoot and then animate opening title graphics to look just enough like it was a real news report, but not too much like a real news report.

We do quite a bit of food work, so there’s always some burgers, chicken or sliced veggies that need a bit of love to make them pop.

There’s a nice set extension job starting next week, and we recently finished a job with around 400 final versions, which made for a big old deliverables spreadsheet. There’s so much that we do that no one sees, which is the point if we do it right.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
Sometimes the job that you are most proud of isn’t necessarily the most amazing thing to look at. I used to work on newspaper commercials back in the UK, and it was all so “last minute.” A story broke, and all of a sudden you had to have a spot ready to go on air with no edit, no footage and only the bare bones of a script. It could be really challenging, but we had to get it done somehow.

But the best thing is seeing something on TV that you’ve worked on. At Jogger Studios, it is primarily commercials, so you get that excitement over and over again. It’s on air for a few weeks and then it’s gone. I like that. I saw two of our spots in a row recently on TV, which I got a kick out of. Still looking for that elusive hat-trick.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
The Flame, the Land Rover Series III and, sadly, my glasses.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
Just friends and family on Instagram, mainly. Although like most Flame operators, I look at the Flame Learning Channel on YouTube pretty regularly. YouTube also thinks I’m really interested in the Best Fails of 2018 for some reason.

DO YOU LISTEN TO MUSIC WHILE YOU WORK?
More often than not it is podcasts. West Wing Weekly, The Adam Buxton Podcast, Short Cuts and Song Exploder. Plus some of the shows on BBC 6 Music, which I really miss.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I go to work every day feeling incredibly lucky to be doing the job that I do, and it’s good to remember that. The 15-minute walk to and from work in Santa Monica usually does it.

Living so close to the beach is fantastic. We can get down to the sand, get the super-brella set up and get in the sea with the bodyboards in about 15 minutes. Then there’s the Malibu Cars & Coffee, which is a great place to start your Sunday.

The Darkest Minds director Jennifer Yuh Nelson

By Iain Blair

Jennifer Yuh Nelson has been an acclaimed — and highly bankable — director in animation for years, thanks to her work on the billion-dollar-grossing Kung Fu Panda franchise.

Now she’s taken on her first live-action film with Fox’s The Darkest Minds. Adapted from the best-selling book by Alexandra Bracken, the first in a YA trilogy, the film stars Amandla Stenberg in the lead as Ruby, along with Harris Dickinson, Miya Cech and Skylan Brooks.

The Darkest Minds also features adults, including Mandy Moore and Bradley Whitford, and revolves around a group of teens who mysteriously develop powerful new abilities and who are then declared a threat by the government and detained. It’s essentially a genre mash-up — a road movie with some sci-fi elements and lots of kinetic action. It was written by Chad Hodge, best known for his work as the creator and executive producer of TNT’s Good Behavior and Fox’s Wayward Pines.

Nelson’s creative team included DP Kramer Morgenthau (Terminator Genisys, Thor: The Dark World), editors Maryann Brandon (Star Wars: The Force Awakens) and Dean Zimmerman (Stranger Things), and visual effects supervisor Björn Mayer (Oblivion). Fox-based 21 Laps’ (Stranger Things, Arrival) Shawn Levy and Dan Levine produced.

I recently spoke with Nelson about making the film.

What sort of film did you set out to make?
To start off with, I wanted a great emotional core, and as this was based off a book, it already had that built in… even in early versions of the script. It had great characters with strong relationships, and I wanted to do some action stuff.

Any big surprises making the move to a major live-action film, or were you pretty prepared in terms of prep thanks to your background in animation?
I was pretty prepared, and the prep’s essentially the same as in animation. But, of course, production is utterly different, along with the experience of being on location. I had a really great crew and a fantastic DP, which helped me a lot. The big difference is suddenly you have the luxury of coverage, which you don’t get in animation. There you need to know exactly what you want, as it’s so expensive to create. Being outside all day on location, and dealing with the elements and crew and cast all at once — that was a big learning curve, but I really loved it. I had a fantastic time!

What were the main technical challenges in pulling it all together?
There were a lot of moving parts, and the main one was probably all the VFX involved. It’s a very reality-based book. It’s not set in outer space, and it’s supposed to look very grounded and seamless with reality. So you have these characters with superpowers that are meant to be very believable, but then we had fire, flamethrowers, 300 extras running around, wind machines and so on. Then all the post fire stuff we had to add later.

How early on did you start integrating post and all the VFX?
Right at the start, and my VFX super Björn Mayer was so smart about it and figuring out ways to get really cool looks. We tried out a ton of visual approaches. Some were done in camera, most were done in post or augmented in post – especially all the fire effects. It was intense reality, not complete reality, that we aimed for, so we had some flexibility.

I assume you did a lot of previs?
Quite a lot, and that was also a big help. We did full-3D previs, like we do in animation, so I was pretty used to that. We also storyboarded a big chunk of the movie, including scenes that normally you wouldn’t have to storyboard because I wanted to make completely sure we were covered on everything.

Didn’t you start off as a storyboard artist?
I did, and my husband’s one too, so I roped him in and we did the whole thing ourselves. It’s just an invaluable tool for showing people what’s going on in a director’s head, and when they’ve seen the pictures they can then offer creative ideas as everyone knows what you’re trying to achieve.

How tough was the shoot?
We shot in Atlanta, and it went smoothly considering there’s always unexpected things. We had freak thunderstorms and a lot of rain that made some sets sink and so on, but it’s how you respond to all that that counts. Everyone was calm and organized.

Where did you post?
Here in LA. We rented some offices near my home and just set up editorial and all our VFX there. It was very convenient.

In a sense, animation is all post, so you must love the post process?
You’re right – animation is like a long-running post for the whole production. I love post because it’s so transformative, and it’s beautiful to see all the VFX get layered in and see the movie suddenly come to life.

Talk about editing this with two editors. How did that work?
Maryann was on the set with us, working as we shot, and then Dean came on later in post, so we had a great team.

What were the big editing challenges?
I think the big one was making all the relationships believable over the course of the film, and so much of it is very subtle. It can come down to just a look or a moment, and we had to carefully plot the gradations and work hard to make it all feel real.

All the VFX play a big role. How many were there?
Well over 2,000 I think, and MPC and Crafty Apes did most of them. I loved working on them with my VFX supervisor. It’s very similar to working with them in animation, which is essentially one big VFX show. So I was very familiar with the process, although integrating them into live action instead of a virtual world is quite different. I loved seeing how it all got integrated so seamlessly.

Can you talk about the importance of sound and music?
It was so important to me, and we had quite a few songs in the film because it’s partly a road trip. There’s the big dance scene where we found a great song and then were able to shoot to the track. We mixed all the sound on the Fox lot.

Where did you do the DI and how important is it to you?
At Technicolor, and I’m pretty involved, although I don’t micro-manage. I’d give notes, and we’d make some stuff pop a bit more and play around with the palette, but basically it went pretty quickly as what we shot already looked really sweet.

Did the film turn out the way you hoped?
It did, and I can’t wait to do another live-action film. I adore animation, but live action’s like this new shiny toy.

You’re that Hollywood rarity — a successful female director. What advice would you give to young women who want to direct?
Do what makes you happy. Don’t do it just because someone says “you can” or “you can’t.” You’ve got to have that personal desire to do this job, and it’s not easy and I don’t expect change to come very quickly to Hollywood. But it is coming.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Behind the Title: Weta Digital VFX supervisor Erik Winquist

NAME: Erik Winquist

COMPANY: Wellington, New Zealand’s Weta Digital

CAN YOU DESCRIBE YOUR COMPANY?
We’re currently a collection of about 1,600 ridiculously talented artists and developers down at the bottom of the world who have created some the most memorable digital characters and visual effects for film over the last couple of decades. We’re named after a giant New Zealand bug.

WHAT’S YOUR JOB TITLE?
Visual Effects Supervisor

WHAT DOES THAT ENTAIL?
Making the director and studio happy without making my crew unhappy. Ensuring that everybody on the shoot has the same goal in mind for a shot before the cameras start rolling is one way to help accomplish both of those goals. Using the strengths and good ideas of everybody on your team is another.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
The amount of problem solving that is required. Every show is completely different from the last. We’re often asked to do something and don’t know how we’re going to accomplish it at the outset. That’s where it’s incredibly important to have a crew full of insanely brilliant people you can bash ideas around with.

HOW DID YOU START YOUR CAREER IN VFX?
I went to school for it. After graduating from the Ringling College of Art and Design with a degree in computer animation, I eventually landed a job as an assistant animator at Pacific Data Images (PDI). The job title was a little misleading, because although my degree was fairly character animation-centric, the first thing I was asked to do at PDI was morphing. I found that I really enjoyed working on the 2D side of things, and that sent me down a path that ultimately got me hired as a compositor at Weta on The Lord of the Rings.

HOW LONG HAVE YOU BEEN WORKING IN VFX?
I was hired by PDI in 1998, so I guess that means 20 years now. (Whoa.)

HOW HAS THE VFX INDUSTRY CHANGED IN THE TIME YOU’VE BEEN WORKING? WHAT’S BEEN GOOD? WHAT’S BEEN BAD?
Oh, there’s just been so much great stuff. We’re able to make images now that are completely indistinguishable from reality. Thanks to massive technology advancements over the years, interactivity for artists has gotten way better. We’re sculpting incredible amounts of detail into our models, painting them with giga-pixels worth of texture information, scrubbing our animation in realtime, using hardware-accelerated engines to light our scenes, rendering them with physically-based renderers and compositing with deep images and a 3D workspace.

Of course, all of these efficiency gains get gobbled up pretty quickly by the ever-expanding vision of the directors we work for!

The industry’s technology advancements and flexibility have also perhaps had some downsides. Studios demand increasingly shorter post schedules, prep time is reduced, shots can be less planned out because so much can be decided in post. When the brief is constantly shifting, it’s difficult to deliver the quality that everyone wants. And when the quality isn’t there, suddenly the Internet starts clamoring that “CGI is ruining movies!”

But, when a great idea — planned well by a decisive director and executed brilliantly by a visual effects team working in concert with all of the other departments — the movie magic that results is just amazing. And that’s why we’re all here doing what we do.

DID A PARTICULAR FILM INSPIRE YOU ALONG THIS PATH IN ENTERTAINMENT?
There were some films I saw very early on that left a lasting impression: Clash of the Titans, The Empire Strikes Back. Later inspiration came in high school with the TV spots that Pixar was doing prior to Toy Story, and the early computer graphics work that Disney Feature Animation was employing in their films of the early ‘90s.

But the big ones that really set me off around this time were ILM’s work on Jurassic Park, and films like Jim Cameron’s The Abyss and Terminator 2. That’s why it was a particular kick to find myself on set with Jim on Avatar.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Dailies. When I challenge an artist to bring their best, and they come up with an idea that completely surprises me; that is way better than what I had imagined or asked for. Those moments are gold. Dailies is pretty much the only chance I have to see a shot for the first time like an audience member gets to, so I pay a lot of attention to my reaction to that very first impression.

WHAT’S YOUR LEAST FAVORITE?
Getting a shot ripped from our hands by those pesky deadlines before every little thing is perfect. And scheduling meetings. Though, the latter is critically important to make sure that the former doesn’t happen.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
There was a time when I was in grade school where I thought I might like to go into sound effects, which is a really interesting what-if scenario for me to think about. But these days, if I were to hang up my VFX hat, I imagine I would end up doing something photography-related. It’s been a passion for a very long time.

Rampage

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
I supervised Weta’s work on Rampage, starring Dwayne Johnson and a very large albino gorilla. Prior to that was War for the Planet of the Apes, Spectral and Dawn of the Planet of the Apes.

WHAT IS THE PROJECT/S THAT YOU ARE MOST PROUD OF?
We had a lot of fun working on Rampage, and I think audiences had a ton of fun watching it. I’m quite proud of what we achieved with Dawn of the Planet of the Apes. But I’m also really fond of what our crew turned out for the Netflix film Spectral. That project gave us the opportunity to explore some VFX-heavy sci-fi imagery and was a really interesting challenge.

WHAT TOOLS DO YOU USE DAY TO DAY?
Most of my day revolves around reviewing work and communicating with my production team and the crew, so it’s our in-house review software, Photoshop and e-mail. But I’m constantly jumping in and out of Maya, and always have a Nuke session open for one thing or another. I’m also never without my camera and am constantly shooting reference photos or video, and have been known to initiate impromptu element shoots at a moment’s notice.

WHERE DO YOU FIND INSPIRATION NOW?
Everywhere. It’s why I always have my camera in my bag.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Scuba diving and sea kayaking are two hobbies that get me out in the water, though that happens far less than I would like. My wife and I recently bought a small rural place north of Wellington. I’ve found going up there doing “farm stuff” on the weekend is a great way to re-calibrate.

Quick Chat: Joyce Cox talks VFX and budgeting

Veteran VFX producer Joyce Cox has a long and impressive list of credits to her name. She got her start producing effects shots for Titanic and from there went on to produce VFX for Harry Potter and the Sorcerer’s Stone, The Dark Knight and Avatar, among many others. Along the way, Cox perfected her process for budgeting VFX for films and became a go-to resource for many major studios. She realized that the practice of budgeting VFX could be done more efficiently if there was a standardized way to track all of the moving parts in the life cycle of a project’s VFX costs.

With a background in the finance industry, combined with extensive VFX production experience, she decided to apply her process and best practices into developing a solution for other filmmakers. That has evolved into a new web-based app called Curó, which targets visual effects budgeting from script to screen. It will be debuting at Siggraph in Vancouver this month.

Ahead of the show we reached out to find out more about her VFX producer background and her path to becoming a the maker of a product designed to make other VFX pros’ lives easier.

You got your big break in visual effects working on the film Titanic. Did you know that it would become such an iconic landmark film for this business while you were in the throes of production?
I recall thinking the rough cut I saw in the early stage was something special, but had no idea it would be such a massive success.

Were there contacts made on that film that helped kickstart your career in visual effects?
Absolutely. It was my introduction into the visual effects community and offered me opportunities to learn the landscape of digital production and develop relationships with many talented, inventive people. Many of them I continued to work with throughout my career as a VFX producer.

Did you face any challenges as a woman working in below-the-line production in those early days of digital VFX?
It is a bit tricky. Visual effects is still a primarily male dominated arena, and it is a highly competitive environment. I think what helped me navigate the waters is my approach. My focus is always on what is best for the movie.

Was there anyone from those days that you would consider a professional mentor?
Yes. I credit Richard Hollander, a gifted VFX supervisor/producer with exposing me to the technology and methodologies of visual effects; how to conceptualize a VFX project and understand all the moving parts. I worked with Richard on several projects producing the visual effects within digital facilities. Those experiences served me well when I moved to working on the production side, navigating the balance between the creative agenda, the approved studio budgets and the facility resources available.

You’ve worked as a VFX producer on some of the most notable studio effects films of all time, including X-Men 2, The Dark Night, Avatar and The Jungle Book. Was there a secret to your success or are you just really good at landing top gigs?
I’d say my skills lie more in doing the work than finding the work. I believe I continued to be offered great opportunities because those I’d worked for before understood that I facilitated their goals of making a great movie. And that I remain calm while managing the natural conflicts that arise between creative desire and financial reality.

Describe what a VFX producer does exactly on a film, and what the biggest challenges are of the job.
This is a tough question. During pre-production, working with the director, VFX supervisor and other department heads, the VFX producer breaks down the movie into the digital assets, i.e., creatures, environments, matte paintings, etc., that need to be created, estimate how many visual effects shots are needed to achieve the creative goals as well as the VFX production crew required to support the project. Since no one knows exactly what will be needed until the movie is shot and edited, it is all theory.

During production, the VFX producer oversees the buildout of the communications, data management and digital production schedule that are critical to success. Also, during production the VFX producer is evaluating what is being shot and tries to forecast potential changes to the budget or schedule.

Starting in production and going through post, the focus is on getting the shots turned over to digital facilities to begin work. This is challenging in that creative or financial changes can delay moving forward with digital production, compressing the window of time within which to complete all the work for release. Once everything is turned over that focus switches to getting all the shots completed and delivered for the final assembly.

What film did you see that made you want to work in visual effects?
Truthfully, I did not have my sights set on visual effects. I’ve always had a keen interest in movies and wanted to produce them. It was really just a series of unplanned events, and I suppose my skills at managing highly complex processes drew me further into the world of visual effects.

Did having a background in finance help in any particular way when you transitioned into VFX?
Yes, before I entered into production, I spent a few years working in the finance industry. That experience has been quite helpful and perhaps is something that gave me a bit of a leg up in understanding the finances of filmmaking and the ability to keep track of highly volatile budgets.

You pulled out of active production in 2016 to focus on a new company, tell me about Curó.
Because of my background in finance and accounting, one of the first things I noticed when I began working in visual effects was, unlike production and post, the lack of any unified system for budgeting and managing the finances of the process.  So, I built an elaborate system of worksheets in Excel that I refined over the years. This design and process served as the basis for Curó’s development.

To this day the entire visual effects community manages the finances, which can be tens, if not hundreds, of millions in spend with spreadsheets. Add to that the fact that everyone’s document designs are different, which makes the job of collaborating, interpreting and managing facility bids unwieldy to say the least.

Why do you think the industry needs Curó, and why is now the right time? 
Visual effects is the fastest growing segment of the film industry, demonstrated in the screen credits of VFX-heavy films. The majority of studio projects are these tent-pole films, which heavily use visual effects. The volatility of visual effects finances can be managed more efficiently with Curó and the language of VFX financial management across the industry would benefit greatly from a unified system.

Who’s been beta testing Curó, and what’s in store for the future, after its Siggraph debut?
We’ve had a variety of beta users over the past year. In addition to Sony and Netflix a number of freelance VFX producers and supervisors as well as VFX facilities have beta access.

The first phase of the Curó release focuses on the VFX producers and studio VFX departments, providing tools for initial breakdown and budgeting of digital and overhead production costs. After Siggraph we will be continuing our development, focusing on vendor bid packaging, bid comparison tools and management of a locked budget throughout production and post, including the accounting reports, change orders, etc.

We are also talking with visual effects facilities about developing a separate but connected module for their internal granular bidding of human and technical resources.

 

Alkemy X joins forces with Quietman, adds CD Megan Oepen

Creative content studio Alkemy X has entered into a joint venture with long-time New York City studio Quietman. In addition, Alkemy X has brought on director/creative director Megan Oepen.

The Quietman deal will see founder and creative director Johnnie Semerad moving the operations of his company into Alkemy X, where both parties will share all creative talent, resources and capabilities.

“Quietman’s reputation of high-end, award-winning work is a tribute to Johnnie’s creative and entrepreneurial spirit,” says Justin B. Wineburgh, Alkemy X president/CEO. “Over the course of two decades, he grew and evolved Quietman from a fledgling VFX boutique into one of the most renowned production companies in advertising and branded content. By joining forces with Alkemy X, we’ll no doubt build on each other’s legacies collectively.”

Semerad co-founded Quietman in 1996 as a Flame-based visual effects company. Since then, it has expanded into the full gamut of production and post production services, producing more than 100 Super Bowl spots, and earning a Cannes Grand Prix, two Emmy Awards and other honors along the way.

“What I’ve learned over the years is that you have to constantly reinvest and reinvent, especially as clients increasingly demand start-to-finish projects,” says Semerad. “Our partnership with Alkemy X will elevate how we serve existing and future clients together, while bolstering our creative and technical resources to reach our potential as commercial filmmakers. The best part of this venture? I’ve always been listed with the Qs, but now, I’m with the As!”

Alkemy X is also teaming up with Oepen, an award-winning creative director and live-action director with 20 years of broadcast, sports and consumer brand campaign experience. Notable clients include Google, the NBA, MLB, PGA, NASCAR, Dove Beauty, Gatorade, Sprite, ESPN, Delta Air Lines, Home Depot, Regal Cinemas, Chick-Fil-A and Yahoo! Sports. Oepen was formerly the executive producer and director for Red Bull’s Non-Live/Long Format Productions group, and headed Under Armour’s Content House. She was also the creator behind Under Armour Originals.

Marvel’s Victoria Alonso to receive HPA’s Charles S. Swartz Award

The Hollywood Professional Association (HPA) has announced that Victoria Alonso, producer and executive VP of production for Marvel Studios, will receive the organization’s 2018 Charles S. Swartz Award at the HPA Awards on November 15. The HPA Awards recognize creative artistry, innovation and engineering excellence, and the Charles S. Swartz Award honors the recipient’s significant impact across diverse aspects of the industry.

A native of Buenos Aires, Alonso moved to the US at the age of 19. She worked her way up through the industry, beginning as a PA and then working four years at the VFX house Digital Domain. She served as VFX producer on a number of films, including Ridley Scott’s Kingdom of Heaven, Tim Burton’s Big Fish, Andrew Adamson’s Shrek and Marvel’s Iron Man. She won the Visual Effects Society (VES) Award for outstanding supporting visual effects/motion picture for Kingdom of Heaven, with two additional shared nominations (best single visual effects, outstanding visual effects/effects-driven motion picture) for Iron Man.

Eventually, she joined Marvel as the company’s EVP of visual effects and post, doubling as co-producer on Iron Man, a role she reprised on Iron Man 2, Thor and Captain America: The First Avenger. In 2011, she advanced to executive producer on the hit The Avengers and has since executive produced Marvel’s Iron Man 3, Captain America: The Winter Soldier and Captain America: Civil War, Thor: The Dark World, Avengers: Age of Ultron, Ant-Man, Guardians of the Galaxy, Doctor Strange, Guardians of the Galaxy Vol. 2, Spider-Man: Homecoming, Thor: Ragnarok, Black Panther, Avengers: Infinity War and most recently, Ant-Man and the Wasp.

She is currently at work on the untitled fourth installment of Avengers and Captain Marvel.

The Charles S. Swartz Award was named after executive Charles Swartz, who had a far ranging creative and technical career, eventually leading the Entertainment Technology Center at the University of Southern California, a leading industry think tank and research center. The Charles S. Swartz Award is awarded at the discretion of the HPA Awards Committee and the HPA Board of Directors, and is not given annually.

SIGGRAPH conference chair Roy C. Anthony: VR, AR, AI, VFX, more

By Randi Altman

Next month, SIGGRAPH returns to Vancouver after turns in Los Angeles and Anaheim. This gorgeous city, whose convention center offers a water view, is home to many visual effects studios providing work for film, television and spots.

As usual, SIGGRAPH will host many presentations, showcase artists’ work, display technology and offer a glimpse into what’s on the horizon for this segment of the market.

Roy C. Anthony

Leading up to the show — which takes place August 12-16 — we reached out to Roy C. Anthony, this year’s conference chair. For his day job, Anthony recently joined Ventuz Technology as VP, creative development. There, he leads initiatives to bring Ventuz’s realtime rendering technologies to creators of sets, stages and ProAV installations around the world

SIGGRAPH is back in Vancouver this year. Can you talk about why it’s important for the industry?
There are 60-plus world-class VFX and animation studios in Vancouver. There are more than 20,000 film and TV jobs, and more than 8,000 VFX and animation jobs in the city.

So, Vancouver’s rich production-centric communities are leading the way in film and VFX production for television and onscreen films. They are also are also busy with new media content, games work and new workflows, including those for AR/VR/mixed reality.

How many exhibitors this year?
The conference and exhibition will play host to over 150 exhibitors on the show floor, showcasing the latest in computer graphics and interactive technologies, products and services. Due to the increase in the amount of new technology that has debuted in the computer graphics marketplace over this past year, almost one quarter of this year’s 150 exhibitors will be presenting at SIGGRAPH for the first time

In addition to the traditional exhibit floor and conferences, what are some of the can’t-miss offerings this year?
We have increased the presence of virtual, augmented and mixed reality projects and experiences — and we are introducing our new Immersive Pavilion in the east convention center, which will be dedicated to this area. We’ve incorporated immersive tech into our computer animation festival with the inclusion of our VR Theater, back for its second year, as well as inviting a special, curated experience with New York University’s Ken Perlin — he’s a legendary computer graphics professor.

We’ll be kicking off the week in a big VR way with a special session following the opening ceremony featuring Ivan Sutherland, considered by many as “the father of computer graphics.” That 50-year retrospective will present the history and innovations that sparked our industry.

We have also brought Syd Mead, a legendary “visual futurist” (Blade Runner, Tron, Star Trek: The Motion Picture, Aliens, Time Cop, Tomorrowland, Blade Runner 2049), who will display an arrangement of his art in a special collection called Progressions. This will be seen within our Production Gallery experience, which also returns for its second year. Progressions will exhibit more than 50 years of artwork by Syd, from his academic years to his most current work.

We will have an amazing array of guest speakers, including those featured within the Business Symposium, which is making a return to SIGGRAPH after an absence of a few years. Among these speakers are people from the Disney Technology Innovation Group, Unity and Georgia Tech.

On Tuesday, August 14, our SIGGRAPH Next series will present a keynote speaker each morning to kick off the day with an inspirational talk. These speakers are Tony Derose, a senior scientist from Pixar; Daniel Szecket, VP of design for Quantitative Imaging Systems; and Bob Nicoll, dean of Blizzard Academy.

There will be a 25th anniversary showing of the original Jurassic Park movie, being hosted by “Spaz” Williams, a digital artist who worked on that film.

Can you talk about this year’s keynote and why he was chosen?
We’re thrilled to have ILM head and senior VP, ECD Rob Bredow deliver the keynote address this year. Rob is all about innovation — pushing through scary new directions while maintaining the leadership of artists and technologists.

Rob is the ultimate modern-day practitioner, a digital VFX supervisor who has been disrupting ‘the way it’s always been done’ to move to new ways. He truly reflects the spirit of ILM, which was founded in 1975 and is just one year younger than SIGGRAPH.

A large part of SIGGRAPH is its slant toward students and education. Can you discuss how this came about and why this is important?
SIGGRAPH supports education in all sub-disciplines of computer graphics and interactive techniques, and it promotes and improves the use of computer graphics in education. Our Education Committee sponsors a broad range of projects, such as curriculum studies, resources for educators and SIGGRAPH conference-related activities.

SIGGRAPH has always been a welcoming and diverse community, one that encourages mentorship, and acknowledges that art inspires science and science enables advances in the arts. SIGGRAPH was built upon a foundation of research and education.

How are the Computer Animation Festival films selected?
The Computer Animation Festival has two programs, the Electronic Theater and the VR Theater. Because of the large volume of submissions for the Electronic Theater (over 400), there is a triage committee for the first phase. The CAF Chair then takes the high scoring pieces to a jury comprised of industry professionals. The jury selects then become the Electronic Theater show pieces.

The selections for the VR Theater are made by a smaller panel comprised mostly of sub-committee members that watch each film in a VR headset and vote.

Can you talk more about how SIGGRAPH is tackling AR/VR/AI and machine learning?
Since SIGGRAPH 2018 is about the theme of “Generations,” we took a step back to look at how we got where we are today in terms of AR/VR, and where we are going with it. Much of what we know today couldn’t have been possible without the research and creation of Ivan Sutherland’s 1968 head-mounted display. We have a fanatic panel celebrating the 50-year anniversary of his HMD, which is widely considered and the first VR HMD.

AI tools are newer, and we created a panel that focuses on trends and the future of AI tools in VFX, called “Future Artificial Intelligence and Deep Learning Tools for VFX.” This panel gains insight from experts embedded in both the AI and VFX industries and gives attendees a look at how different companies plan to further their technology development.

What is the process for making sure that all aspects of the industry are covered in terms of panels?
Every year new ideas for panels and sessions are submitted by contributors from all over the globe. Those submissions are then reviewed by a jury of industry experts, and it is through this process that panelists and cross-industry coverage is determined.

Each year, the conference chair oversees the program chairs, then each of the program chairs become part of a jury process — this helps to ensure the best program with the most industries represented from across all disciplines.

In the rare case a program committee feels they are missing something key in the industry, they can try to curate a panel in, but we still require that that panel be reviewed by subject matter experts before it would be considered for final acceptance.

 

Mark Thorely joins Mill Film Australia as MD

Mill Film in Australia, a Technicolor VFX studio, has named Mark Thorley as managing director.Hi appointment comes in the wake of the February launch of Mill Film in Adelaide, Australia.

Thorley brings with him more than 15 years of executive experience, working at such studios as Lucas Film, Singapore, where he oversaw studio operations and production strategies. Prior to that, Thorley spent nine years at Animal Logic, at both their Los Angeles and Sydney locations, as head of production. He also held senior positions at Screen Queensland and Omnicom.

Throughout his career, Thorley has received credits on numerous blockbuster feature films, including Kong: Skull Island, Rogue One, Jurassic World and Avengers: Age of Ultron. Thorley will drive all aspects of VFX production, client relations and business development for Australia, reporting into the global head of Mill Film, Lauren McCallum.

Disfiguring The Man in Black for HBO’s Westworld

If you watch HBO’s Westworld, you are familiar with the once-good guy turned bad guy The Man in Black. He is ruthless and easy to hate, so when Karma caught up to him audiences were not too upset about it.

Westworld doesn’t shy away from violence. In fact, it has a major role in the series. A recent example of an invisible effect displaying mutilation came during the show’s recent Season Two finale. CVD VFX, a boutique visual effects house based in Vancouver, was called on to create the intricate and gruesome result of what The Man in Black’s hand looked like after being blown to pieces.

During the long-awaited face-off between The Man in Black (Ed Harris) and Dolores Abernathy (Evan Rachel Wood), we see their long-simmering conflict culminate with his pistol pressed against her forehead, cocked and ready to fire. But when he pulls the trigger, the gun backfires and explodes in his hand, sending fingers flying into the sand and leaving horrifyingly bloody stumps.

CVD VFX’s team augmented the on-set footage to bring the moment to life in excruciating detail. Harris’ fingers were wrapped in blue in the original shot, and CVD VFX went to work removing his digits and replacing them with animated stubs, complete with the visceral details of protruding bone and glistening blood. The team used special effects makeup for reference on both blood and lighting, and were able to seamlessly incorporate the practical and digital elements.

The result was impressive, especially considering the short turnaround time that CVD had to create the effect.

“We were brought on a little late in the game as we had a couple weeks to turn it around,” explains Chris van Dyck, founder of CVD VFX, who worked with the show’s VFX supervisor, Jay Worth. “Our first task was to provide reference/style frames of what we’d be proposing. It was great to have relatively free reign to propose how the fingers were blown off. Ultimately, we had great direction and once we put the shots together, everyone was happy pretty quickly.”

CVD used Foundry’s Nuke and Autodesk’s Maya to create the effect.

CVD VFX’s work on Westworld wasn’t the first time they worked with Worth. They previously worked together on Syfy’s The Magicians and Fox’s Wayward Pines.

Behind the Title: Steelhead MD Ted Markovic

NAME: Ted Markovic

COMPANY: LA-based Steelhead

CAN YOU DESCRIBE YOUR COMPANY?
We are a content studio and cross-platform production company. You can walk through our front door with a script and out the back with a piece of content. We produce everything from social to Super Bowl.

WHAT’S YOUR JOB TITLE?
Managing Director

WHAT DOES THAT ENTAIL?
I am responsible for driving the overall culture and financial health of the organization. That includes building strong client relationships, new business development, operational oversight, marketing, recruiting and retaining talent and managing the profits and losses of all departments.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
We all have a wide range of responsibilities and wear many hats. I occasionally find myself replacing the paper towels in the bathrooms because some days that’s what it takes.

WHAT’S YOUR FAVORITE PART OF THE JOB?
We are a very productive group that produces great work. I get a sense of accomplishment almost every day.

WHAT’S YOUR LEAST FAVORITE?
Replacing the paper towels in the bathrooms.

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
I get a lot more done while everyone else is busy eating their lunch or driving home.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Solving the traffic problem in Los Angeles. I see a lot of opportunities there.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I am a third-generation post production executive, and essentially grew up in a film lab in New York. I suspect the profession chose me.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
I am currently working on a Volkswagen Tier 2 project where we are shooting six cars over seven days on our stage at Steelhead. We’re incorporating dynamic camera shots of cars on a cyc with kinetic typography, motion graphics and VFX. It’s a great example of how we can do it all under one roof.

We recently worked with Nintendo and Interogate to bring the new Switch games to life in a campaign called Close Call. On set with rams, air mortars, lighting effects and lots of sawed-in-half furniture, we were able create real weight in-camera to layer with our VFX. We augmented the practical effects with HDR light maps, fire and debris simulations, as well as procedurally generated energy beams, 3D models, and 2D compositing to create a synergy between the practical and visual effects that really sells the proximity and sense of danger we were looking to create.

While the coordination of practical and post was no small chore, another interesting challenge we had to overcome was creating the CG weapons to mesh with the live-action plates. We started with low-resolution models directly from the games themselves, converted them and scrubbed in a good layer of detail and refined them to make them photoreal. We also had to conceptualize how some of the more abstract weapons would play with real-world physics.

Another project worth mentioning was a piece we created for Volkswagen called Strange Terrains. The challenge was to create 360-degree timelapse video from day-to-night. Something that’s never been done before. And in order to get this unique footage, we had to build an equally unique rigging system. We partnered with Supply Frame to design and build a custom-milled aluminum head to support four 50.6 megapixel Canon EOS 5DS cameras.

The “holy grail” of timelapse photography is getting the cameras to ramp the exposure over broad light changes. This was especially challenging to capture due to the massive exposure changes in the sky and the harshness of the white salt. After capturing around approximately 2,000 frames per camera — 9TB of working storage — we spent countless hours stitching, compositing, computing and rendering to get a fluid final product.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
About eight years ago, I created a video for my parents’ 40th wedding anniversary. My mother still cries when she watches it.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
The wheel is a pretty essential piece of technology that I’m not sure I could live without. My smartphone is as expected as well as my Sleepwell device for apnea. That device changed my life.

DO YOU LISTEN TO MUSIC WHILE YOU WORK? CARE TO SHARE YOUR FAVORITE MUSIC TO WORK TO?
I can work listening to anything but reggae.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Exercise.

Eli Rotholz joins Alkemy X as VP of biz dev

Creative content company Alkemy X has added Eli Rotholz as VP of business development,. He will be based in the company’s New York headquarters.

Rotholz brings more than 12 years of sales/business development, strategy and production experience, having begun his career as an independent sales rep for Ziegler/Jakubowicz and Moustache NYC. From there, he worked in his first in-house position at Click3X, where he built and managed a diverse roster of directorial talent, as well as the company’s first integrated production offering focusing on live-action, VFX/design/animation and editorial.

Rotholz then founded Honor Society Films. He later joined Hone Production, a brand-direct-focused production company and consultancy, as director of business development/content EP.

“Very few companies in the industry can boast the strong directorial roster and VFX capabilities as Alkemy X,” says Rotholz. “In addition to the amazing entertainment work that Alkemy does, there’s definitely a trend in high-end ‘package’ productions where one company can do both live-action shoots with their directors, as well as editorial and VFX.”

The Orville VFX supervisor on mixing practical and visual effects

By Barry Goch

What do you get when you mix Family Guy and Ted creator Seth MacFarlane with science fiction? The most dysfunctional spaceship in the galaxy, that’s what. What is the Fox series The Orville? Well, it’s more Galaxy Quest/Space Balls than it is Star Trek/Star Wars.

Set 400 years in the future, The Orville is a spaceship captained by MacFarlane’s Ed Mercer, who has to work alongside his ex-wife as they wing their way through space on a science mission. As you might imagine with a show that is set in space, The Orville features a large amount of visual and practical effects shots, including real and CG models of The Orville.

Luke McDonald

We reached out to the show’s VFX supervisor Luke McDonald to find out more.

How did the practical model of The Orville come about?
Jon Favreau was directing the pilot, and he and Seth MacFarlane had been kidding around about doing a practical model of The Orville. I jumped at the chance. In this day and age, visual effects supervisors shooting models is an unheard of thing to do, but something I was absolutely thrilled about.

Favreau’s visual effects supervisor is Rob Legato. I have worked with Rob on many projects, including Martin Scorsese’s Aviator, Shine a Light and Shutter Island, so I was very familiar with how Rob works. The only other chance that I had had to shoot models was with Rob during Shutter Island and Aviator, so in a sense, whenever Rob Legato shows up it’s model time (he laughs). It’s so amazing because it’s just something that the industry shies away from, but given the opportunity it was absolutely fantastic.

Who built the practical model of The Orville?
Glenn Derry made it. He’s worked with Rob Legato on a few things, including Aviator. Glen is kind of a fantastic. He basically does motion controls, models and motion capture. Glen would also look at all the camera moves and all the previz that we did to make sure the camera moves were not doing something that the motion control rig could not do.

How were you able to seamlessly blend the practical model and the CG version of The Orville?
Once we had the design for The Orville, we would then previz out the ships flying by camera, doing whatever, and work out these specific moves. Any move that was too technical for the motion control rig, we would do a CG link-up instead— meaning that it would go from model to a CG ship or vice versa — to get the exact camera move that we wanted. We basically shot all of the miniatures of The Orville at three frames a second. It was kind of like shooting in slow-mo with the motion control rig, and we did about 16 passes per shot — lights on, lights off, key lights, field light, back light, ambient, etc. So, when we got all the passes back, we composited them just like we would any kind of full CG shot.

From the model shoot, we ended up with about 25 individual shots of The Orville. It’s a very time-consuming process, but it’s very rewarding because of how many times you’re going to have to reuse these elements to achieve completely new shots, even though it’s from the same original motion control shoot.

How did the shots of The Orville evolve over the length of the season?
We started to get into more dynamic things, such as big space battles and specific action patenting, where it really wasn’t feasible to continue shooting the model itself. But now we have a complete match for our CG version of The Orville that we can use for our big space battles, where the ship’s flying and whipping around. I need to emphasize that previz on this project was very crucial.

The Orville is a science vessel, but when it needs to throw down and fight, it has the capabilities to be quite maneuverable — it can barrel roll, flip and power slide around to get itself in position to get the best shot off. Seth was responding to these hybrid-type ship-to-ship shots and The Orville moving through space in a unique way when it’s in battle.
There was never a playbook. It was always, “Let’s explore, let’s figure out, and let’s see where we fit in this universe. Do we fit into the traditional Star Trek-y stuff, or do we fit into the Star Wars-type stuff. I’m so pleased that we fit into this really unique world.

How was working with Seth MacFarlane?
Working with Seth has been absolutely amazing. He’s such a dedicated storyteller, even down to the most minute things. He’s such an encyclopedia of sci-fi knowledge, be it Star Trek, Star Wars, Battlestar Galactica or the old-school Buck Rogers and Flash Gordon. All of them are part of his creative repertoire. It’s very rare that he makes a reference that I don’t get, because I’m exactly the same way about sci-fi.

How different is creating VFX for TV versus film?
TV is not that new to me, but for the last 10 years I’ve been doing film work for Bad Robot and JJ Abrams. It was a strange awakening coming to TV, but it wasn’t horrifying. I had to approach things in a different way, especially from a budget standpoint.

Rachel Matchett brought on to lead Technicolor Visual Effects

Technicolor has hired on Rachel Matchett to head the post production group’s newly formed VFX brand, Technicolor Visual Effects. Working side-by-side within the same facilities where post services are offered, Technicolor Visual Effects is expanding to a global offering with an integrated pipeline. Technicolor is growing its namesake VFX team apart from the company’s other visual effects brands: MPC, The Mill, Mr. X and Mikros.

A full-service creative VFX house with local studios in Los Angeles, Toronto and London, Technicolor Visual Effects’ recent credits include the feature films Avengers: Infinity Wars, Black Panther, Paddington 2, and episodic series such as This is Us, Anne With an E and Black Mirror.

Matchett joins Technicolor from her long-tenured position at MPC Film. Her background at MPC London includes nearly a decade of senior management positions at the studio. She most recently served as MPC London’s global head of production. In that role, her divisions at MPC Film oversaw and carried out visual effects on a number of films each year, including director Jon Favreau’s Academy Award-winning The Jungle Book and the critically acclaimed Blade Runner 2049.

“Technicolor Visual Effects is emerging from its position as one of the industry’s best-kept secrets. While continuing to support clients who do color finishing with us, we are excited to work with storytellers from script to screen,” says Matchett. “Having been at the heart of MPC Film’s rapid growth over the past decade, I feel that there is a great opportunity for Technicolor’s future role in VFX to forge a new path within the industry.”

Behind the Title: Weta’s Head of Tech & Research Luca Fascione

NAME: Luca Fascione

COMPANY: Wellington, New Zealand’s Weta Digital

WHAT’S YOUR JOB TITLE?
Senior Head of Technology and Research

WHAT DOES THAT ENTAIL?
In my role, I lead the activities of Weta Digital that provide software technology to the studio and our partners. There are various groups that form technology and research: Production Engineering oversees the studio’s pipeline and infrastructure software, Software Engineering oversees our large plug-ins such as our hair system (Barbershop/Wig), our tree growth system (Lumberjack/Totara) and our environment construction system (Scenic Designer), to name a few.

Two more departments that make up the technology and research group include Rendering Research and Simulation Research. These departments oversee proprietary renderer, Manuka, and our physical simulation system, Synapse. Both groups have a strong applied research focus and as well as producing software, they are often involved in the publication of scientific papers.

HOW DID YOU GET INTO THIS BUSINESS?
Cinema and computers have been favorites of mine (as well as music) since I was a little kid. We used to play a game when I was maybe 12 or so where we would watch about five seconds of a random movie on TV, turn it off, and recite the title. I was very good at that.

A couple of my friends and I watched all the movies we could find, from arthouse European material to more commercial, mainstream content. When it came time to find a job, I thought finding a way to merge my passion for cinema and my interest in computers into one would be great, if I could.

HOW LONG HAVE YOU BEEN WORKING IN THIS INDUSTRY?
I started at Weta Digital in 2004. Before that I was part of the crew working the feature animation movie Valiant, where I started in 2002. I guess this would make it 15 years.

HOW HAS THE VFX INDUSTRY CHANGED IN THE TIME YOU’VE BEEN WORKING? WHAT’S BEEN GOOD, WHAT’S BEEN BAD?
Everything got bigger. More so the content we want to work with in relation to the machines we want to use to achieve our goals. As much as technology has improved, our ability to use it to drive the hardware extremely hard has grown faster, creating a need for technically creative, innovative solutions to our scaling problems.

Graphics is running out of “easy problems” that one can solve drawing inspiration from other fields of science, and it’s sometimes the case that our research has outpaced the advancements of similar problems in other fields, such as medicine, physics or engineering. At the same time, especially since the recent move toward deep learning and “big data” problems, the top brains in the field are all drawn away from graphics, making it harder than it used to be to get great talent.

DID A PARTICULAR FILM INSPIRE YOU ALONG THIS PATH IN ENTERTAINMENT?
I work in VFX because of Jurassic Park. Although I must also recognize Young Sherlock Holmes and Terminator 2, which also played a big role in this space. During my career in VFX, King Kong and Avatar have been life-shaping experiences.

DID YOU GO TO FILM SCHOOL?
Not at all, I studied Mathematics in Rome, Italy. All I know about movies is due to personal study work. Back in those days nobody taught computer graphics at this level for VFX. The closest were degrees in engineering schools that maybe had a course or two in graphics. Things have changed massively since then in this area.

WHAT’S YOUR FAVORITE PART OF THE JOB?
The variety. I run into a lot of extremely interesting problems, and I like being able to help people find good ways to solve them.

WHAT’S YOUR LEAST FAVORITE?
A role like mine necessarily entails having to have many difficult conversations with crew. I am extremely pleased to say the majority of these result in opportunities for growth and deepening of our mutual understandings. I love working with our crew, they’re great people and I do learn a lot every day.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I like my job, I don’t often think about doing something else. But I have on occasion wondered what it would be like to build guitars for a living.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
The latest War for the Planet of the Apes movie has been a fantastic achievement for the studio. The Technology and Research group has contributed a fair bit of software to the initiative, from our forest system Totara to a new lighting pipeline called PhysLight, a piece of work I was personally involved in and that I am particularly proud of.

During our work on The Jungle Book, we helped the production by reshaping our instancing system to address the dense forests in the movie. Great advancements in our destruction systems were also developed for Rampage.

WHAT IS THE PROJECT/S THAT YOU ARE MOST PROUD OF?
It turns out three of my early projects played a role of some importance in the making of Avatar: The facial solver, the sub-surface scattering system and PantaRay (our Spherical Harmonics occlusion system). After that, I’m extremely proud of my work on Manuka, Weta Digital’s in-house renderer.

WHERE DO YOU FIND INSPIRATION NOW?
All around me, it’s the people, listening to their experiences, problems and wishes. That’s how our job is done.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I play guitar and I build audio amplifiers. I have two daughters in primary school that are a lot of fun and a little boy just joined our family last December. I do take the occasional picture as well.

Chimney opens in New York City, hires team of post vets

Chimney, an independent content company specializing in film, television, spots and digital media, has opened a new facility in New York City. For over 20 years, the group has been producing and posting campaigns for brands, such as Ikea, Audi, H&M, Chanel, Nike, HP, UBS and more. Chimney was also the post partner for the feature films Chappaquiddick, Her, Atomic Blonde and Tinker Tailor Soldier Spy.

With this New York opening, Chimney now with 14 offices worldwide. Founded in Stockholm in 1995, they opened their first US studio in Los Angeles last year. In addition to Stockholm, New York and LA, Chimney also has facilities in Singapore, Copenhagen, Berlin and Sydney among other cities. For a full location list click here.

“Launching in New York is a benchmark long in the making, and the ultimate expression of our philosophy of ‘boutique-thinking with global power,’” says Henric Larsson, Chimney founder and COO. “Having a meaningful presence in all of the world’s economic centers with diverse cultural perspectives means we can create and execute at the highest level in partnership with our clients.”

The New York opening supports Chimney’s mission to connect its global talent and resources, effectively operating as a 24-hour, full-service content partner to brand, entertainment and agency clients, no matter where they are in the world.

Chimney has signed on several industry vets to spearhead the New York office. Leading the US presence is CEO North America Marcelo Gandola. His previous roles include COO at Harbor Picture Company; EVP at Hogarth; SVP of creative services at Deluxe Entertainment Services Group; and VP of operations at Company 3.

Colorist and director Lez Rudge serves as Chimney’s head of color North America. He is a former partner and senior colorist at Nice Shoes in New York. He has worked alongside Spike Lee and Darren Aronofsky, and on major brand campaigns for Maybelline, Revlon, NHL, Jeep, Humira, Spectrum and Budweiser.

Managing director Ed Rilli will spearhead the day-to-day logistics of the New York office. As the former head of production of Nice Shoes, his resume includes producing major campaigns for such brands as NFL, Ford, Jagermeister and Chase.

Sam O’Hare, chief creative officer and lead VFX artist, will oversee the VFX team. Bringing experience in live-action directing, VFX supervision, still photography and architecture, O’Hare’s interdisciplinary background makes him well suited for photorealistic CGI production.

In addition, Chimney has brought on cinematographer and colorist Vincent Taylor, who joins from MPC Shanghai, where he worked with brands such as Coca-Cola, Porsche, New Balance, Airbnb, BMW, Nike and L’Oréal.

The 6,000-square-foot office will feature Blackmagic Resolve color rooms, Autodesk Flame suites and a VFX bullpen, as well as multiple edit rooms, a DI theater and a Dolby Atmos mix stage through a joint venture with Gigantic Studios.

Main Image: (L-R) Ed Rilli, Sam O’Hare, Marcelo Gandola and Lez Rudge.

Framestore London adds joint heads of CG

Framestore has named Grant Walker and Ahmed Gharraph as joint heads of CG at its London studio. The two will lead the company’s advertising, television and immersive work alongside head of animation Ross Burgess.

Gharraph has returned to Framestore after a two-year stint at ILM, where he was lead FX artist on Star Wars: The Last Jedi, receiving a VES nomination in Outstanding Effects Simulations in a Photoreal Feature. His credits on the advertising-side as CG supervisor include Mog’s Christmas Calamity, which was Sainsbury’s 2015 festive campaign, and Shell V-Power Shapeshifter, directed by Carl Erik Rinsch.

Walker joined Framestore in 2009, and in his time at the company he has worked across film, advertising and television, building a portfolio as a CG artist with campaigns, including Freesat’s VES-nominated Sheldon. He was also instrumental in Framestore’s digital recreation of Audrey Hepburn in Galaxy’s 2013 campaign Chauffeur for AMV BBDO. Most recently, he was BAFTA-nominated for his creature work in the Black Mirror episode, “Playtest.”

Lauren McCallum to head Mill Film’s new Montreal studio

Mill Film will open a new facility in Montréal, Québec with operations starting this summer. The announcement follows the February launch of Mill Film in Adelaide, Australia.

Mill Film, a Technicolor studio that won an Academy Award for best visual effects for the movie Gladiator in 2001, will focus on the needs of streaming and episodic content — in addition to long form film, which is the domain of existing Technicolor VFX brands, including MPC Film and Mr. X.

Global head of Mill Film Lauren McCallum will head the new studio. Throughout her career, McCallum has been known for leading creative talent on features like Blade Runner 2049 and Wonder Woman, as well as her work on the 2017 Oscar-winning The Jungle Book.

A specialist in VFX management, McCallum will oversee all aspects of production along with driving operations and strategy. A 10-year VFX veteran, McCallum was most recently head of production at MPC Film, and prior to that was at London’s Framestore and Prime Focus World.

Behind the Title: Versus Partner/CD Justin Barnes

NAME: Justin Barnes

COMPANY: Versus (@vs_nyc)

CAN YOU DESCRIBE YOUR COMPANY?
We are “versus” the traditional model of a creative studio. Our approach is design driven and full service. We handle everything from live action to post production, animation and VFX. We often see projects from concept through delivery.

WHAT’S YOUR JOB TITLE?
Partner and Creative Director

WHAT DOES THAT ENTAIL?
I handle the creative side of Versus. From pitching to ideation, thought leadership and working closely with our editors, animators, artists and clients to make our creative — and our clients’ creative vision — the best it can be.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
There’s a lot of business and politics that you have to deal with being a creative.

Adidas

WHAT’S YOUR FAVORITE PART OF THE JOB?
Every day is different, full of new challenges and the opportunity to come up with new ideas and make really great work.

WHAT’S YOUR LEAST FAVORITE?
When I have to deal with the business side of things more than the creative side.

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
For me, it’s very late at night; the only time I can work with no distractions.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Anything in the creative world.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
It’s been a natural progression for me to be where I am. Working with creative and talented people in an industry with unlimited possibilities has always seemed like a perfect fit.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
– Re-brand of The Washington Post
– Animated content series for the NCAA
– CG campaign for Zyrtec
– Live-action content for Adidas and Alltimers collaboration

Zyrtec

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
I am proud of all the projects we do, but the ones that stick out the most are the projects with the biggest challenges that we have pulled together and made look amazing. That seems like every project these days.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
My laptop, my phone and Uber.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
I can’t live without Pinterest. It’s a place to capture the huge streams of inspiration that come at us each day.

DO YOU LISTEN TO MUSIC WHILE YOU WORK?
We have music playing in the office 24/7, everything from hip-hop to classical. We love it all. When I am writing for a pitch, I need a little more concentration. I’ll throw on my headphones and put on something that I can get lost in.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Working on personal projects is big in helping de-stress. Also time at my weekend house in Connecticut.

Lindsay Seguin upped to EP at NYC’s FuseFX

Visual effects studio FuseFX has promoted Lindsay Seguin to executive producer in the studio’s New York City office. Seguin is now responsible for overseeing all client relationships at the FuseFX New York office, acting as a strategic collaborator for current and future productions spanning television, commercial and film categories. The company also has an office in LA.

Seguin, who first joined FuseFX in 2014, was previously managing producer. During her time with the company, she has worked with a number of high-profile client productions, including The Blacklist, Luke Cage, The Punisher, Iron Fist, Mr. Robot, The Get Down and the feature film American Made.

“Lindsay has played a key role in the growth and success of our New York office, and we’re excited for her to continue to forge partnerships with some of our biggest clients in her new role,” says Joseph Bell, chief operating officer and executive VP of production at FuseFX.

“We have a really close-knit team that enjoys working together on exciting projects,” Seguin added about her experience working at FuseFX. “Our crew is very savvy and hardworking, and they manage to maintain a great work/life balance, even as the studio delivers VFX for some of the most popular shows on television. Our goal is to have a healthy work environment and produce awesome visual effects.”

Seguin is a member of the Visual Effects Society and the Post New York Alliance. Prior to making the transition to television and feature work, her experience was primarily in national broadcast and commercial projects, which included campaigns for Wendy’s, Garnier, and Optimum. She is a graduate of Penn State University with a degree in telecommunications. Born in Toronto, Seguin is a dual citizen of Canada and the United States.

Creative editorial and post boutique Hiatus opens in Detroit

Hiatus, a full-service, post production studio with in-house creative editorial, original music composition and motion graphics departments, has opened in Detroit. Their creative content offerings cover categories such as documentary, narrative, conceptual, music videos and advertising media for all video platforms.

Led by founder/senior editor Shane Patrick Ford, the new company includes executive producer/partner Catherine Pink, and executive producer Joshua Magee, who joins Hiatus from the animation studio Lunar North. Additional talents feature editor Josh Beebe, composer/editor David Chapdelaine and animator James Naugle.

The roots of Hiatus began with The Factory, a music venue founded by Ford while he was still in college. It provided a venue for local Detroit musicians to play, as well as touring bands. Ford, along with a small group of creatives, then formed The Work – a production company focused on commercial and advertising projects. For Ford, the launch of Hiatus is an opportunity to focus solely on his editorial projects and to expand his creative reach and that of his team nationally.

Leading up to the launch of Hiatus, the team has worked on projects for brands such as Sony, Ford Motor Company, Acura and Bush’s, as well as recent music videos for Lord Huron, Parquet Courts and the Wombats.

The Hiatus team is also putting the finishing touches on the company’s first original feature film Dare to Struggle, Dare to Win. The film uncovers a Detroit Police decoy unit named STRESS and the efforts made to restore civil order in 1970s post-rebellion Detroit. Dare to Struggle, Dare to Win makes its debut at the Indy Film Festival on Sunday April 29th and Tuesday May 1st in Indianapolis, before it hits the film festival circuit.

“Launching Hiatus was a natural evolution for me,” says Ford. “It was time to give my creative team even more opportunities, to expand our network and to collaborate with people across the country that I’ve made great connections with. As the post team evolved within The Work, we outgrew the original role it played within a production company. We began to develop our own team, culture, offerings and our own processes. With the launch of Hiatus, we are poised to better serve the visual arts community, to continue to grow and to be recognized for the talented creative team we are.”

“Instead of having a post house stacked with people, we’d prefer to stay small and choose the right personal fit for each project when it comes to color, VFX and heavy finishing,” explains Hiatus EP Catherine Pink. “We have a network of like-minded artists that we can call on, so each project gets the right creative attention and touch it deserves. Also, the lower overhead allows us to remain nimble and work with a variety of budget needs and all kinds of clients.”

Director HaZ Dulull on his sci-fi offering The Beyond

By Randi Altman

Director Hasraf “HaZ” Dulull is no stranger to making movies. Before jumping into writing and directing short sci-fi films, he was a visual effects supervisor and producer. His short film resume includes Project Kronos, I.R.I.S. and Sync. Recently, his first feature film, The Beyond, was released by Gravitas Ventures.

When I first met HaZ a few years back, we were both at an Adobe event — on a canal boat in Amsterdam during IBC. We started talking about visual effects, the industry and his drive to make movies.

This Brit is friendly, intelligent and incredibly hands-on in all aspects of what he does. His latest is The Beyond, which he describes as “a cerebral science-fiction feature film that blends the realism of documentary with the fantastical, ‘big idea’ nature of the science-fiction films of today.” The Beyond tells the story of a ground-breaking mission that sent astronauts — modified with advanced robotics — through a newly discovered wormhole known as the Void. When the mission returns unexpectedly, the space agency races to discover what the astronauts encountered on their first-of-its-kind interstellar space journey.

HaZ on set

HaZ was so hands-on that he provided some of the film’s visual effects and edited the film. Here is the trailer. If you like what you see, the film is available for purchase or rent on most digital platforms.

When I reached out to HaZ to talk about The Beyond, he was in Vancouver working on an eight-part TV series for Disney called Fast Layne. “I directed episodes 1 and 2, and am currently directing episodes 7 and 8,” he says. “The beauty of starting and ending the series is it allowed me to set the show’s style and tone.”

It seems he can’t sit still! Let’s find out more about how he works and The Beyond

Can you talk about prepro? How much of that included visual effects prepro?
Most people who know me will say I’m obsessed with prep. I had about six months of hardcore prep on this, from doing little storyboards, known as HaZ-Grams, right through to previs of the key sequences.

But even during the script-writing stage (six months before actual prep), I was coming up with visuals to support the ideas I was writing in the script. Sometimes I would knock up a test VFX scene just to see how complex it would be to create this idea I was writing in the script. Prep worked hand in hand with the script development and the budgeting of the film. The film was self-financed and later additional financing came in (during post production of the film), so I wanted to ensure everything was mapped out technically, as there was no “fix it in post” scenarios in this film — I wouldn’t allow it.

During location scouting, I would have my iPhone with me and shoot a bunch of footage and still imagery, so when I went back home I could write those locations into the script to make them work with the scenarios depicted in the film.

As part of prep we actually shot a test scene to really see if this mocku-mentary format would work to tell a grounded sci-fi story. This was also used to attract crew and other casting to the project, as well as get distributors primed early on.

Many shots from that test actually made it into the final movie —I wasn’t kidding about not wasting any budget or material on this production! So prep pretty much helped shape the script too, as I knew I wasn’t in the financial position to write stuff and then go and build it. I had to reverse engineer it in a way. In the film we have tons of locations, such as the Space Centre with actual real rockets. We also had a team in Iceland shooting alien landscapes, and we even shot some scenes in Malaysia to give the film a global feel — with each of those opportunities the script was tweaked to make full use of those location opportunities we had.

You shot with Blackmagic cameras. Was that your choice? The DP’s? Have you shot with these before?
From the start, I knew we were going to shoot on Blackmagic cameras. This was mainly down to the fact my DP Adam Batchelor — who had shot Sync with me and the proof of concept tests we did for this film — was a Blackmagic advocate and knew the cameras inside out, but more importantly he was able to get cinematic imagery using those cameras.

Blackmagic was very supportive of the film and have been of my career since my short films, so they came on as one of the executive producers on the film. No one had ever shot a full feature film using just the Blackmagic cameras. We also then used a Resolve pipeline to delivery. So The Beyond is the perfect case study for it.

Can you talk about that workflow? Any hiccups? 
I think the only hiccups were the fact we were using a beta version of Resolve 14, so there were the expected crashes, etc. That would usually be seen as risky on a feature film, but luckily we didn’t have a distributor in place with a release date, so the risk was minimal.

The good thing was I would generate an error log report from Resolve and send it over to Blackmagic, who would then instantly send out a new patch. So we were looked after rather than being left on our own to scream at the monitor.

We stuck with a Pro Res 4444 QuickTime workflow for all material from footage to VFX renders, and enabled proxy on the fly within Resolve. This was great as it meant I was working with the highest-resolution imagery within Resolve, and it was fairly fast too. Things started to slow down when I had multiple layers of VFX and composites/groups, which I then had to render out as a new clip and bring back in.

How did you and the DP develop the look you wanted? Any scenes stick out that you guys worked on?
I was very fortunate to get Max Horton, who had worked on films like Gravity, to come onboard to grade this film at the Dolby Vision lab in London’s Soho. We also did an HDR version of the film, which I think is the first indie film to have an HDR treatment done to it.

We had three to four days of grading with Max, and I was in the room with him the whole time. This was because I had already done a first-pass temp grade myself while editing the film in the beta version of Resolve 14. This made the workflow as simple as exporting my Resolve file and then the material hand-over to Max, who would load up the Resolve file, link up the material and work from there.

Max kept everything photographically like a documentary but with a slight cinematic flair to it. The big challenge was matching all the various sources of material from the various Blackmagic cameras (Ursa Mini Pro, the Production Camera and the Pocket Camera) to the DJI Osmo, drone footage and stock footage.

How many VFX shots were there? Who did them?
There were around 750 visual effects shots. I designed all the VFX scenes and handled a huge portion of the compositing myself, including invisible effects shots, all the space scenes, alien planet scenes, memory scenes and tons more — this would not have been possible without the support of my VFX team who worked on their assigned sequences and shots and also generated tons of CGI assets for me to use to create my shots in comp.

My VFX team members included my long-time collaborator John Sellings, who was the VFX supervisor for all the Human 2.0 sequences. Filmmore, in Amsterdam and Brussels, handled Human 2.0 scenes in the transcode bay with in-house VFX supervisor Hans Van Helden. London’s Squint VFX handled the Human 2.0 scenes in wake-up lab. Charles Wilcocks was the Human 2.0 CG supervisor who worked on the shape and look of the Human 2.0.

Hussin Khan looked after the Malaysian team, which provided rotoscoping support and basic comps. Dan Newlands was our on-set tracking supervisor. He ensured all data was captured correctly and supervised anything tracking related in the Human 2.0 scenes.

Another long-time collaborator was Andrea Tedeschi, who handled the CG and comps for the spacecraft carrier at the end of the film, as well as rendering out the CG astronaut passes. Rhys Griffith handled the rigging for the Human 2.0 characters in Maya, and also looked after the CG passes for the alpha Human 2.0 scenes using Blender. Aleksandr Uusmees provided all the particles and simulation rendered out of Houdini as CG passes/elements, which I then used to create the wormhole effects, alien spheres and other shots that needed those elements.

JM Blay designed and created the standalone motion graphics sequences to visualize the Human 2.0 medical procedure, as well as mission trajectory graphics. He also created several “kit-bash” graphics assets for me to use, including UI graphics, from his After Effects files.

Territory Studio created the awesome end titles and credits sequence, which you can read more about on their site.

As a VFX pro yourself, do you find that you are harder to please because it’s your wheelhouse?
Oh boy. Ask any of the VFX guys on the team and they will say I am a beast to work with because I am hands-on, and also I know how long things take. But on the flip side that had its advantages, as they knew they were not going to get revision after revision, because with each brief I also presented a proposed methodology, and made sure we locked down on that first before proceeding with the shots.

Was this your biggest directing job to date? Can you talk about any surprises?
It wasn’t my biggest directing job to date, as during post production of The Beyond my second sci-fi film Origin Unknown (starring Katee Sackhoff from Battlestar Galactica, The Flash) was green-lit and that had its own set of challenges. We can talk more about that when the film is released theatrically and VOD later this year via Kew Media.

This was, however, my biggest producing job to date; there were so many logistics and resources to manage whilst directing too. The cool thing about the way we made this film was that most of the crew were on my short films, including some of the key cast too, so we embraced the guerrilla nature of the production and focused on maximizing our resources to the fullest within the time and budget constraints.

What did you learn on this film that will help on your next?
The other hat I was wearing was the producer hat, and one thing I had to embrace was the sheer amount of paperwork! I may have taken the same filmmaking approach as I did on my short films — guerrilla and thinking outside the box technically and creatively— but making a commercial feature film, I had to learn to deal with things like clearances, E&O (errors and omission) insurance, chain of title, script report and a whole bunch of paperwork required before a distributor will pick up your film.

Thankfully my co-producer Paula Crickard, who is currently wrapping post on Terry Gilliam’s Don Quixote, came in during the post stage of the film and helped.

The other thing I learned was the whole sales angle — getting a reputable distributor on board to sell the film in all worldwide territories and how to navigate that process with rights and IP and more contracts etc. The advise I got from other filmmakers is getting the right distributor is a big part in how your film will be released, and to me it was important the distributor was into the film and not just the trailer, but also what their marketing and sales strategy were. The Beyond was never designed to be a theatrical film and therefore I wanted someone that had a big reach in the VOD world through their brand, especially since The Beyond doesn’t have big-name actors in there.

What was the most challenging scene or scenes? Why and how did you overcome those challenges?
The Human 2.0 scenes were the most challenging because they had to look photoreal due to it being a documentary narrative. We did first try and do it all in-camera using a built suit, but it wasn’t achieving the look we wanted and the actors would feel uncomfortable with it, and also to do it properly with practical would cost a fortune. So we went with a full-digital solution for the Human 2.0 bodies, by having the actors wear a tight grey suit with tracking markers on and we restricted our camera moves for simplicity to enable object tracking to work as accurately as possible. We also shot multiple reference footage from all angles to help with match moving. Having an on set-tracking supervisor helped massively and allowed us to make this happen within the budget, while looking and feeling real.

Our biggest issue came when our actress made very tiny movements due to breathing in close-up shots. Because our Human 2.0 was human consciousness in a synthetic shell, breathing didn’t make sense and we began making up for it by freezing the image or doing some stabilization, which ended up being nearly impossible for the very close-up shots.

In the end, I had to think outside the box, so I wrote a few lines into the script that explained that the Human 2.0 was breathing to make it psychologically more acceptable to other humans. Those two lines saved us weeks and possibly months of time.

Being a VFX movie you would expect us to use a form of greenscreen or bluescreen, but we didn’t — in fact, the only stage used was for the “white room” astronaut scene, which was shot over at Asylum FX in London. There was an actor wearing an astronaut suit in a bright photography room, and we used brightly exposed lighting to give a surreal feeling. We used VFX to augment it.

As a writer and a director, how was it seeing your vision through from start to finish.
It didn’t really hit me until I watched the press screening of it at the Dolby Vision office in Soho. It had the fully mixed sound and the completed grade. I remember looking across at my DP and other team members thinking, “Whoa! It looks and feels like a feature film, and we did that in a year!”

You edited the film yourself?
Yes, I was the editor on the film! I shoot for the edit. I started off using Adobe Premiere CC for the early offline and then quickly moved over to Resolve 14, where I did the majority of the editing. It was great because I was doing a lot of online editorial tasks like stabilizing, basic VFX, pan and scans, as well as establishing temp looks while editing. So in a way there was no offline and online editorial, as it was all part of one workflow. We did all our deliverables out of Resolve 14, too.

End of the Line director Jessica Sanders

By Randi Altman

After watching the End of the Line, I found myself thinking about the short film’s content… a lot. Based on a short story by Aimee Bender, director Jessica Sanders’ version starts off with a man walking into a pet store, looking into what the audience assumes is a birdcage and walking out not with a bird but a little man in a cage.

We see how Big Man (Stranger Things’ Brett Gelman) tries to take care of Little Man (Big Bang Theory‘s Simon Helberg) and then we see his frustration when the clearly well-read and intelligent Little Man tells the story of how he was taken from his family and put in a cage. Big Man’s behavior becomes increasingly disturbing, leading him to torture Little Man.

We reached out to director Sanders — who has an Oscar nomination thanks to her short documentary, Sing! — to talk about making the film, which is part of Refinery29 and TNT’s Shatterbox Anthology, a short film series dedicated to supporting the voices of female filmmakers.

Let’s start with cameras. What did you shoot on, and how involved in that process are you?
We shot on the Alexa Mini with Panavision primo lenses. I like to go over lenses/looks with my DP, but defer to what the DP wants to shoot on. For this project, I worked with ultra-talented DP Brett Pawlak.

How long was the shoot, and how much pre-production did you do? I’m assuming a good amount considering the VFX shots?
The film, although short (14 minutes), was essentially a feature in terms of preparation and the production scope/crew size, shooting for six days. We had about two months of intense prep leading up to the shoot, from scouting and art department. For example, we built a 30-foot penis and 30-foot cage. The VFX approach was an intensive collaboration between VFX supervisor Eva Flodstrom, my DP Brett, production designer Justin Trask, producer Louise Shore and myself.

We had 67 VFX shots, so I storyboarded the film early on and then photoboarded each shot when we had our locations. We had a specific VFX/production approach to execute each shot from a mix of practicals (building the giant cage), to strictly greenscreen (i.e., when the little man is on a calculator). It was a highly involved and collaborative process.

Was your VFX supervisor on set?
Yes. Eva was highly involved from the beginning for all of prep, and on set she was instrumental. We worked closely with a DIT video assist so we could do a rough VFX comp of each shot while we were shooting. After production, it took about four months to finish post and visual effects.

I wanted to work with Eva, as she’s a pro, having worked on Star Wars and Star Trek (also, there are very few female VFX supervisors). Our approach/philosophy to VFX was similar — inspired by Michel Gondry’s and Spike Jonze’s work in which the VFX feels human, warm and practical, integral to the story and characters, never distracting.

Can you talk about the challenges of directing a project with VFX?
I had never done a VFX-heavy film before, and creatively, as a director, I wanted to challenge myself. I had a blast and want to do more films with VFX after this experience. Because I surrounded myself with top artists who had VFX experience, it was a totally enjoyable experience. We had a lot of fun making this film!

This was likely a hard story to tell. As the viewer you think it’s going to be a sweet story about a guy and his bird, but then…
I read Aimee Bender’s short story End of the Line in her book Willful Creatures in 2005 and have been passionate about it since then. The story takes the audience on an emotional rollercoaster. It’s funny, dark, explores themes of loneliness, desire and abuse of power, in a short amount of time. There are a lot of tonal shifts, and I worked closely with screenwriter Joanne Giger to achieve this balance.

How did you as a director set out to balance the humor, the sadness, the kinda disturbing stuff, etc.?
I played the film visually and tonally very grounded (i.e. the rules of this world is that there is a big person and tiny person world that live side by side) and from that I could push the humor and darkness. In the performances, I wanted there to be an emotional truth to what the characters are experiencing that feels human and real, despite the fantastical setting. So I played with a lot of the mix of feelings within this very grounded surreal world.

The color is sort of muted in an old-timey kind of way. Can you talk about what you wanted from the color and mood?
I’m very sensitive to color and attention to detail. We wanted the film to feel timeless, although it is contemporary. Costume designer Shirley Kurata is amazing with color blocking and visual storytelling with color. Because Big Man’s world is more depressed and lonely, his tones are gray, the house is dark wood. As Big Man gains power, he wears more color. My DP has a very naturalistic approach with his lighting, so I wanted everything to feel very natural.

When we colored the film later in post, the approach was to do very little to the film, as it was captured in-camera. Production designer Justin Trask is a genius — from how he designed and built the giant penis (to feel less naturalistic) to the details of Little Man’s cage (his furniture, the giant bread crumb on a coin). We had a lot of fun exploring all the miniature props and details in this film.

How did you work with your editors? What did they cut on?
Because of the VFX, we edited on Adobe Premiere. I worked with editor Stephen Berger, who helped shape the film and did an amazing job doing the rough VFX comps in the edit. He is great with music and brought musical inspirations, which led to composer Pedro Bromfman’s entire saxophone score. Pedro is a big composer from Brazil and did my last documentary March of the Living. Editor Claudia Castello is incredible with performance, building the emotional arc of each character. She edited Fruitvale Station, Creed and was an editor on Black Panther. It was a great collaborative experience.

You had a lot of women on the crew. It seems like you went out of your way to find female talent. Why is this so important to you and the industry in general?
As a woman and a woman of Asian descent (I’m half Chinese), it’s important to me to be surrounded by a diverse group of collaborators and to hire with as much gender equality as possible. I love working with talented women and supporting women. The world is a diverse place. It’s important to me to have different perspectives reflected in filmmaking and representation. There is a huge inequality of the hiring practices in Hollywood (4% of Hollywood feature films were directed by women last year), so it’s critical to hire talented, qualified women.

Do you think things are getting better for females in the industry, especially in the more technical jobs?
I’ve always hired female cinematographers, editors and worked with Eva Flodstrom for VFX. With my friend/colleague Rachel Morrison, who is the first female cinematographer nominated for an Oscar, I hope things are changing for women with more visibility and dialogue. Change can only happen by actually hiring talented women (like director Ryan Coogler (Black Panther) who works with female cinematographers and editors).

You’ve directed both narrative and documentary projects. Do you have a preference, or do you enjoy switching back and forth?
This film marks a new creative chapter and narrative direction in my work. I love my background in documentaries, but I am fully focused on narrative filmmaking at the moment.

How was Sundance for you and the film?
Sundance was an incredible experience and platform for the film. We were IndieWire’s Top 10 Must See Films. My creative team came out, including actors Simon Helberg and Vivian Bang. It was a blast!

Digital locations for Scandal/How to Get Away With Murder crossover

If you like your Thursday night television served up with a little Scandal and How to Get Away With Murder, then you likely loved the recent crossover episodes that paired the two show’s leading ladies. VFX Legion, which has a brick and mortar office in LA but artists all over the world, was called on to create a mix of photorealistic CG environments and other effects that made it possible for the show’s actors to appear in a variety of digital surroundings, including iconic locations in Washington, DC.

VFX Legion has handled all of the visual effects for both shows for almost three years, and is slated to work on the next season of Murder (this is Scandal’s last season). Over the years, the Shondaland Productions have tasked the company with creating high shot counts for almost 100 episodes, each matching the overall look of a single show. However, the crossover episodes required visual effects that blended with two series that use different tools and each have their own look, presenting a more complex set of challenges.

For instance, Scandal is shot on an Arri Alexa camera, and How to Get Away With Murder on a Sony F55, at different color temps and under varying lighting conditions. DP preferences and available equipment required each environment to be shot twice, once with greenscreens for Scandal and then again using bluescreens for Murder.

The replication of the Supreme Court Building is central to the storyline. Building its exterior facade and interiors of the courtroom and rotunda digitally from the ground up were the most complex visual effects created for the episodes.

The process began during preproduction with VFX supervisor Matthew T. Lynn working closely with the client to get a full understanding of their vision. He collaborated with VFX Legion head of production, Nate Smalley, production manager Andrew Turner and coordinators Matt Noren and Lexi Sloan on streamlining workflow and crafting a plan that aligned with the shows’ budgets, schedules, and resources. Lynn spent several weeks on R&D, previs and mockups. Legion’s end-to-end approach was presented to the staffs of both shows during combined VFX meetings, and a plan was finalized.

A rough 3D model of the set was constructed from hundreds of reference photographs stitched together using Agisoft Photoscan and photogrammetry. HDRI panoramas and 360-degree multiple exposure photographs of the set were used to match the 3D lighting with the live-action footage. CG modeling and texturing artist Trevor Harder then added the fine details and created the finished 3D model.

CG supervisor Rommél S. Calderon headed up the team of modeling, texturing, tracking, layout and lighting artists that created Washington, DC’s Supreme Court Building from scratch.

“The computer-generated model of the exterior of the building was a beast, and scheduling was a huge job in itself,” explains Calderon. “Meticulous planning, resource management, constant communication with clients and spot-on supervision were crucial to combining the large volume of shots without causing a bottleneck in VFX Legion’s digital pipeline.”

Ken Bishop, VFX Legion’s lead modeler, ran into some interesting issues while working with footage of the lead characters Olivia Pope and Annalise Keating filmed on the concrete steps of LA’s City Hall. Since the Supreme Court’s staircase is marble, Bishop did a considerable amount of work on the texture, keeping the marble porous enough to blend with the concrete in this key shot.

Compositing supervisor Dan Short led his team through the process of merging the practical photography with renders created with Redshift and then seamlessly composited all of the shots using Foundry’s Nuke.

See their breakdown of the shots here:

Netflix’s Altered Carbon: the look, the feel, the post

By Randi Altman

Netflix’s Altered Carbon is a new sci-fi series set in a dystopian future where people are immortal thanks to something called “stacks,” which contain their entire essence — their personalities, their memories, everything. The one setback is that unless you are a Meth (one of the rich and powerful), you need to buy a “sleeve” (a body) for your stack, and it might not have any resemblance to your former self. It could be a different color, a different sex, a different age, a different everything. You have to take what you can get.

Based on a 2002 novel by Richard K. Morgan, it stars Swedish actor Joel Kinnaman.

Jill Bogdanowicz

We reached out to the show’s colorist, Jill Bogdanowicz, as well as post producer Allen Marshall Palmer to find out more about the show’s varied and distinctive looks.

The look has a very Blade Runner-type feel. Was that in homage to the films?
Bogdanowicz: The creators wanted a film noir look. Blade Runner is the same genre, but the show isn’t specifically an homage to Blade Runner.

Palmer: I’ll leave that for fans to dissect.

Jill, can you talk about your process? What tools did you use?
Bogdanowicz: I designed a LUT to create that film noir look before shooting. I actually provided a few options, and they chose my favorite one and used it throughout. After they shot everything and I had all 10 episodes in my bay, I got familiar with the content, wrapped my head around the story and came up with ideas to tell that story with color.

The show covers many different times and places so scenes needed to be treated visually to show audiences where the story is and what’s happened. I colored both HDR (Dolby Vision) and SDR passes using DaVinci Resolve.

I worked very closely with both DPs — Martin Ahlgren and Neville Kidd — in pre-timing the show, and they gave me a nice idea of what they were looking for so I had a great starting point. They were very close knit. The entire team on this project was an absolute pleasure, and it was a great creative collaboration, which comes through in the final product of the show.

The show is shot and posted like a feature and has a feature feel. Was that part of your marching orders?
Bogdanowicz: I’m primarily a features colorist, so I’m very familiar with the film noir look and heavy VFX, and that’s one reason I was included on this project. It was right up my alley.

Palmer: We approached Altered Carbon as a 10-part feature rather than a television series. I coined the term “feature episodic entertainment,” which describes what we were aspiring to — destination viewing instead of something merely disposable. In a world with so many viewing options, we wanted to command the viewer’s full attention, and fans are rewarded for that attention.

We were very concerned about how images, especially VFX, were going to look in HDR so we had weekly VFX approval sessions with Jill, our mastering colorist, in her color timing bay.

Executive producers and studio along with the VFX and post teams were able to sit together — adjusting color corrections if needed before giving final approval on shots. This gave us really good technical and creative quality control. Despite our initial concerns about VFX shots in HDR, we found that with vendors like Double Negative and Milk with their robust 16-bit EXR pipelines we weren’t “breaking” VFX shots when color correcting for HDR.

How did the VFX affect the workflow?
Bogdanowicz: Because I was brought on so early, the LUT I created was shared with the VFX vendors so they had a good estimation of the show’s contrast. That really helped them visualize the look of the show so that the look of the shots was pretty darn close by the time I got them in my bay.

Was there a favorite scene or scenes?
Bogdanowicz: There are so many spectacular moments, but the emotional core for me is in episode 104 when we see the beginning of the Kovacs and Quell love story in the past and how that love gives Kovacs the strength to survive in the present day.

Palmer: That’s a tough question! There are so many, it’s hard to choose. I think the episode that really jumps out is the one in which Joel Kinnaman’s character is being tortured and the content skips back and forth in time, changes and alternates between VR and reality. It was fun to create a different visual language for each space.

Can you talk about challenges in the process and how you overcame them?
Bogdanowicz: The show features a lot of VFX and they all need to look as real as possible, so I had to make sure they felt part of the worlds. Fortunately, VFX supervisor Everett Burrell and his team are amazing and the VFX is top notch. Coming up with different ideas and collaborating with producers James Middleton and Laeta Kalogridis on those ideas was a really fun creative challenge. I used the Sapphire VFX plugin for Resolve to heavily treat and texture VR looks in different ways.

Palmer: In addition to the data management challenges on the picture side, we were dealing with mixing in Dolby Atmos. It was very easy to get distracted with how great the Atmos mix sounds — the downmixes generally translated very well, but monitoring in 5.1 and 2.0 did reveal some small details that we wanted to adjust. Generally, we’re very happy with how both the picture and sound is translating into viewer’s homes.

Dolby Vision HDR is great at taking what’s in the color bay into the home viewing environment, but there are still so many variables in viewing set-ups that you can still end up chasing your own tail. It was great to see the behind the scenes of Netflix’s dedication to providing the best picture and sound quality through the service.

The look of the AI hotel was so warm. I wanted to live there. Can you talk about that look?
Bogdanowicz: The AI hotel look was mostly done in design and lighting. I saw the warm practical lights and rich details in the architecture and throughout the hotel and ran with it. I just aimed to keep the look filmic and inviting.

What about the look of where the wealthy people lived?
Bogdanowicz: The Meth houses are above the clouds, so we kept the look very clean and cool with a lot of true whites and elegant color separation.

Seems like there were a few different looks within the show?
Bogdanowicz: The same LUT for the film noir look is used throughout the show, but the VR looks are very different. I used Sapphire to come up with different concepts and textures for the different VR looks, from rich quality of the high-end VR to the cheap VR found underneath a noodle bar.

Allen, can you walk us through the workflow from production to post?
Palmer: With the exception of specialty shots, the show was photographed on Alexa 65 — mostly in 5K mode, but occasionally in 6.5K and 4K for certain lenses. The camera is beautiful and a large part of the show’s cinematic look, but it generates a lot of data (about 1.9TB/hour for 5K) so this was the first challenge. The camera dictates using the Codex Vault system, and Encore Vancouver was up to the task for handling this material. We wanted to get the amount of data down for post, so we generated 4096×2304 ProRes 4444XQ “mezzanine” files, which we used for almost all of the show assembly and VFX pulls.

During production and post, all of our 4K files were kept online at Efilm using their portal system. This allowed us fast, automated access to the material so we could quickly do VFX pulls, manage color, generate 16-bit EXR frames and send those off to VFX vendors. We knew that time saved there was going to give us more time on the back end to work creatively on the shots so the Portal was a very valuable tool.

How many VFX shots did you average per episode? Seems like a ton, especially with the AI characters. Who provided those and what were those turnarounds like?
Palmer: There were around 2,300 visual effects shots during this season — probably less than most people would think because we built a large Bay City street inside a former newspaper printing facility outside of Vancouver. The shot turnaround varied depending on the complexity and where we were in the schedule. We were lucky that something like episode 1’s “limo ride” sequence was started very early on because it gave us a lot of time to refine our first grand views of Bay City. Our VFX supervisor Everett Burrell and VFX producer Tony Meagher were able to get us out in front of a lot of challenges like the amount of 3D work in the last two episodes by starting that work early on since we knew we would need those shots from the script and prep phase.

Review: HP’s lower-cost DreamColor Z24x display

By Dariush Derakhshani

So, we all know how important a color-accurate monitor is in making professional-level graphics, right? Right?!? Even at the most basic level, when you’re stalking online for the perfect watch band for your holiday present of a smart watch, you want the orange band you see in the online ad to be what you get when it arrives a few days later. Even if your wife thinks orange doesn’t suit you, and makes you look like “you’re trying too hard.”

Especially as a content developer, you want to know what you’re looking at is an accurate representation of the image. Ever walk into a Best Buy and see multiple screens showing the same content but with wild ranging differences in color? You can’t have that discrepancy working as a pro, especially in collaboration; you need color accuracy. In my own experience, that position has been filled by HP’s 10-bit DreamColor displays for many years now, but not everyone is awash in bitcoins, and justifying a price tag of over $1,200 is sometimes hard to justify, even for a studio professional.

Enter HP’s DreamColor Z24x display at half the price, coming in around $550 online. Yes, DreamColor for half the cost. That’s pretty significant. For the record, I haven’t used a 24-inch monitor since the dark ages; when Lost was the hot TV show. I’ve been fortunate enough to be running at 27-inch and higher, so there was a little shock when I started using the Z24x HP sent me for review. But this is something I quickly got used to.

With my regular 32-inch 4K display still my primary — so I can fit loads of windows all over the place — I used this DreamColor screen as my secondary display, primarily to check output for my Adobe After Effects comps, Adobe Premiere Pro edits and to hold my render view window as I develop shaders and lighting in Autodesk Maya. I felt comfortable knowing the images I shared with my colleagues across town would be seen as I intended them, evening the playing field when working collaboratively (as long as everyone is on the same LUT and color space). Speaking of color spaces, the Z24x hits 100% of sRGB, 99% of AdobeRGB and 96% of DCI P3, which is just slightly under HP’s Z27x DreamColor. It is, however, slightly faster with a 6ms response rate.

The Z24x has a 24-inch IPS panel from LG that exhibits color in 10-bit, like its bigger 27-inch Z27x sibling. This gives you over a billion colors, which I have personally verified by counting them all —that was one, long weekend, I can tell you. Unlike the highest-end DreamColor screens though, the Z24x dithers up from 8-bit to 10-bit (called an 8-bit+FRC). This means it’s better than an 8-bit color display, for sure, but not quite up to real 10-bit, making it color accurate but not color critical. HP’s implementation of dithering is quite good, when subjectively compared to my full 10-bit main display. Frankly, a lot of screens that claim 10-bit may actually be 8-bit+FRC anyway!

While the Z27x gives you 2560×1440 as you expect of most 27inch displays, if not full on 4K, the Z24x is at a comfortable 1920×1200, just enough for a full 1080p image and a little room for a slider or info bar. Being the res snob that I am, I had wondered if that was just too low, but at 24-inches I don’t think you would want a higher resolution, even if you’re sitting only 14-inches away from it. And this is a sentiment echoed by the folks at HP who consulted with so many of their professional clients to build this display. That gives a pixel density of about 94PPI, a bit lower than the 109PPI of the Z27x. This density is about the same as a 1080p HD display at 27-inch, so it’s still crisp and clean.

Viewing angles are good at about 178 degrees, and the screen is matte, with an anti-glare coating, making it easier to stare at without blinking for 10 hours at a clip, as digital artists usually do. Compared to my primary display, this HP’s coating was more matte and still gave me a richer black in comparison, which I liked to see.

Connection options are fairly standard with two DisplayPorts, one HDMI, and one DVI dual link for anyone still living in the past. You also get four USB ports and an analog 3.5mm audio jack if you want to drive some speakers, since you can’t from your phone anymore (Apple, I’m looking at you).

Summing Up
So while 24-inches is a bit small for my tastes for a display, I am seriously impressed at the street price of the Z24x, allowing a lot more pros and semi-pros to get the DreamColor accuracy HP offers at half the price. While I wouldn’t recommend color grading a show on the Z24x, this DreamColor does a nice job of bringing a higher level of color confidence at an attractive price. As a secondary display, the z24x is a nice addition to an artist workflow with budget in mind — or who has a mean, orange-watch-band-hating spouse.


Dariush Derakhshani is a VFX supervisor and educator in Southern California. You can follow his random tweets at @koosh3d.

Kathrin Lausch joins Uppercut as EP

New York post shop Uppercut has added Kathrin Lausch as executive producer. Lausch has over two decades of experience as an executive producer for top production and post production companies such as MPC, Ntropic, B-Reel, Nice Shoes, Partizan and Compass Films, among others. She has led shops on the front lines for the outset of digital, branded content, reality television and brand-direct production.

“I joined Uppercut after being very impressed with Micah Scarpelli’s clear understanding of the advertising market, its ongoing changes and his proactive approach to offer his services accordingly,” explains Lausch. “The new advertising landscape is offering up opportunities for boutique shops like Uppercut, and interesting conversations and relationships can come out of having a clear and focused offering. It was important to me to be part of a team that embraces change and thrives on being a part of it.”

Half French, half German-born, Lausch followed dual pursuits in law and art in NYC before finding her way to the world of production. She launched Passport Films, which later became Compass Films. After selling the company, she followed the onset of the digital advertising marketplace, landing with B-Reel. She made the shift to post production, further embracing the new digital landscape as executive producer at Nice Shoes and Ntropic before landing as head of new business at MPC.

Oscar-winner Jeff White is now CD at ILM Vancouver

Oscar-winning visual effects supervisor Jeff White has been named creative director of Industrial Light & Magic’s Vancouver studio. A 16-year ILM veteran, White will work directly with ILM Vancouver executive in charge Randal Shore.

Recently, the Academy of Motion Picture Arts and Sciences honored White and three colleagues, (Jason Smith, Rachel Rose, Mike Jutanwith a Technical Achievement Award for his original design of ILM’s procedural rigging system, Block Party. He is also nominated for an Academy Award for Visual Effects for his contribution to Kong: Skull Island.

White joined Industrial Light & Magic in 2002 as a creature technical director, working on a variety of films, including the Academy Award-winning Pirates of the Caribbean: Dead Man’s Chest, as well as War of the Worlds and Star Wars: Episode III: Revenge of the Sith.

In 2012, White served as the ILM VFX supervisor on Marvel’s The Avengers, directed by Joss Whedon, and earned both Oscar and BAFTA nominations for his visual effects work. He also received the Hollywood Film Award for visual effects for the work. White was also a VFX supervisor on Duncan Jones’ 2016 sci-fi offering, Warcraft, based on the well-known video game World of Warcraft by Blizzard Entertainment.

Says White, “Having worked with many of the artists here in Vancouver on a number of films, including Kong: Skull Island, I know firsthand the amazing artistic and technical talent we have to offer and I couldn’t be more excited to share what I know and collaborate with them on all manner of projects.”

Initially conceived as a satellite office when it opened in 2012, ILM’s Vancouver studio became a permanent fixture in the company’s operation in 2014. In 2017, the studio nearly doubled in size, adding a second building adjacent to its original location in the Gastown district. The studio has spearheaded ILM’s work on such films as Valerian and the City of a Thousand Planets, Only the Brave and most recently, Ryan Coogler’s Black Panther and Ava DuVernay’s A Wrinkle in Time.

Oscar-winner Jordan Peele on directing Get Out

By Iain Blair

Get Out, the feature film debut of comedian-turned-director Jordan Peele, is chock full of shocks and surprises. This multi-layered horror film also shocked a lot of people in the industry when it went on to gross over a quarter of a billion dollars — on a $4.5 million budget — making it one of the most profitable films in Hollywood history. But those shocks are nothing compared to the ones Peele and his movie generated when it scooped up four major Oscar nominations, including Best Picture and Best Director (in a very strong Best Director year, Peele beat out the likes of Steven Spielberg, Ridley Scott and Martin McDonagh). He won for Best Original Screenplay.

The writer/director honed his cinematic skills on the Comedy Central sketch show Key and Peele, which quickly became a television and Internet sensation, earning 12 Primetime Emmy Award nominations and over 900 million online hits. For his first film, which stars Daniel Kaluuya, Allison Williams, Catherine Keener and Bradley Whitford, he assembled a stellar group of collaborators, including director of photography Toby Oliver (Insidious: Chapter 4), production designer Rusty Smith (Meet the Fockers), editor Gregory Plotkin (the Paranormal Activity series), costume designer Nadine Haders (Into the Badlands) and composer Michael Abels.

With the huge critical and commercial success of Get Out, Peele has now joined the big leagues. I recently caught up with Peele who talked about the Oscars, making the film, and his love of post.

This is your directorial movie debut, and it’s not only Oscar-nominated for Best Picture but also for Best Director. Are you still pinching yourself?
Oh yeah, 100 percent! It’s not something I feel I’ll ever get used to. It’s way beyond any expectations I had.

You were also Oscar-nominated for Best Original Screenplay, making you only the third person ever — after Warren Beatty and James L. Books — to score that and Best Director, Best Picture nods for your debut film. You realize it’s all downhill from here?
(Laughs) Yeah, I might as well quit making movies now while I’m still ahead, because I’m in big trouble. And that’s pretty ironic as the best award and reward for making my first movie is the fact that I get to make another.

You’re only the fifth African-American filmmaker to earn a Best Director nom and none have won. Is change coming fast enough?
I think change should have come a long time ago, but at least now we see some real progress, with such directors as Ryan Coogler, Ava DuVernay, Gary Gray, Barry Jenkins and Dee Rees. It’s this new class of amazing black directors, and people have worked very hard to get to this point, and it’s thanks to all the work of previous filmmakers. What’s blossoming in the industry now is very beautiful, so I’m very hopeful for the future.

When it comes to the Oscars, horror and comedy are two genres that don’t seem to get much respect. Why do you think that is?
I think it’s because they’re genres that are typically focused on getting a monetary return, so they get put in that box and are seen as lightweight and movies that are not art — even though there are many examples of elevated horror and elevated comedy that are extremely artistic films. So there’s that stigma. And if people don’t like horror, they just don’t like it, so it’s not a genre that you can expect everyone to want to see, unlike comedy. Everyone pretty much loves comedy, but when people tell me they don’t like horror, I tell them to seek it out, that it won’t scare them that much, and that it might surprise them.

Did you write this thinking, “I want to direct it too?”No, I never planned to direct it, but then about half-way through writing it I realized I was the only person who could actually direct it. I feel that being both the writer and director is easier than not doing both, because they’re done at separate times, so you don’t have to overlap, and then later if you want to change something on set, you know that you’re not missing or mistaking what the writer intended.

What sort of film did you set out to make, because it’s not just a straightforward horror film, is it?
No. I wanted to make a film I’d never seen before. It’s been called many things, and I myself have called it both a horror film and a social thriller. I was aiming at the genre somewhere between Rosemary’s Baby and Scream, so it’s about a lot of things — the way America deals with race and the idea that racism itself is a monster, and that we can’t neglect abuses and just stand by while atrocities happen. So I tried to incorporate a lot of layers and make something people would want to see more than once.

How did you prepare for directing your first film? It’s got to be pretty daunting.
It’s actually terrifying since you don’t know what you don’t know. I talked to everyone I could — Edgar Wright, Ben Affleck, Leigh Whannell, Peter Atencio who did our show and Keanu, and any other director I could — to try and prepare as much as possible.

How was the shoot?
We shot in Mobile, Alabama, and it was probably the most fun I’ve ever had working on anything. It was so hard and so intense. I was very prepared, but then you also have to be open to adapting and making changes, and too much preparation can work against you if you’re not careful.

Do you like the post process?
I absolutely loved it, and one big reason is because after so long just imagining what the film might look like, all of a sudden you have all the pieces of the puzzle in front of you, and you’re finally making the film. Post teaches you so much about what the film is meant to be and what it wants to be.

Where did you edit and post this?
At Blumhouse in LA.

Tell us about working with editor Gregory Plotkin, who cut most of the Paranormal Activity franchise for Paramount and Blumhouse Productions.
He’s a very accomplished editor and a real horror fan like me, so we bonded immediately over that. He could break down the script and all my influences from Hitchcock and Kubrick to Spielberg and Jonathan Demme. He did his pass and then I came in and did my director’s pass, and then we went over it all with a fine-tooth comb, tightening scenes up and so on and focusing on pace and timing, which are crucial in horror and comedy.

Is it true you shot multiple endings for the film? How did you decide on the right one?
We actually shot two, and the first one was not a happy one. When we edited it all together we realized it wasn’t working for an audience. They thought it was a downer, and then I realized it needed a hero and a happy ending instead, so that after going through all the stress, the audience could come out happy. So we asked for more money and went off and did a reshoot of the ending, which added another layer and worked far better.

Sound and music are so important in horror. Can you talk about that?
I look at it as at least half the movie since you can scare audiences so much with just clever sound design. I paid a lot of attention to it during the writing process, and then once we got into post it all became a very meticulous process. We were careful not to overdo all the sound design. We did it all at Wildfire, and they are such pros and were up for trying anything. They really understood my vision.

Can you talk about the VFX?
Ingenuity Studios did them and the big one was creating “The Sunken Place,” and it was tricky to do it as we didn’t have a bearing on this world apart from what I’d originally imagined. There was no up or down. Should the camera be fixed or floating? In the end, we shot Daniel Kaluuya against a black background on cables, and then Gregory played around in the Avid a lot, resizing the image. Then we added some CG stuff to give it that sort of underwater feel. We had a bunch of other shots, like the car hitting the deer and the father being impaled on the deer horns, which was all CGI.

Who was the colorist, and where did you do the DI?
It was all at Blumhouse, with Aidan Stanford, and I was pretty involved. It was tricky, and you can quickly go overboard with color, but the DP, Toby, did such a great job on the shoot that we mainly just tried to match his original color and not push it too far.

I assume you can’t wait to direct again?
Oh yeah! There’s nothing more fun. It’s the biggest artistic collaboration I can imagine, with all these moving parts, and I loved every minute of it.

What’s next?
I’m working on another screenplay, which I’ll direct for Universal. I just love Hitchcockian thrillers, so I’m staying in the same genre and zone.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Trio from Reel FX and Shilo team on VFX/live-action studio

The commercial division of digital studio Reel FX has teamed up with Shilo founder/executive creative director Jose Sebastian Gomez to launch strategic creative group ATK PLN. Emmy Award-winner Gomez will lead the creative vision for the studio, joined by former Digital Domain HOP Jim Riche as executive producer, with overall strategy led by Reel FX’s David Bates as managing director. The trio will draw from their combined expertise across VFX, design, production, interactive media, branding and marketing to offer in-house services from concept to final delivery.

ATK PLN will work across design, animation and live action. The team has already created work for AT&T, Fox Racing and MADD — all a fusion of live action and VFX. ATK PLN creatives will work between its new Hollywood studio, Montreal and Dallas locations. ATK PLN will also partner with sister companies Flight School and Reel FX Animation.

AT&T

In terms of tools, the company uses a lot of the traditional apps like Flame, Maya, Nuke and Houdini. “Our biggest push at the moment is into GPU rendering,” reports Riche. “We have had great success with Octane from Otoy, and it is a second pipeline in our systems working alongside Arnold. Octane is a faster render system and is fantastic on hard surface models.”

Riche continues, “When I joined David Bates at Reel FX almost two years ago we created a vision to elevate the company to the next level, challenging the status quo of the advertising community to offer a new, unique approach to creative problem solving. Bringing on Jose Gomez and his creative vision to our team at ATK PLN is allowing us to turn our ideas into reality. I am excited about how this forward-thinking team will continue to evolve with the changing market.”

The 16th annual VES Award winners

The Visual Effects Society (VES) celebrated artists and their work at the 16th Annual VES Awards, which recognize outstanding visual effects artistry and innovation in film, animation, television, commercials, video games and special venues.

Seven-time host, comedian Patton Oswalt, presided over more than 1,000 guests at the Beverly Hilton. War for the Planet of the Apes was named photoreal feature film winner, earning four awards. Coco was named top animated film, also earning four awards. Games of Thrones was named best photoreal episode and garnered five awards — the most wins of the night. Samsung; Do What You Can’t; Ostrich won top honors in the commercial field, scoring three awards. These top four contenders collectively garnered 16 of the 24 awards for outstanding visual effects.

President of Marvel Studios Kevin Feige presented the VES Lifetime Achievement Award to producer/writer/director Jon Favreau. Academy Award-winning producer Jon Landau presented the Georges Méliès Award to Academy Award-winning visual effects master Joe Letteri, VES. Awards presenters included fan-favorite Mark Hamill, Coco director Lee Unkrich, War for the Planet of the Apes director Matt Reeves, Academy Award-nominee Diane Warren, Jaime Camil, Dan Stevens, Elizabeth Henstridge, Sydelle Noel, Katy Mixon and Gabriel “Fluffy” Iglesias.

Here is a list of the winners:

Outstanding Visual Effects in a Photoreal Feature

War for the Planet of the Apes

Joe Letteri

Ryan Stafford

Daniel Barrett

Dan Lemmon

Joel Whist

 

Outstanding Supporting Visual Effects in a Photoreal Feature

Dunkirk

Andrew Jackson

Mike Chambers

Andrew Lockley

Alison Wortman

Scott Fisher

 

Outstanding Visual Effects in an Animated Feature

Coco

Lee Unkrich

Darla K. Anderson

David Ryu

Michael K. O’Brien

 

Outstanding Visual Effects in a Photoreal Episode

Game of Thrones: Beyond the Wall

Joe Bauer

Steve Kullback

Chris Baird

David Ramos

Sam Conway

 

Outstanding Supporting Visual Effects in a Photoreal Episode

Black Sails: XXIX

Erik Henry

Terron Pratt

Yafei Wu

David Wahlberg

Paul Dimmer

 

Outstanding Visual Effects in a Real-Time Project

Assassin’s Creed Origins

Raphael Lacoste

Patrick Limoges

Jean-Sebastien Guay

Ulrich Haar

 

Outstanding Visual Effects in a Commercial

Samsung Do What You Can’t: Ostrich

Diarmid Harrison-Murray

Tomek Zietkiewicz

Amir Bazazi

Martino Madeddu

 

 

Outstanding Visual Effects in a Special Venue Project

Avatar: Flight of Passage

Richard Baneham

Amy Jupiter

David Lester

Thrain Shadbolt

 

Outstanding Animated Character in a Photoreal Feature

War for the Planet of the Apes: Caesar

Dennis Yoo

Ludovic Chailloleau

Douglas McHale

Tim Forbes

 

Outstanding Animated Character in an Animated Feature

Coco: Hèctor

Emron Grover

Jonathan Hoffman

Michael Honsel

Guilherme Sauerbronn Jacinto

 

Outstanding Animated Character in an Episode or Real-Time Project

Game of Thrones The Spoils of War: Drogon Loot Train Attack

Murray Stevenson

Jason Snyman

Jenn Taylor

Florian Friedmann

 

Outstanding Animated Character in a Commercial

Samsung Do What You Can’t: Ostrich

David Bryan

Maximilian Mallmann

Tim Van Hussen

Brendan Fagan

 

Outstanding Created Environment in a Photoreal Feature

Blade Runner 2049; Los Angeles

Chris McLaughlin

Rhys Salcombe

Seungjin Woo

Francesco Dell’Anna

 

Outstanding Created Environment in an Animated Feature

Coco: City of the Dead

Michael Frederickson

Jamie Hecker

Jonathan Pytko

Dave Strick

 

Outstanding Created Environment in an Episode, Commercial, or Real-Time Project

Game of Thrones; Beyond the Wall; Frozen Lake

Daniel Villalba

Antonio Lado

José Luis Barreiro

Isaac de la Pompa

 

Outstanding Virtual Cinematography in a Photoreal Project

Guardians of the Galaxy Vol. 2: Groot Dance/Opening Fight

James Baker

Steven Lo

Alvise Avati

Robert Stipp

 

Outstanding Model in a Photoreal or Animated Project

Blade Runner 2049: LAPD Headquarters

Alex Funke

Steven Saunders

Joaquin Loyzaga

Chris Menges

 

Outstanding Effects Simulations in a Photoreal Feature

War for the Planet of the Apes

David Caeiro Cebrián

Johnathan Nixon

Chet Leavai

Gary Boyle

 

Outstanding Effects Simulations in an Animated Feature

Coco

Kristopher Campbell

Stephen Gustafson

Dave Hale

Keith Klohn

 

Outstanding Effects Simulations in an Episode, Commercial, or Real-Time Project 

Game of Thrones; The Dragon and the Wolf; Wall Destruction

Thomas Hullin

Dominik Kirouac

Sylvain Nouveau

Nathan Arbuckle

  

Outstanding Compositing in a Photoreal Feature

War for the Planet of the Apes

Christoph Salzmann

Robin Hollander

Ben Warner

Beck Veitch

 

Outstanding Compositing in a Photoreal Episode

Game of Thrones The Spoils of War: Loot Train Attack

Dom Hellier

Thijs Noij

Edwin Holdsworth

Giacomo Matteucci

 

Outstanding Compositing in a Photoreal Commercial

Samsung Do What You Can’t: Ostrich

Michael Gregory

Andrew Roberts

Gustavo Bellon

Rashabh Ramesh Butani

 

Outstanding Visual Effects in a Student Project

Hybrids

Florian Brauch

Romain Thirion

Matthieu Pujol

Kim Tailhades 

 

 

 

Cinesite VFX supervisor Stephane Paris: 860 shots for The Commuter

By Randi Altman

The Commuter once again shows how badass Liam Neeson can be under very stressful circumstances. This time, Neeson plays a mild-mannered commuter named Michael who gets pushed too far by a seemingly benign but not-very-nice Vera Farmiga.

For this Jaume Collet-Serra-directed Lionsgate film, Cinesite’s London and Montreal locations combined to provide over 800 visual effects shots. The studio’s VFX supervisor, Stephane Paris, worked hand in hand with The Commuter’s overall VFX supervisor Steve Begg.

Stephane Paris

The visual effects shots vary, from CG commuters to Neesom’s outfits changing during his daily commute to fog and smog to the climactic huge train crash sequence. Cinesite’s work on the film included a little bit of everything. For more, we reached out to Paris…

How early did Cinesite get involved in The Commuter?
We were involved before principal photography began. I was then on set at Pinewood Studios, just outside London, for about six weeks alongside Steve. They had set up two stages. The first was a single train carriage adapted and dressed to look like multiple carriages — this was used to film all the main action onboard the train. The carriage was surrounded by bluescreen and shot on a hydraulic system to give realistic shake and movement. In one notable shot, the camera pulls back through the entire length of the train, through the carriage walls. A camera rig was set up on the roof and programmed to repeat the same pullback move through each iteration of the carriage — this was subsequently stitched together by the VFX team.

How did you work with the film’s VFX supervisor, Steve Begg?
Cinesite had worked with Steve previously on productions such as Spectre, Skyfall and Inkheart. Having created effects with him for the Bond films, he was confident that Cinesite could create the required high-quality invisible effects for the action-heavy sequences. We interacted with Steve mainly. The client’s approach was to concentrate on the action, performances and story during production, so we lit and filmed the bluescreens carefully, ensuring reflections were minimized and the bluescreens were secure in order to allow creative freedom to Jaume during filming. We were confident that by using this approach we would have what we needed for the visual effects at a later stage.

You guys were the main house on the film, providing a whopping 860 visual effects shots. What was your turnaround like? How did you work for the review and approval process?
Yes, Cinesite was the lead vendor, and in total we worked on The Commuter for about a year, beginning with principal photography in August 2016 and delivering in August 2017. Both our London and Montreal studios worked together on the film. We have worked together previously, notably on San Andreas and more recently on Independence Day: Resurgence, so I had experience of working across both locations. Most of the full CG heavy shots were completed in London, while the environments, some of the full CG shots and 2D backgrounds were completed in Montreal, which also completed the train station sequence that appears early in the film.

My time was split fairly evenly between both locations, so I would spend two to three weeks in London followed by the same amount of time in Montreal. Steve never needed to visit the Montreal studio, but he was very hands-on and involved throughout. He visited our London studio at least twice a week, where we used the RV system to review both the London and Montreal work.

Can you describe the types of shots you guys provided?
We delivered over 860, from train carriage composites right through to entirely CG shots for the spectacular climactic train crash sequence. The crash required the construction of a two-kilometers-long environment asset complete with station, forest, tracks and industrial detritus. Effects were key, with flying gravel, breaking and deforming tracks, exploding sleepers, fog, dust, smoke and fire, in addition to the damaged train carriages. Other shots required a realistic Neeson digi-double to perform stunts.

The teams also created shots near the film’s opening that demonstrate the repetition of Michael’s daily commute. In a poignant shot at Grand Central Station multiple iterations of Michael’s journey are shown simultaneously, with the crowds gradually accelerating around him while his pace remains measured. His outfit changes, and the mood lighting changes to show the passing of the seasons around him.

The shot was achieved with a combination of multiple motion control passes, creation of the iconic station environment using photogrammetry and, ultimately, by creating the crowd of fellow commuters in CG for the latter part of the shot (a seamless transition was required between the live-action passes and the CG people).

Did you do previs? If so, what tools did you use?
No. London’s Nvizible handled all the initial previs for the train crash. Steve Begg blocked everything out and then sent it to Jaume for feedback initially, but the final train crash layout was done by our team with Jaume at Cinesite.

What did you use tool-wise for the VFX?
Houdini’s RBD particle and fluid simulation processes were mainly used, with some Autodesk Maya for falling pieces of train. Simulated destruction of the train was also created using Houdini, with some internal set-up.

What was the most challenging scene or scenes you worked on? 
The challenge was, strangely enough, more about finding proper references that would fit our action movie requirements. Footage of derailing trains is difficult to find, and when you do find it you quickly notice that train carriages are not designed to tear and break the way you would like them to in an action movie. Naturally, they are constructed to be safe, with lots of energy absorption compartments and equipped with auto triggering safe mechanisms.

Putting reality aside, we devised a visually exciting and dangerous movie train crash for Jaume, complete with lots of metal crumbling, shattering windows and multiple large-scale impact explosions.

As a result, the crew had to ensure they were maintaining the destruction continuity across the sequence of shots as the train progressively derails and crashes. A high number of re-simulations were applied to the train and environment destruction whenever there was a change to one of these in a shot earlier in the sequence. Devising efficient workflows using in-house tools to streamline this where possible was key in order to deliver a large number of effects-heavy destruction shots whilst maintaining accurate continuity and remaining responsive to the clients’ notes during the show.

Quick Chat: ArsenalCreative’s new VFX supervisor Mike Wynd

VFX supervisor Mike Wynd has joined ArsenalCreative from MPC, where he spent eight years in a similar role. Over the years, Wynd has worked on many high-profile projects for directors such as Rupert Sanders, Noam Murro and Adam Berg. He has also won a number of industry awards, including a Silver Clio and a Gold British Arrow, as well as a VES Award nomination.

Wynd started his career in Melbourne, Australia, working for Computer Pictures before landing at Images Post in Auckland, New Zealand. Eight years later, he headed back to Australia to serve as head of 3D at Garner MacLennan Design, where he worked on many high-end animations and effects, including the first Lord of the Rings movie. After that studio was bought out, Wynd joined Digital Pictures. Next, he assisted in establishing a new 3D/design team at FSM. After that he relocated to Los Angeles, where he worked for Moving Pixels. Later, he took on the role of VFX supervisor for MPC.

We reached out to Wynd to ask him a few questions about being a VFX supervisor:

What drew you to VFX supervision?
The thing I enjoy most about VFX supervision is the problem solving. From how best to shoot what we require to seamlessly integrating our effects, through to the actual approach and tools that we’ll employ in post production. We’ve always got a finite amount of time and money with which to produce our work and a little bit of alternative thinking can go a long way to achieve higher quality and more efficient results.

How early do you like being brought onto a project?
I’d prefer to be bought in on a project ideally from day one. Especially on a complex VFX project, being involved alongside production means that we, as a team, can troubleshoot many aspects of the job, that in the long run, will mean savings in cost and time as well as higher quality results. It also gives time for relationships to be formed between VFX and production so that on the shoot the VFX team is seen as an asset rather than a hindrance.

Do you go on set? Why is that so important?
I do go on set… a lot! I have been very lucky over the years to travel to some incredible locations all over the world. It’s so important because this is where the foundations are laid for a successful job. Being able to see how and why footage is shot the way it is goes a long way toward finding solutions to post issues.

Actually seeing the environment of a scene can offer clues that may help in significantly reducing any issues that may arise once the footage is back in the studio. And, of course, there’s the nuts and bolts of capturing set information, along with color and lighting references critical to the project. And probably the most important reason to be on-set is to act as the conduit connecting production and post. The two parties often act so separately from one another, yet each is only doing half the job.

Have you worked on anything at ArsenalCreative yet?
It’s early days for me at ArsenalCreative, but thus far I’ve worked on a Chevy presentation for the motor shows and a series of pod shots for Lexus.

If you had one piece of advice for someone about to embark on a project that involves VFX, what would it be?
Ha! Get VFX involved from day one!

VFX supervisor Lesley Robson-Foster on Amazon’s Mrs. Maisel

By Randi Altman

If you are one of the many who tend to binge-watch streaming shows, you’ve likely already enjoyed Amazon’s The Marvelous Mrs. Maisel. This new comedy focuses on a young wife and mother living in New York City in 1958, when men worked and women tended to, well, not work.

After her husband leaves her, Mrs. Maisel chooses stand-up comedy over therapy — or you could say stand-up comedy chooses her. The show takes place in a few New York neighborhoods, including the toney Upper West Side, the Garment District and the Village. The storyline brings real-life characters into this fictional world — Midge Maisel studies by listening to Red Foxx comedy albums, and she also befriends comic Lenny Bruce, who appears in a number of episodes.

Lesley Robson-Foster on set.

The show, created by Amy Sherman-Palladino and Dan Palladino, is colorful and bright and features a significant amount of visual effects — approximately 80 per episode.

We reached out to the show’s VFX supervisor, Lesley Robson-Foster, to find out more.

How early did you get involved in Mrs. Maisel?
The producer Dhana Gilbert brought my producer Parker Chehak and I in early to discuss feasibility issues, as this is a period piece and to see if Amy and Dan liked us! We’ve been on since the pilot.

What did the creators/showrunners say they needed?
They needed 1958 New York City, weather changes and some very fancy single-shot blending. Also, some fantasy and magic realism.

As you mentioned, this is a period piece, so I’m assuming a lot of your work is based on that.
The big period shots in Season 1 are the Garment District reconstruction. We shot on 19th Street between 5th and 6th — the brilliant production designer Bill Groom did 1/3 of the street practically and VFX took care of the rest, such as crowd duplication and CG cars and crowds. Then we shot on Park Avenue and had to remove the Met Life building down near Grand Central, and knock out anything post-1958.

We also did a major gag with the driving footage. We shot driving plates around the Upper West Side and had a flotilla of period-correct cars with us, but could not get rid of all the parked cars. My genius design partner on the show Douglas Purver created a wall of parked period CG cars and put them over the modern ones. Phosphene then did the compositing.

What other types of effects did you provide?
Amy and Dan — the creators and showrunners — haven’t done many VFX shows, but they are very, very experienced. They write and ask for amazing things that allow me to have great fun. For example, I was asked to make a shot where our heroine is standing inside a subway car, and then the camera comes hurtling backwards through the end of the carriage and then sees the train going away down the tunnel. All we had was a third of a carriage with two and a half walls on set. Douglas Purver made a matte painting of the tunnel, created a CG train and put it all together.

Can you talk about the importance of being on set?
For me being on set is everything. I talk directors out of VFX shots and fixes all day long. If you can get it practically you should get it practically. It’s the best advice you’ll ever give as a VFX supervisor. A trust is built that you will give your best advice, and if you really need to shoot plates and interrupt the flow of the day, then they know it’s important for the finished shot.

Having a good relationship with every department is crucial.

Can you give an example of how being on set might have saved a shot or made a shot stronger?
This is a character-driven show. The directors really like Steadicam and long, long shots following the action. Even though a lot of the effects we want to do really demand motion control, I know I just can’t have it. It would kill the performances and take up too much time and room.

I run around with string and tennis balls to line things up. I watch the monitors carefully and use QTake to make sure things line up within acceptable parameters.

In my experience you have to have the production’s best interests at heart. Dhana Gilbert knows that a VFX supervisor on the crew and as part of the team smooths out the season. They really don’t want a supervisor who is intermittent and doesn’t have the whole picture. I’ve done several shows with Dhana; she knows my idea of how to service a show with an in-house team.

You shot b-roll for this? What camera did you use, and why?
We used a Blackmagic Ursa Mini Pro. We rented one on The OA for Netflix last year and found it to be really easy to use. We liked that’s its self-contained and we can use the Canon glass from our DSLR kits. It’s got a built-in monitor and it can shoot RAW 4.6K. It cut in just fine with the Alexa Mini for establishing shots and plates. It fits into a single backpack so we could get a shot at a moment’s notice. The user interface on the camera is so intuitive that anyone on the VFX team could pick it up and learn how to get the shot in 30 minutes.

What VFX houses did you employ, and how do you like to work with them?
We keep as much as we can in New York City, of course. Phosphene is our main vendor, and we like Shade and Alkemy X. I like RVX in Iceland, El Ranchito in Spain and Rodeo in Montreal. I also have a host of secret weapon individuals dotted around the world. For Parker and I, it’s always horses for courses. Whom we send the work to depends on the shot.

For each show we build a small in-house team — we do the temps and figure out the design, and shoot plates and elements before shots leave us to go to the vendor.

You’ve worked on many critically acclaimed television series. Television is famous for quick turnarounds. How do you and your team prepare for those tight deadlines?
Television schedules can be relentless. Prep, shoot and post all at the same time. I like it very much as it keeps the wheels of the machine oiled. We work on features in between the series and enjoy that slower process too. It’s all the same skill set and workflow — just different paces.

If you have to offer a production a tip or two about how to make the process go more smoothly, what would it be?
I would say be involved with EVERYTHING. Keep your nose close to the ground. Really familiarize yourself with the scripts — head trouble off at the pass by discussing upcoming events with the relevant person. Be fluid and flexible and engaged!

Sci-Tech Award winners named

The 2018 Sci-Tech Awards (Academy of Motion Picture Arts and Sciences) have been bestowed to 34 individuals and one company representing 10 scientific and technical achievements. Each recipient will be honored at the annual Scientific and Technical Awards Presentation on February 10 at the Beverly Wilshire in Beverly Hills.

“This year we are happy to honor a very international group of technologists for their innovative and outstanding accomplishments,” says Ray Feeney, Academy Award recipient and chair of the Scientific and Technical Awards Committee. “These individuals have significantly contributed to the ongoing evolution of motion pictures and their efforts continue to empower the creativity of our industry.”

Technical Achievement Award Winners (Academy Certificates)

Honorees: Jason Smith and Jeff White for the original design, and to Rachel Rose and Mike Jutan for the architecture and engineering of the BlockParty procedural rigging system at Industrial Light & Magic.

BlockParty streamlines the rigging process through a comprehensive connection framework, a unique graphical user interface and volumetric rig transfer. This has enabled ILM to build richly detailed and unique creatures while greatly improving artist productivity.

Honorees: Joe Mancewicz, Matt Derksen and Hans Rijpkema for the design, architecture and implementation of the Rhythm & Hues Construction Kit rigging system.

This toolset provides a new approach to character rigging that features topological independence, continuously editable rigs and deformation workflows with shape-preserving surface relaxation, enabling 15 years of improvements to production efficiency and animation quality.

Honorees: Alex Powell for the design and engineering and to Jason Reisig for the interaction design, and to Martin Watt and Alex Wells for the high-performance execution engine of the Premo character animation system at DreamWorks Animation.

Premo enables animators to pose full-resolution characters in representative shot context, significantly increasing their productivity.

Honorees: Rob Jensen for the foundational design and continued development and to Thomas Hahn for the animation toolset and to George ElKoura, Adam Woodbury and Dirk Van Gelder for the high-performance execution engine of the Presto Animation System at Pixar Animation Studios.

Presto allows artists to work interactively in scene context with full-resolution geometric models and sophisticated rig controls, and has significantly increased the productivity of character animators at Pixar.

Scientific and Engineering Award Winners (Academy Plaques)

Honorees: John Coyle, Brad Hurndell, Vikas Sathaye and Shane Buckham for the concept, design, engineering and implementation of the Shotover K1 camera system.

This six-axis stabilized aerial camera mount, with its enhanced ability to frame shots while looking straight down, enables greater creativity while allowing pilots to fly more effectively and safely.

Honorees: Jeff Lait, Mark Tucker, Cristin Barghiel and John Lynch for their contributions to the design and architecture of Side Effects Software’s Houdini visual effects and animation system.

Houdini’s dynamics framework and workflow management tools have helped it become the industry standard for bringing natural phenomena, destruction and other digital effects to the screen.

Honorees: Bill Spitzak and Jonathan Egstad for the visionary design, development and stewardship of Foundry’s Nuke compositing system.

Built for production at Digital Domain, Nuke is used across the motion picture industry, enabling novel and sophisticated workflows at an unprecedented scale.

Honorees: Abigail Brady, Jon Wadelton and Jerry Huxtable for their significant contributions to the architecture and extensibility of Foundry’s Nuke compositing system.

Expanded as a commercial product at The Foundry, Nuke is a comprehensive, versatile and stable system that has established itself as the backbone of compositing and image processing pipelines across the motion picture industry.

Honorees: Leonard Chapman for the overall concept, design and development, to Stanislav Gorbatov for the electronic system design, and to David Gasparian and Souhail Issa for the mechanical design and integration of the Hydrascope telescoping camera crane systems.

With its fully waterproof construction, the Hydrascope has advanced crane technology and versatility by enabling precise long-travel multi-axis camera movement in, out of and through fresh or salt water.

Academy Award of Merit (Oscar statuette)

Honorees: Mark Elendt and Side Effects Software for the creation and development of the Houdini visual effects and animation system.

With more than twenty years of continual innovation, Houdini has delivered the power of procedural methods to visual effects artists, making it the industry standard for bringing natural phenomena, destruction and other digital effects to the screen.

Gordon E. Sawyer Award (Oscar statuette)

Honoree: Jonathan Erland, visual effects technologist

Presented to an individual in the motion picture industry whose technological contributions have brought credit to the industry.

All images courtesy of A.M.P.A.S.

Jogger moves CD Andy Brown from London to LA

Creative director Andy Brown has moved from Jogger’s London office to its Los Angeles studio. Brown led the development of boutique VFX house Jogger London, including credits for the ADOT PSA Homeless Lights via Ogilvy & Mather, as well as projects for Adidas, Cadbury, Valentino, Glenmorangie, Northwestern Mutual, La-Z-Boy and more. He’s also been involved in post and VFX for short films such as Foot in Mouth, Containment and Daisy as well as movie title sequences (via The Morrison Studio), including Jupiter Ascending, Collide, The Ones Below and Ronaldo.

Brown got his start in the industry at MPC, where he worked for six years, eventually assuming the role of digital online editor. He then went on to work in senior VFX roles at some of London’s post houses, before assuming head of VFX at One Post. Following One Post’s merger with Rushes, Brown founded his own company Four Walls, establishing the company’s reputation for creative visual effects and finishing.

Brown oversaw Four Walls’ merger with LA’s Jogger Studios in 2016. Andy has since helped form interconnections with Jogger’s teams in London, New York, Los Angeles, San Francisco and Austin, with high-end VFX, motion graphics and color grading carried out on projects globally.

VFX house Jogger is a sister company of editing house Cut + Run.