Tag Archives: VFX

Behind the Title: We Are Royale CD Chad Howitt

NAME: Chad Howitt

COMPANY: We Are Royale in Los Angeles

CAN YOU DESCRIBE YOUR COMPANY?
We Are Royale is a creative and digital agency looking to partner with brands to create unique experiences across platforms. In the end, we make pretty things for lots of different applications, depending on the creative problem our clients are looking to solve. We’re a full-service production studio that directs live-action commercials, creates full 3D worlds, designs 2D character spots, and develops immersive AR and VR experiences.

WHAT’S YOUR JOB TITLE?
Creative Director

WHAT DOES THAT ENTAIL?
Anything and everything needed to get the job done. On the service side, I’ll work directly with clients and agencies to address their wide variety of needs. So whether that’s creating an idea from scratch or curating a look around an already developed script, I try to figure out how we can help.

Then in-house, I’ll work with our talented team of art directors, CG supervisors and producers to help execute those ideas.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
While the title stays the same, the responsibilities vary by location, person and company culture. So don’t think there’s a hard-and-fast rule about what a creative director is and does.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Seeing a finished project out in the world knowing the hard work the team put in to get it there. Whether it’s on TV, in a space as a part of an installation or online as a part of a pre-roll… it’s a proud moment whenever I see it in the wild.

WHAT’S YOUR LEAST FAVORITE?
Seeing the results of a job we lost or had to pass on knowing that the creative we were planning will never see the light of day.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I’d probably be in the video game industry, but that wasn’t really a feasible career path back then.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
As a little kid, I was obsessed with drawing and computers. So merging those into a profession always seemed like the most natural course. That said, as an LA native, working on film sets just seemed like what out-of-towners wanted to do. So I never saw that coming.

Under Armour spot for its UA HOVR running line

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
I’ve wrapped a few projects with Under Armour, a trio of spots for NASDAQ, and a promo for Billy Bob Thornton’s series on Amazon called Goliath.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
It’d probably be the first project I worked on at We Are Royale, which was an Under Armour spot for its UA HOVR running shoe line. It allowed me to work with merging live-action, CG and beautiful type design.

NAME THREE THINGS YOU CAN’T LIVE WITHOUT.
Fire, indoor plumbing and animal husbandry

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
The last social media I had was MySpace, unless you count LinkedIn…which you really shouldn’t.

CARE TO SHARE YOUR FAVORITE MUSIC TO WORK TO?
Some of my current go-to tracks are “We Were So Young” by Hammock, “Galang” by Vijay Iyer Trio, “Enormous” by Llgl Tndr, “Almost Had to Start a Fight” by Parquet Courts and “Pray” by Jungle.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I stress eat. Cake, cookies and pizza make most problems go away. Although diabetes could become a new problem, but that’s tomorrow.

Artifex provides VFX for Jordan Peele’s Weird City

Vancouver-based VFX house Artifex Studios was the primary visual effects vendor for Weird City, Oscar-winner Jordan Peele’s first foray into scripted OTT content. The dystopian sci-fi/comedy Weird City — from Peele and Charlie Sanders — premieres on YouTube Premium on  February 13. They have released a first trailer and it features a variety of Artifex’s visual effects work.

Artifex’s CG team created the trailer’s opening aerial shots of the futuristic city. Additionally, video/holographic screens, user interfaces, graphics, icons and other interactive surfaces that the characters interact with were tasked to Artifex.

Artifex’s team, led by VFX supervisor Rob Geddes, provided 250 visual effects shots in all, including Awkwafina’s and Yvette Nicole Brown’s outfit swapping (our main image), LeVar Burton’s tube traveling and a number of additional environment shots.

Artifex called on Autodesk Maya, V-ray, Foundry’s Nuke and Adobe Photoshop, along with a mix of Dell, HP, generic PC workstations and Dell and HP render nodes. They also used Side Effects Houdini for procedural generation of the “below the line” buildings in the opening city shot. Qumulo was called on for storage.

 

VFX editor Warren Mazutinec on life, work and Altered Carbon

By Jeremy Presner

Long-time assistant editor Warren Mazutinec’s love for filming began when he saw Star Wars as an eight-year-old in a small town in Edmonton, Alberta. Unlike many other Lucas-heads, however, this one got to live out his dream grinding away in cutting rooms from Vancouver to LA working with some of the biggest editors in the galaxy.

We met back in 1998 when he assisted me on the editing of the Martin Sheen “classic” Voyage of Terror. We remain friends to this day. One of Warren’s more recent projects was Netflix’s VFX-heavy Altered Carbon, which got a lot of love from critics and audiences alike.

My old friend, who is now based in Vancouver, has an interesting story to tell, moving from assistant editor to VFX editor working on films like Underworld 4, Tomorrowland, Elysium and Chappie, so I threw some questions at him. Enjoy!

Warren Mazutinec

How did you get into the business?
I always wanted to work in the entertainment industry, but that was hard to find in Alberta. No film school-type programs were even offered, so I took the closest thing at a local college: audiovisual communications. While there, I studied photography, audio and video, but nothing like actual filmmaking. After that I attended Vancouver Film School. After film school, and with the help of some good friends, I got an opportunity to be a trainee at Shavick Entertainment.

What was it like working at a “film factory” that cranked out five to six pictures a year?
It was fun, but the product ultimately became intolerable. Movies for nine-year-olds can only be so interesting… especially low-budget ones.

What do your parents think of your career option?
Being from Alberta, everyone thought it wasn’t a real job — just a Hollywood dream. It took some convincing; my dad still tells me to look for work between gigs.

How did you learn Avid? Were you self-taught?
I was handed the manual by a post supervisor on day one. I never read it. I just asked questions and played around on any machine available. So I did have a lot of help, but I also went into work during my free time and on weekends to sit and learn what I needed to do.

Over the years I’ve been lucky enough to have cool people to work with and to learn with and from. I did six movies before I had an email address, more before I even owned a computer.

As media strayed away from film into digital, how did your role change in the cutting room? How did you refine your techniques with a changing workflow?
My first non-film movie was Underworld 4. It was shot with a Red One camera. I pretty much lied and said I knew how to deal with it. There was no difference really; just had to say goodbye to lab rolls, Keykode, etc. It was also a 3D stereo project, so that was a pickle, but not too hard to figure out.

How did you figure out the 3D stereo post?
It was basically learning to do everything twice. During production we really only played back in 3D for the novelty. I think most shows are 3D-ified in post. I’m not sure though, I’ve only done the one.

Do you think VR/AR will be something you work with in the future?
Yes, I want to be involved in VR at some point. It’s going to be big. Even just doing sound design would be cool. I think it’s the next step, and I want in.

Who are some of your favorite filmmakers?
David Lynch is my number one, by far. I love his work in all forms. A real treasure tor sure. David Fincher is great too. Scorsese, Christopher Nolan. There are so many great filmmakers working right now.

Is post in your world constantly changing, or have things more or less leveled off?
Both. But usually someone has dailies figured out, so Avid is pretty much the same. We cut in DNx115 or DnX36, so nothing like 4K-type stuff. Conform at the end is always fun, but there are tests we do at the start to figure it all out. We are rarely treading in new water.

What was it like transitioning to VFX editor? What tools did you need to learn to do that role?
FileMaker. And Jesus, son, I didn’t learn it. It’s a tough beast but it can do a lot. I managed to wrangle it to do what I was asked for, but it’s a hugely powerful piece of software. I picked up a few things on Tomorrowland and went from there.

I like the pace of the VFX editor. It’s different than assisting and is a nice change. I’d like to do more of it. I’d like to learn and use After Effects more. On the film I was VFX editor for, I was able to just use the Avid, as it wasn’t that complex. Mostly set extensions, etc.

How many VFX shot revisions would a typical shot go through on Elysium?
On Elysium, the shot version numbers got quite high, but part of that would be internal versioning by the vendor. Director Neil Blomkamp is a VFX guy himself, so he was pretty involved and knew what he wanted. The robots kept looking cooler and cooler as the show went on. Same for Chappie. That robot was almost perfect, but it took a while to get there.

You’ve worked with a vast array of editors, from, including Walter Murch, Lee Smith, Julian Clarke, Nancy Richardson and Bill Steinkamp. Can you talk about that, and have any of them let you cut material?
I’ll assemble scenes if asked to, just to help the editor out so he isn’t starting from scratch. If I get bored, I start cutting scenes as well. On Altered Carbon, when Julian (Clark) was busy with Episodes 2 and 3, I’d try to at least string together a scene or two for Episode 8. Not fine-cutting, mind you, just laying out the framework.

Walter asked a lot of us — the workload was massive. Lee Smith didn’t ask for much. Everyone asks for scene cards that they never use, ha!

Walter hadn’t worked on the Avid for five years or so prior to Tomorrowland, so there was a lot of him walking out of his room asking, “How do I?” It was funny because a lot of the time I knew what he was asking, but I had to actually do it on my machine because it’s so second nature.

What is Walter Murch like in the cutting room? Was learning his organizational process something you carried over into future cutting rooms?
I was a bit intimidated prior to meeting him. He’s awesome though. We got along great and worked well together. There was Walter, a VFX editor and four assistants. We all shared in the process. Of course, Walter’s workflow is unlike any other so it was a huge adjustment, but within a few weeks we were a well-oiled machine.

I’d come in at 6:30am to get dailies sorted and would usually finish around lunch. Then we’d screen in our theater and make notes, all of us. I really enjoyed screening the dailies that way. Then he would go into his room and do his thing. I really wish all films followed his workflow. As tough as it is, it all makes sense and nothing gets lost.

I have seen photos with the colored boxes and triangles on the wall. What does all that mean, and how often was that board updated?
Ha. That’s Walter’s own version of scene cards. It makes way better sense. The colors and shapes mean a particular thing — the longer the card the longer the scene. He did all that himself, said it helps him see the picture. I would peek into his room and watch him do this. He seemed so happy doing it, like a little kid.

Do you always add descriptions and metadata to your shots in Avid Media Composer?
We add everything possible. Usually there is a codebook the studios want, so we generate that with FileMaker on almost all the bigger shows. Walter’s is the same just way bigger and better. It made the VFX database look like a toy.

What is your workflow for managing/organizing footage?
A lot of times you have to follow someone else’s procedure, but if left to my own devices I try to make it the simplest it can be so anyone can figure out what was done.

How do you organize your timeline?
It’s specific to the editor, but I like to use as many audio tracks as possible and as few video tracks as possible, but when it’s a VFX-heavy show, that isn’t possible due to stacking various shot versions.

What did you learn from Lee Smith and Julian Clarke?
Lee Smith is a suuuuuper nice guy. He always had great stories from past films and he’s a very good editor. I’m glad he got the Oscar for Dunkirk, he’s done a lot of great work.

Julian is also great to work with. I’ve worked with him on Elysium, Chappie and Altered Carbon. He likes to cut with a lot of sound, so it’s fun to work with him. I love cutting sound, and on Altered Carbon we had over 60 tracks. It was a alternating stereo setup and we used all the tracks possible.

Altered Carbon

It was such a fun world to create sound for. Everything that could make a sound we put in. We also invented signature sounds for the tech we hoped they’d use in the final. And they did for some things.

Was that a 5.1 temp mix?? Have you ever done one?
No. I want to do a 5.1 Avid mix. Looks fun.

What was the schedule like on Altered Carbon? How was that different than some of the features you’ve worked on?
It was six-day weeks and 12 hours a day. Usually one week per month I’d trade off with the 2nd assistant and she’d let me have an actual weekend. It was a bit of a grind. I worked on Episodes 2, 3 and 8, and the schedules for those were tight, but somehow we got through it all. We had a great team up here for Vancouver’s editorial. They were also cutting in LA as well. It was pretty much non-stop editing the whole way through.

How involved was Netflix in terms of the notes process? Were you working with the same editors on the episodes you assisted?
Yes, all episodes were with Julian. First it went through Skydance notes, then Netflix. Skydance usually had more as they were the first to see the cuts. There were many versions for sure.

What was it like working with Neil Blomkamp?
It was awesome. He makes cool films, and it’s great to see footage like that. I love shooting guns, explosions, swords and swearing. I beat him in ping-pong once. I danced around in victory and he demanded we play again. I retired. One of the best environments I’ve ever worked in. Elysium was my favorite gig.

What’s the largest your crew has gotten in post?
Usually one or two editors, up to four assistants, a PA, a post super — so eight or nine, depending.

Do you prefer working with a large team or do you like smaller films?
I like the larger team. It can all be pretty overwhelming and having others there to help out, the easier it can be to get through. The more the merrier!

Altered Carbon

How do you handle long-ass-days?
Long days aren’t bad when you have something to do. On Altered Carbon I kept a skateboard in my car for those times. I just skated around the studio waiting for a text. Recently I purchased a One-Wheel (skateboard with 1 wheel) and plan to use it to commute to work as much as possible.

How do you navigate the politics of a cutting room?
Politics can be tricky. I usually try to keep out of things unless I’m asked, but I do like to have a sit down or a discussion of what’s going on privately with the editor or post super. I like to be aware of what’s coming, so the rest of us are ready.

Do you prefer features to TV?
It doesn’t matter anymore because the good filmmakers work in both mediums. It used to be that features were one thing and TV was another, with less complex stories. Now that’s different and at times it’s the opposite. Features usually pay more though, but again that’s changing. I still think features are where it’s at, but that’s just vanity talking.

Sometimes your project posts in Vancouver but moves to LA for finishing. Why? Does it ever come back?
Mostly I think it’s because that’s where the director/producers/studio lives. After it’s shot everyone just goes back home. Home is usually LA or NY. I wish they’d stay here.

How long do you think you’ll continue being an AE? Until you retire? What age do you think that’ll be?
No idea; I just want to keep working on projects that excite me.

Would you ever want to be an editor or do you think you’d like to pivot to VFX, or are you happy where you are?
I only hope to keep learning and doing more. I like the VFX editing, I like assisting and I like being creative. As far as cutting goes, I’d like to get on a cool series as a junior editor or at least start doing a few scenes to get better. I just want to keep advancing, I’d love to do some VR stuff.

What’s next for you project wise?
I’m on a Disney Show called Timmy Failure. I can’t say anything more at this point.

What advice do you have for other assistant editors trying to come up?
It’s going to take a lot longer than you think to become good at the job. Being the only assistant does not make you a qualified first assistant. It took me 10 years to get there. Also you never stop learning, so always be open to another approach. Everyone does things differently. With Murch on Tomorrowland, it was a whole new way of doing things that I had never seen before, so it was interesting to learn, although it was very intimidating at the start.


Jeremy Presner is an Emmy-nominated film and television editor residing in New York City. Twenty years ago, Warren was AE on his first film. Since then he has cut such diverse projects as Carrie, Stargate Atlantis, Love & Hip Hop and Breaking Amish.

VFX studio Electric Theatre Collective adds three to London team

London visual effects studio Electric Theatre Collective has added three to its production team: Elle Lockhart, Polly Durrance and Antonia Vlasto.

Lockhart brings with her extensive CG experience, joining from Touch Surgery where she ran the Johnson & Johnson account. Prior to that she worked at Analog as a VFX producer where she delivered three global campaigns for Nike. At Electric, she will serve as producer on Martini and Toyota.

Vlasto joins Electric working on clients such Mercedes, Tourism Ireland and Tui. She joins from 750MPH where, over a four-year period, she served as producer on Nike, Great Western Railway, VW and Amazon to name but a few.

At Electric, Polly Durrance will serve as producer on H&M, TK Maxx and Carphone Warehouse. She joins from Unit where she helped launched their in-house Design Collective, worked with clients such as Lush, Pepsi and Thatchers Cider. Prior to Unit Polly was at Big Buoy where she produced work for Jaguar Land Rover, giffgaff and Redbull.

Recent projects at the studio, which also has an office in Santa Monica, California, include Tourism Ireland Capture Your Heart and Honda Palindrome.

Main Image: (L-R) Elle Lockhart, Antonia Vlasto and Polly Durrance.

Rodeo VFX supe Arnaud Brisebois on the Fantastic Beasts sequel

By Randi Altman

Fantastic Beasts: Crimes of Grindelwald, directed by David Yates and written by J.K. Rowling, is a sequel to 2016’s Fantastic Beasts and Where to Find Them. It follows Newt Scamander (Eddie Redmayne) and a young Albus Dumbledore (Jude Law) as they attempt to take down the dark wizard Gellert Grindelwald (Johnny Depp).

Arnaud_Brisebois

As you can imagine, the film features a load of visual effects, and once again the team at Rodeo FX was called on to help. Their work included establishing the period in which the film is set and helping with the history of the Obscurus, Credence Barebone, and more.

Rodeo FX visual effects supervisor Arnaud Brisebois and team worked with the film’s VFX supervisors — Tim Burke and Christian Manz — to create digital environments, including detailed recreations of Paris in the 1920s and iconic wizarding locations like the Ministry of Magic.

Beyond these settings, the Montreal-based Brisebois was also in charge of creating the set pieces of the Obscurus’ destructive powers and a scene depicting its backstory. In all, they produced approximately 200 shots over a dozen sequences. While Brisebois visited the film’s set in Leavesden to get a better feel of the practical environments, he was not involved in principal photography.

Let’s find out more…

How early did you get involved, and how much input did you have?
Rodeo got involved in May 2017, at the time mainly working on pre-production creatures, design and concept art. I had a few calls with the film’s VFX supervisors, Tim Burke and Christian Manz, to discuss creatures and main directive lines for us to play with. From there we tried various ideas.
At that moment in pre-production, the essence of what the creatures were was clear, but their visual representation could really swing between extremes. That was the time to invent, study and propose directions for design.

Can you talk about creating the Ministry of Magic, which was partially practical, yes?
Correct, the London Ministry of Magic was indeed partially practically built. The partial set in this case meant a simple incurved corridor with a ceramic tiled wall. We still had to build the whole environment in CG in order to directly extend that practical set, but, most importantly, we extended the environment itself, with its immense circular atrium filled with thousands of busy offices.

For this build, we were provided with original Harry Potter set plans from production designer Stuart Craig, as well as plan revisions meant specifically for Crimes of Grindelwald. We also had access to LIDAR scans and cross-polarized photography from areas of the Harry Potter tour in Leavesden, which was extremely useful.

Every single architectural element was precisely built as individual units, and each unit composed of individual pieces. The single office variants were procedurally laid out on a flat grid over the set plan elevations and then wrapped as a cylinder using an expression.

The use of a procedural approach for this asset allowed for faster turnarounds and for changes to be made, even in the 11th hour. A crowd library was built to populate the offices and various areas of the Ministry, helping give it life and support the sense of scale.

So you were able to use assets from previous films?
What really links these movies together is production designer Stuart Craig. This is definitely his world, at least in visual terms. Also, as with all the Potter films, there are a large number of references and guidelines available for inspiration. This world has its own mythology, history and visual language. One does not need to look for long before finding a hint, something to link or ground a new effect in the wizarding world.

What about the scenes involving the Obscurus? Was any of the destruction it caused practical?
Apart from a few fans blowing a bit of wind on the actors, all destruction was full-frontal CG. A complex model of Irma’s house was built with precise architectural details required for its destruction. We also built a wide library of high-resolution hero debris, which was scattered on points and simulated for the very close-up shots. In the end, only the actors were preserved from live photography.

What was the most challenging sequence you worked on?
It was definitely Irma’s death. This sequence involved such a wide variety of effects — ranging from cloth and RBD levitation, tearing cloth, huge RBD simulations and, of course, the Obscurus itself, which is a very abstract and complex cloth setup driving flip simulations. The challenge also came from shot values, which meant everything we built or simulated had to hold up for tight close-ups, as well as wide shots.

Can you talk about the tools you used for VFX, management and review and approval?
All our tracking and review is done in Autodesk Shotgun. Artists worked up versions that they would then submit for dailies. All these submissions got in front of me at one point or another, and I then reviewed them and entered notes and directives to guide artists in the right direction.
For a project the size of Crimes of Grindelwald, over the course of 10 months, I reviewed and commented on approximately 6,000 versions for about 500 assets and 200 shots.

We are working on a Maya-based pipeline mainly, using it for modeling, rigging and shading. Zbrush is of course our main tool for organic modeling. We mostly use Mari and Substance Designer for textures. FX and CFX is handled in Houdini and our lighting pipeline is Katana based using Arnold as renderer. Our compositing pipeline is Nuke with a little use of Flame/Flare for very specific cases. We obviously have proprietary tools which help us boost these great softwares potential and offer custom solutions.

How did the workflow differ on this film from previous films?
It didn’t really differ. Working with the same team and the same crew, it really just felt like a continuation of our collaboration. These films are great to work on, not only because of their subject matter, but also thanks to the terrific people involved.

VFX Supervision: The Coens’ Western The Ballad of Buster Scruggs

By Randi Altman

The writing and directing duo of Joel and Ethan Coen have taken on the American Western with their new Netflix film, The Ballad of Buster Scruggs. This offering features six different vignettes that follow outlaws and settlers on the American frontier.

It stars the Coen brothers’ favorite Tim Blake Nelson as Buster, along with Liam Neeson, James Franco, Brenden Gleeson and many other familiar faces, even Tom Waits! It’s got dark humor and a ton of Coen quirkiness.

Alex Lemke (middle) on set with the Coen brothers.

For their visual effects needs, the filmmakers turned to New York-based East Side Effects co-founders and VFX supervisors Alexander Lemke and Michael Huber to help make things look authentic.

We reached out to visual effects supervisors Lemke and Huber to find out more about their process on the film and how they worked with these acclaimed filmmakers. East Side Effects created two-thirds of the visual effects in-house, while other houses, such as The Mill and Method, provided shots as well.

How many VFX shots were there in total?
Alexander Lemke: In the end, 704 shots had digital effects in them. This has to be a new record for the Coens. Joel at one point jokingly called it their “Marvel movie.”

How early did you get involved? Can you talk about that process?
Michael Huber: Alex and myself were first approached in January 2017 and had our first meetings shortly thereafter. We went through the script with the Coens and designed what we call a “VFX bible,” which outlined how we thought certain effects could be achieved. We then started collecting references from other films or real-life footage.

Did you do previs? 
Lemke: The Coens have been doing movies for so long in their own way that previs never really became an issue. For the Indian battles, we tried to interest them in the Ncam virtual camera system in combination with pre-generated assets, but that is not their way of doing a film.

The whole project was storyboarded by J. Todd Anderson, who has been their go-to storyboard guy since Raising Arizona. These storyboards gave a pretty good indication of what to expect, but there were still a lot of changes due to the nature of the project, such as weather and shooting with animals.

What were some of the challenges of the process and can you talk about creating the digital characters that were needed?
Huber: Every story had its own challenge, ranging from straightforward paintouts and continuity fixes to CG animals and complex head replacements using motion control technology. In order to keep the work as close to the directors as possible, we assembled a group of artists to serve as an extended in-house team, creating the majority of shots while also acting as a hub for external vendor work.

In addition, a color workflow using ACES and FilmLight Baselight was established to match VFX shots seamlessly to the dailies look established by cinematographer Bruno Delbonnel and senior colorist Peter Doyle. All VFX pulls were handled in-house.

Lemke: The Coens like to keep things in-camera as much as possible, so animals like the owl in “All Gold Canyon” or the dog in “Gal” were real. Very early on it was clear that some horse falls wouldn’t be possible as a practical stunt, so Joel and Ethan had a reel compiled with various digital horse stunts — including the “Battle of the Bastards” from Game of Thrones, which was done by Iloura (now Method). We liked that so much that we decided to just go for it and reach out to these guys, and we were thrilled when we got them on board for this. They did the “dog-hole!” horse falls in the “The Gal Who Got Rattled” segment, as well as the carriage horses in “Mortal Remains.”

Huber: For the deer in “All Gold Canyon,” the long-time plan was to shoot a real deer against bluescreen, but it became clear that we might not get the very specific actions Joel and Ethan wanted to see. They were constantly referring to the opening of Shane, which has this great shot of the titular character appearing through the antlers of a deer. So, it became more and more clear it would have to be a digital solution, and we were very happy to get The Mill in New York to work on that for us. Eventually, they would also handle all the other critters in the opening sequence.

Can you talk about Meal Ticket’s “artist” character, who is missing limbs?
Lemke: The “Wingless Thrush” — as he is referred to on a poster in the film — was a combined effort of the art department, special effects, costume design, VFX and, of course, actor Harry Melling’s incredible stamina. He was performing this poetry while standing in a hole in the ground with his hands behind his back, and went for it take after take, sometimes in the freezing cold.

Huber: It was clear that 98% of all shots would be painting out his arms and legs, so SFX supervisor Steve Cremin had to devise a way to cut holes into the set and his chair to make it appear he was resting on his stumps. Our costume designer, Mary Zophres, had the great idea of having him wear a regular shirt where the longs sleeves were just folded up, which helped with hiding his arms. He wasn’t wearing any blue garment, just black, which helped with getting any unnecessary color spill in the set.

Alex was on set to make sure we would shoot clean plates after each setup. Luckily, the Coen brothers’ approach to these shots was really focusing on Harry’s performance in long locked-off takes, so we didn’t have to deal with a lot of camera motion. We also helped Harry’s look by warping his shoulders closer to his body in some shots.

Was there a particular scene with this character that was most challenging or that you are most proud of?
Lemke: While most of the paintout shots were pretty straightforward — we just had to deal with the sheer amount of shots and edit changes — the most challenging parts are when Liam Neeson carries Harry in a backpack up the stairs in a brothel. He then puts him on the ground and eventually turns him away from the “action” that is about to happen.

We talked about different approaches early on. At some point, a rig was considered to help with him being carried up the stairs, but this would have meant an enormous amount of paint work, not to mention the setup time on a very tight shooting schedule. A CG head might have worked for the stairs, but for the long close up shots of Harry — both over a minute long, and only with very subtle facial expressions — it would have been cost prohibitive and maybe not successful in the end. So a head replacement seemed like the best solution, which comes with its own set of problems. In our case, shooting a head element of Harry that would match exactly what the dummy on Liam’s back and on the ground was doing in the production plates.

We came up with a very elaborate set up, where we would track the backpack and a dummy in the live-action photography in 3D Equalizer. We then reengineered this data into Kuper move files that would drive a motion control motion base combo.

Basically, Harry would sit on a computerized motion base that would do the turning motion so he could react to being pushed around. This happened while the motion control camera would take care of all the translations. This also meant our DP Bruno had to create animated lighting for the staircase shot to make the head element really sit in the plate.

We worked with Pacific Motion for the motion control. Mike Leben was our operator. For the NAC effects for the motion base, Nic Nicholson took care of this. Special thanks goes out to Christoph Gaudl for his camera and object tracking, Stefan Galleithner for taking on the task of converting all that data into something the camera and base would understand, and Kelly Chang and Mike Viscione for on-set Maya support.

Of course, you only get an element that works 80% of the way — the rest was laborious compositing work. Since we put the motion base to its speed limits on the staircase shot, we actually had to shoot it half speed and then speed it up in post. This meant a lot of warping/tracking was needed to make sure there was no slippage.

Michael Huber

The dummy we used for the live-action photography didn’t have any breathing movement in it, so we used parts of Harry’s bluescreen plates as a guideline of how his chest should move. These tricky tasks were expertly performed mainly by Danica Parry, Euna Kho and Sabrina Tenore.

Can you talk about how valuable it is being on set?
Huber: It is just valuable to be on set when the call sheet calls for a greenscreen, while we really need a bluescreen! But joking aside, Joel and Ethan were very happy to have someone there all the time during the main shoot in case something came up, which happened a lot because we were shooting outdoors so much and we were dependent on the weather.

For the opening shot of Buster riding through Monument Valley, they were thinking of a very specific view — something they had seen on a picture on the Internet. Through Google Maps and research, Alex was able to find out the exact location that picture was taken. So, on a weekend when we weren’t shooting, he packed up his family and drove up to the Valley to shoot photographs that would serve as the basis for the matte painting for the first shot of the film — instead of going there with a whole crew.

Another instance being on set helped would be the scene with Tom Waits in the tree — the backgrounds for these bluescreen shots were a mixture of B camera and Alex’s location photography while in Colorado. Same goes for the owl tree backgrounds.

What tools did East Side use on the film?
Huber: For software we called on Foundry Nuke (X & Studio), Boris FX Mocha Pro and Side Effects Houdini. For hardware we used HP and SuperMicro workstations running Linux. There was also proprietary software such as using Houdini digital assets for blood simulations.

We were using Autodesk Shotgun with a proprietary connection to Nuke that handled all our artist interaction and versioning, including automatically applying the correct Baselight grade when creating a version. This also allowed us to use the RV-Shotgun integration for reviewing.

Can you talk about the turnaround times and deadlines?
Lemke: Working on a Coen brothers film means you don’t have a lot of things you normally have to deal with — studio screenings, trailers, and such. At the same time, they insisted on working through the stories chronologically, so that meant that the later segments would come in late in the schedule. But, it is always a great experience working with filmmakers who have a clear vision and know what they are doing.

Asahi beer spot gets the VFX treatment

A collaboration between The Monkeys Melbourne, In The Thicket and Alt, a newly released Asahi campaign takes viewers on a journey through landscapes built around surreal Japanese iconography. Watch Asahi Super Dry — Enter Asahi here.

From script to shoot — a huge operation that took place at Sydney’s Fox Studios — director Marco Prestini and his executive producer Genevieve Triquet (from production house In The Thicket) brought on the VFX team at Alt to help realize the creative vision.

The VFX team at Alt (which has offices in Sydney, Melbourne and Los Angeles) worked with Prestini to help design and build the complex “one shot” look, with everything from robotic geishas to a gigantic CG squid in the mix, alongside a seamless blend of CG set extensions and beautifully shot live-action plates.

“VFX supervisor Dave Edwards and the team at Alt, together with my EP Genevieve, have been there since the very beginning, and their creative input and expertise were key in every step of the way,” explains Prestini. “Everything we did on set was the results of weeks of endless back and forth on technical previz, a process that required pretty much everyone’s input on a daily basis and that was incredibly inspiring for me to be part of.”

Dave Edwards, VFX supervisor at Alt, shares: “Production designer Michael Iacono designed sets in 3D, with five huge sets built for the shoot. The team then worked out camera speeds for timings based on these five sets and seven plates. DP Stefan Duscio would suggest rigs and mounts, which our team was able to then test it in previs to see if it would work with the set. During previs, we worked out that we couldn’t get the resolution and the required frame rate to shoot the high frame rate samurais, so we had to use Alexa LF. Of course, that also helped Marco, who wanted minimal lens distortion as it allowed a wide field of view without the distortion of normal anamorphic lenses.”

One complex scene involves a character battling a gigantic underwater squid, which was done via a process known as “dry for wet” — a film technique in which smoke, colored filters and/or lighting effects are used to simulate a character being underwater while filming on a dry stage. The team at Alt did a rough animation of the squid to help drive the actions of the talent and the stunt team on the day, before spending the final weeks perfecting the look of the photoreal monster.

In terms of tools, for concept design/matte painting Alt used Adobe Photoshop while previs/modeling/texturing/animation was done in Autodesk Maya. All of the effects/lighting/look development was via Side Effects Houdini; the compositing pipeline was built around Foundry Nuke; final online was completed in Autodesk Flame; and for graphics, they used Adobe After Effects.
The final edit was done by The Butchery.

Here is the VFX breakdown:

Enter Asahi – VFX Breakdown from altvfx on Vimeo.

Storage for VFX Studios

By Karen Moltenbrey

Visual effects are dazzling — inviting eye candy, if you will. But when you mention the term “storage,” the wide eyes may turn into a stifled yawn from viewers of the amazing content. Not so for the makers of that content.

They know that the key to a successful project rests within the reliability of their storage solutions. Here, we look at two visual effects studios — both top players in television and feature film effects — as they discuss how data storage enables them to excel at their craft.

Zoic Studios
A Culver City-based visual effects facility, with shops in Vancouver and New York, Zoic Studios has been crafting visual effects for a host of television series since its founding in 2002, starting with Firefly. In addition to a full plate of episodics, Zoic also counts numerous feature films and spots to its credits.

Saker Klippsten

According to Saker Klippsten, CTO, the facility has used a range of storage solutions over the past 16 years from BlueArc (before it was acquired by Hitachi), DataDirect Networks and others, but now uses Dell EMC’s Isilon cluster file storage system for its current needs. “We’ve been a fan of theirs for quite a long time now. I think we were customer number two,” he says, “back when they were trying to break into the media and entertainment sector.”

Locally, the studio uses Intel and NVMe drives for its workstations. NVMe, or non-volatile memory express, is an open logical device interface specification for accessing all-flash storage media attached via PCI Express (PCIe) bus. Previously, Zoic had been using Samsung SSD drives, with Samsung 1TB and 2TB EVO drives, but in the past year and a half, began migrating to NVMe on the local workstations.

Zoic transitioned to the Isilon system in 2004-2005 because of the heavy usage its renderfarm was getting. “Renderfarms work 24/7 and don’t take breaks. Our storage was getting really beat up, and people were starting to complain that it was slow accessing the file system and affecting playback of their footage and media,” explains Klippsten. “We needed to find something that could scale out horizontally.”

At the time, however, file-level storage was pretty much all that was available — “you were limited to this sort of vertical pool of storage,” says Klippsten. “You might have a lot of storage behind it, but you were still limited at the spigot, at the top end. You couldn’t get the data out fast enough.” But Isilon broke through that barrier by creating a cluster storage system that allotted the scale horizontally, “so we could balance our load, our render nodes and our artists across a number of machines, and access and update in parallel at the same time,” he adds.

Klippsten believes that solution was a big breakthrough for a lot of users; nevertheless, it took some time for others to get onboard. “In the media and entertainment industry, everyone seemed to be locked into BlueArc or NetApp,” he notes. Not so with Zoic.

Fairly recently, some new players have come onto the market, including Qumulo, touted as a “next-generation NAS company” built around advanced, distributed software running on commodity hardware. “That’s another storage platform that we have looked at and tested,” says Klippsten, adding that Zoic even has a number of nodes from the vendor.

There are other open-source options out there as well. Recently, Red Hat began offering Gluster Storage, an open, software-defined storage platform for physical, virtual and cloud environments. “And now with NVMe, it’s eliminating a lot of these problems as well,” Klippsten says.

Back when Zoic selected Isilon, there were a number of major issues that affected the studio’s decision making. As Klippsten notes, they had just opened the Vancouver office and were transferring data back and forth. “How do we back up that data? How do we protect it? Storage snapshot technology didn’t really exist at the time,” he says. But, Isilon had a number of features that the studio liked, including SyncIQ, software for asynchronous replication of data. “It could push data between different Isilon clusters from a block level, in a more automated fashion. It was very convenient. It offered a lot of parameters, such as moving data by time of day and access frequency.”

SyncIQ enabled the studio to archive the data. And for dealing with interim changes, such as a mistakenly deleted file, Zoic found Isilon’s SnapshotIQ ideal for fast data recovery. Moreover, Isilon was one of the first to support Aspera, right on the Isilon cluster. “You didn’t have to run it on a separate machine. It was a huge benefit because we transfer a lot of secure, encrypted data between us and a lot of our clients,” notes Klippsten.

Netflix’s The Chilling Adventures of Sabrina

Within the pipeline, Zoic’s storage system sits at the core. It is used immediately as the studio ingests the media, whether it is downloaded or transferred from hard drives – terabytes upon terabytes of data. The data is then cleaned up and distributed to project folders for tasks assigned to the various artists. In essence, it acts as a holding tank for the main production storage as an artist begins working on those specific shots, Klippsten explains.

Aside from using the storage at the floor level, the studio also employs it at the archive level, for data recovery as well as material that might not be accessed for weeks. “We have sort of a tiered level of storage — high-performance and deep-archival storage,” he says.

And the system is invaluable, as Zoic is handling 400 to 500 shots a week. If you multiply that by the number of revisions and versions that take place during that time frame, it adds up to hundreds of terabytes weekly. “Per day, we transfer between LA, Vancouver and New York somewhere around 20TB to 30TB,” he estimates. “That number increases quite a bit because we do a lot of cloud rendering. So, we’re pushing a lot of data up to Google and back for cloud rendering, and all of that hits our Isilon storage.”

When Zoic was founded, it originally saw itself as a visual effects company, but at the end of the day, Klippsten says they’re really a technology company that makes pretty pictures. “We push data and move it around to its limits. We’re constantly coming up with new, creative ideas, trying to find partners that can help provide solutions collaboratively if we cannot create them ourselves. The shot cost is constantly being squeezed by studios, which want these shots done faster and cheaper. So, we have to make sure our artists are working faster, too.”

The Chilling Adventures of Sabrina

Recently, Zoic has been working on a TV project involving a good deal of water simulations and other sims in general — which rapidly generate a tremendous amount of data. Then the data is transferred between the LA and Vancouver facilities. Having storage capable of handling that was unheard of three years ago, Klippsten says. However, Zoic has managed to do so using Isilon along with some off-the-shelf Supermicro storage with NVMe drives, enabling its dynamics department to tackle this and other projects. “When doing full simulation, you need to get that sim in front of the clients as soon as possible so they can comment on it. Simulations take a long time — we’re doing 26GB/sec, which is crazy. It’s close to something in the high-performance computing realm.”

With all that considered, it is hardly surprising to hear Klippsten say that Zoic could not function without a solid storage solution. “It’s funny. When people talk about storage, they are always saying they don’t have enough of it. Even when you have a lot of storage, it’s always running at 99 percent full, and they wonder why you can’t just go out to Best Buy and purchase another hard drive. It doesn’t work that way!”

Milk VFX
Founded just five years ago, Milk VFX is an independent visual effects facility in the UK with locations in London and Cardiff, Wales. While Milk VFX may be young, it was founded by experienced and award-winning VFX supervisors and producers. And the awards have continued, including an Oscar (Ex-Machina), an Emmy (Sherlock) and three BAFTAs, as the studio creates innovative and complex work for high-end television and feature films.

Benoit Leveau

With so much precious data, and a lot of it, the studio has to ensure that its work is secure and the storage system is keeping pace with the staff using it. When the studio was set up, it installed Pixit Media’s PixStor, a parallel file system with limitless storage, for its central storage solution. And, it has been growing with the company ever since. (Milk uses almost no local storage, except for media playback.)

“It was a carefully chosen solution due to its enterprise-level performance,” says Benoit Leveau, head of pipeline at Milk, about the decision to select PixStor. “It allowed us to expand when setting up our second studio in Cardiff and our rendering solutions in the cloud.”

When Milk was shopping for a storage offering while opening the studio, four things were forefront in their minds: speed, scalability, performance and reliability. Those were the functions the group wanted from its storage system — exactly the same four demands that the projects at the studios required.

“A final image requires gigabytes, sometimes terabytes, of data in the form of detailed models, high-resolution textures, animation files, particles and effects caches and so forth,” says Leveau. “We need to be able to review 4K image sequences in real time, so it’s really essential for daily operation.”

This year alone, Milk has completed a number of high-end visual effects sequences for feature films such as Adrift, serving as the principal vendor on this true story about a young couple lost at sea during one of the most catastrophic hurricanes in recorded history. The Milk team created all the major water and storm sequences, including bespoke 100-foot waves, all of which were rendered entirely in the cloud.

As Leveau points out, one of the shots in the film was more than 60TB, as it required complex ocean simulations. “We computed the ocean simulations on our local renderfarm, but the rendering was done in the cloud, and with this setup, we were able to access the data from everywhere almost transparently for the artists,” he explains.

Adrift

The studio also recently completed work on the blockbuster Fantastic Beasts sequel, The Crimes of Grindelwald.

For television, the studio created visual effects for an episode of the Netflix Altered Carbon sci-fi series, where people can live forever, as they digitally store their consciousness (stacks) and then download themselves into new bodies (sleeves). For the episode, the Milk crew created forest fires and the aftermath, as well as an alien planet and escape ship. For Origin, an action-thriller, the team generated 926 VFX shots in 4K for the 10-part series, spanning a wide range of work. Milk is also serving as the VFX vendor for Good Omens, a six-part horror/fantasy/drama series.

“For Origin, all the data had to be online for the duration of the four-month project. At the same time, we commenced work as the sole VFX vendor on the BBC/Amazon Good Omens series, which is now rapidly filling up our PixStor, hence the importance of scalability!” says Leveau.

Main Image: Origin via Milk VFX


Karen Moltenbrey is a veteran VFX and post writer.

Shotgun 8: cloud-based Shotgun Create for studio artists, new UI, more

Autodesk Shotgun 8 is the newest version of the company’s cloud-based review and production tracking software. Targeted to busy creative teams, Shotgun 8 aims to improve the artist and reviewer experience with updates that reduce interruptions to creative flow and facilitate the feedback loop.

Shotgun 8 includes the first public version of Shotgun Create, a new part of Shotgun geared specifically toward creatives working at studios. As a cloud-connected desktop experience, Shotgun Create makes it easier for artists and reviewers to see tasks demanding their attention while providing a collaborative environment to review media and exchange feedback. Linked to the same updated information stored in Shotgun, artists are able to work with high-resolution, locally stored shots in addition to media streamed from the cloud. Also included in this first version of Shotgun Create are built-in annotation tools, new prioritized task views, a centralized view of task history, and tighter integration with creative tools.

Shotgun 8 also has a new web interface for greater clarity and consistency. For individuals who work predominantly in a dark environment, this release also makes it easier to switch between light and dark themes with a new button in the header.

Shotgun 8 is currently available and priced at $30 per account/per month with support or $50 per account/per month with advanced support.

A VFX pro on avoiding the storage ‘space trap’

By Adam Stern

Twenty years is an eternity in any technology-dependent industry. Over the course of two-plus decades of visual effects facility ownership, changing standards, demands, capability upgrades and staff expansion have seen my company, Vancouver-based Artifex Studios, usher in several distinct eras of storage, each with its own challenges. As we’ve migrated to bigger and better systems, one lesson we’ve learned has proven critical to all aspects of our workflow.

Adam Stern

In the early days, Artifex used off-the-shelf hard drives and primitive RAIDs for our storage needs, which brought with it slow transfer speeds and far too much downtime when loading gigabytes of data on and off the system. We barely had any centralized storage, and depended on what was essentially a shared network of workstations — which our then-small VFX house could get away with. Even considering where we were then, which was sub-terabyte, this was a messy problem that needed solving.

We took our first steps into multi-TB NAS using off-the-shelf solutions from companies like Buffalo. This helped our looming storage space crunch but brought new issues, including frequent breakdowns that cost us vital time and lost iterations — even with plenty of space. I recall a particular feature film project we had to deliver right before Christmas. It almost killed us. Our NAS crashed and wouldn’t allow us to pull final shots, while throwing endless error messages our way. I found myself frantically hot-wiring spare drives to enable us to deliver to our client. We made it, but barely.

At that point it was clear a change was needed. We started using a solution that Annex Pro — a Vancouver-based VAR we’d been working with for years — helped put into place. That company was bought and then went away completely.

Our senior FX TD, Stanislav Enilenis, who was also handling IT for us back then, worked with Annex to install the new system. According to Stan, “the switch allowed bandwidth for expansion. However, when we would be in high-production mode, bandwidth became an issue. While the system was an overall improvement from our first multi-terabyte NAS, we had issues. The company was bought out, so getting drives became problematic, parts became harder to source and there were system failures. When we hit top capacity with the-then 20-plus staff all grinding, the system would slow to a crawl and our artists spent more time waiting than working.”

Artifex machine room.

As we transitioned from SD to HD, and then to 4K, our delivery requirements increased along with our rendering demands, causing severe bottlenecks in the established setup. We needed a better solution but options were limited. We were potentially looking at a six-figure investment, in a system not geared to M&E.

In 2014, Artifex was working on the TV series Continuum, which had fairly massive 3D requirements on an incredibly tight turnaround. It was time to make a change. After a number of discussions with Annex, we made the decision to move to an offering from a new company called Qumulo, which provided above-and-beyond service, training and setup. When we expanded into our new facility, Qumulo helped properly move the tech. Our new 48TB pipeline flowed freely and offered features we didn’t previously have, and Qumulo were constantly adding new and requested updates.

Laila Arshid, our current IS manager, has found this to be particularly valuable. “In Qumulo’s dashboard I can see realtime analytics of everything in the system. If we have a slowdown, I can track it to specific workstations and address any issues. We can shut that workstation or render-node down or reroute files so the system stays fast.”

The main lesson we’ve learned throughout every storage system change or upgrade is this: It isn’t just about having a lot of space. That’s an easy trap to fall into, especially today when we’re seeing skyrocketing demands from 4K+ workflows. You can have unlimited storage, but If you can’t utilize it efficiently and at speed, your storage space becomes irrelevant.

In our industry, the number of iterations we can produce has a dramatic impact on the quality of work we’re able to provide, especially with today’s accelerated schedules. One less pass can mean work with less polish, which isn’t acceptable.

Artifex provided VFX for Faster Than Light

Looking forward, we’re researching extended storage on the cloud: an ever-expanding storage pool with the advantages of fast local infrastructure. We currently use GCP for burst rendering with Zync, along with nearline storage, which has been fantastic — but the next step will be integrating these services with our daily production processes. That brings a number of new challenges, including how to combine local and cloud-based rendering and storage in ways that are seamless to our team.

Constantly expanding storage requirements, along with maintaining the best possible speed and efficiency to allow for artist iterations, are the principal drivers for every infrastructure decision at our company — and should be a prime consideration for everyone in our industry.


Adam Stern is the founder of Vancouver, British Columbia’s Artifex. He says the studio’s main goal is to heighten the VFX experience, both artistically and technically, and collaborate globally with filmmakers to tell great stories.

Post production in the cloud

By Adrian Pennington

After being talked about for years, the capacity to use the cloud for the full arsenal of post workflows is possible today with huge ramifications for the facilities business.

Rendering frames for visual effects requires an extraordinary amount of compute power for which VFX studios have historically assigned whole rooms full of servers to act as their renderfarm. As visual quality has escalated, most vendors have either had to limit the scope of their projects or buy or rent new machines on-premises to cope with the extra rendering needed. In recent times this has been upended as cloud networking has enabled VFX shops to relieve internal bottlenecks to scale, and then contract, at will.

The cloud rendering process has become so established that even this once groundbreaking capability has evolved to encompass a whole host of post workflows from previz to transcoding. In doing so, the conventional business model for post is being uprooted and reimagined.

“Early on, global facility powerhouses first recognized how access to unlimited compute and remote storage could empower the creative process to reach new heights,” explains Chuck Parker, CEO of Sohonet. “Despite spending millions of dollars on hardware, the demands of working on multiple, increasingly complex projects simultaneously, combined with decreasing timeframes, stretched on-premise facilities to their limits.”

Chuck Parker

Public cloud providers (Amazon Web Services, Google Cloud Platform, Microsoft Azure) changed the game by solving space, time and capacity problems for resource-intensive tasks. “Sohonet Fastlane and Google Compute Engine, for example, enabled MPC to complete The Jungle Book on time and to Oscar-winning standards, thanks to being able to run millions of Core hours in the cloud,” notes Parker.

Small- to mid-sized companies followed suit. “They lacked the financial resources and the physical space of larger competitors, and initially found themselves priced out of major studio projects,” says Parker. “But by accessing renderfarms in the cloud they can eliminate the cost and logistics of installing and configuring physical machines. Flexible pricing and the option of preemptible instances mean only paying for the compute power used, further minimizing costs and expanding the scope of possible projects.”

Milk VFX did just this when rendering the complex sequences on Adrift. Without the extra horsepower, the London-based house could not have bid on the project in the first place.

“The technology has now evolved to a point where any filmmaker with any VFX project or theatrical, TV or spot editorial can call on the cloud to operate at scale when needed — and still stay affordable,” says Parker. “Long anticipated and theorized, the ability to collaborate in realtime with teams in multiple geographic locations is a reality that is altering the post production landscape for enterprises of all sizes.”

Parker says the new post model might look like this. He uses the example of a company headquartered in Berlin — “an innovative company might employ only a dozen managers and project supervisors on its books. They can bid with confidence on jobs of any scale and any timeframe knowing that they can readily rent physical space in any location, anywhere in the world, to flexibly take advantage of tax breaks and populate it with freelance artists: 100 one week, say, 200 in week three, 300 in week five. The only hardware (rental) costs would be thin-client workstations and Wacom tablets, plus software licenses for 3D, roto, compositing and other key tools. With the job complete, the whole infrastructure can be smoothly scaled back.”

The compute costs of spinning up cloud processing and storage can be modelled into client pitches. “But building out and managing such connectivity independently may still require considerable CAPEX — one that might be cost-prohibitive if you only need the infrastructure for short periods,” notes Parker. “Cloud-compute resources are perfect for spikes in workload but, in between those spikes, paying for bandwidth you don’t need will hurt the bottom line.

Dedicated, “burstable” connectivity speeds of 100Mbit/s up to 50Gbit/s with flexibility, security and reliability are highly desirable attributes for the creative workflow. Price points, as ever, are a motivating concern. Parker’s offerings “move your data away from Internet bandwidth, removing network congestion and decreasing the time it takes to transfer your data. With a direct link to the major cloud provider of your choice, customers can be in control of how their data is routed, leading to a more consistent network experience.

“Direct links into major studios like Pinewood UK open up realtime on-set CGI rendering with live-action photography for virtual production scenarios,” adds Parker. “It is vital that your data transits straight to the cloud and never touches the Internet.”

With file sizes set to continue to increase exponentially over the next few years as 4K and HDR become standard and new immersive media like VR emerges to the mainstream, leveraging the cloud will not only be routine for the highest budget projects and largest vendors, it will become the new post production paradigm. In the cloud creative workflows are demystified and democratized.

Video editing and VFX app HitFilm gets an upgrade

FXhome has upgraded its video editing and VFX software app. The new HitFilm Version 11.0 features Surface Studio, a new VFX plugin modeled from Video Copilot’s Damage and Decay and Cinematic Titles tutorials. Based on customer requests, this new VFX tool enables users to create smooth or rough-textured metallic and vitreous surfaces on any text or in layers. By dropping a clear PNG file onto the HitFilm timeline, text titles instantly turn into weathered, rusty and worn metallic signs.

HitFilm’s Surface Studio also joins FXhome’s expanding library of VFX plugins, Ignite Pro. This set of plugins is available on Mac and PC platforms, and is compatible with 10 of the most used host software, including Adobe Creative Cloud, Apple Final Cut Pro X, Avid, DaVinci Resolve and others.

Last month, FXhome added to its product family with Imerge Pro, a non-destructive RAW image compositor with fully flexible layers and advanced keying for content creators. FXhome is also integrating a number of Imerge Pro plugins with HitFilm, including Exposure, Outer Glow, Inner Glow and Dehaze. New Imerge Pro plugins are tightly integrated with HitFilm V.11.0’s interface ensuring smooth, uninterrupted workflows.

Minimum system requirements are for Apple are: Mac OS 10.13 High Sierra, OS X 10.12 Sierra or OS X 10.11 El Capitan. And for Windows: Microsoft Windows 10 (64-bit), Microsoft Windows 8 (64-bit)

HitFilm 11.0 is available immediately from the FXhome store for $299. FXhome is also celebrating this holiday season with its annual sale. Through December 4, 2018, they are offering a 33% discount when users purchase the FXhome Pro Bundle, which includes HitFilm 11.0, Action, Ignite and Imerge.

Post studio Nomad adds Tokyo location

Creative editorial/VFX/sound design company Nomad has expanded its global footprint with a space in Tokyo, adding to a network that also includes offices in New York, Los Angeles and London. It will be led by managing director Yoshinori Fujisawa and executive producer Masato Midorikawa.

The Tokyo office has three client suites, an assistant support suite, production office and machine room. The tools for post workflow include Adobe Creative Cloud (Premiere, After Effects, Photoshop), Flame, Flame Assist, Avid Pro Tools and other various support tools.

Nomad partner/editor Glenn Martin says the studio often works with creatives who regularly travel between LA and Tokyo. He says Nomad will support the new Tokyo-based group with editors and VFX artists from our other offices whenever larger teams are needed.

“The role of a post production house is quite different between the US and Japan,” says Fujisawa and Midorikawa, jointly. “Although people in Japan are starting to see the value of the Western-style post production model, it has not been properly established here yet. We are able to give our Japanese directors and creatives the ability to collaborate with Nomad’s talented editors and VFX artists, who have great skills in storytelling and satisfying the needs of brands. Nomad has a comprehensive post-production workflow that enables the company to execute global projects. It’s now time for Japan to experience this process and be a part of the future of global advertising.”

Main Image: (L-R) Yoshinori Fujisawa and Masato Midorikawa

MPC Film provides VFX for new Predator movie

From outer space to suburban streets, the hunt comes to Earth in director Shane Black’s reinvention of the Predator series. Now, the universe’s most lethal hunters are stronger, smarter and deadlier than ever before, having genetically upgraded themselves with DNA from other species.

When a young boy accidentally triggers their return to Earth, only a crew of ex-soldiers and an evolutionary biology professor can prevent the end of the human race.

 

MPC Film’s team was led by VFX supervisors Richard Little and Arundi Asregadoo, creating 500 shots for this new Predator movie. The majority of the work required hero animation for the Upgrade Predator and additional work included the Predator dogs, FX effects and a CG swamp environment.

MPC Film’s character lab department modeled, sculpted and textured the Upgrade and Predator dogs to a high level of detail, allowing the director to have flexibility to view closeups, if needed. The technical animation department applied movement to the muscle system and added the flowing motion to the dreadlocks on all of the film’s hero alien characters, an integral part of the Predator’s aesthetic in the franchise.

MPC’s artists also created photorealistic Predator One and digital double mercenary characters for the film. Sixty-four other assets were created, ranging from stones and sticks for the swamp floor to grenades, a grenade launcher and bombs.

MPC’s artists also worked on the scene where we first meet the Upgrade’s dogs who are sent out to hunt the Predator. The sequence was shot on a bluescreen stage on location. The digital environments team built a 360-degree baseball field matching the location shoot from reference photography. Creating simple geometry and re-projecting the textures helped create a realistic environment.

Once the Upgrade tracks down the “fugitive” Predator the fight begins. To create the scene, MPC used a mixture of the live-action Predator intercut with its full CG Predator. The battle culminates with the Upgrade ripping the head and spine away from the body of the fugitive. This shot was a big challenge for the FX and tech animation team, who also added green Predator blood into the mix, amplifying the gore factor.

In the hunt scene, the misfit heroes trap the Upgrade and set the ultimate hunter alight. This sequence was technically challenging for the animation, lighting and FX team, which worked very closely to create a convincing Upgrade that appeared to be on fire.

For the final battle, MPC Film’s digital environments artists created a full CG swamp where the Upgrade’s ship crash-lands. The team was tasked with matching the set and creating a 360-degree CG set extension with water effects.

A big challenge was how to ensure the Upgrade Predator interacted realistically with the actors and set. The animation, tech animation and FX teams worked hard to make the Upgrade Predator fit seamlessly into this environment.

London design, animation studio Golden Wolf sets up shop in NYC

Animation studio Golden Wolf, headquartered in London, has launched its first stateside location in New York City. The expansion comes on the heels of an alliance with animation/VFX/live-action studio Psyop, a minority investor in the company. Golden Wolf now occupies studio space in SoHo adjacent to Psyop and its sister company Blacklist, which formerly represented Golden Wolf stateside and was instrumental to the relationship.

Among the year’s highlights from Golden Wolf are an integrated campaign for Nike FA18 Phantom (client direct), a spot for the adidas x Parley Run for the Oceans initiative (TBWA Amsterdam) in collaboration with Psyop, and Marshmello’s Fly music video for Disney. Golden Wolf also received an Emmy nomination for its main title sequence for Disney’s Ducktales reboot.

Heading up Golden Wolf’s New York office are two transplants from the London studio, executive producer Dotti Sinnott and art director Sammy Moore. Both joined Golden Wolf in 2015, Sinnott from motion design studio Bigstar, where she was a senior producer, and Moore after a run as a freelance illustrator/designer in London’s agency scene.

Sinnott comments: “Building on the strength of our London team, the Golden Wolf brand will continue to grow and evolve with the fresh perspective of our New York creatives. Our presence on either side of the Atlantic not only brings us closer to existing clients, but also positions us perfectly to build new relationships with New York-based agencies and brands. On top of this, we’re able to use the time difference to our advantage to work on faster turnarounds and across a range of budgets.”

Founded in 2013 by Ingi Erlingsson, the studio’s executive creative director, Golden Wolf is known for youth-oriented work — especially content for social media, entertainment and sports — that blurs the lines of irreverent humor, dynamic action and psychedelia. Erlingsson was once a prolific graffiti artist and, later, illustrator/designer and creative director at U.K.-based design agency ilovedust. Today he inspires Golden Wolf’s creative culture and disruptive style fed in part by a wave of next-gen animation talent coming out of schools such as Gobelins in France and The Animation Workshop in Denmark.

“I’m excited about our affiliation with Psyop, which enjoys an incredible legacy producing industry-leading animated advertising content,” Erlingsson says. “Golden Wolf is the new kid on the block, with bags of enthusiasm and an aim to disrupt the industry with new ideas. The combination of the two studios means that we are able to tackle any challenge, regardless of format or technical approach, with the support of some of the world’s best artists and directors. The relationship allows brands and agencies to have complete confidence in our ability to solve even the biggest challenges.”

Golden Wolf’s initial work out of its New York studio includes spots for Supercell (client direct) and Bulleit Bourbon (Barton F. Graf). Golden Wolf is represented in the US market by Hunky Dory for the East Coast, Baer Brown for the Midwest and In House Reps for the West Coast. Stink represents the studio for Europe.

Main Photo: (L-R) Dotti Sinnott, Ingi Erlingsson and Sammy Moore.

Our SIGGRAPH 2018 video coverage

SIGGRAPH is always a great place to wander around and learn about new and future technology. You can get see amazing visual effects reels and learn how the work was created by the artists themselves. You can get demos of new products, and you can immerse yourself in a completely digital environment. In short, SIGGRAPH is educational and fun.

If you weren’t able to make it this year, or attended but couldn’t see it all, we would like to invite you to watch our video coverage from the show.

SIGGRAPH 2018

postPerspective Impact Award winners from SIGGRAPH 2018

postPerspective has announced the winners of our Impact Awards from SIGGRAPH 2018 in Vancouver. Seeking to recognize debut products with real-world applications, the postPerspective Impact Awards are voted on by an anonymous judging body made up of respected industry artists and professionals. It’s working pros who are going to be using new tools — so we let them make the call.

The awards honor innovative products and technologies for the visual effects, post production and production industries that will influence the way people work. They celebrate companies that push the boundaries of technology to produce tools that accelerate artistry and actually make users’ working lives easier.

While SIGGRAPH’s focus is on VFX, animation, VR/AR, AI and the like, the types of gear they have on display vary. Some are suited for graphics and animation, while others have uses that slide into post production, which makes these SIGGRAPH Impact Awards doubly interesting.

The winners are as follows:

postPerspective Impact Award — SIGGRAPH 2018 MVP Winner:

They generated a lot of buzz at the show, as well as a lot of votes from our team of judges, so our MVP Impact Award goes to Nvidia for its Quadro RTX raytracing GPU.

postPerspective Impact Awards — SIGGRAPH 2018 Winners:

  • Maxon for its Cinema 4D R20 3D design and animation software.
  • StarVR for its StarVR One headset with integrated eye tracking.

postPerspective Impact Awards — SIGGRAPH 2018 Horizon Winners:

This year we have started a new Imapct Award category. Our Horizon Award celebrates the next wave of impactful products being previewed at a particular show. At SIGGRAPH, the winners were:

  • Allegorithmic for its Substance Alchemist tool powered by AI.
  • OTOY and Epic Games for their OctaneRender 2019 integration with UnrealEngine 4.

And while these products and companies didn’t win enough votes for an award, our voters believe they do deserve a mention and your attention: Wrnch, Google Lightfields, Microsoft Mixed Reality Capture and Microsoft Cognitive Services integration with PixStor.

 

Artifex provides VFX limb removal for Facebook Watch’s Sacred Lies

Vancouver-based VFX house Artifex Studios created CG amputation effects for the lead character in Blumhouse Productions’ new series for Facebook Watch, Sacred Lies. In the show, the lead character, Minnow Bly (Elena Kampouris), emerges after 12 years in the Kevinian cult missing both of her hands. Artifex was called on to remove the actress’ limbs.

VFX supervisor Rob Geddes led Artifex’s team who created the hand/stump transposition, which encompassed 165 shots across the series. This involved detailed paint work to remove the real hands, while Artifex 3D artists simultaneously performed tracking and match move in SynthEyes to align the CG stump assets to the actress’ forearm.

This was followed up with some custom texture and lighting work in Autodesk Maya and Chaos V-Ray to dial in the specific degree of scarring or level of healing on the stumps, depending on each scene’s context in the story. While the main focus of Artifex’s work was on hand removal, the team also created a pair of severed hands for the first episode after rubber prosthetics didn’t pass the eye test. VFX work was run through Side Effects Houdini and composited in Foundry’s Nuke.

“The biggest hurdle for the team during this assignment was working with the actress’ movements and complex performance demands, especially the high level of interaction with her environment, clothing or hair,” says Adam Stern, founder of Artifex. “In one visceral sequence, Rob and his team created the actual severed hands. These were originally shot practically with prosthetics, however the consensus was that the practical hands weren’t working. We fully replaced these with CG hands, which allowed us to dial in the level of decomposition, dirt, blood and torn skin around the cuts. We couldn’t be happier with the results.”

Geddes adds, “One interesting thing we discovered when wrangling the stumps, is that the logical and accurate placement of the wrist bone of the stumps didn’t necessarily feel correct when the hands weren’t there. There was quite a bit of experimentation to keep the ‘hand-less’ arms from looking unnaturally long, or thin.”

Artifex also added a scene involving absolute devastation in a burnt forest in Episode 101, involving matte painting and set extension of extensive fire damage that couldn’t safely be achieved on set. Artifex fell back on their experience in environmental VFX creation, using matte painting and projections tied together with ample rotoscope work.

Approximately 20 Artifex artists took part in Sacred Lies across 3D, compositing, matte painting, I/O and production staff.

Watch Artifex founder Adam Stern talk about the show from the floor of SIGGRAPH 2018:

Sony Pictures Post adds three theater-style studios

Sony Pictures Post Production Services has added three theater-style studios inside the Stage 6 facility on the Sony Pictures Studios lot in Culver City. All studios feature mid-size theater environments and include digital projectors and projection screens.

Theater 1 is setup for sound design and mixing with two Avid S6 consoles and immersive Dolby Atmos capabilities, while Theater 3 is geared toward sound design with a single S6. Theater 2 is designed for remote visual effects and color grading review, allowing filmmakers to monitor ongoing post work at other sites without leaving the lot. Additionally, centralized reception and client services facilities have been established to better serve studio sound clients.

Mix Stage 6 and Mix Stage 7 within the sound facility have been upgraded, each featuring two S6 mixing consoles, six Pro Tools digital audio workstations, Christie digital cinema projectors, 24 X 13 projection screens and a variety of support gear. The stages will be used to mix features and high-end television projects. The new resources add capacity and versatility to the studio’s sound operations.

Sony Pictures Post Production Services now has 11 traditional mix stages, the largest being the Cary Grant Theater, which seats 344. It also has mix stages dedicated to IMAX and home entertainment formats. The department features four sound design suites, 60 sound editorial rooms, three ADR recording studios and three Foley stages. Its Barbra Streisand Scoring Stage is among the largest in the world and can accommodate a full orchestra and choir.

Patrick Ferguson joins MPC LA as VFX supervisor

MPC’s Los Angeles studio has added Patrick Ferguson to its staff as visual effects supervisor. He brings with him experience working in both commercials and feature films.

Ferguson started out in New York and moved to Los Angeles in 2002, and he has since worked at a range of visual effect houses along the West Coast, including The Mission, where he was VFX supervisor, and Method, where he was head of 2D. “No matter where I am in the world or what I’m working on, one thing has remained consistent since I started working in the industry: I still love what I do. I think that’s the most important thing.”

Ferguson has collaborated with directors such as Stacy Wall, Mark Romanek, Melina Matsoukas, Brian Billow and Carl Rinsch, and has worked on campaigns for big global brands, including Nike, Apple, Audi, HP and ESPN.

He has also worked on high-profile films, including Pirates of the Caribbean and Alice in Wonderland, and he was a member of the Academy Award-winning team for The Curious Case of Benjamin Button.

“In this new role at MPC, I hope to bring my varied experience of working on large scale feature films as well as on commercials that have a much quicker turnaround time,” he says. “It’s all about knowing what the correct tools are for the particular job at hand, as every project is unique.”

For Ferguson, there is no substitute for being on set: “Being on set is vital, as that’s when key relationships are forged between the director, the crew, the agency and the entire team. Those shared experiences go a long way in creating a trust that is carried all the way through to end of the project and beyond.”

Breathing life into The Walking Dead with VFX

By Karen Moltenbrey

Zombies used to have a short life span, awakening sometime during October, just in time for Halloween, before once again stumbling back into obscurity for another year. But thanks to the hit series The Walking Dead and its spin-off Fear the Walking Dead, the popularity of these monsters is infectious, turning them — and the shows — into cult phenomenon.

The Walking Dead’s rise in popularity started almost immediately with the series’ US debut, on October 31, 2010, on AMC. The storyline began when Rick Grimes, a sheriff deputy, awakens from a coma to find the world overrun by zombies. He and other survivors in the Atlanta area then band together to fight off these so-called “walkers,” as well as other tribes of survivors intent on ensuring their own survival in this post-apocalyptic world, no matter the cost.

In mid-2015, the show gave rise to the companion series Fear the Walking Dead. Fear, a prequel to The Walking Dead, takes place at the start of the zombie apocalypse and follows a different set of characters as they struggle to survive on the West Coast.

And the series’ visual effects are, well, to die for. Literally.

Burbank’s Picture Shop began creating the effects for Walk starting last season and is now in the midst of Season 9 (The studio splits the lion’s share of the work on the show with Goodbye Kansas Studios). Picture Shop provides visual effects for Fear, as well (Season 4, Episodes 1 through 8).

According to Christian Cardona, senior visual effects supervisor at Picture Shop, the crux of the work for both series includes character “kill” effects and environment augmentations. “We do a lot of what we call ‘walker kills.’ What that usually requires is weapon extensions, whereby the weapon gets inserted into the walkers, and then the ensuing wounds. We have to track the wounds onto the practical walkers and then also do blood sims during those kills,” he explains. “That accounts for probably 50 to 60 percent of the work.”

The only way to kill a walker is to damage its brain or destroy its body. Therefore, each episode contains its fair share of bodily damage to the zombies (and, sometimes, to the humans trying to keep them and the adversarial tribes at bay).

Cardona notes that it took the group a few episodes to nail down the blood aesthetic the producers were looking for — the look of the blood as well as how it should flow from a walker’s mouth. “Throughout the season, we’ve definitely zeroed in on that and have really gotten a good system down, where now it takes us just a fraction of the time,” he says.

Nevertheless, the work has to be exact. “The client wants everything to be photoreal; they don’t want it to look like anything was added. With that said, they scrutinize every shot, frame by frame and pixel by pixel, so we definitely have to do our due diligence and make sure our work adheres to their standard,” Cardona says. “And they will art direct every drop of blood. They know what they want, and we make sure we deliver that.”

The Walking Dead

There is indeed consistency in the overall look of the blood and wounds, as well as the walkers themselves in each of the shows, although there was one scene in Walk whereby the walkers were stuck in a toxic sludge. As a result, their skin was more pale and saggy. “It was a unique scenario where we could change their appearance and the look of the blood for that episode to illustrate that these walkers were different,” recalls Cardona.

The Walking Dead
In Walk, the majority of the hero walkers — either individuals or those in small groups – are actors with prosthetic makeup, or in some cases, practical models. But when there are more than two dozen or so walkers in a shot, they are CG creations.

Whether a practical or digital walker is used for the show often depends on the action in the scene. Walker kills are prevalent throughout the series, as it is in Fear, and often this involves a decapitation or a scalping, “because in order to kill the walkers, they have to be either shot in the head or stabbed,” Cardona points out. “Oftentimes when that happens, we have to cut from the person with the makeup and prosthetics and replace the entire head digitally before chopping it off for the scene.”

Season 8 Episode 14 featured a practical walker with a broken body strapped to a dolly cart, its head locked in position on the cart. As a form of torture, the cart was wheeled close enough for the walker to bite a victim. Initially, the walker was practical, but Picture Shop artists ended up replacing it with a CG model.

“The client wasn’t really happy with the practical version on set. It looked rubbery, like an animatronic walker, which it was. It needed to be fleshier, a little more real,” says Cardona. “The blood and wounds felt too dry, and the muscles had to contract.”

Fear the Walking Dead

In fact, Cardona describes that sequence as one of the more challenging from Season 8. “It was close to the camera and had to be photoreal. It wasn’t just one shot, either; there were over a dozen shots in the scene, with multiple angles and long takes.”

The team used the practical cart walker’s head in the shots but replaced the body and arms, which also had been locked to the cart, requiring intricate match-moving. “We had to be spot on and tight, so there was a lot of soft tracking as well, since the camera was moving everywhere,” recalls Cardona. “And then we had to get that walker to look photoreal.”

Compounding this scene even more was the fact that the main character shoots the cart walker, requiring the addition of bullet hits and resulting wounds.

Zombie Nation
For the upcoming Season 9 (premiering October 7), Picture Shop began using a new system for generating large crowds of walkers. According to Cardona, the animators introduced Golaem’s population tool into the pipeline to help with the mass crowd simulations for the walker herds that will appear during the season. Previously, the artists used the particle system in Autodesk’s Maya for this task, “but we needed something that was more robust, something created specifically to do this kind of effect, especially on a TV schedule and with a TV pipeline and workflow,” he adds.

During this past off-season, the artists began establishing a system that would source the walker assets the group had created for Season 8 and in the off-season to prepare for Season 9. “We were modeling walkers and using some of the walkers from Season 8, and standardizing them all with the same T-pose so we could easily swap out rigs and customize and create a lot of variations with textures, changing their clothes, skin color and hair,” explains Cardona. “So when we have to create walker herds, we can easily get five or six variations from a single walker. We end up with a lot of variations in our herd sims without having to create a brand-new walker every time.”

And make no mistake, there will be more herds of walkers in Season 9 than viewers have seen in previous seasons.

The Walking Dead

Other VFX
On average, Picture Shop created 40 to 50 VFX shots per episode during Season 8 of The Walking Dead, with a typical turnaround time of two to three weeks. In addition to the kills and their associated effects, the group also built set extensions. For instance, most of Season 8 (and 7) revolved around what is called “the Sanctuary,” an old factory that is now home to The Saviors, with whom Grimes and his group must interact. The first two floors were practically built, and then Picture Shop extended the structure digitally by another 12 stories. At one point, a gun fight ensues, which called for the CG artists to break out all the windows — another visual effect.

“The group didn’t move far from their location in Season 8, so the need for additional set extensions wasn’t as high as it had been in earlier seasons,” adds Cardona.

Episode 8 of Walk — which has the tribes fighting each other more so than the walkers — did start off with a bang, however. In the first episode of the season, Daryl, Grimes’ trusted lieutenant, shoots a box of explosives, ripping a CG walker in half — work that Cardona describes as “challenging.” In another shot, an RPG blows up another tribe member; in it, the actor had to be swapped out for a digital double that had been projection modeled.

Cardona has seen an evolution in the effects Picture Shop is providing for Walk in particular. Some of the more interesting effects will be coming in Season 9, he teases. “We’ve been working on some of the stuff over the summer and have spent time in R&D on one effect in particular — something new that the audience hasn’t seen before,” he says.

In the upcoming season, nature is taking over, and there will be overgrown vegetation on all the buildings and structures. “With that said, we are doing an effect whereby we take a murder of crows and have them swarm similar to the murmuration of starlings, which wheel and dart through the sky in tight, fluid formations,” says Cardona. “This is something you will see through the course of the season.”

 

The Walking Dead

For this work, the artists built and rigged a crow in Maya and generated various animation cycles, which were cached out and used as a particle simulation within Side Effects’ Houdini.

While creating this effect was not nearly as time-consuming as setting up the crowd simulation in Golaem, “it was something unique, and we had to figure out an approach that gave us the flexibility to art direct and change [the results] quickly,” Cardona notes. “And, it’s an effect that has nothing to do with walkers, but it tells a big part of the story of what is happening to the world around them.”

In terms of the other overall effects in Walk and Fear, the Picture Shop artists use primarily Autodesk’s 3ds Max, although Maya is also used, albeit mainly for the Golaem crowd work. Yet once the sim is complete, the artists cache the results and bake out all the animation within Maya, then export it into Max. Rendering is done in Chaos’ V-Ray for 3ds Max.

In addition, the artists use Pixologic’s ZBrush for a lot of the organic modeling, mostly for the walkers. For effects, the crew usually turns to Houdini.

The effects that Picture Shop delivers for The Walking Dead are unique — in terms of the blood and gore — from other shows the studio works on, such as Hawaii Five-0 and MacGyver, which call for more traditional VFX, like muzzle flashes and explosions, as well as set extensions. “It’s more hard-surface modeling stuff, whereas Walk, for the most part, is a lot more organic,” Cardona adds. “And the tone is completely different, obviously.”

Monster Mash
Picture Shop performs the same type of work for Fear as it does for Walk — mostly weapon extensions and walker kills. Cardona notes there is a cohesiveness in the effects between the two shows, especially now that the timeline of both stories is nearly the same. Fear started at the beginning of the apocalypse, when the walkers were “fresher, and the blood kills were a little bigger, because the thinking was that there would be more blood present in the walkers at that point,” he explains.

Insofar as the general walker kills are concerned, the actors never really make a physical impact with their stabbing motions, so oftentimes their hands are not in the right positions or the reaction time of the walker is off. In these instances, it is up to the VFX artists to digitally rectify the action and reaction — for instance, separating and repositioning the actor’s arm, stabilizing it, then tracking on the weapon that is placed in their hand, as well as stabilizing the walker, adding the wound, and then adding the weapon extension. This holds true for both series.

“Sometimes this can be challenging because the camera is also moving, so it requires significant roto work, and we have to deconstruct the shot and reconstruct it back again,” says Cardona. “There are plenty of these types of shots we have to do for both Walk and Fear.”

For the tracking on this and other work, the studio uses Andersson Technologies’ SynthEyes; for the planar tracking, Boris FX’s Mocha, formerly from Imagineer Systems.

According to Cardona, the overall look of the walkers has changed throughout the course of the Fear seasons. In Walk, they are more decayed, whereas in Fear, they still look like humans for the most part, not as skeletal. But, that has now changed, and with a more cohesive look between the walkers is a more cohesive look with some of the VFX, particularly the blood effects going forward.

However, there is one big difference between the two shows. Walk is still shot on film, 16mm, while Fear is shot digitally, so the graininess is quite heavy with Walk. “It affects our tracking because there is so much noise. Often we would de-grain [the footage] to do the tracking, and then add the grain back in. Also, any time we have a bluescreen shot where we have to pull keys, it’s a problem,” says Cardona.

Indeed, honing the effects on The Walking Dead gives the artists a leg up when it comes to Fear. Another advantage: Picture Shop performs the color and finishing for both shows, as well, which can result in some emergency VFX work for the crew, especially during a time crunch. In fact, the colorists at the facility created the custom 16mm grain pattern that the artists use now during the tracking process. It was generated by the colorists when the client was considering migrating Walk to digital format but then decided to retain the current structure.

Another plus: The Walking Dead executives are also located in the same building as Picture Shop, several floors up. “They just moved here, and it’s convenient for everyone. We can do spot sessions with VFX producer Jason Sax or showrunner Angela Kang (who recently took over that role from Scott Gimple).”

At Comic-Con San Diego a few weeks ago, the series was a fan favorite, and online there is talk about plans for upcoming seasons. So, it appears these walkers still have a lot of life left in them, if fans — and the digital artists — have their way.


Karen Moltenbrey is a longtime writer and editor in the CG and post industries.

Using VFX to bring the new Volkswagen Jetta to life

LA-based studio Jamm provided visual effects for the all-new 2019 Volkswagen Jetta campaign Betta Getta Jetta. Created by Deutsch and produced by ManvsMachine, the series of 12 spots bring the Jetta to life by combining Jamm’s CG design with a color palette inspired by the car’s 10-color ambient lighting system.

“The VW campaign offered up some incredibly fun and intricate challenges. Most notably was the volume of work to complete in a limited amount of time — 12 full-CG spots in just nine weeks, each one unique with its own personality,” says VFX supervisor Andy Boyd.

Collaboration was key to delivering so many spots in such a short span of time. Jamm worked closely with ManvsMachine on every shot. “The team had a very strong creative vision which is crucial in the full 3D world where anything is possible,” explains Boyd.

Jamm employed a variety of techniques for the music-centric campaign, which highlights updated features such as ambient lighting and Beats Audio. The series includes spots titled  Remix, Bumper-to-Bumper, Turb-Whoa, Moods, Bass, Rings, Puzzle and App Magnet, along with 15-second teasers, all of which aired on various broadcast, digital and social channels during the World Cup.

For “Remix,” Jamm brought both a 1985 and a 2019 Jetta to life, along with a hybrid mix of the two, adding a cool layer of turntablist VFX, whereas for “Puzzle,” they cut up the car procedurally in Houdini​, which allowed the team to change around the slices as needed.

For Bass, Jamm helped bring personality to the car while keeping its movements grounded in reality. Animation supervisor Stew Burris pushed the car’s performance and dialed in the choreography of the dance with ManvsMachine as the Jetta discovered the beat, adding exciting life to the car as it bounced to the bassline and hit the switches on a little three-wheel motion.

We reached out to Jamm’s Boyd to find out more.

How early did Jamm get involved?
We got involved as soon as agency boards were client approved. We worked hand in hand with ManvMachine to previs each of the spots in order to lay the foundation for our CG team to execute both agency and directors’ vision.

What were the challenges of working on so many spots at once.
The biggest challenge was for editorial to keep up with the volume of previs options we gave them to present to agency.

Other than Houdini, what tools did they use?
Flame, Nuke and Maya were used as well.

What was your favorite spot of the 12 and why?
Puzzle was our favorite to work on. It was the last of the bunch delivered to Deutsch which we treated with a more technical approach, slicing up the car like a Rubix’s Cube.

 

2nd-gen AMD Ryzen Threadripper processors

At the SIGGRAPH show, AMD announced the availability of its 2nd-generation AMD Ryzen Threadripper 2990WX processor with 32 cores and 64 threads. These new AMD Ryzen Threadripper processors are built using 12nm “Zen+” x86 processor architecture. Second-gen AMD Ryzen Threadripper processors support the most I/O and are compatible with existing AMD X399 chipset motherboards via a simple BIOS update, offering builders a broad choice for designing the ultimate high-end desktop or workstation PC.

The 32-core/64-thread Ryzen Threadripper 2990WX and the 24-core/48-thread Ryzen Threadripper 2970WX are purpose-built for prosumers who crave raw computational compute power to dispatch the heaviest workloads. The 2nd-gen AMD Ryzen Threadripper 2990WX offers up to 53 percent faster multithread performance and up to 47 percent more rendering performance for creators than the core i9-7980XE.

This new AMD Ryzen Threadripper X series comes with a higher base and boost clocks for users who need high performance. The 16 cores and 32 threads in the 2950X model offer up to 41 percent more multithreaded performance than the Core i9-7900X.

Additional performance and value come from:
• AMD StoreMI technology: All X399 platform users will now have free access to AMD StoreMI technology, enabling configured PCs to load files, games and applications from a high-capacity hard drive at SSD-like read speeds.
• Ryzen Master Utility: Like all AMD Ryzen processors, the 2nd-generation AMD Ryzen Threadripper CPUs are fully unlocked. With the updated AMD Ryzen Master Utility, AMD has added new features, such as fast core detection both on die and per CCX; advanced hardware controls; and simple, one-click workload optimizations.
• Precision Boost Overdrive (PBO): A new performance-enhancing feature that allows multithreaded boost limits to be raised by tapping into extra power delivery headroom in premium motherboards.

With a simple BIOS update, all 2nd-generation AMD Ryzen Threadripper CPUs are supported by a full ecosystem of new motherboards and all existing X399 platforms. Designs are available from top motherboard manufacturers, including ASRock, ASUS, Gigabyte and MSI.

The 32-core, 64-thread AMD Ryzen Threadripper 2990WX is available now from global retailers and system integrators. The 16-core, 32-thread AMD Ryzen Threadripper 2950X processor is expected to launch on August 31, and the AMD Ryzen Threadripper 2970WX and 2920X models are slated for launch in October.

NIM 3.0 studio management tool debuts at Siggraph

NIM Labs’ new NIM 3.0 is an all-in-one studio management platform that helps visual effects and post production houses track projects and manage their company. Created by creative directors and studio heads, NIM 3.0 helps studios do more under one roof with a redesigned review tool, grouped bidding and a live link to Adobe Premiere.

What began as an in-house toolset for Ntropic is now the studio management software in-house at Digital Domain, Logan, Taylor James and Intelligent Creatures. With NIM 3.0, studios get access tools to handle bidding, tracking, versioning, scheduling, reviews, finances and time cards in one place.

The creative review tool has been rewritten from the ground up to make screening and markup of videos, PDFs and still frames even easier. Review capabilities have also been extended to more parts of NIM, so users can do more in a single location. With Review Bins, teams can stay organized using saveable smart filters and grouped elements for later use. Review Bins come with a new Theater View, which allows for a larger viewer experience and the ability to zoom, fit, fill and pan review items during playback. Review items can also be stacked according to version, allowing supervisors and clients to quickly visualize progress.

After years of building NIM Connectors into Maya, Houdini, 3ds Max, Nuke and Flame, NIM Labs is introducing its first direct workflow for Adobe Premiere editors. With NIM 3.0, Adobe Premiere becomes a conform tool, helping VFX, VR and post houses do everything from creating timeline shots to round-tripping elements rendered in other packages, all without leaving the app. Thanks to the new NIM Connector, the entire creative review process can be conducted from within Premiere, creating a seamless experience. Premiere is NIM’s third Adobe connection, following After Effects and Photoshop.

NIM’s bidding system was created to get bids out the door faster using studio-wide templates. In NIM 3.0, new organizational tools let producers define the different sections of their bid using groups, allowing greater flexibility and speed when responding to a large RFP. Through item linking, users will be able to modify multiple line items simultaneously, with the ability to attach further information like images, notes and descriptions when the need arises.

Companies with existing directory systems can now immediately integrate NIM with their users. By accessing NIM 3.0’s on-board security controls, organizations can manage permissions and security groups from Active Directory (AD) or Lightweight Directory Access Protocol (LDAP). Support for multiple domains will add even more authentication controls across networks.

NIM 3.0 will be available in the fall. Active user licenses cost $360 annually and $40 monthly. NIM 2.8 is currently available as a free 30-day trial.

Behind the Title: Jogger Studios’ CD Andy Brown

This veteran creative director can also often be found at the controls of his Flame working on a new spot.

NAME: Andy Brown

COMPANY: Jogger Studios (@joggerstudios)

CAN YOU DESCRIBE YOUR COMPANY?
We are a boutique post house with offices in the US and UK providing visual effects, motion graphics, color grading and finishing. We are partnered with Cut + Run for editorial and get to work with their editors from around the world. I am based in our Jogger Los Angeles office, after having helped found the company in London.

WHAT’S YOUR JOB TITLE?
Creative Director

WHAT DOES THAT ENTAIL?
Overseeing compositing, visual effects and finishing. Looking after staff and clients. Juggling all of these things and anticipating the unexpected.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I’m still working “on the box” every day. Even though my title is creative director, it is the hands-on work that is my first love as far as project collaborations go. Also I get to re-program the phones and crawl under the desks to get the wires looking neater when viewed from the client couch.

WHAT’S YOUR FAVORITE PART OF THE JOB?
The variety, the people and the challenges. Just getting to work on a huge range of creative projects is such a privilege. How many people get to go to work each day looking forward to it?

WHAT’S YOUR LEAST FAVORITE?
The hours, occasionally. It’s more common to have to work without clients nowadays. That definitely makes for more work sometimes, as you might need to create two or three versions of a spot to get approval. If everyone was in the room together you reach a consensus more quickly.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
I like the start of the day best, when everyone is coming into the office and we are getting set up for whatever project we are working on. Could be the first coffee of the day that does it.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I want to say classic car dealer, but given my actual career path the most likely alternative would be editor.

WHY DID YOU CHOOSE THIS PROFESSION?
There were lots of reasons, when I look at it. It was either the Blue Peter Book of Television (the longest running TV program for kids, courtesy of the BBC) or my visit to the HTV Wales TV station with my dad when I was about 12. We walked around the studios and they were playing out a film to air, grading it live through a telecine. I was really struck by the influence that the colorist was having on what was seen.

I went on to do critical work on photography, film and television at the Centre for Contemporary Cultural Studies at Birmingham University. Part of that course involved being shown around the Pebble Mill BBC Studios. They were editing a sequence covering a public enquiry into the Handsworth riots in 1985. It just struck me how powerful the editing process was. The story could be told so many different ways, and the editor was playing a really big part in the process.

Those experiences (and an interest in writing) led me to think that television might be a good place to work. I got my first job as a runner at MPC after a friend had advised me how to get a start in the business.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
We worked on a couple of spots for Bai recently with Justin Timberlake creating the “brasberry.” We had to make up some graphic animations for the newsroom studio backdrop for the shoot and then animate opening title graphics to look just enough like it was a real news report, but not too much like a real news report.

We do quite a bit of food work, so there’s always some burgers, chicken or sliced veggies that need a bit of love to make them pop.

There’s a nice set extension job starting next week, and we recently finished a job with around 400 final versions, which made for a big old deliverables spreadsheet. There’s so much that we do that no one sees, which is the point if we do it right.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
Sometimes the job that you are most proud of isn’t necessarily the most amazing thing to look at. I used to work on newspaper commercials back in the UK, and it was all so “last minute.” A story broke, and all of a sudden you had to have a spot ready to go on air with no edit, no footage and only the bare bones of a script. It could be really challenging, but we had to get it done somehow.

But the best thing is seeing something on TV that you’ve worked on. At Jogger Studios, it is primarily commercials, so you get that excitement over and over again. It’s on air for a few weeks and then it’s gone. I like that. I saw two of our spots in a row recently on TV, which I got a kick out of. Still looking for that elusive hat-trick.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
The Flame, the Land Rover Series III and, sadly, my glasses.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
Just friends and family on Instagram, mainly. Although like most Flame operators, I look at the Flame Learning Channel on YouTube pretty regularly. YouTube also thinks I’m really interested in the Best Fails of 2018 for some reason.

DO YOU LISTEN TO MUSIC WHILE YOU WORK?
More often than not it is podcasts. West Wing Weekly, The Adam Buxton Podcast, Short Cuts and Song Exploder. Plus some of the shows on BBC 6 Music, which I really miss.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I go to work every day feeling incredibly lucky to be doing the job that I do, and it’s good to remember that. The 15-minute walk to and from work in Santa Monica usually does it.

Living so close to the beach is fantastic. We can get down to the sand, get the super-brella set up and get in the sea with the bodyboards in about 15 minutes. Then there’s the Malibu Cars & Coffee, which is a great place to start your Sunday.

The Darkest Minds director Jennifer Yuh Nelson

By Iain Blair

Jennifer Yuh Nelson has been an acclaimed — and highly bankable — director in animation for years, thanks to her work on the billion-dollar-grossing Kung Fu Panda franchise.

Now she’s taken on her first live-action film with Fox’s The Darkest Minds. Adapted from the best-selling book by Alexandra Bracken, the first in a YA trilogy, the film stars Amandla Stenberg in the lead as Ruby, along with Harris Dickinson, Miya Cech and Skylan Brooks.

The Darkest Minds also features adults, including Mandy Moore and Bradley Whitford, and revolves around a group of teens who mysteriously develop powerful new abilities and who are then declared a threat by the government and detained. It’s essentially a genre mash-up — a road movie with some sci-fi elements and lots of kinetic action. It was written by Chad Hodge, best known for his work as the creator and executive producer of TNT’s Good Behavior and Fox’s Wayward Pines.

Nelson’s creative team included DP Kramer Morgenthau (Terminator Genisys, Thor: The Dark World), editors Maryann Brandon (Star Wars: The Force Awakens) and Dean Zimmerman (Stranger Things), and visual effects supervisor Björn Mayer (Oblivion). Fox-based 21 Laps’ (Stranger Things, Arrival) Shawn Levy and Dan Levine produced.

I recently spoke with Nelson about making the film.

What sort of film did you set out to make?
To start off with, I wanted a great emotional core, and as this was based off a book, it already had that built in… even in early versions of the script. It had great characters with strong relationships, and I wanted to do some action stuff.

Any big surprises making the move to a major live-action film, or were you pretty prepared in terms of prep thanks to your background in animation?
I was pretty prepared, and the prep’s essentially the same as in animation. But, of course, production is utterly different, along with the experience of being on location. I had a really great crew and a fantastic DP, which helped me a lot. The big difference is suddenly you have the luxury of coverage, which you don’t get in animation. There you need to know exactly what you want, as it’s so expensive to create. Being outside all day on location, and dealing with the elements and crew and cast all at once — that was a big learning curve, but I really loved it. I had a fantastic time!

What were the main technical challenges in pulling it all together?
There were a lot of moving parts, and the main one was probably all the VFX involved. It’s a very reality-based book. It’s not set in outer space, and it’s supposed to look very grounded and seamless with reality. So you have these characters with superpowers that are meant to be very believable, but then we had fire, flamethrowers, 300 extras running around, wind machines and so on. Then all the post fire stuff we had to add later.

How early on did you start integrating post and all the VFX?
Right at the start, and my VFX super Björn Mayer was so smart about it and figuring out ways to get really cool looks. We tried out a ton of visual approaches. Some were done in camera, most were done in post or augmented in post – especially all the fire effects. It was intense reality, not complete reality, that we aimed for, so we had some flexibility.

I assume you did a lot of previs?
Quite a lot, and that was also a big help. We did full-3D previs, like we do in animation, so I was pretty used to that. We also storyboarded a big chunk of the movie, including scenes that normally you wouldn’t have to storyboard because I wanted to make completely sure we were covered on everything.

Didn’t you start off as a storyboard artist?
I did, and my husband’s one too, so I roped him in and we did the whole thing ourselves. It’s just an invaluable tool for showing people what’s going on in a director’s head, and when they’ve seen the pictures they can then offer creative ideas as everyone knows what you’re trying to achieve.

How tough was the shoot?
We shot in Atlanta, and it went smoothly considering there’s always unexpected things. We had freak thunderstorms and a lot of rain that made some sets sink and so on, but it’s how you respond to all that that counts. Everyone was calm and organized.

Where did you post?
Here in LA. We rented some offices near my home and just set up editorial and all our VFX there. It was very convenient.

In a sense, animation is all post, so you must love the post process?
You’re right – animation is like a long-running post for the whole production. I love post because it’s so transformative, and it’s beautiful to see all the VFX get layered in and see the movie suddenly come to life.

Talk about editing this with two editors. How did that work?
Maryann was on the set with us, working as we shot, and then Dean came on later in post, so we had a great team.

What were the big editing challenges?
I think the big one was making all the relationships believable over the course of the film, and so much of it is very subtle. It can come down to just a look or a moment, and we had to carefully plot the gradations and work hard to make it all feel real.

All the VFX play a big role. How many were there?
Well over 2,000 I think, and MPC and Crafty Apes did most of them. I loved working on them with my VFX supervisor. It’s very similar to working with them in animation, which is essentially one big VFX show. So I was very familiar with the process, although integrating them into live action instead of a virtual world is quite different. I loved seeing how it all got integrated so seamlessly.

Can you talk about the importance of sound and music?
It was so important to me, and we had quite a few songs in the film because it’s partly a road trip. There’s the big dance scene where we found a great song and then were able to shoot to the track. We mixed all the sound on the Fox lot.

Where did you do the DI and how important is it to you?
At Technicolor, and I’m pretty involved, although I don’t micro-manage. I’d give notes, and we’d make some stuff pop a bit more and play around with the palette, but basically it went pretty quickly as what we shot already looked really sweet.

Did the film turn out the way you hoped?
It did, and I can’t wait to do another live-action film. I adore animation, but live action’s like this new shiny toy.

You’re that Hollywood rarity — a successful female director. What advice would you give to young women who want to direct?
Do what makes you happy. Don’t do it just because someone says “you can” or “you can’t.” You’ve got to have that personal desire to do this job, and it’s not easy and I don’t expect change to come very quickly to Hollywood. But it is coming.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Behind the Title: Weta Digital VFX supervisor Erik Winquist

NAME: Erik Winquist

COMPANY: Wellington, New Zealand’s Weta Digital

CAN YOU DESCRIBE YOUR COMPANY?
We’re currently a collection of about 1,600 ridiculously talented artists and developers down at the bottom of the world who have created some the most memorable digital characters and visual effects for film over the last couple of decades. We’re named after a giant New Zealand bug.

WHAT’S YOUR JOB TITLE?
Visual Effects Supervisor

WHAT DOES THAT ENTAIL?
Making the director and studio happy without making my crew unhappy. Ensuring that everybody on the shoot has the same goal in mind for a shot before the cameras start rolling is one way to help accomplish both of those goals. Using the strengths and good ideas of everybody on your team is another.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
The amount of problem solving that is required. Every show is completely different from the last. We’re often asked to do something and don’t know how we’re going to accomplish it at the outset. That’s where it’s incredibly important to have a crew full of insanely brilliant people you can bash ideas around with.

HOW DID YOU START YOUR CAREER IN VFX?
I went to school for it. After graduating from the Ringling College of Art and Design with a degree in computer animation, I eventually landed a job as an assistant animator at Pacific Data Images (PDI). The job title was a little misleading, because although my degree was fairly character animation-centric, the first thing I was asked to do at PDI was morphing. I found that I really enjoyed working on the 2D side of things, and that sent me down a path that ultimately got me hired as a compositor at Weta on The Lord of the Rings.

HOW LONG HAVE YOU BEEN WORKING IN VFX?
I was hired by PDI in 1998, so I guess that means 20 years now. (Whoa.)

HOW HAS THE VFX INDUSTRY CHANGED IN THE TIME YOU’VE BEEN WORKING? WHAT’S BEEN GOOD? WHAT’S BEEN BAD?
Oh, there’s just been so much great stuff. We’re able to make images now that are completely indistinguishable from reality. Thanks to massive technology advancements over the years, interactivity for artists has gotten way better. We’re sculpting incredible amounts of detail into our models, painting them with giga-pixels worth of texture information, scrubbing our animation in realtime, using hardware-accelerated engines to light our scenes, rendering them with physically-based renderers and compositing with deep images and a 3D workspace.

Of course, all of these efficiency gains get gobbled up pretty quickly by the ever-expanding vision of the directors we work for!

The industry’s technology advancements and flexibility have also perhaps had some downsides. Studios demand increasingly shorter post schedules, prep time is reduced, shots can be less planned out because so much can be decided in post. When the brief is constantly shifting, it’s difficult to deliver the quality that everyone wants. And when the quality isn’t there, suddenly the Internet starts clamoring that “CGI is ruining movies!”

But, when a great idea — planned well by a decisive director and executed brilliantly by a visual effects team working in concert with all of the other departments — the movie magic that results is just amazing. And that’s why we’re all here doing what we do.

DID A PARTICULAR FILM INSPIRE YOU ALONG THIS PATH IN ENTERTAINMENT?
There were some films I saw very early on that left a lasting impression: Clash of the Titans, The Empire Strikes Back. Later inspiration came in high school with the TV spots that Pixar was doing prior to Toy Story, and the early computer graphics work that Disney Feature Animation was employing in their films of the early ‘90s.

But the big ones that really set me off around this time were ILM’s work on Jurassic Park, and films like Jim Cameron’s The Abyss and Terminator 2. That’s why it was a particular kick to find myself on set with Jim on Avatar.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Dailies. When I challenge an artist to bring their best, and they come up with an idea that completely surprises me; that is way better than what I had imagined or asked for. Those moments are gold. Dailies is pretty much the only chance I have to see a shot for the first time like an audience member gets to, so I pay a lot of attention to my reaction to that very first impression.

WHAT’S YOUR LEAST FAVORITE?
Getting a shot ripped from our hands by those pesky deadlines before every little thing is perfect. And scheduling meetings. Though, the latter is critically important to make sure that the former doesn’t happen.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
There was a time when I was in grade school where I thought I might like to go into sound effects, which is a really interesting what-if scenario for me to think about. But these days, if I were to hang up my VFX hat, I imagine I would end up doing something photography-related. It’s been a passion for a very long time.

Rampage

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
I supervised Weta’s work on Rampage, starring Dwayne Johnson and a very large albino gorilla. Prior to that was War for the Planet of the Apes, Spectral and Dawn of the Planet of the Apes.

WHAT IS THE PROJECT/S THAT YOU ARE MOST PROUD OF?
We had a lot of fun working on Rampage, and I think audiences had a ton of fun watching it. I’m quite proud of what we achieved with Dawn of the Planet of the Apes. But I’m also really fond of what our crew turned out for the Netflix film Spectral. That project gave us the opportunity to explore some VFX-heavy sci-fi imagery and was a really interesting challenge.

WHAT TOOLS DO YOU USE DAY TO DAY?
Most of my day revolves around reviewing work and communicating with my production team and the crew, so it’s our in-house review software, Photoshop and e-mail. But I’m constantly jumping in and out of Maya, and always have a Nuke session open for one thing or another. I’m also never without my camera and am constantly shooting reference photos or video, and have been known to initiate impromptu element shoots at a moment’s notice.

WHERE DO YOU FIND INSPIRATION NOW?
Everywhere. It’s why I always have my camera in my bag.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Scuba diving and sea kayaking are two hobbies that get me out in the water, though that happens far less than I would like. My wife and I recently bought a small rural place north of Wellington. I’ve found going up there doing “farm stuff” on the weekend is a great way to re-calibrate.

Quick Chat: Joyce Cox talks VFX and budgeting

Veteran VFX producer Joyce Cox has a long and impressive list of credits to her name. She got her start producing effects shots for Titanic and from there went on to produce VFX for Harry Potter and the Sorcerer’s Stone, The Dark Knight and Avatar, among many others. Along the way, Cox perfected her process for budgeting VFX for films and became a go-to resource for many major studios. She realized that the practice of budgeting VFX could be done more efficiently if there was a standardized way to track all of the moving parts in the life cycle of a project’s VFX costs.

With a background in the finance industry, combined with extensive VFX production experience, she decided to apply her process and best practices into developing a solution for other filmmakers. That has evolved into a new web-based app called Curó, which targets visual effects budgeting from script to screen. It will be debuting at Siggraph in Vancouver this month.

Ahead of the show we reached out to find out more about her VFX producer background and her path to becoming a the maker of a product designed to make other VFX pros’ lives easier.

You got your big break in visual effects working on the film Titanic. Did you know that it would become such an iconic landmark film for this business while you were in the throes of production?
I recall thinking the rough cut I saw in the early stage was something special, but had no idea it would be such a massive success.

Were there contacts made on that film that helped kickstart your career in visual effects?
Absolutely. It was my introduction into the visual effects community and offered me opportunities to learn the landscape of digital production and develop relationships with many talented, inventive people. Many of them I continued to work with throughout my career as a VFX producer.

Did you face any challenges as a woman working in below-the-line production in those early days of digital VFX?
It is a bit tricky. Visual effects is still a primarily male dominated arena, and it is a highly competitive environment. I think what helped me navigate the waters is my approach. My focus is always on what is best for the movie.

Was there anyone from those days that you would consider a professional mentor?
Yes. I credit Richard Hollander, a gifted VFX supervisor/producer with exposing me to the technology and methodologies of visual effects; how to conceptualize a VFX project and understand all the moving parts. I worked with Richard on several projects producing the visual effects within digital facilities. Those experiences served me well when I moved to working on the production side, navigating the balance between the creative agenda, the approved studio budgets and the facility resources available.

You’ve worked as a VFX producer on some of the most notable studio effects films of all time, including X-Men 2, The Dark Night, Avatar and The Jungle Book. Was there a secret to your success or are you just really good at landing top gigs?
I’d say my skills lie more in doing the work than finding the work. I believe I continued to be offered great opportunities because those I’d worked for before understood that I facilitated their goals of making a great movie. And that I remain calm while managing the natural conflicts that arise between creative desire and financial reality.

Describe what a VFX producer does exactly on a film, and what the biggest challenges are of the job.
This is a tough question. During pre-production, working with the director, VFX supervisor and other department heads, the VFX producer breaks down the movie into the digital assets, i.e., creatures, environments, matte paintings, etc., that need to be created, estimate how many visual effects shots are needed to achieve the creative goals as well as the VFX production crew required to support the project. Since no one knows exactly what will be needed until the movie is shot and edited, it is all theory.

During production, the VFX producer oversees the buildout of the communications, data management and digital production schedule that are critical to success. Also, during production the VFX producer is evaluating what is being shot and tries to forecast potential changes to the budget or schedule.

Starting in production and going through post, the focus is on getting the shots turned over to digital facilities to begin work. This is challenging in that creative or financial changes can delay moving forward with digital production, compressing the window of time within which to complete all the work for release. Once everything is turned over that focus switches to getting all the shots completed and delivered for the final assembly.

What film did you see that made you want to work in visual effects?
Truthfully, I did not have my sights set on visual effects. I’ve always had a keen interest in movies and wanted to produce them. It was really just a series of unplanned events, and I suppose my skills at managing highly complex processes drew me further into the world of visual effects.

Did having a background in finance help in any particular way when you transitioned into VFX?
Yes, before I entered into production, I spent a few years working in the finance industry. That experience has been quite helpful and perhaps is something that gave me a bit of a leg up in understanding the finances of filmmaking and the ability to keep track of highly volatile budgets.

You pulled out of active production in 2016 to focus on a new company, tell me about Curó.
Because of my background in finance and accounting, one of the first things I noticed when I began working in visual effects was, unlike production and post, the lack of any unified system for budgeting and managing the finances of the process.  So, I built an elaborate system of worksheets in Excel that I refined over the years. This design and process served as the basis for Curó’s development.

To this day the entire visual effects community manages the finances, which can be tens, if not hundreds, of millions in spend with spreadsheets. Add to that the fact that everyone’s document designs are different, which makes the job of collaborating, interpreting and managing facility bids unwieldy to say the least.

Why do you think the industry needs Curó, and why is now the right time? 
Visual effects is the fastest growing segment of the film industry, demonstrated in the screen credits of VFX-heavy films. The majority of studio projects are these tent-pole films, which heavily use visual effects. The volatility of visual effects finances can be managed more efficiently with Curó and the language of VFX financial management across the industry would benefit greatly from a unified system.

Who’s been beta testing Curó, and what’s in store for the future, after its Siggraph debut?
We’ve had a variety of beta users over the past year. In addition to Sony and Netflix a number of freelance VFX producers and supervisors as well as VFX facilities have beta access.

The first phase of the Curó release focuses on the VFX producers and studio VFX departments, providing tools for initial breakdown and budgeting of digital and overhead production costs. After Siggraph we will be continuing our development, focusing on vendor bid packaging, bid comparison tools and management of a locked budget throughout production and post, including the accounting reports, change orders, etc.

We are also talking with visual effects facilities about developing a separate but connected module for their internal granular bidding of human and technical resources.

 

Alkemy X joins forces with Quietman, adds CD Megan Oepen

Creative content studio Alkemy X has entered into a joint venture with long-time New York City studio Quietman. In addition, Alkemy X has brought on director/creative director Megan Oepen.

The Quietman deal will see founder and creative director Johnnie Semerad moving the operations of his company into Alkemy X, where both parties will share all creative talent, resources and capabilities.

“Quietman’s reputation of high-end, award-winning work is a tribute to Johnnie’s creative and entrepreneurial spirit,” says Justin B. Wineburgh, Alkemy X president/CEO. “Over the course of two decades, he grew and evolved Quietman from a fledgling VFX boutique into one of the most renowned production companies in advertising and branded content. By joining forces with Alkemy X, we’ll no doubt build on each other’s legacies collectively.”

Semerad co-founded Quietman in 1996 as a Flame-based visual effects company. Since then, it has expanded into the full gamut of production and post production services, producing more than 100 Super Bowl spots, and earning a Cannes Grand Prix, two Emmy Awards and other honors along the way.

“What I’ve learned over the years is that you have to constantly reinvest and reinvent, especially as clients increasingly demand start-to-finish projects,” says Semerad. “Our partnership with Alkemy X will elevate how we serve existing and future clients together, while bolstering our creative and technical resources to reach our potential as commercial filmmakers. The best part of this venture? I’ve always been listed with the Qs, but now, I’m with the As!”

Alkemy X is also teaming up with Oepen, an award-winning creative director and live-action director with 20 years of broadcast, sports and consumer brand campaign experience. Notable clients include Google, the NBA, MLB, PGA, NASCAR, Dove Beauty, Gatorade, Sprite, ESPN, Delta Air Lines, Home Depot, Regal Cinemas, Chick-Fil-A and Yahoo! Sports. Oepen was formerly the executive producer and director for Red Bull’s Non-Live/Long Format Productions group, and headed Under Armour’s Content House. She was also the creator behind Under Armour Originals.

Marvel’s Victoria Alonso to receive HPA’s Charles S. Swartz Award

The Hollywood Professional Association (HPA) has announced that Victoria Alonso, producer and executive VP of production for Marvel Studios, will receive the organization’s 2018 Charles S. Swartz Award at the HPA Awards on November 15. The HPA Awards recognize creative artistry, innovation and engineering excellence, and the Charles S. Swartz Award honors the recipient’s significant impact across diverse aspects of the industry.

A native of Buenos Aires, Alonso moved to the US at the age of 19. She worked her way up through the industry, beginning as a PA and then working four years at the VFX house Digital Domain. She served as VFX producer on a number of films, including Ridley Scott’s Kingdom of Heaven, Tim Burton’s Big Fish, Andrew Adamson’s Shrek and Marvel’s Iron Man. She won the Visual Effects Society (VES) Award for outstanding supporting visual effects/motion picture for Kingdom of Heaven, with two additional shared nominations (best single visual effects, outstanding visual effects/effects-driven motion picture) for Iron Man.

Eventually, she joined Marvel as the company’s EVP of visual effects and post, doubling as co-producer on Iron Man, a role she reprised on Iron Man 2, Thor and Captain America: The First Avenger. In 2011, she advanced to executive producer on the hit The Avengers and has since executive produced Marvel’s Iron Man 3, Captain America: The Winter Soldier and Captain America: Civil War, Thor: The Dark World, Avengers: Age of Ultron, Ant-Man, Guardians of the Galaxy, Doctor Strange, Guardians of the Galaxy Vol. 2, Spider-Man: Homecoming, Thor: Ragnarok, Black Panther, Avengers: Infinity War and most recently, Ant-Man and the Wasp.

She is currently at work on the untitled fourth installment of Avengers and Captain Marvel.

The Charles S. Swartz Award was named after executive Charles Swartz, who had a far ranging creative and technical career, eventually leading the Entertainment Technology Center at the University of Southern California, a leading industry think tank and research center. The Charles S. Swartz Award is awarded at the discretion of the HPA Awards Committee and the HPA Board of Directors, and is not given annually.

SIGGRAPH conference chair Roy C. Anthony: VR, AR, AI, VFX, more

By Randi Altman

Next month, SIGGRAPH returns to Vancouver after turns in Los Angeles and Anaheim. This gorgeous city, whose convention center offers a water view, is home to many visual effects studios providing work for film, television and spots.

As usual, SIGGRAPH will host many presentations, showcase artists’ work, display technology and offer a glimpse into what’s on the horizon for this segment of the market.

Roy C. Anthony

Leading up to the show — which takes place August 12-16 — we reached out to Roy C. Anthony, this year’s conference chair. For his day job, Anthony recently joined Ventuz Technology as VP, creative development. There, he leads initiatives to bring Ventuz’s realtime rendering technologies to creators of sets, stages and ProAV installations around the world

SIGGRAPH is back in Vancouver this year. Can you talk about why it’s important for the industry?
There are 60-plus world-class VFX and animation studios in Vancouver. There are more than 20,000 film and TV jobs, and more than 8,000 VFX and animation jobs in the city.

So, Vancouver’s rich production-centric communities are leading the way in film and VFX production for television and onscreen films. They are also are also busy with new media content, games work and new workflows, including those for AR/VR/mixed reality.

How many exhibitors this year?
The conference and exhibition will play host to over 150 exhibitors on the show floor, showcasing the latest in computer graphics and interactive technologies, products and services. Due to the increase in the amount of new technology that has debuted in the computer graphics marketplace over this past year, almost one quarter of this year’s 150 exhibitors will be presenting at SIGGRAPH for the first time

In addition to the traditional exhibit floor and conferences, what are some of the can’t-miss offerings this year?
We have increased the presence of virtual, augmented and mixed reality projects and experiences — and we are introducing our new Immersive Pavilion in the east convention center, which will be dedicated to this area. We’ve incorporated immersive tech into our computer animation festival with the inclusion of our VR Theater, back for its second year, as well as inviting a special, curated experience with New York University’s Ken Perlin — he’s a legendary computer graphics professor.

We’ll be kicking off the week in a big VR way with a special session following the opening ceremony featuring Ivan Sutherland, considered by many as “the father of computer graphics.” That 50-year retrospective will present the history and innovations that sparked our industry.

We have also brought Syd Mead, a legendary “visual futurist” (Blade Runner, Tron, Star Trek: The Motion Picture, Aliens, Time Cop, Tomorrowland, Blade Runner 2049), who will display an arrangement of his art in a special collection called Progressions. This will be seen within our Production Gallery experience, which also returns for its second year. Progressions will exhibit more than 50 years of artwork by Syd, from his academic years to his most current work.

We will have an amazing array of guest speakers, including those featured within the Business Symposium, which is making a return to SIGGRAPH after an absence of a few years. Among these speakers are people from the Disney Technology Innovation Group, Unity and Georgia Tech.

On Tuesday, August 14, our SIGGRAPH Next series will present a keynote speaker each morning to kick off the day with an inspirational talk. These speakers are Tony Derose, a senior scientist from Pixar; Daniel Szecket, VP of design for Quantitative Imaging Systems; and Bob Nicoll, dean of Blizzard Academy.

There will be a 25th anniversary showing of the original Jurassic Park movie, being hosted by “Spaz” Williams, a digital artist who worked on that film.

Can you talk about this year’s keynote and why he was chosen?
We’re thrilled to have ILM head and senior VP, ECD Rob Bredow deliver the keynote address this year. Rob is all about innovation — pushing through scary new directions while maintaining the leadership of artists and technologists.

Rob is the ultimate modern-day practitioner, a digital VFX supervisor who has been disrupting ‘the way it’s always been done’ to move to new ways. He truly reflects the spirit of ILM, which was founded in 1975 and is just one year younger than SIGGRAPH.

A large part of SIGGRAPH is its slant toward students and education. Can you discuss how this came about and why this is important?
SIGGRAPH supports education in all sub-disciplines of computer graphics and interactive techniques, and it promotes and improves the use of computer graphics in education. Our Education Committee sponsors a broad range of projects, such as curriculum studies, resources for educators and SIGGRAPH conference-related activities.

SIGGRAPH has always been a welcoming and diverse community, one that encourages mentorship, and acknowledges that art inspires science and science enables advances in the arts. SIGGRAPH was built upon a foundation of research and education.

How are the Computer Animation Festival films selected?
The Computer Animation Festival has two programs, the Electronic Theater and the VR Theater. Because of the large volume of submissions for the Electronic Theater (over 400), there is a triage committee for the first phase. The CAF Chair then takes the high scoring pieces to a jury comprised of industry professionals. The jury selects then become the Electronic Theater show pieces.

The selections for the VR Theater are made by a smaller panel comprised mostly of sub-committee members that watch each film in a VR headset and vote.

Can you talk more about how SIGGRAPH is tackling AR/VR/AI and machine learning?
Since SIGGRAPH 2018 is about the theme of “Generations,” we took a step back to look at how we got where we are today in terms of AR/VR, and where we are going with it. Much of what we know today couldn’t have been possible without the research and creation of Ivan Sutherland’s 1968 head-mounted display. We have a fanatic panel celebrating the 50-year anniversary of his HMD, which is widely considered and the first VR HMD.

AI tools are newer, and we created a panel that focuses on trends and the future of AI tools in VFX, called “Future Artificial Intelligence and Deep Learning Tools for VFX.” This panel gains insight from experts embedded in both the AI and VFX industries and gives attendees a look at how different companies plan to further their technology development.

What is the process for making sure that all aspects of the industry are covered in terms of panels?
Every year new ideas for panels and sessions are submitted by contributors from all over the globe. Those submissions are then reviewed by a jury of industry experts, and it is through this process that panelists and cross-industry coverage is determined.

Each year, the conference chair oversees the program chairs, then each of the program chairs become part of a jury process — this helps to ensure the best program with the most industries represented from across all disciplines.

In the rare case a program committee feels they are missing something key in the industry, they can try to curate a panel in, but we still require that that panel be reviewed by subject matter experts before it would be considered for final acceptance.

 

Mark Thorely joins Mill Film Australia as MD

Mill Film in Australia, a Technicolor VFX studio, has named Mark Thorley as managing director.Hi appointment comes in the wake of the February launch of Mill Film in Adelaide, Australia.

Thorley brings with him more than 15 years of executive experience, working at such studios as Lucas Film, Singapore, where he oversaw studio operations and production strategies. Prior to that, Thorley spent nine years at Animal Logic, at both their Los Angeles and Sydney locations, as head of production. He also held senior positions at Screen Queensland and Omnicom.

Throughout his career, Thorley has received credits on numerous blockbuster feature films, including Kong: Skull Island, Rogue One, Jurassic World and Avengers: Age of Ultron. Thorley will drive all aspects of VFX production, client relations and business development for Australia, reporting into the global head of Mill Film, Lauren McCallum.

Disfiguring The Man in Black for HBO’s Westworld

If you watch HBO’s Westworld, you are familiar with the once-good guy turned bad guy The Man in Black. He is ruthless and easy to hate, so when Karma caught up to him audiences were not too upset about it.

Westworld doesn’t shy away from violence. In fact, it has a major role in the series. A recent example of an invisible effect displaying mutilation came during the show’s recent Season Two finale. CVD VFX, a boutique visual effects house based in Vancouver, was called on to create the intricate and gruesome result of what The Man in Black’s hand looked like after being blown to pieces.

During the long-awaited face-off between The Man in Black (Ed Harris) and Dolores Abernathy (Evan Rachel Wood), we see their long-simmering conflict culminate with his pistol pressed against her forehead, cocked and ready to fire. But when he pulls the trigger, the gun backfires and explodes in his hand, sending fingers flying into the sand and leaving horrifyingly bloody stumps.

CVD VFX’s team augmented the on-set footage to bring the moment to life in excruciating detail. Harris’ fingers were wrapped in blue in the original shot, and CVD VFX went to work removing his digits and replacing them with animated stubs, complete with the visceral details of protruding bone and glistening blood. The team used special effects makeup for reference on both blood and lighting, and were able to seamlessly incorporate the practical and digital elements.

The result was impressive, especially considering the short turnaround time that CVD had to create the effect.

“We were brought on a little late in the game as we had a couple weeks to turn it around,” explains Chris van Dyck, founder of CVD VFX, who worked with the show’s VFX supervisor, Jay Worth. “Our first task was to provide reference/style frames of what we’d be proposing. It was great to have relatively free reign to propose how the fingers were blown off. Ultimately, we had great direction and once we put the shots together, everyone was happy pretty quickly.”

CVD used Foundry’s Nuke and Autodesk’s Maya to create the effect.

CVD VFX’s work on Westworld wasn’t the first time they worked with Worth. They previously worked together on Syfy’s The Magicians and Fox’s Wayward Pines.

Behind the Title: Steelhead MD Ted Markovic

NAME: Ted Markovic

COMPANY: LA-based Steelhead

CAN YOU DESCRIBE YOUR COMPANY?
We are a content studio and cross-platform production company. You can walk through our front door with a script and out the back with a piece of content. We produce everything from social to Super Bowl.

WHAT’S YOUR JOB TITLE?
Managing Director

WHAT DOES THAT ENTAIL?
I am responsible for driving the overall culture and financial health of the organization. That includes building strong client relationships, new business development, operational oversight, marketing, recruiting and retaining talent and managing the profits and losses of all departments.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
We all have a wide range of responsibilities and wear many hats. I occasionally find myself replacing the paper towels in the bathrooms because some days that’s what it takes.

WHAT’S YOUR FAVORITE PART OF THE JOB?
We are a very productive group that produces great work. I get a sense of accomplishment almost every day.

WHAT’S YOUR LEAST FAVORITE?
Replacing the paper towels in the bathrooms.

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
I get a lot more done while everyone else is busy eating their lunch or driving home.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Solving the traffic problem in Los Angeles. I see a lot of opportunities there.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I am a third-generation post production executive, and essentially grew up in a film lab in New York. I suspect the profession chose me.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
I am currently working on a Volkswagen Tier 2 project where we are shooting six cars over seven days on our stage at Steelhead. We’re incorporating dynamic camera shots of cars on a cyc with kinetic typography, motion graphics and VFX. It’s a great example of how we can do it all under one roof.

We recently worked with Nintendo and Interogate to bring the new Switch games to life in a campaign called Close Call. On set with rams, air mortars, lighting effects and lots of sawed-in-half furniture, we were able create real weight in-camera to layer with our VFX. We augmented the practical effects with HDR light maps, fire and debris simulations, as well as procedurally generated energy beams, 3D models, and 2D compositing to create a synergy between the practical and visual effects that really sells the proximity and sense of danger we were looking to create.

While the coordination of practical and post was no small chore, another interesting challenge we had to overcome was creating the CG weapons to mesh with the live-action plates. We started with low-resolution models directly from the games themselves, converted them and scrubbed in a good layer of detail and refined them to make them photoreal. We also had to conceptualize how some of the more abstract weapons would play with real-world physics.

Another project worth mentioning was a piece we created for Volkswagen called Strange Terrains. The challenge was to create 360-degree timelapse video from day-to-night. Something that’s never been done before. And in order to get this unique footage, we had to build an equally unique rigging system. We partnered with Supply Frame to design and build a custom-milled aluminum head to support four 50.6 megapixel Canon EOS 5DS cameras.

The “holy grail” of timelapse photography is getting the cameras to ramp the exposure over broad light changes. This was especially challenging to capture due to the massive exposure changes in the sky and the harshness of the white salt. After capturing around approximately 2,000 frames per camera — 9TB of working storage — we spent countless hours stitching, compositing, computing and rendering to get a fluid final product.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
About eight years ago, I created a video for my parents’ 40th wedding anniversary. My mother still cries when she watches it.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
The wheel is a pretty essential piece of technology that I’m not sure I could live without. My smartphone is as expected as well as my Sleepwell device for apnea. That device changed my life.

DO YOU LISTEN TO MUSIC WHILE YOU WORK? CARE TO SHARE YOUR FAVORITE MUSIC TO WORK TO?
I can work listening to anything but reggae.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Exercise.

Eli Rotholz joins Alkemy X as VP of biz dev

Creative content company Alkemy X has added Eli Rotholz as VP of business development,. He will be based in the company’s New York headquarters.

Rotholz brings more than 12 years of sales/business development, strategy and production experience, having begun his career as an independent sales rep for Ziegler/Jakubowicz and Moustache NYC. From there, he worked in his first in-house position at Click3X, where he built and managed a diverse roster of directorial talent, as well as the company’s first integrated production offering focusing on live-action, VFX/design/animation and editorial.

Rotholz then founded Honor Society Films. He later joined Hone Production, a brand-direct-focused production company and consultancy, as director of business development/content EP.

“Very few companies in the industry can boast the strong directorial roster and VFX capabilities as Alkemy X,” says Rotholz. “In addition to the amazing entertainment work that Alkemy does, there’s definitely a trend in high-end ‘package’ productions where one company can do both live-action shoots with their directors, as well as editorial and VFX.”

The Orville VFX supervisor on mixing practical and visual effects

By Barry Goch

What do you get when you mix Family Guy and Ted creator Seth MacFarlane with science fiction? The most dysfunctional spaceship in the galaxy, that’s what. What is the Fox series The Orville? Well, it’s more Galaxy Quest/Space Balls than it is Star Trek/Star Wars.

Set 400 years in the future, The Orville is a spaceship captained by MacFarlane’s Ed Mercer, who has to work alongside his ex-wife as they wing their way through space on a science mission. As you might imagine with a show that is set in space, The Orville features a large amount of visual and practical effects shots, including real and CG models of The Orville.

Luke McDonald

We reached out to the show’s VFX supervisor Luke McDonald to find out more.

How did the practical model of The Orville come about?
Jon Favreau was directing the pilot, and he and Seth MacFarlane had been kidding around about doing a practical model of The Orville. I jumped at the chance. In this day and age, visual effects supervisors shooting models is an unheard of thing to do, but something I was absolutely thrilled about.

Favreau’s visual effects supervisor is Rob Legato. I have worked with Rob on many projects, including Martin Scorsese’s Aviator, Shine a Light and Shutter Island, so I was very familiar with how Rob works. The only other chance that I had had to shoot models was with Rob during Shutter Island and Aviator, so in a sense, whenever Rob Legato shows up it’s model time (he laughs). It’s so amazing because it’s just something that the industry shies away from, but given the opportunity it was absolutely fantastic.

Who built the practical model of The Orville?
Glenn Derry made it. He’s worked with Rob Legato on a few things, including Aviator. Glen is kind of a fantastic. He basically does motion controls, models and motion capture. Glen would also look at all the camera moves and all the previz that we did to make sure the camera moves were not doing something that the motion control rig could not do.

How were you able to seamlessly blend the practical model and the CG version of The Orville?
Once we had the design for The Orville, we would then previz out the ships flying by camera, doing whatever, and work out these specific moves. Any move that was too technical for the motion control rig, we would do a CG link-up instead— meaning that it would go from model to a CG ship or vice versa — to get the exact camera move that we wanted. We basically shot all of the miniatures of The Orville at three frames a second. It was kind of like shooting in slow-mo with the motion control rig, and we did about 16 passes per shot — lights on, lights off, key lights, field light, back light, ambient, etc. So, when we got all the passes back, we composited them just like we would any kind of full CG shot.

From the model shoot, we ended up with about 25 individual shots of The Orville. It’s a very time-consuming process, but it’s very rewarding because of how many times you’re going to have to reuse these elements to achieve completely new shots, even though it’s from the same original motion control shoot.

How did the shots of The Orville evolve over the length of the season?
We started to get into more dynamic things, such as big space battles and specific action patenting, where it really wasn’t feasible to continue shooting the model itself. But now we have a complete match for our CG version of The Orville that we can use for our big space battles, where the ship’s flying and whipping around. I need to emphasize that previz on this project was very crucial.

The Orville is a science vessel, but when it needs to throw down and fight, it has the capabilities to be quite maneuverable — it can barrel roll, flip and power slide around to get itself in position to get the best shot off. Seth was responding to these hybrid-type ship-to-ship shots and The Orville moving through space in a unique way when it’s in battle.
There was never a playbook. It was always, “Let’s explore, let’s figure out, and let’s see where we fit in this universe. Do we fit into the traditional Star Trek-y stuff, or do we fit into the Star Wars-type stuff. I’m so pleased that we fit into this really unique world.

How was working with Seth MacFarlane?
Working with Seth has been absolutely amazing. He’s such a dedicated storyteller, even down to the most minute things. He’s such an encyclopedia of sci-fi knowledge, be it Star Trek, Star Wars, Battlestar Galactica or the old-school Buck Rogers and Flash Gordon. All of them are part of his creative repertoire. It’s very rare that he makes a reference that I don’t get, because I’m exactly the same way about sci-fi.

How different is creating VFX for TV versus film?
TV is not that new to me, but for the last 10 years I’ve been doing film work for Bad Robot and JJ Abrams. It was a strange awakening coming to TV, but it wasn’t horrifying. I had to approach things in a different way, especially from a budget standpoint.

Rachel Matchett brought on to lead Technicolor Visual Effects

Technicolor has hired on Rachel Matchett to head the post production group’s newly formed VFX brand, Technicolor Visual Effects. Working side-by-side within the same facilities where post services are offered, Technicolor Visual Effects is expanding to a global offering with an integrated pipeline. Technicolor is growing its namesake VFX team apart from the company’s other visual effects brands: MPC, The Mill, Mr. X and Mikros.

A full-service creative VFX house with local studios in Los Angeles, Toronto and London, Technicolor Visual Effects’ recent credits include the feature films Avengers: Infinity Wars, Black Panther, Paddington 2, and episodic series such as This is Us, Anne With an E and Black Mirror.

Matchett joins Technicolor from her long-tenured position at MPC Film. Her background at MPC London includes nearly a decade of senior management positions at the studio. She most recently served as MPC London’s global head of production. In that role, her divisions at MPC Film oversaw and carried out visual effects on a number of films each year, including director Jon Favreau’s Academy Award-winning The Jungle Book and the critically acclaimed Blade Runner 2049.

“Technicolor Visual Effects is emerging from its position as one of the industry’s best-kept secrets. While continuing to support clients who do color finishing with us, we are excited to work with storytellers from script to screen,” says Matchett. “Having been at the heart of MPC Film’s rapid growth over the past decade, I feel that there is a great opportunity for Technicolor’s future role in VFX to forge a new path within the industry.”

Behind the Title: Weta’s Head of Tech & Research Luca Fascione

NAME: Luca Fascione

COMPANY: Wellington, New Zealand’s Weta Digital

WHAT’S YOUR JOB TITLE?
Senior Head of Technology and Research

WHAT DOES THAT ENTAIL?
In my role, I lead the activities of Weta Digital that provide software technology to the studio and our partners. There are various groups that form technology and research: Production Engineering oversees the studio’s pipeline and infrastructure software, Software Engineering oversees our large plug-ins such as our hair system (Barbershop/Wig), our tree growth system (Lumberjack/Totara) and our environment construction system (Scenic Designer), to name a few.

Two more departments that make up the technology and research group include Rendering Research and Simulation Research. These departments oversee proprietary renderer, Manuka, and our physical simulation system, Synapse. Both groups have a strong applied research focus and as well as producing software, they are often involved in the publication of scientific papers.

HOW DID YOU GET INTO THIS BUSINESS?
Cinema and computers have been favorites of mine (as well as music) since I was a little kid. We used to play a game when I was maybe 12 or so where we would watch about five seconds of a random movie on TV, turn it off, and recite the title. I was very good at that.

A couple of my friends and I watched all the movies we could find, from arthouse European material to more commercial, mainstream content. When it came time to find a job, I thought finding a way to merge my passion for cinema and my interest in computers into one would be great, if I could.

HOW LONG HAVE YOU BEEN WORKING IN THIS INDUSTRY?
I started at Weta Digital in 2004. Before that I was part of the crew working the feature animation movie Valiant, where I started in 2002. I guess this would make it 15 years.

HOW HAS THE VFX INDUSTRY CHANGED IN THE TIME YOU’VE BEEN WORKING? WHAT’S BEEN GOOD, WHAT’S BEEN BAD?
Everything got bigger. More so the content we want to work with in relation to the machines we want to use to achieve our goals. As much as technology has improved, our ability to use it to drive the hardware extremely hard has grown faster, creating a need for technically creative, innovative solutions to our scaling problems.

Graphics is running out of “easy problems” that one can solve drawing inspiration from other fields of science, and it’s sometimes the case that our research has outpaced the advancements of similar problems in other fields, such as medicine, physics or engineering. At the same time, especially since the recent move toward deep learning and “big data” problems, the top brains in the field are all drawn away from graphics, making it harder than it used to be to get great talent.

DID A PARTICULAR FILM INSPIRE YOU ALONG THIS PATH IN ENTERTAINMENT?
I work in VFX because of Jurassic Park. Although I must also recognize Young Sherlock Holmes and Terminator 2, which also played a big role in this space. During my career in VFX, King Kong and Avatar have been life-shaping experiences.

DID YOU GO TO FILM SCHOOL?
Not at all, I studied Mathematics in Rome, Italy. All I know about movies is due to personal study work. Back in those days nobody taught computer graphics at this level for VFX. The closest were degrees in engineering schools that maybe had a course or two in graphics. Things have changed massively since then in this area.

WHAT’S YOUR FAVORITE PART OF THE JOB?
The variety. I run into a lot of extremely interesting problems, and I like being able to help people find good ways to solve them.

WHAT’S YOUR LEAST FAVORITE?
A role like mine necessarily entails having to have many difficult conversations with crew. I am extremely pleased to say the majority of these result in opportunities for growth and deepening of our mutual understandings. I love working with our crew, they’re great people and I do learn a lot every day.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I like my job, I don’t often think about doing something else. But I have on occasion wondered what it would be like to build guitars for a living.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
The latest War for the Planet of the Apes movie has been a fantastic achievement for the studio. The Technology and Research group has contributed a fair bit of software to the initiative, from our forest system Totara to a new lighting pipeline called PhysLight, a piece of work I was personally involved in and that I am particularly proud of.

During our work on The Jungle Book, we helped the production by reshaping our instancing system to address the dense forests in the movie. Great advancements in our destruction systems were also developed for Rampage.

WHAT IS THE PROJECT/S THAT YOU ARE MOST PROUD OF?
It turns out three of my early projects played a role of some importance in the making of Avatar: The facial solver, the sub-surface scattering system and PantaRay (our Spherical Harmonics occlusion system). After that, I’m extremely proud of my work on Manuka, Weta Digital’s in-house renderer.

WHERE DO YOU FIND INSPIRATION NOW?
All around me, it’s the people, listening to their experiences, problems and wishes. That’s how our job is done.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I play guitar and I build audio amplifiers. I have two daughters in primary school that are a lot of fun and a little boy just joined our family last December. I do take the occasional picture as well.

Chimney opens in New York City, hires team of post vets

Chimney, an independent content company specializing in film, television, spots and digital media, has opened a new facility in New York City. For over 20 years, the group has been producing and posting campaigns for brands, such as Ikea, Audi, H&M, Chanel, Nike, HP, UBS and more. Chimney was also the post partner for the feature films Chappaquiddick, Her, Atomic Blonde and Tinker Tailor Soldier Spy.

With this New York opening, Chimney now with 14 offices worldwide. Founded in Stockholm in 1995, they opened their first US studio in Los Angeles last year. In addition to Stockholm, New York and LA, Chimney also has facilities in Singapore, Copenhagen, Berlin and Sydney among other cities. For a full location list click here.

“Launching in New York is a benchmark long in the making, and the ultimate expression of our philosophy of ‘boutique-thinking with global power,’” says Henric Larsson, Chimney founder and COO. “Having a meaningful presence in all of the world’s economic centers with diverse cultural perspectives means we can create and execute at the highest level in partnership with our clients.”

The New York opening supports Chimney’s mission to connect its global talent and resources, effectively operating as a 24-hour, full-service content partner to brand, entertainment and agency clients, no matter where they are in the world.

Chimney has signed on several industry vets to spearhead the New York office. Leading the US presence is CEO North America Marcelo Gandola. His previous roles include COO at Harbor Picture Company; EVP at Hogarth; SVP of creative services at Deluxe Entertainment Services Group; and VP of operations at Company 3.

Colorist and director Lez Rudge serves as Chimney’s head of color North America. He is a former partner and senior colorist at Nice Shoes in New York. He has worked alongside Spike Lee and Darren Aronofsky, and on major brand campaigns for Maybelline, Revlon, NHL, Jeep, Humira, Spectrum and Budweiser.

Managing director Ed Rilli will spearhead the day-to-day logistics of the New York office. As the former head of production of Nice Shoes, his resume includes producing major campaigns for such brands as NFL, Ford, Jagermeister and Chase.

Sam O’Hare, chief creative officer and lead VFX artist, will oversee the VFX team. Bringing experience in live-action directing, VFX supervision, still photography and architecture, O’Hare’s interdisciplinary background makes him well suited for photorealistic CGI production.

In addition, Chimney has brought on cinematographer and colorist Vincent Taylor, who joins from MPC Shanghai, where he worked with brands such as Coca-Cola, Porsche, New Balance, Airbnb, BMW, Nike and L’Oréal.

The 6,000-square-foot office will feature Blackmagic Resolve color rooms, Autodesk Flame suites and a VFX bullpen, as well as multiple edit rooms, a DI theater and a Dolby Atmos mix stage through a joint venture with Gigantic Studios.

Main Image: (L-R) Ed Rilli, Sam O’Hare, Marcelo Gandola and Lez Rudge.

Framestore London adds joint heads of CG

Framestore has named Grant Walker and Ahmed Gharraph as joint heads of CG at its London studio. The two will lead the company’s advertising, television and immersive work alongside head of animation Ross Burgess.

Gharraph has returned to Framestore after a two-year stint at ILM, where he was lead FX artist on Star Wars: The Last Jedi, receiving a VES nomination in Outstanding Effects Simulations in a Photoreal Feature. His credits on the advertising-side as CG supervisor include Mog’s Christmas Calamity, which was Sainsbury’s 2015 festive campaign, and Shell V-Power Shapeshifter, directed by Carl Erik Rinsch.

Walker joined Framestore in 2009, and in his time at the company he has worked across film, advertising and television, building a portfolio as a CG artist with campaigns, including Freesat’s VES-nominated Sheldon. He was also instrumental in Framestore’s digital recreation of Audrey Hepburn in Galaxy’s 2013 campaign Chauffeur for AMV BBDO. Most recently, he was BAFTA-nominated for his creature work in the Black Mirror episode, “Playtest.”

Lauren McCallum to head Mill Film’s new Montreal studio

Mill Film will open a new facility in Montréal, Québec with operations starting this summer. The announcement follows the February launch of Mill Film in Adelaide, Australia.

Mill Film, a Technicolor studio that won an Academy Award for best visual effects for the movie Gladiator in 2001, will focus on the needs of streaming and episodic content — in addition to long form film, which is the domain of existing Technicolor VFX brands, including MPC Film and Mr. X.

Global head of Mill Film Lauren McCallum will head the new studio. Throughout her career, McCallum has been known for leading creative talent on features like Blade Runner 2049 and Wonder Woman, as well as her work on the 2017 Oscar-winning The Jungle Book.

A specialist in VFX management, McCallum will oversee all aspects of production along with driving operations and strategy. A 10-year VFX veteran, McCallum was most recently head of production at MPC Film, and prior to that was at London’s Framestore and Prime Focus World.

Behind the Title: Versus Partner/CD Justin Barnes

NAME: Justin Barnes

COMPANY: Versus (@vs_nyc)

CAN YOU DESCRIBE YOUR COMPANY?
We are “versus” the traditional model of a creative studio. Our approach is design driven and full service. We handle everything from live action to post production, animation and VFX. We often see projects from concept through delivery.

WHAT’S YOUR JOB TITLE?
Partner and Creative Director

WHAT DOES THAT ENTAIL?
I handle the creative side of Versus. From pitching to ideation, thought leadership and working closely with our editors, animators, artists and clients to make our creative — and our clients’ creative vision — the best it can be.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
There’s a lot of business and politics that you have to deal with being a creative.

Adidas

WHAT’S YOUR FAVORITE PART OF THE JOB?
Every day is different, full of new challenges and the opportunity to come up with new ideas and make really great work.

WHAT’S YOUR LEAST FAVORITE?
When I have to deal with the business side of things more than the creative side.

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
For me, it’s very late at night; the only time I can work with no distractions.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Anything in the creative world.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
It’s been a natural progression for me to be where I am. Working with creative and talented people in an industry with unlimited possibilities has always seemed like a perfect fit.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
– Re-brand of The Washington Post
– Animated content series for the NCAA
– CG campaign for Zyrtec
– Live-action content for Adidas and Alltimers collaboration

Zyrtec

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
I am proud of all the projects we do, but the ones that stick out the most are the projects with the biggest challenges that we have pulled together and made look amazing. That seems like every project these days.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
My laptop, my phone and Uber.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
I can’t live without Pinterest. It’s a place to capture the huge streams of inspiration that come at us each day.

DO YOU LISTEN TO MUSIC WHILE YOU WORK?
We have music playing in the office 24/7, everything from hip-hop to classical. We love it all. When I am writing for a pitch, I need a little more concentration. I’ll throw on my headphones and put on something that I can get lost in.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Working on personal projects is big in helping de-stress. Also time at my weekend house in Connecticut.

Lindsay Seguin upped to EP at NYC’s FuseFX

Visual effects studio FuseFX has promoted Lindsay Seguin to executive producer in the studio’s New York City office. Seguin is now responsible for overseeing all client relationships at the FuseFX New York office, acting as a strategic collaborator for current and future productions spanning television, commercial and film categories. The company also has an office in LA.

Seguin, who first joined FuseFX in 2014, was previously managing producer. During her time with the company, she has worked with a number of high-profile client productions, including The Blacklist, Luke Cage, The Punisher, Iron Fist, Mr. Robot, The Get Down and the feature film American Made.

“Lindsay has played a key role in the growth and success of our New York office, and we’re excited for her to continue to forge partnerships with some of our biggest clients in her new role,” says Joseph Bell, chief operating officer and executive VP of production at FuseFX.

“We have a really close-knit team that enjoys working together on exciting projects,” Seguin added about her experience working at FuseFX. “Our crew is very savvy and hardworking, and they manage to maintain a great work/life balance, even as the studio delivers VFX for some of the most popular shows on television. Our goal is to have a healthy work environment and produce awesome visual effects.”

Seguin is a member of the Visual Effects Society and the Post New York Alliance. Prior to making the transition to television and feature work, her experience was primarily in national broadcast and commercial projects, which included campaigns for Wendy’s, Garnier, and Optimum. She is a graduate of Penn State University with a degree in telecommunications. Born in Toronto, Seguin is a dual citizen of Canada and the United States.

Creative editorial and post boutique Hiatus opens in Detroit

Hiatus, a full-service, post production studio with in-house creative editorial, original music composition and motion graphics departments, has opened in Detroit. Their creative content offerings cover categories such as documentary, narrative, conceptual, music videos and advertising media for all video platforms.

Led by founder/senior editor Shane Patrick Ford, the new company includes executive producer/partner Catherine Pink, and executive producer Joshua Magee, who joins Hiatus from the animation studio Lunar North. Additional talents feature editor Josh Beebe, composer/editor David Chapdelaine and animator James Naugle.

The roots of Hiatus began with The Factory, a music venue founded by Ford while he was still in college. It provided a venue for local Detroit musicians to play, as well as touring bands. Ford, along with a small group of creatives, then formed The Work – a production company focused on commercial and advertising projects. For Ford, the launch of Hiatus is an opportunity to focus solely on his editorial projects and to expand his creative reach and that of his team nationally.

Leading up to the launch of Hiatus, the team has worked on projects for brands such as Sony, Ford Motor Company, Acura and Bush’s, as well as recent music videos for Lord Huron, Parquet Courts and the Wombats.

The Hiatus team is also putting the finishing touches on the company’s first original feature film Dare to Struggle, Dare to Win. The film uncovers a Detroit Police decoy unit named STRESS and the efforts made to restore civil order in 1970s post-rebellion Detroit. Dare to Struggle, Dare to Win makes its debut at the Indy Film Festival on Sunday April 29th and Tuesday May 1st in Indianapolis, before it hits the film festival circuit.

“Launching Hiatus was a natural evolution for me,” says Ford. “It was time to give my creative team even more opportunities, to expand our network and to collaborate with people across the country that I’ve made great connections with. As the post team evolved within The Work, we outgrew the original role it played within a production company. We began to develop our own team, culture, offerings and our own processes. With the launch of Hiatus, we are poised to better serve the visual arts community, to continue to grow and to be recognized for the talented creative team we are.”

“Instead of having a post house stacked with people, we’d prefer to stay small and choose the right personal fit for each project when it comes to color, VFX and heavy finishing,” explains Hiatus EP Catherine Pink. “We have a network of like-minded artists that we can call on, so each project gets the right creative attention and touch it deserves. Also, the lower overhead allows us to remain nimble and work with a variety of budget needs and all kinds of clients.”

Director HaZ Dulull on his sci-fi offering The Beyond

By Randi Altman

Director Hasraf “HaZ” Dulull is no stranger to making movies. Before jumping into writing and directing short sci-fi films, he was a visual effects supervisor and producer. His short film resume includes Project Kronos, I.R.I.S. and Sync. Recently, his first feature film, The Beyond, was released by Gravitas Ventures.

When I first met HaZ a few years back, we were both at an Adobe event — on a canal boat in Amsterdam during IBC. We started talking about visual effects, the industry and his drive to make movies.

This Brit is friendly, intelligent and incredibly hands-on in all aspects of what he does. His latest is The Beyond, which he describes as “a cerebral science-fiction feature film that blends the realism of documentary with the fantastical, ‘big idea’ nature of the science-fiction films of today.” The Beyond tells the story of a ground-breaking mission that sent astronauts — modified with advanced robotics — through a newly discovered wormhole known as the Void. When the mission returns unexpectedly, the space agency races to discover what the astronauts encountered on their first-of-its-kind interstellar space journey.

HaZ on set

HaZ was so hands-on that he provided some of the film’s visual effects and edited the film. Here is the trailer. If you like what you see, the film is available for purchase or rent on most digital platforms.

When I reached out to HaZ to talk about The Beyond, he was in Vancouver working on an eight-part TV series for Disney called Fast Layne. “I directed episodes 1 and 2, and am currently directing episodes 7 and 8,” he says. “The beauty of starting and ending the series is it allowed me to set the show’s style and tone.”

It seems he can’t sit still! Let’s find out more about how he works and The Beyond

Can you talk about prepro? How much of that included visual effects prepro?
Most people who know me will say I’m obsessed with prep. I had about six months of hardcore prep on this, from doing little storyboards, known as HaZ-Grams, right through to previs of the key sequences.

But even during the script-writing stage (six months before actual prep), I was coming up with visuals to support the ideas I was writing in the script. Sometimes I would knock up a test VFX scene just to see how complex it would be to create this idea I was writing in the script. Prep worked hand in hand with the script development and the budgeting of the film. The film was self-financed and later additional financing came in (during post production of the film), so I wanted to ensure everything was mapped out technically, as there was no “fix it in post” scenarios in this film — I wouldn’t allow it.

During location scouting, I would have my iPhone with me and shoot a bunch of footage and still imagery, so when I went back home I could write those locations into the script to make them work with the scenarios depicted in the film.

As part of prep we actually shot a test scene to really see if this mocku-mentary format would work to tell a grounded sci-fi story. This was also used to attract crew and other casting to the project, as well as get distributors primed early on.

Many shots from that test actually made it into the final movie —I wasn’t kidding about not wasting any budget or material on this production! So prep pretty much helped shape the script too, as I knew I wasn’t in the financial position to write stuff and then go and build it. I had to reverse engineer it in a way. In the film we have tons of locations, such as the Space Centre with actual real rockets. We also had a team in Iceland shooting alien landscapes, and we even shot some scenes in Malaysia to give the film a global feel — with each of those opportunities the script was tweaked to make full use of those location opportunities we had.

You shot with Blackmagic cameras. Was that your choice? The DP’s? Have you shot with these before?
From the start, I knew we were going to shoot on Blackmagic cameras. This was mainly down to the fact my DP Adam Batchelor — who had shot Sync with me and the proof of concept tests we did for this film — was a Blackmagic advocate and knew the cameras inside out, but more importantly he was able to get cinematic imagery using those cameras.

Blackmagic was very supportive of the film and have been of my career since my short films, so they came on as one of the executive producers on the film. No one had ever shot a full feature film using just the Blackmagic cameras. We also then used a Resolve pipeline to delivery. So The Beyond is the perfect case study for it.

Can you talk about that workflow? Any hiccups? 
I think the only hiccups were the fact we were using a beta version of Resolve 14, so there were the expected crashes, etc. That would usually be seen as risky on a feature film, but luckily we didn’t have a distributor in place with a release date, so the risk was minimal.

The good thing was I would generate an error log report from Resolve and send it over to Blackmagic, who would then instantly send out a new patch. So we were looked after rather than being left on our own to scream at the monitor.

We stuck with a Pro Res 4444 QuickTime workflow for all material from footage to VFX renders, and enabled proxy on the fly within Resolve. This was great as it meant I was working with the highest-resolution imagery within Resolve, and it was fairly fast too. Things started to slow down when I had multiple layers of VFX and composites/groups, which I then had to render out as a new clip and bring back in.

How did you and the DP develop the look you wanted? Any scenes stick out that you guys worked on?
I was very fortunate to get Max Horton, who had worked on films like Gravity, to come onboard to grade this film at the Dolby Vision lab in London’s Soho. We also did an HDR version of the film, which I think is the first indie film to have an HDR treatment done to it.

We had three to four days of grading with Max, and I was in the room with him the whole time. This was because I had already done a first-pass temp grade myself while editing the film in the beta version of Resolve 14. This made the workflow as simple as exporting my Resolve file and then the material hand-over to Max, who would load up the Resolve file, link up the material and work from there.

Max kept everything photographically like a documentary but with a slight cinematic flair to it. The big challenge was matching all the various sources of material from the various Blackmagic cameras (Ursa Mini Pro, the Production Camera and the Pocket Camera) to the DJI Osmo, drone footage and stock footage.

How many VFX shots were there? Who did them?
There were around 750 visual effects shots. I designed all the VFX scenes and handled a huge portion of the compositing myself, including invisible effects shots, all the space scenes, alien planet scenes, memory scenes and tons more — this would not have been possible without the support of my VFX team who worked on their assigned sequences and shots and also generated tons of CGI assets for me to use to create my shots in comp.

My VFX team members included my long-time collaborator John Sellings, who was the VFX supervisor for all the Human 2.0 sequences. Filmmore, in Amsterdam and Brussels, handled Human 2.0 scenes in the transcode bay with in-house VFX supervisor Hans Van Helden. London’s Squint VFX handled the Human 2.0 scenes in wake-up lab. Charles Wilcocks was the Human 2.0 CG supervisor who worked on the shape and look of the Human 2.0.

Hussin Khan looked after the Malaysian team, which provided rotoscoping support and basic comps. Dan Newlands was our on-set tracking supervisor. He ensured all data was captured correctly and supervised anything tracking related in the Human 2.0 scenes.

Another long-time collaborator was Andrea Tedeschi, who handled the CG and comps for the spacecraft carrier at the end of the film, as well as rendering out the CG astronaut passes. Rhys Griffith handled the rigging for the Human 2.0 characters in Maya, and also looked after the CG passes for the alpha Human 2.0 scenes using Blender. Aleksandr Uusmees provided all the particles and simulation rendered out of Houdini as CG passes/elements, which I then used to create the wormhole effects, alien spheres and other shots that needed those elements.

JM Blay designed and created the standalone motion graphics sequences to visualize the Human 2.0 medical procedure, as well as mission trajectory graphics. He also created several “kit-bash” graphics assets for me to use, including UI graphics, from his After Effects files.

Territory Studio created the awesome end titles and credits sequence, which you can read more about on their site.

As a VFX pro yourself, do you find that you are harder to please because it’s your wheelhouse?
Oh boy. Ask any of the VFX guys on the team and they will say I am a beast to work with because I am hands-on, and also I know how long things take. But on the flip side that had its advantages, as they knew they were not going to get revision after revision, because with each brief I also presented a proposed methodology, and made sure we locked down on that first before proceeding with the shots.

Was this your biggest directing job to date? Can you talk about any surprises?
It wasn’t my biggest directing job to date, as during post production of The Beyond my second sci-fi film Origin Unknown (starring Katee Sackhoff from Battlestar Galactica, The Flash) was green-lit and that had its own set of challenges. We can talk more about that when the film is released theatrically and VOD later this year via Kew Media.

This was, however, my biggest producing job to date; there were so many logistics and resources to manage whilst directing too. The cool thing about the way we made this film was that most of the crew were on my short films, including some of the key cast too, so we embraced the guerrilla nature of the production and focused on maximizing our resources to the fullest within the time and budget constraints.

What did you learn on this film that will help on your next?
The other hat I was wearing was the producer hat, and one thing I had to embrace was the sheer amount of paperwork! I may have taken the same filmmaking approach as I did on my short films — guerrilla and thinking outside the box technically and creatively— but making a commercial feature film, I had to learn to deal with things like clearances, E&O (errors and omission) insurance, chain of title, script report and a whole bunch of paperwork required before a distributor will pick up your film.

Thankfully my co-producer Paula Crickard, who is currently wrapping post on Terry Gilliam’s Don Quixote, came in during the post stage of the film and helped.

The other thing I learned was the whole sales angle — getting a reputable distributor on board to sell the film in all worldwide territories and how to navigate that process with rights and IP and more contracts etc. The advise I got from other filmmakers is getting the right distributor is a big part in how your film will be released, and to me it was important the distributor was into the film and not just the trailer, but also what their marketing and sales strategy were. The Beyond was never designed to be a theatrical film and therefore I wanted someone that had a big reach in the VOD world through their brand, especially since The Beyond doesn’t have big-name actors in there.

What was the most challenging scene or scenes? Why and how did you overcome those challenges?
The Human 2.0 scenes were the most challenging because they had to look photoreal due to it being a documentary narrative. We did first try and do it all in-camera using a built suit, but it wasn’t achieving the look we wanted and the actors would feel uncomfortable with it, and also to do it properly with practical would cost a fortune. So we went with a full-digital solution for the Human 2.0 bodies, by having the actors wear a tight grey suit with tracking markers on and we restricted our camera moves for simplicity to enable object tracking to work as accurately as possible. We also shot multiple reference footage from all angles to help with match moving. Having an on set-tracking supervisor helped massively and allowed us to make this happen within the budget, while looking and feeling real.

Our biggest issue came when our actress made very tiny movements due to breathing in close-up shots. Because our Human 2.0 was human consciousness in a synthetic shell, breathing didn’t make sense and we began making up for it by freezing the image or doing some stabilization, which ended up being nearly impossible for the very close-up shots.

In the end, I had to think outside the box, so I wrote a few lines into the script that explained that the Human 2.0 was breathing to make it psychologically more acceptable to other humans. Those two lines saved us weeks and possibly months of time.

Being a VFX movie you would expect us to use a form of greenscreen or bluescreen, but we didn’t — in fact, the only stage used was for the “white room” astronaut scene, which was shot over at Asylum FX in London. There was an actor wearing an astronaut suit in a bright photography room, and we used brightly exposed lighting to give a surreal feeling. We used VFX to augment it.

As a writer and a director, how was it seeing your vision through from start to finish.
It didn’t really hit me until I watched the press screening of it at the Dolby Vision office in Soho. It had the fully mixed sound and the completed grade. I remember looking across at my DP and other team members thinking, “Whoa! It looks and feels like a feature film, and we did that in a year!”

You edited the film yourself?
Yes, I was the editor on the film! I shoot for the edit. I started off using Adobe Premiere CC for the early offline and then quickly moved over to Resolve 14, where I did the majority of the editing. It was great because I was doing a lot of online editorial tasks like stabilizing, basic VFX, pan and scans, as well as establishing temp looks while editing. So in a way there was no offline and online editorial, as it was all part of one workflow. We did all our deliverables out of Resolve 14, too.

End of the Line director Jessica Sanders

By Randi Altman

After watching the End of the Line, I found myself thinking about the short film’s content… a lot. Based on a short story by Aimee Bender, director Jessica Sanders’ version starts off with a man walking into a pet store, looking into what the audience assumes is a birdcage and walking out not with a bird but a little man in a cage.

We see how Big Man (Stranger Things’ Brett Gelman) tries to take care of Little Man (Big Bang Theory‘s Simon Helberg) and then we see his frustration when the clearly well-read and intelligent Little Man tells the story of how he was taken from his family and put in a cage. Big Man’s behavior becomes increasingly disturbing, leading him to torture Little Man.

We reached out to director Sanders — who has an Oscar nomination thanks to her short documentary, Sing! — to talk about making the film, which is part of Refinery29 and TNT’s Shatterbox Anthology, a short film series dedicated to supporting the voices of female filmmakers.

Let’s start with cameras. What did you shoot on, and how involved in that process are you?
We shot on the Alexa Mini with Panavision primo lenses. I like to go over lenses/looks with my DP, but defer to what the DP wants to shoot on. For this project, I worked with ultra-talented DP Brett Pawlak.

How long was the shoot, and how much pre-production did you do? I’m assuming a good amount considering the VFX shots?
The film, although short (14 minutes), was essentially a feature in terms of preparation and the production scope/crew size, shooting for six days. We had about two months of intense prep leading up to the shoot, from scouting and art department. For example, we built a 30-foot penis and 30-foot cage. The VFX approach was an intensive collaboration between VFX supervisor Eva Flodstrom, my DP Brett, production designer Justin Trask, producer Louise Shore and myself.

We had 67 VFX shots, so I storyboarded the film early on and then photoboarded each shot when we had our locations. We had a specific VFX/production approach to execute each shot from a mix of practicals (building the giant cage), to strictly greenscreen (i.e., when the little man is on a calculator). It was a highly involved and collaborative process.

Was your VFX supervisor on set?
Yes. Eva was highly involved from the beginning for all of prep, and on set she was instrumental. We worked closely with a DIT video assist so we could do a rough VFX comp of each shot while we were shooting. After production, it took about four months to finish post and visual effects.

I wanted to work with Eva, as she’s a pro, having worked on Star Wars and Star Trek (also, there are very few female VFX supervisors). Our approach/philosophy to VFX was similar — inspired by Michel Gondry’s and Spike Jonze’s work in which the VFX feels human, warm and practical, integral to the story and characters, never distracting.

Can you talk about the challenges of directing a project with VFX?
I had never done a VFX-heavy film before, and creatively, as a director, I wanted to challenge myself. I had a blast and want to do more films with VFX after this experience. Because I surrounded myself with top artists who had VFX experience, it was a totally enjoyable experience. We had a lot of fun making this film!

This was likely a hard story to tell. As the viewer you think it’s going to be a sweet story about a guy and his bird, but then…
I read Aimee Bender’s short story End of the Line in her book Willful Creatures in 2005 and have been passionate about it since then. The story takes the audience on an emotional rollercoaster. It’s funny, dark, explores themes of loneliness, desire and abuse of power, in a short amount of time. There are a lot of tonal shifts, and I worked closely with screenwriter Joanne Giger to achieve this balance.

How did you as a director set out to balance the humor, the sadness, the kinda disturbing stuff, etc.?
I played the film visually and tonally very grounded (i.e. the rules of this world is that there is a big person and tiny person world that live side by side) and from that I could push the humor and darkness. In the performances, I wanted there to be an emotional truth to what the characters are experiencing that feels human and real, despite the fantastical setting. So I played with a lot of the mix of feelings within this very grounded surreal world.

The color is sort of muted in an old-timey kind of way. Can you talk about what you wanted from the color and mood?
I’m very sensitive to color and attention to detail. We wanted the film to feel timeless, although it is contemporary. Costume designer Shirley Kurata is amazing with color blocking and visual storytelling with color. Because Big Man’s world is more depressed and lonely, his tones are gray, the house is dark wood. As Big Man gains power, he wears more color. My DP has a very naturalistic approach with his lighting, so I wanted everything to feel very natural.

When we colored the film later in post, the approach was to do very little to the film, as it was captured in-camera. Production designer Justin Trask is a genius — from how he designed and built the giant penis (to feel less naturalistic) to the details of Little Man’s cage (his furniture, the giant bread crumb on a coin). We had a lot of fun exploring all the miniature props and details in this film.

How did you work with your editors? What did they cut on?
Because of the VFX, we edited on Adobe Premiere. I worked with editor Stephen Berger, who helped shape the film and did an amazing job doing the rough VFX comps in the edit. He is great with music and brought musical inspirations, which led to composer Pedro Bromfman’s entire saxophone score. Pedro is a big composer from Brazil and did my last documentary March of the Living. Editor Claudia Castello is incredible with performance, building the emotional arc of each character. She edited Fruitvale Station, Creed and was an editor on Black Panther. It was a great collaborative experience.

You had a lot of women on the crew. It seems like you went out of your way to find female talent. Why is this so important to you and the industry in general?
As a woman and a woman of Asian descent (I’m half Chinese), it’s important to me to be surrounded by a diverse group of collaborators and to hire with as much gender equality as possible. I love working with talented women and supporting women. The world is a diverse place. It’s important to me to have different perspectives reflected in filmmaking and representation. There is a huge inequality of the hiring practices in Hollywood (4% of Hollywood feature films were directed by women last year), so it’s critical to hire talented, qualified women.

Do you think things are getting better for females in the industry, especially in the more technical jobs?
I’ve always hired female cinematographers, editors and worked with Eva Flodstrom for VFX. With my friend/colleague Rachel Morrison, who is the first female cinematographer nominated for an Oscar, I hope things are changing for women with more visibility and dialogue. Change can only happen by actually hiring talented women (like director Ryan Coogler (Black Panther) who works with female cinematographers and editors).

You’ve directed both narrative and documentary projects. Do you have a preference, or do you enjoy switching back and forth?
This film marks a new creative chapter and narrative direction in my work. I love my background in documentaries, but I am fully focused on narrative filmmaking at the moment.

How was Sundance for you and the film?
Sundance was an incredible experience and platform for the film. We were IndieWire’s Top 10 Must See Films. My creative team came out, including actors Simon Helberg and Vivian Bang. It was a blast!

Digital locations for Scandal/How to Get Away With Murder crossover

If you like your Thursday night television served up with a little Scandal and How to Get Away With Murder, then you likely loved the recent crossover episodes that paired the two show’s leading ladies. VFX Legion, which has a brick and mortar office in LA but artists all over the world, was called on to create a mix of photorealistic CG environments and other effects that made it possible for the show’s actors to appear in a variety of digital surroundings, including iconic locations in Washington, DC.

VFX Legion has handled all of the visual effects for both shows for almost three years, and is slated to work on the next season of Murder (this is Scandal’s last season). Over the years, the Shondaland Productions have tasked the company with creating high shot counts for almost 100 episodes, each matching the overall look of a single show. However, the crossover episodes required visual effects that blended with two series that use different tools and each have their own look, presenting a more complex set of challenges.

For instance, Scandal is shot on an Arri Alexa camera, and How to Get Away With Murder on a Sony F55, at different color temps and under varying lighting conditions. DP preferences and available equipment required each environment to be shot twice, once with greenscreens for Scandal and then again using bluescreens for Murder.

The replication of the Supreme Court Building is central to the storyline. Building its exterior facade and interiors of the courtroom and rotunda digitally from the ground up were the most complex visual effects created for the episodes.

The process began during preproduction with VFX supervisor Matthew T. Lynn working closely with the client to get a full understanding of their vision. He collaborated with VFX Legion head of production, Nate Smalley, production manager Andrew Turner and coordinators Matt Noren and Lexi Sloan on streamlining workflow and crafting a plan that aligned with the shows’ budgets, schedules, and resources. Lynn spent several weeks on R&D, previs and mockups. Legion’s end-to-end approach was presented to the staffs of both shows during combined VFX meetings, and a plan was finalized.

A rough 3D model of the set was constructed from hundreds of reference photographs stitched together using Agisoft Photoscan and photogrammetry. HDRI panoramas and 360-degree multiple exposure photographs of the set were used to match the 3D lighting with the live-action footage. CG modeling and texturing artist Trevor Harder then added the fine details and created the finished 3D model.

CG supervisor Rommél S. Calderon headed up the team of modeling, texturing, tracking, layout and lighting artists that created Washington, DC’s Supreme Court Building from scratch.

“The computer-generated model of the exterior of the building was a beast, and scheduling was a huge job in itself,” explains Calderon. “Meticulous planning, resource management, constant communication with clients and spot-on supervision were crucial to combining the large volume of shots without causing a bottleneck in VFX Legion’s digital pipeline.”

Ken Bishop, VFX Legion’s lead modeler, ran into some interesting issues while working with footage of the lead characters Olivia Pope and Annalise Keating filmed on the concrete steps of LA’s City Hall. Since the Supreme Court’s staircase is marble, Bishop did a considerable amount of work on the texture, keeping the marble porous enough to blend with the concrete in this key shot.

Compositing supervisor Dan Short led his team through the process of merging the practical photography with renders created with Redshift and then seamlessly composited all of the shots using Foundry’s Nuke.

See their breakdown of the shots here:

Netflix’s Altered Carbon: the look, the feel, the post

By Randi Altman

Netflix’s Altered Carbon is a new sci-fi series set in a dystopian future where people are immortal thanks to something called “stacks,” which contain their entire essence — their personalities, their memories, everything. The one setback is that unless you are a Meth (one of the rich and powerful), you need to buy a “sleeve” (a body) for your stack, and it might not have any resemblance to your former self. It could be a different color, a different sex, a different age, a different everything. You have to take what you can get.

Based on a 2002 novel by Richard K. Morgan, it stars Swedish actor Joel Kinnaman.

Jill Bogdanowicz

We reached out to the show’s colorist, Jill Bogdanowicz, as well as post producer Allen Marshall Palmer to find out more about the show’s varied and distinctive looks.

The look has a very Blade Runner-type feel. Was that in homage to the films?
Bogdanowicz: The creators wanted a film noir look. Blade Runner is the same genre, but the show isn’t specifically an homage to Blade Runner.

Palmer: I’ll leave that for fans to dissect.

Jill, can you talk about your process? What tools did you use?
Bogdanowicz: I designed a LUT to create that film noir look before shooting. I actually provided a few options, and they chose my favorite one and used it throughout. After they shot everything and I had all 10 episodes in my bay, I got familiar with the content, wrapped my head around the story and came up with ideas to tell that story with color.

The show covers many different times and places so scenes needed to be treated visually to show audiences where the story is and what’s happened. I colored both HDR (Dolby Vision) and SDR passes using DaVinci Resolve.

I worked very closely with both DPs — Martin Ahlgren and Neville Kidd — in pre-timing the show, and they gave me a nice idea of what they were looking for so I had a great starting point. They were very close knit. The entire team on this project was an absolute pleasure, and it was a great creative collaboration, which comes through in the final product of the show.

The show is shot and posted like a feature and has a feature feel. Was that part of your marching orders?
Bogdanowicz: I’m primarily a features colorist, so I’m very familiar with the film noir look and heavy VFX, and that’s one reason I was included on this project. It was right up my alley.

Palmer: We approached Altered Carbon as a 10-part feature rather than a television series. I coined the term “feature episodic entertainment,” which describes what we were aspiring to — destination viewing instead of something merely disposable. In a world with so many viewing options, we wanted to command the viewer’s full attention, and fans are rewarded for that attention.

We were very concerned about how images, especially VFX, were going to look in HDR so we had weekly VFX approval sessions with Jill, our mastering colorist, in her color timing bay.

Executive producers and studio along with the VFX and post teams were able to sit together — adjusting color corrections if needed before giving final approval on shots. This gave us really good technical and creative quality control. Despite our initial concerns about VFX shots in HDR, we found that with vendors like Double Negative and Milk with their robust 16-bit EXR pipelines we weren’t “breaking” VFX shots when color correcting for HDR.

How did the VFX affect the workflow?
Bogdanowicz: Because I was brought on so early, the LUT I created was shared with the VFX vendors so they had a good estimation of the show’s contrast. That really helped them visualize the look of the show so that the look of the shots was pretty darn close by the time I got them in my bay.

Was there a favorite scene or scenes?
Bogdanowicz: There are so many spectacular moments, but the emotional core for me is in episode 104 when we see the beginning of the Kovacs and Quell love story in the past and how that love gives Kovacs the strength to survive in the present day.

Palmer: That’s a tough question! There are so many, it’s hard to choose. I think the episode that really jumps out is the one in which Joel Kinnaman’s character is being tortured and the content skips back and forth in time, changes and alternates between VR and reality. It was fun to create a different visual language for each space.

Can you talk about challenges in the process and how you overcame them?
Bogdanowicz: The show features a lot of VFX and they all need to look as real as possible, so I had to make sure they felt part of the worlds. Fortunately, VFX supervisor Everett Burrell and his team are amazing and the VFX is top notch. Coming up with different ideas and collaborating with producers James Middleton and Laeta Kalogridis on those ideas was a really fun creative challenge. I used the Sapphire VFX plugin for Resolve to heavily treat and texture VR looks in different ways.

Palmer: In addition to the data management challenges on the picture side, we were dealing with mixing in Dolby Atmos. It was very easy to get distracted with how great the Atmos mix sounds — the downmixes generally translated very well, but monitoring in 5.1 and 2.0 did reveal some small details that we wanted to adjust. Generally, we’re very happy with how both the picture and sound is translating into viewer’s homes.

Dolby Vision HDR is great at taking what’s in the color bay into the home viewing environment, but there are still so many variables in viewing set-ups that you can still end up chasing your own tail. It was great to see the behind the scenes of Netflix’s dedication to providing the best picture and sound quality through the service.

The look of the AI hotel was so warm. I wanted to live there. Can you talk about that look?
Bogdanowicz: The AI hotel look was mostly done in design and lighting. I saw the warm practical lights and rich details in the architecture and throughout the hotel and ran with it. I just aimed to keep the look filmic and inviting.

What about the look of where the wealthy people lived?
Bogdanowicz: The Meth houses are above the clouds, so we kept the look very clean and cool with a lot of true whites and elegant color separation.

Seems like there were a few different looks within the show?
Bogdanowicz: The same LUT for the film noir look is used throughout the show, but the VR looks are very different. I used Sapphire to come up with different concepts and textures for the different VR looks, from rich quality of the high-end VR to the cheap VR found underneath a noodle bar.

Allen, can you walk us through the workflow from production to post?
Palmer: With the exception of specialty shots, the show was photographed on Alexa 65 — mostly in 5K mode, but occasionally in 6.5K and 4K for certain lenses. The camera is beautiful and a large part of the show’s cinematic look, but it generates a lot of data (about 1.9TB/hour for 5K) so this was the first challenge. The camera dictates using the Codex Vault system, and Encore Vancouver was up to the task for handling this material. We wanted to get the amount of data down for post, so we generated 4096×2304 ProRes 4444XQ “mezzanine” files, which we used for almost all of the show assembly and VFX pulls.

During production and post, all of our 4K files were kept online at Efilm using their portal system. This allowed us fast, automated access to the material so we could quickly do VFX pulls, manage color, generate 16-bit EXR frames and send those off to VFX vendors. We knew that time saved there was going to give us more time on the back end to work creatively on the shots so the Portal was a very valuable tool.

How many VFX shots did you average per episode? Seems like a ton, especially with the AI characters. Who provided those and what were those turnarounds like?
Palmer: There were around 2,300 visual effects shots during this season — probably less than most people would think because we built a large Bay City street inside a former newspaper printing facility outside of Vancouver. The shot turnaround varied depending on the complexity and where we were in the schedule. We were lucky that something like episode 1’s “limo ride” sequence was started very early on because it gave us a lot of time to refine our first grand views of Bay City. Our VFX supervisor Everett Burrell and VFX producer Tony Meagher were able to get us out in front of a lot of challenges like the amount of 3D work in the last two episodes by starting that work early on since we knew we would need those shots from the script and prep phase.

Review: HP’s lower-cost DreamColor Z24x display

By Dariush Derakhshani

So, we all know how important a color-accurate monitor is in making professional-level graphics, right? Right?!? Even at the most basic level, when you’re stalking online for the perfect watch band for your holiday present of a smart watch, you want the orange band you see in the online ad to be what you get when it arrives a few days later. Even if your wife thinks orange doesn’t suit you, and makes you look like “you’re trying too hard.”

Especially as a content developer, you want to know what you’re looking at is an accurate representation of the image. Ever walk into a Best Buy and see multiple screens showing the same content but with wild ranging differences in color? You can’t have that discrepancy working as a pro, especially in collaboration; you need color accuracy. In my own experience, that position has been filled by HP’s 10-bit DreamColor displays for many years now, but not everyone is awash in bitcoins, and justifying a price tag of over $1,200 is sometimes hard to justify, even for a studio professional.

Enter HP’s DreamColor Z24x display at half the price, coming in around $550 online. Yes, DreamColor for half the cost. That’s pretty significant. For the record, I haven’t used a 24-inch monitor since the dark ages; when Lost was the hot TV show. I’ve been fortunate enough to be running at 27-inch and higher, so there was a little shock when I started using the Z24x HP sent me for review. But this is something I quickly got used to.

With my regular 32-inch 4K display still my primary — so I can fit loads of windows all over the place — I used this DreamColor screen as my secondary display, primarily to check output for my Adobe After Effects comps, Adobe Premiere Pro edits and to hold my render view window as I develop shaders and lighting in Autodesk Maya. I felt comfortable knowing the images I shared with my colleagues across town would be seen as I intended them, evening the playing field when working collaboratively (as long as everyone is on the same LUT and color space). Speaking of color spaces, the Z24x hits 100% of sRGB, 99% of AdobeRGB and 96% of DCI P3, which is just slightly under HP’s Z27x DreamColor. It is, however, slightly faster with a 6ms response rate.

The Z24x has a 24-inch IPS panel from LG that exhibits color in 10-bit, like its bigger 27-inch Z27x sibling. This gives you over a billion colors, which I have personally verified by counting them all —that was one, long weekend, I can tell you. Unlike the highest-end DreamColor screens though, the Z24x dithers up from 8-bit to 10-bit (called an 8-bit+FRC). This means it’s better than an 8-bit color display, for sure, but not quite up to real 10-bit, making it color accurate but not color critical. HP’s implementation of dithering is quite good, when subjectively compared to my full 10-bit main display. Frankly, a lot of screens that claim 10-bit may actually be 8-bit+FRC anyway!

While the Z27x gives you 2560×1440 as you expect of most 27inch displays, if not full on 4K, the Z24x is at a comfortable 1920×1200, just enough for a full 1080p image and a little room for a slider or info bar. Being the res snob that I am, I had wondered if that was just too low, but at 24-inches I don’t think you would want a higher resolution, even if you’re sitting only 14-inches away from it. And this is a sentiment echoed by the folks at HP who consulted with so many of their professional clients to build this display. That gives a pixel density of about 94PPI, a bit lower than the 109PPI of the Z27x. This density is about the same as a 1080p HD display at 27-inch, so it’s still crisp and clean.

Viewing angles are good at about 178 degrees, and the screen is matte, with an anti-glare coating, making it easier to stare at without blinking for 10 hours at a clip, as digital artists usually do. Compared to my primary display, this HP’s coating was more matte and still gave me a richer black in comparison, which I liked to see.

Connection options are fairly standard with two DisplayPorts, one HDMI, and one DVI dual link for anyone still living in the past. You also get four USB ports and an analog 3.5mm audio jack if you want to drive some speakers, since you can’t from your phone anymore (Apple, I’m looking at you).

Summing Up
So while 24-inches is a bit small for my tastes for a display, I am seriously impressed at the street price of the Z24x, allowing a lot more pros and semi-pros to get the DreamColor accuracy HP offers at half the price. While I wouldn’t recommend color grading a show on the Z24x, this DreamColor does a nice job of bringing a higher level of color confidence at an attractive price. As a secondary display, the z24x is a nice addition to an artist workflow with budget in mind — or who has a mean, orange-watch-band-hating spouse.


Dariush Derakhshani is a VFX supervisor and educator in Southern California. You can follow his random tweets at @koosh3d.

Kathrin Lausch joins Uppercut as EP

New York post shop Uppercut has added Kathrin Lausch as executive producer. Lausch has over two decades of experience as an executive producer for top production and post production companies such as MPC, Ntropic, B-Reel, Nice Shoes, Partizan and Compass Films, among others. She has led shops on the front lines for the outset of digital, branded content, reality television and brand-direct production.

“I joined Uppercut after being very impressed with Micah Scarpelli’s clear understanding of the advertising market, its ongoing changes and his proactive approach to offer his services accordingly,” explains Lausch. “The new advertising landscape is offering up opportunities for boutique shops like Uppercut, and interesting conversations and relationships can come out of having a clear and focused offering. It was important to me to be part of a team that embraces change and thrives on being a part of it.”

Half French, half German-born, Lausch followed dual pursuits in law and art in NYC before finding her way to the world of production. She launched Passport Films, which later became Compass Films. After selling the company, she followed the onset of the digital advertising marketplace, landing with B-Reel. She made the shift to post production, further embracing the new digital landscape as executive producer at Nice Shoes and Ntropic before landing as head of new business at MPC.