Category Archives: VFX

Director HaZ Dulull on his sci-fi offering The Beyond

By Randi Altman

Director Hasraf “HaZ” Dulull is no stranger to making movies. Before jumping into writing and directing short sci-fi films, he was a visual effects supervisor and producer. His short film resume includes Project Kronos, I.R.I.S. and Sync. Recently, his first feature film, The Beyond, was released by Gravitas Ventures.

When I first met HaZ a few years back, we were both at an Adobe event — on a canal boat in Amsterdam during IBC. We started talking about visual effects, the industry and his drive to make movies.

This Brit is friendly, intelligent and incredibly hands-on in all aspects of what he does. His latest is The Beyond, which he describes as “a cerebral science-fiction feature film that blends the realism of documentary with the fantastical, ‘big idea’ nature of the science-fiction films of today.” The Beyond tells the story of a ground-breaking mission that sent astronauts — modified with advanced robotics — through a newly discovered wormhole known as the Void. When the mission returns unexpectedly, the space agency races to discover what the astronauts encountered on their first-of-its-kind interstellar space journey.

HaZ on set

HaZ was so hands-on that he provided some of the film’s visual effects and edited the film. Here is the trailer. If you like what you see, the film is available for purchase or rent on most digital platforms.

When I reached out to HaZ to talk about The Beyond, he was in Vancouver working on an eight-part TV series for Disney called Fast Layne. “I directed episodes 1 and 2, and am currently directing episodes 7 and 8,” he says. “The beauty of starting and ending the series is it allowed me to set the show’s style and tone.”

It seems he can’t sit still! Let’s find out more about how he works and The Beyond

Can you talk about prepro? How much of that included visual effects prepro?
Most people who know me will say I’m obsessed with prep. I had about six months of hardcore prep on this, from doing little storyboards, known as HaZ-Grams, right through to previs of the key sequences.

But even during the script-writing stage (six months before actual prep), I was coming up with visuals to support the ideas I was writing in the script. Sometimes I would knock up a test VFX scene just to see how complex it would be to create this idea I was writing in the script. Prep worked hand in hand with the script development and the budgeting of the film. The film was self-financed and later additional financing came in (during post production of the film), so I wanted to ensure everything was mapped out technically, as there was no “fix it in post” scenarios in this film — I wouldn’t allow it.

During location scouting, I would have my iPhone with me and shoot a bunch of footage and still imagery, so when I went back home I could write those locations into the script to make them work with the scenarios depicted in the film.

As part of prep we actually shot a test scene to really see if this mocku-mentary format would work to tell a grounded sci-fi story. This was also used to attract crew and other casting to the project, as well as get distributors primed early on.

Many shots from that test actually made it into the final movie —I wasn’t kidding about not wasting any budget or material on this production! So prep pretty much helped shape the script too, as I knew I wasn’t in the financial position to write stuff and then go and build it. I had to reverse engineer it in a way. In the film we have tons of locations, such as the Space Centre with actual real rockets. We also had a team in Iceland shooting alien landscapes, and we even shot some scenes in Malaysia to give the film a global feel — with each of those opportunities the script was tweaked to make full use of those location opportunities we had.

You shot with Blackmagic cameras. Was that your choice? The DP’s? Have you shot with these before?
From the start, I knew we were going to shoot on Blackmagic cameras. This was mainly down to the fact my DP Adam Batchelor — who had shot Sync with me and the proof of concept tests we did for this film — was a Blackmagic advocate and knew the cameras inside out, but more importantly he was able to get cinematic imagery using those cameras.

Blackmagic was very supportive of the film and have been of my career since my short films, so they came on as one of the executive producers on the film. No one had ever shot a full feature film using just the Blackmagic cameras. We also then used a Resolve pipeline to delivery. So The Beyond is the perfect case study for it.

Can you talk about that workflow? Any hiccups? 
I think the only hiccups were the fact we were using a beta version of Resolve 14, so there were the expected crashes, etc. That would usually be seen as risky on a feature film, but luckily we didn’t have a distributor in place with a release date, so the risk was minimal.

The good thing was I would generate an error log report from Resolve and send it over to Blackmagic, who would then instantly send out a new patch. So we were looked after rather than being left on our own to scream at the monitor.

We stuck with a Pro Res 4444 QuickTime workflow for all material from footage to VFX renders, and enabled proxy on the fly within Resolve. This was great as it meant I was working with the highest-resolution imagery within Resolve, and it was fairly fast too. Things started to slow down when I had multiple layers of VFX and composites/groups, which I then had to render out as a new clip and bring back in.

How did you and the DP develop the look you wanted? Any scenes stick out that you guys worked on?
I was very fortunate to get Max Horton, who had worked on films like Gravity, to come onboard to grade this film at the Dolby Vision lab in London’s Soho. We also did an HDR version of the film, which I think is the first indie film to have an HDR treatment done to it.

We had three to four days of grading with Max, and I was in the room with him the whole time. This was because I had already done a first-pass temp grade myself while editing the film in the beta version of Resolve 14. This made the workflow as simple as exporting my Resolve file and then the material hand-over to Max, who would load up the Resolve file, link up the material and work from there.

Max kept everything photographically like a documentary but with a slight cinematic flair to it. The big challenge was matching all the various sources of material from the various Blackmagic cameras (Ursa Mini Pro, the Production Camera and the Pocket Camera) to the DJI Osmo, drone footage and stock footage.

How many VFX shots were there? Who did them?
There were around 750 visual effects shots. I designed all the VFX scenes and handled a huge portion of the compositing myself, including invisible effects shots, all the space scenes, alien planet scenes, memory scenes and tons more — this would not have been possible without the support of my VFX team who worked on their assigned sequences and shots and also generated tons of CGI assets for me to use to create my shots in comp.

My VFX team members included my long-time collaborator John Sellings, who was the VFX supervisor for all the Human 2.0 sequences. Filmmore, in Amsterdam and Brussels, handled Human 2.0 scenes in the transcode bay with in-house VFX supervisor Hans Van Helden. London’s Squint VFX handled the Human 2.0 scenes in wake-up lab. Charles Wilcocks was the Human 2.0 CG supervisor who worked on the shape and look of the Human 2.0.

Hussin Khan looked after the Malaysian team, which provided rotoscoping support and basic comps. Dan Newlands was our on-set tracking supervisor. He ensured all data was captured correctly and supervised anything tracking related in the Human 2.0 scenes.

Another long-time collaborator was Andrea Tedeschi, who handled the CG and comps for the spacecraft carrier at the end of the film, as well as rendering out the CG astronaut passes. Rhys Griffith handled the rigging for the Human 2.0 characters in Maya, and also looked after the CG passes for the alpha Human 2.0 scenes using Blender. Aleksandr Uusmees provided all the particles and simulation rendered out of Houdini as CG passes/elements, which I then used to create the wormhole effects, alien spheres and other shots that needed those elements.

JM Blay designed and created the standalone motion graphics sequences to visualize the Human 2.0 medical procedure, as well as mission trajectory graphics. He also created several “kit-bash” graphics assets for me to use, including UI graphics, from his After Effects files.

Territory Studio created the awesome end titles and credits sequence, which you can read more about on their site.

As a VFX pro yourself, do you find that you are harder to please because it’s your wheelhouse?
Oh boy. Ask any of the VFX guys on the team and they will say I am a beast to work with because I am hands-on, and also I know how long things take. But on the flip side that had its advantages, as they knew they were not going to get revision after revision, because with each brief I also presented a proposed methodology, and made sure we locked down on that first before proceeding with the shots.

Was this your biggest directing job to date? Can you talk about any surprises?
It wasn’t my biggest directing job to date, as during post production of The Beyond my second sci-fi film Origin Unknown (starring Katee Sackhoff from Battlestar Galactica, The Flash) was green-lit and that had its own set of challenges. We can talk more about that when the film is released theatrically and VOD later this year via Kew Media.

This was, however, my biggest producing job to date; there were so many logistics and resources to manage whilst directing too. The cool thing about the way we made this film was that most of the crew were on my short films, including some of the key cast too, so we embraced the guerrilla nature of the production and focused on maximizing our resources to the fullest within the time and budget constraints.

What did you learn on this film that will help on your next?
The other hat I was wearing was the producer hat, and one thing I had to embrace was the sheer amount of paperwork! I may have taken the same filmmaking approach as I did on my short films — guerrilla and thinking outside the box technically and creatively— but making a commercial feature film, I had to learn to deal with things like clearances, E&O (errors and omission) insurance, chain of title, script report and a whole bunch of paperwork required before a distributor will pick up your film.

Thankfully my co-producer Paula Crickard, who is currently wrapping post on Terry Gilliam’s Don Quixote, came in during the post stage of the film and helped.

The other thing I learned was the whole sales angle — getting a reputable distributor on board to sell the film in all worldwide territories and how to navigate that process with rights and IP and more contracts etc. The advise I got from other filmmakers is getting the right distributor is a big part in how your film will be released, and to me it was important the distributor was into the film and not just the trailer, but also what their marketing and sales strategy were. The Beyond was never designed to be a theatrical film and therefore I wanted someone that had a big reach in the VOD world through their brand, especially since The Beyond doesn’t have big-name actors in there.

What was the most challenging scene or scenes? Why and how did you overcome those challenges?
The Human 2.0 scenes were the most challenging because they had to look photoreal due to it being a documentary narrative. We did first try and do it all in-camera using a built suit, but it wasn’t achieving the look we wanted and the actors would feel uncomfortable with it, and also to do it properly with practical would cost a fortune. So we went with a full-digital solution for the Human 2.0 bodies, by having the actors wear a tight grey suit with tracking markers on and we restricted our camera moves for simplicity to enable object tracking to work as accurately as possible. We also shot multiple reference footage from all angles to help with match moving. Having an on set-tracking supervisor helped massively and allowed us to make this happen within the budget, while looking and feeling real.

Our biggest issue came when our actress made very tiny movements due to breathing in close-up shots. Because our Human 2.0 was human consciousness in a synthetic shell, breathing didn’t make sense and we began making up for it by freezing the image or doing some stabilization, which ended up being nearly impossible for the very close-up shots.

In the end, I had to think outside the box, so I wrote a few lines into the script that explained that the Human 2.0 was breathing to make it psychologically more acceptable to other humans. Those two lines saved us weeks and possibly months of time.

Being a VFX movie you would expect us to use a form of greenscreen or bluescreen, but we didn’t — in fact, the only stage used was for the “white room” astronaut scene, which was shot over at Asylum FX in London. There was an actor wearing an astronaut suit in a bright photography room, and we used brightly exposed lighting to give a surreal feeling. We used VFX to augment it.

As a writer and a director, how was it seeing your vision through from start to finish.
It didn’t really hit me until I watched the press screening of it at the Dolby Vision office in Soho. It had the fully mixed sound and the completed grade. I remember looking across at my DP and other team members thinking, “Whoa! It looks and feels like a feature film, and we did that in a year!”

You edited the film yourself?
Yes, I was the editor on the film! I shoot for the edit. I started off using Adobe Premiere CC for the early offline and then quickly moved over to Resolve 14, where I did the majority of the editing. It was great because I was doing a lot of online editorial tasks like stabilizing, basic VFX, pan and scans, as well as establishing temp looks while editing. So in a way there was no offline and online editorial, as it was all part of one workflow. We did all our deliverables out of Resolve 14, too.

V-Ray GPU is Chaos Group’s new GPU rendering architecture

Chaos Group has redesigned its V-Ray RT product. The new V-Ray GPU rendering architecture, according to the company, effectively doubles the speed of production rendering for film, broadcast and design artists. This represents a redesign of V-Ray’s kernel structure, ensuring a dual-blend of high-performance speed and accuracy.

Chaos Group has renamed V-Ray RT to V-Ray GPU, wanting to establish the latter as a professional production renderer capable of supporting volumetrics, advanced shading and other smart tech coming down the road.

Current internal tests have V-Ray GPU running 80 percent faster on the Nvidia’s Titan V, a big gain from previous benchmarks on the Titan Xp, and up to 10-15x faster than an Intel Core i7-7700K, with the same high level of accuracy across interactive and production renders. (For its testing, Chaos Group uses a battery of production scenes to benchmark each release.)

“V-Ray GPU might be the biggest speed leap we’ve ever made,” says Blagovest Taskov, V-Ray GPU lead developer at Chaos Group. “Redesigning V-Ray GPU to be modular makes it much easier for us to exploit the latest GPU architectures and to add functionality without impacting performance. With our expanded feature set, V-Ray GPU can be used in many more production scenarios, from big-budget films to data-heavy architecture projects, while providing more speed than ever before.”

Representing over two years of dedicated R&D, V-Ray GPU builds on nine years of GPU-driven development in V-Ray. New gains for production artists include:

• Volume Rendering – Fog, smoke and fire can be rendered with the speed of V-Ray GPU. It’s compatible with V-Ray Volume Grid, which supports OpenVDB, Field3D and Phoenix FD volume caches.
• Adaptive Dome Light – Cleaner image-based lighting is now faster and even more accurate.
• V-Ray Denoising – Offering GPU-accelerated denoising across render elements and animations.
• Nvidia AI Denoiser – Fast, real-time denoising based on Nvidia OptiX AI-accelerated denoising technology.
• Interface Support – Instant filtering of GPU-supported features lets artists know what’s available in V-Ray GPU (starting within 3ds Max).

V-Ray GPU will be made available as part of the next update of V-Ray Next for 3ds Max beta.

Cinna 4.13

Digital locations for Scandal/How to Get Away With Murder crossover

If you like your Thursday night television served up with a little Scandal and How to Get Away With Murder, then you likely loved the recent crossover episodes that paired the two show’s leading ladies. VFX Legion, which has a brick and mortar office in LA but artists all over the world, was called on to create a mix of photorealistic CG environments and other effects that made it possible for the show’s actors to appear in a variety of digital surroundings, including iconic locations in Washington, DC.

VFX Legion has handled all of the visual effects for both shows for almost three years, and is slated to work on the next season of Murder (this is Scandal’s last season). Over the years, the Shondaland Productions have tasked the company with creating high shot counts for almost 100 episodes, each matching the overall look of a single show. However, the crossover episodes required visual effects that blended with two series that use different tools and each have their own look, presenting a more complex set of challenges.

For instance, Scandal is shot on an Arri Alexa camera, and How to Get Away With Murder on a Sony F55, at different color temps and under varying lighting conditions. DP preferences and available equipment required each environment to be shot twice, once with greenscreens for Scandal and then again using bluescreens for Murder.

The replication of the Supreme Court Building is central to the storyline. Building its exterior facade and interiors of the courtroom and rotunda digitally from the ground up were the most complex visual effects created for the episodes.

The process began during preproduction with VFX supervisor Matthew T. Lynn working closely with the client to get a full understanding of their vision. He collaborated with VFX Legion head of production, Nate Smalley, production manager Andrew Turner and coordinators Matt Noren and Lexi Sloan on streamlining workflow and crafting a plan that aligned with the shows’ budgets, schedules, and resources. Lynn spent several weeks on R&D, previs and mockups. Legion’s end-to-end approach was presented to the staffs of both shows during combined VFX meetings, and a plan was finalized.

A rough 3D model of the set was constructed from hundreds of reference photographs stitched together using Agisoft Photoscan and photogrammetry. HDRI panoramas and 360-degree multiple exposure photographs of the set were used to match the 3D lighting with the live-action footage. CG modeling and texturing artist Trevor Harder then added the fine details and created the finished 3D model.

CG supervisor Rommél S. Calderon headed up the team of modeling, texturing, tracking, layout and lighting artists that created Washington, DC’s Supreme Court Building from scratch.

“The computer-generated model of the exterior of the building was a beast, and scheduling was a huge job in itself,” explains Calderon. “Meticulous planning, resource management, constant communication with clients and spot-on supervision were crucial to combining the large volume of shots without causing a bottleneck in VFX Legion’s digital pipeline.”

Ken Bishop, VFX Legion’s lead modeler, ran into some interesting issues while working with footage of the lead characters Olivia Pope and Annalise Keating filmed on the concrete steps of LA’s City Hall. Since the Supreme Court’s staircase is marble, Bishop did a considerable amount of work on the texture, keeping the marble porous enough to blend with the concrete in this key shot.

Compositing supervisor Dan Short led his team through the process of merging the practical photography with renders created with Redshift and then seamlessly composited all of the shots using Foundry’s Nuke.

See their breakdown of the shots here:


Netflix’s Altered Carbon: the look, the feel, the post

By Randi Altman

Netflix’s Altered Carbon is a new sci-fi series set in a dystopian future where people are immortal thanks to something called “stacks,” which contain their entire essence — their personalities, their memories, everything. The one setback is that unless you are a Meth (one of the rich and powerful), you need to buy a “sleeve” (a body) for your stack, and it might not have any resemblance to your former self. It could be a different color, a different sex, a different age, a different everything. You have to take what you can get.

Based on a 2002 novel by Richard K. Morgan, it stars Swedish actor Joel Kinnaman.

Jill Bogdanowicz

We reached out to the show’s colorist, Jill Bogdanowicz, as well as post producer Allen Marshall Palmer to find out more about the show’s varied and distinctive looks.

The look has a very Blade Runner-type feel. Was that in homage to the films?
Bogdanowicz: The creators wanted a film noir look. Blade Runner is the same genre, but the show isn’t specifically an homage to Blade Runner.

Palmer: I’ll leave that for fans to dissect.

Jill, can you talk about your process? What tools did you use?
Bogdanowicz: I designed a LUT to create that film noir look before shooting. I actually provided a few options, and they chose my favorite one and used it throughout. After they shot everything and I had all 10 episodes in my bay, I got familiar with the content, wrapped my head around the story and came up with ideas to tell that story with color.

The show covers many different times and places so scenes needed to be treated visually to show audiences where the story is and what’s happened. I colored both HDR (Dolby Vision) and SDR passes using DaVinci Resolve.

I worked very closely with both DPs — Martin Ahlgren and Neville Kidd — in pre-timing the show, and they gave me a nice idea of what they were looking for so I had a great starting point. They were very close knit. The entire team on this project was an absolute pleasure, and it was a great creative collaboration, which comes through in the final product of the show.

The show is shot and posted like a feature and has a feature feel. Was that part of your marching orders?
Bogdanowicz: I’m primarily a features colorist, so I’m very familiar with the film noir look and heavy VFX, and that’s one reason I was included on this project. It was right up my alley.

Palmer: We approached Altered Carbon as a 10-part feature rather than a television series. I coined the term “feature episodic entertainment,” which describes what we were aspiring to — destination viewing instead of something merely disposable. In a world with so many viewing options, we wanted to command the viewer’s full attention, and fans are rewarded for that attention.

We were very concerned about how images, especially VFX, were going to look in HDR so we had weekly VFX approval sessions with Jill, our mastering colorist, in her color timing bay.

Executive producers and studio along with the VFX and post teams were able to sit together — adjusting color corrections if needed before giving final approval on shots. This gave us really good technical and creative quality control. Despite our initial concerns about VFX shots in HDR, we found that with vendors like Double Negative and Milk with their robust 16-bit EXR pipelines we weren’t “breaking” VFX shots when color correcting for HDR.

How did the VFX affect the workflow?
Bogdanowicz: Because I was brought on so early, the LUT I created was shared with the VFX vendors so they had a good estimation of the show’s contrast. That really helped them visualize the look of the show so that the look of the shots was pretty darn close by the time I got them in my bay.

Was there a favorite scene or scenes?
Bogdanowicz: There are so many spectacular moments, but the emotional core for me is in episode 104 when we see the beginning of the Kovacs and Quell love story in the past and how that love gives Kovacs the strength to survive in the present day.

Palmer: That’s a tough question! There are so many, it’s hard to choose. I think the episode that really jumps out is the one in which Joel Kinnaman’s character is being tortured and the content skips back and forth in time, changes and alternates between VR and reality. It was fun to create a different visual language for each space.

Can you talk about challenges in the process and how you overcame them?
Bogdanowicz: The show features a lot of VFX and they all need to look as real as possible, so I had to make sure they felt part of the worlds. Fortunately, VFX supervisor Everett Burrell and his team are amazing and the VFX is top notch. Coming up with different ideas and collaborating with producers James Middleton and Laeta Kalogridis on those ideas was a really fun creative challenge. I used the Sapphire VFX plugin for Resolve to heavily treat and texture VR looks in different ways.

Palmer: In addition to the data management challenges on the picture side, we were dealing with mixing in Dolby Atmos. It was very easy to get distracted with how great the Atmos mix sounds — the downmixes generally translated very well, but monitoring in 5.1 and 2.0 did reveal some small details that we wanted to adjust. Generally, we’re very happy with how both the picture and sound is translating into viewer’s homes.

Dolby Vision HDR is great at taking what’s in the color bay into the home viewing environment, but there are still so many variables in viewing set-ups that you can still end up chasing your own tail. It was great to see the behind the scenes of Netflix’s dedication to providing the best picture and sound quality through the service.

The look of the AI hotel was so warm. I wanted to live there. Can you talk about that look?
Bogdanowicz: The AI hotel look was mostly done in design and lighting. I saw the warm practical lights and rich details in the architecture and throughout the hotel and ran with it. I just aimed to keep the look filmic and inviting.

What about the look of where the wealthy people lived?
Bogdanowicz: The Meth houses are above the clouds, so we kept the look very clean and cool with a lot of true whites and elegant color separation.

Seems like there were a few different looks within the show?
Bogdanowicz: The same LUT for the film noir look is used throughout the show, but the VR looks are very different. I used Sapphire to come up with different concepts and textures for the different VR looks, from rich quality of the high-end VR to the cheap VR found underneath a noodle bar.

Allen, can you walk us through the workflow from production to post?
Palmer: With the exception of specialty shots, the show was photographed on Alexa 65 — mostly in 5K mode, but occasionally in 6.5K and 4K for certain lenses. The camera is beautiful and a large part of the show’s cinematic look, but it generates a lot of data (about 1.9TB/hour for 5K) so this was the first challenge. The camera dictates using the Codex Vault system, and Encore Vancouver was up to the task for handling this material. We wanted to get the amount of data down for post, so we generated 4096×2304 ProRes 4444XQ “mezzanine” files, which we used for almost all of the show assembly and VFX pulls.

During production and post, all of our 4K files were kept online at Efilm using their portal system. This allowed us fast, automated access to the material so we could quickly do VFX pulls, manage color, generate 16-bit EXR frames and send those off to VFX vendors. We knew that time saved there was going to give us more time on the back end to work creatively on the shots so the Portal was a very valuable tool.

How many VFX shots did you average per episode? Seems like a ton, especially with the AI characters. Who provided those and what were those turnarounds like?
Palmer: There were around 2,300 visual effects shots during this season — probably less than most people would think because we built a large Bay City street inside a former newspaper printing facility outside of Vancouver. The shot turnaround varied depending on the complexity and where we were in the schedule. We were lucky that something like episode 1’s “limo ride” sequence was started very early on because it gave us a lot of time to refine our first grand views of Bay City. Our VFX supervisor Everett Burrell and VFX producer Tony Meagher were able to get us out in front of a lot of challenges like the amount of 3D work in the last two episodes by starting that work early on since we knew we would need those shots from the script and prep phase.


Behind the Title: Freefolk MD Justine White

NAME: Justine White

COMPANY: Freefolk

CAN YOU DESCRIBE YOUR COMPANY?
We are a high-end visual effects and post production company in our 15th year. We are independently owned and work hard at providing a creative environment and culture where talent can flourish. We work across commercials, TV, film and content in London and New York.

WHAT’S YOUR JOB TITLE?
Managing Director.

Freefolk’s longform team in London.

WHAT DOES THAT ENTAIL?
As MD, I have an overall view of the whole company across London and New York, so I drive everything from business development, strategy and PR to keeping tabs on projects going through the company. Day to day I have regular meetings with heads of department, PR, HR, finance and technology internally, and externally I am always in touch with clients, suppliers and industry bodies — I am always on the lookout for new talent.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
The role of MD is so broad; there are some things you are obliged to do running a company, but it’s the other aspects of my job that make it my own version of being an MD.

Working in films and commercials means I’m surrounded by creativity and innovation, so I tend to approach the business from a creative angle. An example is when we had a vacant top floor and we turned it into a pop-up yoga studio. This went down so well with clients and staff alike that I’d love to be able to offer it permanently.

WHAT’S YOUR FAVORITE PART OF THE JOB?
The great thing about project-based work is you are always working with different people on new ideas. We’ve worked alongside some of the best directing talent in the world, award-winning creatives and producers. If they are trying to push boundaries, it challenges us to do so too. If I’m not being challenged I get bored quite easily.

WHAT’S YOUR LEAST FAVORITE?
Admin!!!!!

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
Definitely the morning – PC (post coffee)

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
An international superstar DJ, of course.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I was introduced to the graphic Quantel Paintbox at a young age during a work experience… even though I was truly impressed by what was happening in the suite, afterwards I told my parents quite explicitly that, “I would never work in advertising painting out women’s wrinkles.” Now cut to current day…

The Alienist

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
We’ve recently carried out VFX on a new TV series called The Alienist for TNT. I’m a fan of the book, so it’s really exciting to be working on it! It looks beautiful and we’re so proud of the work we’ve done.

Cadbury Mum, a TVC directed by Frederic Planchon almost brings you to tears it is so well done and quite a lovely film. I also love the new Zelle commercials, starring Hamilton actor Daveed Diggs. Great scripts and the grade our New York colorist Paul Harrison did is stunning.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
A commercial we did for a campaign for mobile network O2 featured some very sweet goslings. Whenever I show this to people they don’t even realize the goslings are all CGI. Not only that, it fulfilled a long-time goal for one of our artists who loves creature work, and it won a host of awards for VFX.

On a personal level setting up our New York office was a major achievement for me.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
A good set of headphones to listen to my music to survive the commute across London to work.

I hate to admit, but my iPhone, which should really be called anything but a phone. Does anyone actually talk anymore?

I know this is cheating, but all VFX software and hardware, because if it didn’t exist I wouldn’t be making a living!

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
What’s App!!! My mummy group was a lifeline during my maternity leave; we have a work group called the Freefolk Daily Review, which started as a burger review group and morphed into food reviews and just about anything.

Instagram was a revelation when it first launched. There was a real focus on quality photographic images… seeing the world through everyday people’s immediate images has opened our minds.

Workplace is Facebook’s ESN (enterprise social network) — we use it at Freefolk to share links, show work we’ve done or liked across the two sites and to discuss work in various groups. The idea is to reduce internal email traffic and focus more efficiently.

DO YOU LISTEN TO MUSIC WHILE YOU WORK? CARE TO SHARE YOUR FAVORITE MUSIC TO WORK TO?
When I was a junior Flame artist often working nights, I’d listen to Air, Tosca, Massive Attack, Morcheeba, Theivery Corporation, etc., kind of chilled, but with a slow beat.

If I was doing that job now it would be Mixcloud shows all the way. Fleetmac Wood, Greg Wilson, Bill Brewster and Glitterbox DJ mixes.

Nowadays I’m on a production floor where you are flitting from email to calls to meetings, so “radio” friendly music works well. Also, ‘90s pop seems to be on trend in the office right now, but it changes daily depending on who’s on the controls.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Bananagrams!!! It’s a team-based word game like Scrabble we have sitting around in the office. It gives you a quick high-speed competitive rush usually ending in laughter or screaming. I highly recommend every office gets one!


Review: HP’s lower-cost DreamColor Z24x display

By Dariush Derakhshani

So, we all know how important a color-accurate monitor is in making professional-level graphics, right? Right?!? Even at the most basic level, when you’re stalking online for the perfect watch band for your holiday present of a smart watch, you want the orange band you see in the online ad to be what you get when it arrives a few days later. Even if your wife thinks orange doesn’t suit you, and makes you look like “you’re trying too hard.”

Especially as a content developer, you want to know what you’re looking at is an accurate representation of the image. Ever walk into a Best Buy and see multiple screens showing the same content but with wild ranging differences in color? You can’t have that discrepancy working as a pro, especially in collaboration; you need color accuracy. In my own experience, that position has been filled by HP’s 10-bit DreamColor displays for many years now, but not everyone is awash in bitcoins, and justifying a price tag of over $1,200 is sometimes hard to justify, even for a studio professional.

Enter HP’s DreamColor Z24x display at half the price, coming in around $550 online. Yes, DreamColor for half the cost. That’s pretty significant. For the record, I haven’t used a 24-inch monitor since the dark ages; when Lost was the hot TV show. I’ve been fortunate enough to be running at 27-inch and higher, so there was a little shock when I started using the Z24x HP sent me for review. But this is something I quickly got used to.

With my regular 32-inch 4K display still my primary — so I can fit loads of windows all over the place — I used this DreamColor screen as my secondary display, primarily to check output for my Adobe After Effects comps, Adobe Premiere Pro edits and to hold my render view window as I develop shaders and lighting in Autodesk Maya. I felt comfortable knowing the images I shared with my colleagues across town would be seen as I intended them, evening the playing field when working collaboratively (as long as everyone is on the same LUT and color space). Speaking of color spaces, the Z24x hits 100% of sRGB, 99% of AdobeRGB and 96% of DCI P3, which is just slightly under HP’s Z27x DreamColor. It is, however, slightly faster with a 6ms response rate.

The Z24x has a 24-inch IPS panel from LG that exhibits color in 10-bit, like its bigger 27-inch Z27x sibling. This gives you over a billion colors, which I have personally verified by counting them all —that was one, long weekend, I can tell you. Unlike the highest-end DreamColor screens though, the Z24x dithers up from 8-bit to 10-bit (called an 8-bit+FRC). This means it’s better than an 8-bit color display, for sure, but not quite up to real 10-bit, making it color accurate but not color critical. HP’s implementation of dithering is quite good, when subjectively compared to my full 10-bit main display. Frankly, a lot of screens that claim 10-bit may actually be 8-bit+FRC anyway!

While the Z27x gives you 2560×1440 as you expect of most 27inch displays, if not full on 4K, the Z24x is at a comfortable 1920×1200, just enough for a full 1080p image and a little room for a slider or info bar. Being the res snob that I am, I had wondered if that was just too low, but at 24-inches I don’t think you would want a higher resolution, even if you’re sitting only 14-inches away from it. And this is a sentiment echoed by the folks at HP who consulted with so many of their professional clients to build this display. That gives a pixel density of about 94PPI, a bit lower than the 109PPI of the Z27x. This density is about the same as a 1080p HD display at 27-inch, so it’s still crisp and clean.

Viewing angles are good at about 178 degrees, and the screen is matte, with an anti-glare coating, making it easier to stare at without blinking for 10 hours at a clip, as digital artists usually do. Compared to my primary display, this HP’s coating was more matte and still gave me a richer black in comparison, which I liked to see.

Connection options are fairly standard with two DisplayPorts, one HDMI, and one DVI dual link for anyone still living in the past. You also get four USB ports and an analog 3.5mm audio jack if you want to drive some speakers, since you can’t from your phone anymore (Apple, I’m looking at you).

Summing Up
So while 24-inches is a bit small for my tastes for a display, I am seriously impressed at the street price of the Z24x, allowing a lot more pros and semi-pros to get the DreamColor accuracy HP offers at half the price. While I wouldn’t recommend color grading a show on the Z24x, this DreamColor does a nice job of bringing a higher level of color confidence at an attractive price. As a secondary display, the z24x is a nice addition to an artist workflow with budget in mind — or who has a mean, orange-watch-band-hating spouse.


Dariush Derakhshani is a VFX supervisor and educator in Southern California. You can follow his random tweets at @koosh3d.


Kathrin Lausch joins Uppercut as EP

New York post shop Uppercut has added Kathrin Lausch as executive producer. Lausch has over two decades of experience as an executive producer for top production and post production companies such as MPC, Ntropic, B-Reel, Nice Shoes, Partizan and Compass Films, among others. She has led shops on the front lines for the outset of digital, branded content, reality television and brand-direct production.

“I joined Uppercut after being very impressed with Micah Scarpelli’s clear understanding of the advertising market, its ongoing changes and his proactive approach to offer his services accordingly,” explains Lausch. “The new advertising landscape is offering up opportunities for boutique shops like Uppercut, and interesting conversations and relationships can come out of having a clear and focused offering. It was important to me to be part of a team that embraces change and thrives on being a part of it.”

Half French, half German-born, Lausch followed dual pursuits in law and art in NYC before finding her way to the world of production. She launched Passport Films, which later became Compass Films. After selling the company, she followed the onset of the digital advertising marketplace, landing with B-Reel. She made the shift to post production, further embracing the new digital landscape as executive producer at Nice Shoes and Ntropic before landing as head of new business at MPC.


Oscar-winner Jeff White is now CD at ILM Vancouver

Oscar-winning visual effects supervisor Jeff White has been named creative director of Industrial Light & Magic’s Vancouver studio. A 16-year ILM veteran, White will work directly with ILM Vancouver executive in charge Randal Shore.

Recently, the Academy of Motion Picture Arts and Sciences honored White and three colleagues, (Jason Smith, Rachel Rose, Mike Jutanwith a Technical Achievement Award for his original design of ILM’s procedural rigging system, Block Party. He is also nominated for an Academy Award for Visual Effects for his contribution to Kong: Skull Island.

White joined Industrial Light & Magic in 2002 as a creature technical director, working on a variety of films, including the Academy Award-winning Pirates of the Caribbean: Dead Man’s Chest, as well as War of the Worlds and Star Wars: Episode III: Revenge of the Sith.

In 2012, White served as the ILM VFX supervisor on Marvel’s The Avengers, directed by Joss Whedon, and earned both Oscar and BAFTA nominations for his visual effects work. He also received the Hollywood Film Award for visual effects for the work. White was also a VFX supervisor on Duncan Jones’ 2016 sci-fi offering, Warcraft, based on the well-known video game World of Warcraft by Blizzard Entertainment.

Says White, “Having worked with many of the artists here in Vancouver on a number of films, including Kong: Skull Island, I know firsthand the amazing artistic and technical talent we have to offer and I couldn’t be more excited to share what I know and collaborate with them on all manner of projects.”

Initially conceived as a satellite office when it opened in 2012, ILM’s Vancouver studio became a permanent fixture in the company’s operation in 2014. In 2017, the studio nearly doubled in size, adding a second building adjacent to its original location in the Gastown district. The studio has spearheaded ILM’s work on such films as Valerian and the City of a Thousand Planets, Only the Brave and most recently, Ryan Coogler’s Black Panther and Ava DuVernay’s A Wrinkle in Time.


Trio from Reel FX and Shilo team on VFX/live-action studio

The commercial division of digital studio Reel FX has teamed up with Shilo founder/executive creative director Jose Sebastian Gomez to launch strategic creative group ATK PLN. Emmy Award-winner Gomez will lead the creative vision for the studio, joined by former Digital Domain HOP Jim Riche as executive producer, with overall strategy led by Reel FX’s David Bates as managing director. The trio will draw from their combined expertise across VFX, design, production, interactive media, branding and marketing to offer in-house services from concept to final delivery.

ATK PLN will work across design, animation and live action. The team has already created work for AT&T, Fox Racing and MADD — all a fusion of live action and VFX. ATK PLN creatives will work between its new Hollywood studio, Montreal and Dallas locations. ATK PLN will also partner with sister companies Flight School and Reel FX Animation.

AT&T

In terms of tools, the company uses a lot of the traditional apps like Flame, Maya, Nuke and Houdini. “Our biggest push at the moment is into GPU rendering,” reports Riche. “We have had great success with Octane from Otoy, and it is a second pipeline in our systems working alongside Arnold. Octane is a faster render system and is fantastic on hard surface models.”

Riche continues, “When I joined David Bates at Reel FX almost two years ago we created a vision to elevate the company to the next level, challenging the status quo of the advertising community to offer a new, unique approach to creative problem solving. Bringing on Jose Gomez and his creative vision to our team at ATK PLN is allowing us to turn our ideas into reality. I am excited about how this forward-thinking team will continue to evolve with the changing market.”

The 16th annual VES Award winners

The Visual Effects Society (VES) celebrated artists and their work at the 16th Annual VES Awards, which recognize outstanding visual effects artistry and innovation in film, animation, television, commercials, video games and special venues.

Seven-time host, comedian Patton Oswalt, presided over more than 1,000 guests at the Beverly Hilton. War for the Planet of the Apes was named photoreal feature film winner, earning four awards. Coco was named top animated film, also earning four awards. Games of Thrones was named best photoreal episode and garnered five awards — the most wins of the night. Samsung; Do What You Can’t; Ostrich won top honors in the commercial field, scoring three awards. These top four contenders collectively garnered 16 of the 24 awards for outstanding visual effects.

President of Marvel Studios Kevin Feige presented the VES Lifetime Achievement Award to producer/writer/director Jon Favreau. Academy Award-winning producer Jon Landau presented the Georges Méliès Award to Academy Award-winning visual effects master Joe Letteri, VES. Awards presenters included fan-favorite Mark Hamill, Coco director Lee Unkrich, War for the Planet of the Apes director Matt Reeves, Academy Award-nominee Diane Warren, Jaime Camil, Dan Stevens, Elizabeth Henstridge, Sydelle Noel, Katy Mixon and Gabriel “Fluffy” Iglesias.

Here is a list of the winners:

Outstanding Visual Effects in a Photoreal Feature

War for the Planet of the Apes

Joe Letteri

Ryan Stafford

Daniel Barrett

Dan Lemmon

Joel Whist

 

Outstanding Supporting Visual Effects in a Photoreal Feature

Dunkirk

Andrew Jackson

Mike Chambers

Andrew Lockley

Alison Wortman

Scott Fisher

 

Outstanding Visual Effects in an Animated Feature

Coco

Lee Unkrich

Darla K. Anderson

David Ryu

Michael K. O’Brien

 

Outstanding Visual Effects in a Photoreal Episode

Game of Thrones: Beyond the Wall

Joe Bauer

Steve Kullback

Chris Baird

David Ramos

Sam Conway

 

Outstanding Supporting Visual Effects in a Photoreal Episode

Black Sails: XXIX

Erik Henry

Terron Pratt

Yafei Wu

David Wahlberg

Paul Dimmer

 

Outstanding Visual Effects in a Real-Time Project

Assassin’s Creed Origins

Raphael Lacoste

Patrick Limoges

Jean-Sebastien Guay

Ulrich Haar

 

Outstanding Visual Effects in a Commercial

Samsung Do What You Can’t: Ostrich

Diarmid Harrison-Murray

Tomek Zietkiewicz

Amir Bazazi

Martino Madeddu

 

 

Outstanding Visual Effects in a Special Venue Project

Avatar: Flight of Passage

Richard Baneham

Amy Jupiter

David Lester

Thrain Shadbolt

 

Outstanding Animated Character in a Photoreal Feature

War for the Planet of the Apes: Caesar

Dennis Yoo

Ludovic Chailloleau

Douglas McHale

Tim Forbes

 

Outstanding Animated Character in an Animated Feature

Coco: Hèctor

Emron Grover

Jonathan Hoffman

Michael Honsel

Guilherme Sauerbronn Jacinto

 

Outstanding Animated Character in an Episode or Real-Time Project

Game of Thrones The Spoils of War: Drogon Loot Train Attack

Murray Stevenson

Jason Snyman

Jenn Taylor

Florian Friedmann

 

Outstanding Animated Character in a Commercial

Samsung Do What You Can’t: Ostrich

David Bryan

Maximilian Mallmann

Tim Van Hussen

Brendan Fagan

 

Outstanding Created Environment in a Photoreal Feature

Blade Runner 2049; Los Angeles

Chris McLaughlin

Rhys Salcombe

Seungjin Woo

Francesco Dell’Anna

 

Outstanding Created Environment in an Animated Feature

Coco: City of the Dead

Michael Frederickson

Jamie Hecker

Jonathan Pytko

Dave Strick

 

Outstanding Created Environment in an Episode, Commercial, or Real-Time Project

Game of Thrones; Beyond the Wall; Frozen Lake

Daniel Villalba

Antonio Lado

José Luis Barreiro

Isaac de la Pompa

 

Outstanding Virtual Cinematography in a Photoreal Project

Guardians of the Galaxy Vol. 2: Groot Dance/Opening Fight

James Baker

Steven Lo

Alvise Avati

Robert Stipp

 

Outstanding Model in a Photoreal or Animated Project

Blade Runner 2049: LAPD Headquarters

Alex Funke

Steven Saunders

Joaquin Loyzaga

Chris Menges

 

Outstanding Effects Simulations in a Photoreal Feature

War for the Planet of the Apes

David Caeiro Cebrián

Johnathan Nixon

Chet Leavai

Gary Boyle

 

Outstanding Effects Simulations in an Animated Feature

Coco

Kristopher Campbell

Stephen Gustafson

Dave Hale

Keith Klohn

 

Outstanding Effects Simulations in an Episode, Commercial, or Real-Time Project 

Game of Thrones; The Dragon and the Wolf; Wall Destruction

Thomas Hullin

Dominik Kirouac

Sylvain Nouveau

Nathan Arbuckle

  

Outstanding Compositing in a Photoreal Feature

War for the Planet of the Apes

Christoph Salzmann

Robin Hollander

Ben Warner

Beck Veitch

 

Outstanding Compositing in a Photoreal Episode

Game of Thrones The Spoils of War: Loot Train Attack

Dom Hellier

Thijs Noij

Edwin Holdsworth

Giacomo Matteucci

 

Outstanding Compositing in a Photoreal Commercial

Samsung Do What You Can’t: Ostrich

Michael Gregory

Andrew Roberts

Gustavo Bellon

Rashabh Ramesh Butani

 

Outstanding Visual Effects in a Student Project

Hybrids

Florian Brauch

Romain Thirion

Matthieu Pujol

Kim Tailhades