OWC 12.4

Category Archives: 2D

Redshift integrates Cinema 4D noises, nodes and more

Maxon and Redshift Rendering Technologies have released Redshift 3.0.12, which has native support for Cinema 4D noises and deeper integration with Cinema 4D, including the option to define materials using Cinema 4D’s native node-based material system.

Cinema 4D noise effects have been in demand within other 3D software packages because of their flexibility, efficiency and look. Native support in Redshift means that users of other DCC applications can now access Cinema 4D noises by using Redshift as their rendering solution. Procedural noise allows artists to easily add surface detail and randomness to otherwise perfect surfaces. Cinema 4D offers 32 different types of noise and countless variations based on settings. Native support for Cinema 4D noises means Redshift can preserve GPU memory while delivering high-quality rendered results.

Redshift 3.0.12 provides content creators deeper integration of Redshift within Cinema 4D. Redshift materials can now be defined using Cinema 4D’s nodal material framework, introduced in Release 20. As well, Redshift materials can use the Node Space system introduced in Release 21, which combines the native nodes of multiple render engines into a single material. Redshift is the first to take advantage of the new API in Cinema 4D to implement its own Node Spaces. Users can now also use any Cinema 4D view panel as a Redshift IPR (interactive preview render) window, making it easier to work within compact layouts and interact with a scene while developing materials and lighting.

Redshift 3.0.12 is immediately available from the Redshift website.

Maxon acquired RedShift in April of 2019.

London’s Freefolk beefs up VFX team

Soho-based visual effects studio Freefolk, which has seen growth in its commercials and longform work, has grown its staff to meet this demand. As part of the uptick in work, Freefolk promoted Cheryl Payne from senior producer to head of commercial production. Additionally, Laura Rickets has joined as senior producer, and 2D artist Bradley Cocksedge has been added to the commercials VFX team.

Payne, who has been with Freefolk since the early days, has worked on some of the studio’s biggest commercials, including; Warburtons for Engine, Peloton for Dark Horses and Cadburys for VCCP.

Rickets comes to Freefolk with over 18 years of production experience working at some of the biggest VFX houses in London, including Framestore, The Mill and Smoke & Mirrors, as well as agency side for McCann. Since joining the team, Rickets has VFX-produced work on the I’m A Celebrity IDs, a set of seven technically challenging and CG-heavy spots for the new series of the show as well as ads for the Rugby World Cup and Who Wants to Be a Millionaire?.

Cocksedge is a recent graduate who joins from Framestore, where he was working as an intern on Fantastic Beasts: The Crimes of Grindelwald. While in school at the University of Hertfordshire, he interned at Freefolk and is happy to be back in a full-time position.

“We’ve had an exciting year and have worked on some really stand-out commercials, like TransPennine for Engine and the beautiful spot for The Guardian we completed with Uncommon, so we felt it was time to add to the Freefolk family,” says Fi Kilroe, Freefolk’s co-managing director/executive producer.

Main Image: (L-R) Cheryl Payne, Laura Rickets and Bradley Cocksedge

OWC 12.4

Alkemy X adds Albert Mason as head of production

Albert Mason has joined VFX house Alkemy X as head of production. He comes to Alkemy X with over two decades of experience in visual effects and post production. He has worked on projects directed by such industry icons as Peter Jackson on the Lord of the Rings trilogy, Tim Burton on Alice in Wonderland and Robert Zemeckis on The Polar Express. In his new role at Alkemy X, he will use his experience in feature films to target the growing episodic space.

A large part of Alkemy X’s work has been for episodic visual effects, with credits that include Amazon Prime’s Emmy-winning original series, The Marvelous Mrs. Maisel, USA’s Mr. Robot, AMC’s Fear the Walking Dead, Netflix’s Maniac, NBC’s Blindspot and Starz’s Power.

Mason began his career at MTV’s on-air promos department, sharpening his production skills on top series promo campaigns and as a part of its newly launched MTV Animation Department. He took an opportunity to transition into VFX, stepping into a production role for Weta Digital and spending three years working globally on the Lord of the Rings trilogy. He then joined Sony Pictures Imageworks, where he contributed to features including Spider-Man 3 and Ghost Rider. He has also produced work for such top industry shops as Logan, Rising Sun Pictures and Greymatter VFX.

“[Albert’s] expertise in constructing advanced pipelines that embrace emerging technologies will be invaluable to our team as we continue to bolster our slate of VFX work,” says Alkemy X president/CEO Justin Wineburgh.


Carbon New York grows with three industry vets

Carbon in New York has grown with two senior hires — executive producer Nick Haynes and head of CG Frank Grecco — and the relocation of existing ECD Liam Chapple, who joins from the Chicago office.

Chapple joined Carbon in 2016, moving from Mainframe in London to open Carbon’s Chicago facility.  He brought in clients such as Porsche, Lululemon, Jeep, McDonald’s, and Facebook. “I’ve always looked to the studios, designers and directors in New York as the high bar, and now I welcome the opportunity to pitch against them. There is an amazing pool of talent in New York, and the city’s energy is a magnet for artists and creatives of all ilk. I can’t wait to dive into this and look forward to expanding upon our amazing team of artists and really making an impression in such a competitive and creative market.”

Chapple recently wrapped direction and VFX on films for Teflon and American Express (Ogilvy) and multiple live-action projects for Lululemon. The most recent shoot, conceived and directed by Chapple, was a series of eight live-action films focusing on Lululemon’s brand ambassadors and its new flagship store in Chicago.

Haynes joins Carbon from his former role as EP of MPC, bringing over 20 years of experience earned at The Mill, MPC and Absolute. Haynes recently wrapped the launch film for the Google Pixel phone and the Chromebook, as well as an epic Middle Earth: Shadow of War Monolith Games trailer combining photo-real CGI elements with live-action shot on the frozen Black Sea in Ukraine.  “We want to be there at the inception of the creative and help steer it — ideally, lead it — and be there the whole way through the process, from concept and shoot to delivery. Over the years, whether working for the world’s most creative agencies or directly with prestigious clients like Google, Guinness and IBM, I aim to be as close to the project as possible from the outset, allowing my team to add genuine value that will garner the best result for everyone involved.”

Grecco joins Carbon from Method Studios, where he most recently led projects for Google, Target, Microsoft, Netflix and Marvel’s Deadpool 2.  With a wide range of experience from Emmy-nominated television title sequences to feature films and Super Bowl commercials, Grecco looks forward to helping Carbon continue to push its visuals beyond the high bar that has already been set.

In addition to New York and Chicago, Carbon has a studio in Los Angeles.

Main Image: (L-R) Frank Grecco, Liam Chapple, Nick Haynes


Behind the Title: Compadre’s Jessica Garcia-Scharer

NAME: Jessica Garcia-Scharer

COMPANY: Culver City’s Compadre

CAN YOU DESCRIBE YOUR COMPANY?
We are a creative marketing agency. We make strategically informed branding and creative — and then help to get it out to the world in memorable ways. And we use strategy, design, planning and technology to do it.

WHAT’S YOUR JOB TITLE?
Head of Production

WHAT DOES THAT ENTAIL?
Head of production means different things at different companies. I’m the three-ring binder with the special zip pack that helps to hold everything together in an organized manner. Everything from hearing and understanding client needs, creating proposals, managing budget projections/actuals/contracts, getting in the right talent for the job, all the way to making sure that everyone in-house is happy, balanced and supported.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
Probably the proposals and planning charts. I’m also “Snack Mom!”

WHAT’S YOUR FAVORITE PART OF THE JOB?
Snack Mom. Ha! My favorite part of the job is being part of a team and bringing something to the table that is useful. I like when my team feels like everything is being handled.

WHAT’S YOUR LEAST FAVORITE?
If and when there isn’t enough quiet time to get into the paperwork zone.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
At work: When I get in early and no one is in yet. I get the most work done during that time. Also lunch. I try to make it a point now to get out to lunch and take co-workers with me. It’s nice to be able to break up the day and be regular people for an hour.

Non-work-related: When the sun is just coming up and it’s still a little brisk outside, but the air is fresh and the birds are starting to wake up and chirp. Also, when the sun is starting to descend and it’s still a little warm as the cool ocean breeze starts to come in. The birds are starting to wind down after a hard day of being a bird, and families are coming together to make dinner and talk about their days (well… on the weekend anyway). I am obviously very lucky, and I know that. There are many that don’t get to experience that, and I think of them during that time as well.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
It depends on if I were independently wealthy or not, and where I had been previously. Before going to college, I wanted to be a VFX make-up artist, a marine biologist working with dolphins or a park ranger in Yosemite.

If I were independently wealthy, I would complete a painting collection and put up an art show, start a female/those-who-identify-as-female agency, open up a vegan restaurant and be a hardcore animal activist.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I wish people thought about their careers as more than one path. I have many paths, and I don’t think I’m done just yet. You never know where life will take you from one day to the next, so it’s important to live for today.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
CNN 2020 Election promo package, ESPN 40th Anniversary and another that is pretty neat and a big puzzle to figure out, but I can’t tell you just yet…

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
I technically work on everything, so they’re all my babies, and I’m proud of all of them for different reasons. Most, if not all, of the projects that we work on start out with a complex puzzle to solve. I work with the team to figure it out and present the solution to the client. That is where I thrive, and those documents are what I’m most proud of as far as my own personal accomplishments and physical contributions to the company.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
Water filtration systems, giant greenhouses and air conditioning will be vital because of global warming.

For work, it would be really hard to function without my mobile phone, laptop and headphones.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
Mainly Instagram and Facebook. Facebook is where I learn about events/concerts/protests coming up, keep tabs on people’s birthdays, weddings, babies and share my thoughts on factory farming. Instagram is mindless eye candy for the most part, but I do love how close I feel to certain communities there.

DO YOU LISTEN TO MUSIC WHILE YOU WORK? CARE TO SHARE YOUR FAVORITE MUSIC TO WORK TO?
Usually binaural beats (for focus and clarity) and new age relaxation; but if I’m organizing and cleaning up, then The Cure, Bowie, Duran Duran, Radiohead and Bel Canto.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
As I mentioned before, it’s important to take a lunch break and bond with co-workers and old friends. Taking a step away and remembering that I am a human being living a life that needs to be enjoyed is key to a happy work-life balance. We aren’t saving lives here; we are making fun things for fun people, so as long as you have the systems and resources in place, the stress is the excitement of making things that exceed expectations.

But if I do let things get to me, the best de-stressor is getting home and into my PJs and snuggling up with my family and animals… drowning myself in the escape of love. Oh, and dark chocolate (vegan, of course).


A post engineer’s thoughts on Adobe MAX, new offerings

By Mike McCarthy

Last week, I had the opportunity to attend Adobe’s MAX conference at the LA Convention Center. Adobe showed me, and 15,000 of my closest friends, the newest updates to pretty much all of its Creative Cloud applications, as well as a number of interesting upcoming developments. From a post production perspective, the most significant pieces of news are the release of Premiere Pro 14 and After Effects 17 (a.ka., the 2020 releases of those Creative Cloud apps).

The main show ran from Monday to Wednesday, with a number of pre-show seminars and activities the preceding weekend. My experience started off by attending a screening of the new Terminator Dark Fate film at LA Live, followed by Q&A with the director and post team. The new Terminator was edited in Premiere Pro, sharing the project assets between a large team of editors and assistants, with extensive use of After Effects, Adobe’s newly acquired Substance app and various other tools in the Creative Cloud.

The post team extolled the improvements in shared project support and project opening times since their last Premiere endeavor on the first Deadpool movie. Visual effects editor Jon Carr shared how they used the integration between Premiere and After Effects to facilitate rapid generation of temporary “postvis” effects. This helped the editors tell the story while they were waiting on the VFX teams to finish generating the final CGI characters and renders.

MAX
The conference itself kicked off with a keynote presentation of all of Adobe’s new developments and releases. The 150-minute presentation covered all aspects of the company’s extensive line of applications. “Creativity for All” is the primary message Adobe is going for, and they focused on the tension between creativity and time. So they are trying to improve their products in ways that give their users more time to be creative.

The three prongs of that approach for this iteration of updates were:
– Faster, more powerful, more reliable — fixing time-wasting bugs, improving hardware use.
– Create anywhere, anytime, with anyone — adding functionality via the iPad, and shared Libraries for collaboration.
– Explore new frontiers — specifically in 3D with Adobe’s Dimension, Substance and Aero)

Education is also an important focus for Adobe, with 15 million copies of CC in use in education around the world. They are also creating a platform for CC users to stream their working process to viewers who want to learn from them, directly from within the applications. That will probably integrate with the new expanded Creative Cloud app released last month. They also have released integration for Office apps to access assets in CC libraries.

The first application updates they showed off were in Photoshop. They have made the new locked aspect ratio scaling a toggle-able behavior, improved the warp tool and improved ways to navigate deep layer stacks by seeing which layers effect particular parts of an image. But the biggest improvement is AI-based object selection. This makes detailed maskings based on simple box selections or rough lassos. Illustrator now has GPU acceleration, improving performance of larger documents and a path simplifying tool to reduce the number of anchor points.

They released Photoshop for the iPad and announced that Illustrator will be following that path as well. Fresco is headed the other direction and now available on Windows. That is currently limited to Microsoft Surface products, but I look forward to being able to try it out on my ZBook-X2 at some point. Adobe XD has new features, and apparently is the best way to move complex Illustrator files into After Effects, which I learned at one of the sessions later.

Premiere
Premiere Pro 14 has a number of new features, the most significant one being AI-driven automatic reframe to allow you to automatically convert your edited project into other aspect ratios for various deliverables. While 16×9 is obviously a standard size, certain web platforms are optimized for square or tall videos. The feature can also be used to reframe content for 2.35 to 16×9 or 4×3, which are frequent delivery requirements for feature films that I work on. My favorite aspect of this new functionality is that the user has complete control over the results.

Unlike other automated features like warp stabilizer, which only offers on/off of applying the results, the auto-frame function just generates motion effect keyframes that can be further edited and customized by the user… once the initial AI pass is complete. It also has a nesting feature for retaining existing framing choices, that results in the creation of a new single-layer source sequence. I can envision this being useful for a number of other workflow processes — such as preparing for external color grading or texturing passes, etc.

They also added better support for multi-channel audio workflows and effects, improved playback performance for many popular video formats, better HDR export options and a variety of changes to make the motion graphics tools more flexible and efficient for users who use them extensively. They also increased the range of values available for clip playback speed and volume, and added support for new camera formats and derivations.

The brains behind After Effects have focused on improving playback and performance for this release and have made some significant improvements in that regard. The other big feature that actually may make a difference is content-aware fill for video. This was sneak previewed at MAX last year and first implemented in the NAB 2019 release of After Effects, but it should be greatly refined and improved in this version since it’s now twice as fast.

They also greatly improved support for OpenEXR frame sequences, especially with multiple render pass channels. The channels can be labeled; it creates a video contact sheet for viewing all the layers in thumbnail form. EXR playback performance is supposed to be greatly improved as well.

Character Animator is now at 3.0, and they have added keyframing of all editable values, trigger-able reposition “cameras” and trigger-able audio effects, among other new features. And Adobe Rush now supports publishing directly to TikTok.

Content Authenticity Initiative
Outside of individual applications, Adobe has launched the Content Authenticity Initiative in partnership with the NY Times and Twitter. It aims to fight fake news and restore consumer confidence in media. Its three main goals are: trust, attribution and authenticity. It aims to present end users with who created an image and who edited or altered it and, if so, in what ways. Seemingly at odds with that, they also released a new mobile app that edits images upon capture, using AI empowered “lenses” for highly stylized looks, even providing a live view.

This opening keynote was followed by a selection of over 200 different labs and sessions available over the next three days. I attended a couple sessions focused on After Effects, as that is a program I know I don’t use to its full capacity. (Does anyone, really?)

Partners
A variety of other partner companies were showing off their products in the community pavilion. HP was pushing 3D printing and digital manufacturing tools that integrate with Photoshop and Illustrator. Dell has a new 27-inch color accurate monitor with built-in colorimeter, presumably to compete with HP’s top end DreamColor displays. Asus also has some new HDR monitors that are Dolby Vision compatible. One is designed to be portable, and is as thin and lightweight as a laptop screen. I have always wondered why that wasn’t a standard approach for desktop displays.

Keynotes
Tuesday opened with a keynote presentation from a number of artists of different types, speaking or being interviewed. Jason Levine’s talk with M. Night Shyamalan was my favorite part, even though thrillers aren’t really my cup of tea. Later, I was able to sit down and talk with Patrick Palmer, Adobe’s Premiere Pro product manager about where Premiere is headed and the challenges of developing HDR creation tools when there is no unified set of standards for final delivery. But I am looking forward to being able to view my work in HDR while I am editing at some point in the future.

One of the highlights of MAX is the 90-minute Sneaks session on Tuesday night, where comedian John Mulaney “helped” a number of Adobe researchers demonstrate new media technologies they are working on. These will eventually improve audio quality, automate animation, analyze photographic authenticity and many other tasks once they are refined into final products at some point in the future.

This was only my second time attending MAX, and with Premiere Rush being released last year, video production was a key part of that show. This year, without that factor, it was much more apparent to me that I was an engineer attending an event catering to designers. Not that this is bad, but I mention it here because it is good to have a better idea of what you are stepping into when you are making decisions about whether to invest in attending a particular event.

Adobe focuses MAX on artists and creatives as opposed to engineers and developers, who have other events that are more focused on their interests and needs. I suppose that is understandable since it is not branded Creative Cloud for nothing. But it is always good to connect with the people who develop the tools I use, and the others who use them with me, which is a big part of what Adobe MAX is all about.


Mike McCarthy is an online editor/workflow consultant with over 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.


Behind the Title: Sarofsky EP Steven Anderson

This EP’s responsibilities range gamut “from managing our production staff to treating clients to an amazing dinner.”

Company: Chicago’s Sarofsky

Can you describe your company?
We like to describe ourselves as a design-driven production company. I like to think of us as that but so much more. We can be a one-stop shop for everything from concept through finish, or we can partner with a variety of other companies and just be one piece of the puzzle. It’s like ordering from a Chinese menu — you get to pick what items you want.

What’s your job title, and what does the job entail?
I’m executive producer, and that means different things at different companies and industries. Here at Sarofsky, I am responsible for things that run the gamut from managing our production staff to treating clients to an amazing dinner.

Sarofsky

What would surprise people the most about what falls under that title?
I also run payroll, and I am damn good at it.

How has the VFX industry changed in the time you’ve been working?
It used to be that when you told someone, “This is going to take some time to execute,” that’s what it meant. But now, everyone wants everything two hours ago. On the flip side, the technology we now have access to has streamlined the production process and provided us with some terrific new tools.

Why do you like being on set for shoots? What are the benefits?
I always like being on set whenever I can because decisions are being made that are going to affect the rest of the production paradigm. It’s also a good opportunity to bond with clients and, sometimes, get some kick-ass homemade guacamole.

Did a particular film inspire you along this path in entertainment?
I have been around this business for quite a while, and one of the reasons I got into it was my love of film and filmmaking. I can’t say that one particular film inspired me to do this, but I remember being a young kid and my dad taking me to see The Towering Inferno in the movie theater. I was blown away.

What’s your favorite part of the job?
Choosing a spectacular bottle of wine for a favorite client and watching their face when they taste it. My least favorite has to be chasing down clients for past due invoices. It gets old very quickly.

What is your most productive time of the day?
It’s 6:30am with my first cup of coffee sitting at my kitchen counter before the day comes at me. I get a lot of good thinking and writing done in those early morning hours.

Original Bomb Pop via agency VMLY&R

If you didn’t have this job, what would you be doing instead?
I would own a combo bookstore/wine shop where people could come and enjoy two of my favorite things.

Why did you choose this profession?
I would say this profession chose me. I studied to be an actor and made my living at it for several years, but due to some family issues, I ended up taking a break for a few years. When I came back, I went for a job interview at FCB and the rest is history. I made the move from agency producing to post executive producer five years ago and have not looked back since.

Can you briefly explain one or more ways Sarofsky is addressing the issue of workplace diversity in its business?
We are a smallish women-owned business, and I am a gay man; diversity is part of our DNA. We always look out for the best talent but also try to ensure we are providing opportunities for people who may not have access to them. For example, one of our amazing summer interns came to us through a program called Kaleidoscope 4 Kids, and we all benefited from the experience.

Name some recent projects you have worked on, which are you most proud of, and why?
My first week here at EP, we went to LA for the friends and family screening of Guardians of the Galaxy, and I thought, what an amazing company I work for! Marvel Studios is a terrific production partner, and I would say there is something special about so many of our clients because they keep coming back. I do have a soft spot for our main title for Animal Kingdom just because I am a big Ellen Barkin fan.

Original Bomb Pop via agency VMLY&R

Name three pieces of technology you can’t live without.
I’d be remiss if I didn’t say my MacBook and iPhone, but I also wouldn’t want to live without my cooking thermometer, as I’ve learned how to make sourdough bread this year, and it’s essential.

What social media channels do you follow?
I am a big fan of Instagram; it’s just visual eye candy and provides a nice break during the day. I don’t really partake in much else unless you count NPR. They occupy most of my day.

Do you listen to music while you work? Care to share your favorite music to work to?
I go in waves. Sometimes I do but then I won’t listen to anything for weeks. But I recently enjoyed listening to “Ladies and Gentleman: The Best of George Michael.” It was great to listen to an entire album, a rare treat.

What do you do to de-stress from it all?
I get up early and either walk or do some type of exercise to set the tone for the day. It’s also so important to unplug; my partner and I love to travel, so we do that as often as we can. All that and a 2006 Chateau Margaux usually washes away the day in two delicious sips.


Behind the Title: Title Designer Nina Saxon

For 40 years, Nina Saxon has been a pioneer in the area of designing movie titles. She is still one of the few women working in this part of the industry.

NAME: Nina Saxon

COMPANY: Nina Saxon Design

CAN YOU DESCRIBE YOUR COMPANY?
We design main and end titles for film and television as well as branding for still and moving images.

WHAT’S YOUR JOB TITLE?
Title Designer

WHAT DOES THAT ENTAIL?
Making a moving introduction for films, like a book cover, that introduces a film. Or it might be simple type over picture. Also watching a film and showing the director samples or storyboards of what I think should be used for the film.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
That I’m one of only a few women in this field and have worked for 40 years, hiring others to help me only if necessary.

WHAT’S YOUR FAVORITE PART OF THE JOB?
When my project is done and I get to see my finished work up on the screen.

WHAT’S YOUR LEAST FAVORITE?
Waiting to be paid.

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
Morning

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I’d probably be a psychologist.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
In 1975, I was in the film department at UCLA and decided I was determined to work in the film business.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
The upcoming documentary on Paul McCartney called Here, There and Everywhere, and upcoming entertainment industry corporate logos that will be revealed in October. In the past, I did the movie Salt with Angeline Jolie and the movie Flight with Denzel Washington.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
Working on the main title open for Forrest Gump.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
My iPad, iPhone and computer

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I exercise a lot, five to six days a week; drink a nice glass of wine; try to get enough sleep; listen to music while meditating before sleep; and make sure I know what I need to do the next day before I go to bed.


Ziva VFX 1.7 helps simplify CG character creation


Ziva Dynamics has introduced Ziva VFX 1.7, designed to make CG character creation easier thanks to the introduction of Art Directable Rest Shapes (ADRS). This tool allows artists to make characters conform to any shape without losing its dynamic properties, opening up a faster path to cartoons and digi-doubles.

Users can now adjust a character’s silhouette with simple sculpting tools. Once the goal shape is established, Ziva VFX can morph to match it, maintaining all of the dynamics embedded before the change. Whether unnatural or precise, ADRS works with any shape, removing the difficulty of both complex setups and time-intensive corrective work.

The Art Directable Rest Shapes feature has been in development for over a year and was created in collaboration with several major VFX and feature animation studios. According to Ziva, while outputs and art styles differed, each group essentially requested the same thing: extreme accuracy and more control without compromising the dynamics that sell a final shot.

For feature animation characters not based on humans or nature, ADRS can rapidly alter and exaggerate key characteristics, allowing artists to be expressive and creative without losing the power of secondary physics. For live-action films, where the use of digi-doubles and other photorealistic characters is growing, ADRS can minimize the setup process when teams want to quickly tweak a silhouette or make muscles fire in multiple ways during a shot.

According to Josh diCarlo, head of rigging at Sony Pictures Imageworks, “Our creature team is really looking forward to the potential of Art Directable Rest Shapes to augment our facial and shot-work pipelines by adding quality while reducing effort. Ziva VFX 1.7 holds the potential to shave weeks of work off of both processes while simultaneously increasing the quality of the end results.”

To use Art Directable Rest Shapes, artists must duplicate a tissue mesh, sculpt their new shape onto the duplicate and add the new geometry as a Rest Shape over select frames. This process will intuitively morph the character, creating a smooth, novel deformation that adheres to any artistic direction a creative team can think up. On top of ADRS, Ziva VFX 1.7 will also include a new zRBFWarp feature, which can warp NURBS surfaces, curves and meshes.

For a free 60-day trial, click here. Ziva VFX 1.7 is available now as an Autodesk Maya plugin for Windows and Linux users. Ziva VFX 1.7 can be purchased in monthly or yearly installments, depending on user type.

According to Michael Smit, chief commercial officer at Ziva Dynamics, “Ziva is working towards a new platform that will more easily allow us to deploy the software into other software packages, operating systems, and different network architectures. As an example we are currently working on our integrations into iOS and Unreal, both of which have already been used in limited release for production settings. We’re hopeful that once we launch the new platform commercially there will be an opportunity to deploy tools for macOS users.”

Flavor adds Joshua Studebaker as CG supervisor

Creative production house Flavor has added CG supervisor Joshua Studebaker to its Los Angeles studio. For more than eight years, Studebaker has been a freelance CG artist in LA, specializing in design, animation, dynamics, lighting/shading and compositing via Maya, Cinema 4D, Vray/Octane, Nuke and After Effects.

A frequent collaborator with Flavor and its brand and agency partners, Studebaker has also worked with Alma Mater, Arsenal FX, Brand New School, Buck, Greenhaus GFX, Imaginary Forces and We Are Royale in the past five years alone. In his new role with Flavor, Studebaker oversees visual effects and 3D services across the company’s global operations. Flavor’s Chicago, Los Angeles and Detroit studios offer color grading, VFX and picture finishing using tools like Autodesk Lustre and Flame Premium.

Flavor creative director Jason Cook also has a long history of working with Studebaker and deep respect for his talent. “What I love most about Josh is that he is both technical and a really amazing artist and designer. Adding him is a huge boon to the Flavor family, instantly elevating our production capabilities tenfold.”

Flavor has always emphasized creativity as a key ingredient, and according to Studebaker, that’s what attracted him. “I see Flavor as a place to grow my creative and design skills, as well as help bring more standardization to our process in house,” he explained. “My vision is to help Flavor become more agile and more efficient and to do our best work together.”

Boris FX beefs up film VFX arsenal, buys SilhouetteFX, Digital Film Tools

Boris FX, a provider of integrated VFX and workflow solutions for video and film, has bought SilhouetteFX (SFX) and Digital Film Tools (DFT). The two companies have a long history of developing tools used on Hollywood blockbusters and experience collaborating with top VFX studios, including Weta Digital, Framestore, Technicolor and Deluxe.

This is the third acquisition by Boris FX in recent years — Imagineer Systems (2014) and GenArts (2016) — and builds upon the company’s editing, visual effects, and motion graphics solutions used by post pros working in film and television. Silhouette and Digital Film Tools join Boris FX’s tools Sapphire, Continuum and Mocha Pro.

Silhouette’s groundbreaking non-destructive paint and advanced rotoscoping technology was recognized earlier this year by the Academy of Motion Pictures (Technical Achievement Award). It first gained prominence after Weta Digital used the rotoscoping tools on King Kong (2005). Now the full-fledged GPU-accelerated node-based compositing app features over 100 VFX nodes and integrated Boris FX Mocha planar tracking. Over the last 15 years, feature film artists have used Silhouette on films including Avatar (2009), The Hobbit (2012), Wonder Woman (2017), Avengers: End Game (2019) and Fast & Furious Presents: Hobbs & Shaw (2019).

Avengers: End Game courtesy of Marvel

Digital Film Tools (DFT) emerged as an off-shoot of a LA-based motion picture visual effects facility whose work included hundreds of feature films, commercials and television shows.

The Digital Film Tools portfolio includes standalone applications as well as professional plug-in collections for filmmakers, editors, colorists and photographers. The products offer hundreds of realistic filters for optical camera simulation, specialized lenses, film stocks and grain, lens flares, optical lab processes, color correction, keying and compositing, as well as natural light and photographic effects. DFT plug-ins support Adobe’s Photoshop, Lightroom, After Effects and Premiere Pro; Apple’s Final Cut Pro X and Motion; Avid’s Media Composer; and OFX hosts, including Foundry Nuke and Blackmagic DaVinci Resolve.

“This acquisition is a natural next step to our continued growth strategy and singular focus on delivering the most powerful VFX tools and plug-ins to the content creation market,”
“Silhouette fits perfectly into our product line with superior paint and advanced roto tools that highly complement Mocha’s core strength in planar tracking and object removal,” says Boris Yamnitsky, CEO/founder of Boris FX. “Rotoscoping, paint, digital makeup and stereo conversion are some of the most time-consuming, labor-intensive aspects of feature film post. Sharing technology and tools across all our products will make Silhouette even stronger as the leader in these tasks. Furthermore, we are very excited to be working with such an accomplished team [at DFT] and look forward to collaborating on new product offerings for photography, film and video.”

Silhouette founders, Marco Paolini, Paul Miller and Peter Moyer, will continue in their current leadership roles and partner with the Mocha product development team to collaborate on delivering next-generation tools. “By joining forces with Boris FX, we are not only dramatically expanding our team’s capabilities, but we are also joining a group of like-minded film industry pros to provide the best solutions and support to our customers,” says Marco Paolini, Product Designer. “The Mocha planar tracking option we currently license is extremely popular with Silhouette paint and roto artists, and more recently through OFX, we’ve added support for Sapphire plug-ins. Working together under the Boris FX umbrella is our next logical step and we are excited to add new features and continue advancing Silhouette for our user base.”

Both Silhouette and Digital Film Tool plug-ins will continue to be developed and sold under the Boris FX brand. Silhouette will adopt the Boris FX commitment to agile development with annual releases, annual support and subscription options.

Main Image: Silhouette

Roger and Big Machine merge, operate as Roger

Creative agency Roger and full-service production company Big Machine have merged — a move that will expand the creative capabilities for their respective agency, brand and entertainment clients. The studios will retain the Roger name and operate at Roger’s newly renovated facility in Los Angeles.

The combined management team includes CD Terence Lee, CD Dane Macbeth, EP Josh Libitsky, director Steve Petersen, CD Ken Carlson and Sean Owolo, who focuses on business development.

Roger now offers expanded talent and resources for projects that require branding, design, animation, VFX, VR/AR, live action and content development. Roger uses Adobe Creative Cloud for most of its workflows. The tools vary from project to project, but outside of the Adobe Suite, they also use Maxon Cinema4D, Autodesk Maya, Blackmagic DaVinci Resolve and Foundry Nuke.

Since the merger, the studio is already embarking on a number of projects, including major creative campaigns for Disney and Sony Pictures.

Roger’s new 6,500-square-foot studio includes four private offices, three editing suites, two conference rooms, an empty shooting space for greenscreen work, a kitchen and a lounge.

Behind the Title: MPC Senior Compositor Ruairi Twohig

After studying hand-drawn animation, this artist found his way to visual effects.

NAME: NYC-based Ruairi Twohig

COMPANY: Moving Picture Company (MPC)

CAN YOU DESCRIBE YOUR COMPANY?
MPC is a global creative and visual effects studio with locations in London, Los Angeles, New York, Shanghai, Paris, Bangalore and Amsterdam. We work with clients and brands across a range of different industries, handling everything from original ideas through to finished production.

WHAT’S YOUR JOB TITLE?
I work as a 2D lead/senior compositor.

Cadillac

WHAT DOES THAT ENTAIL?
The tasks and responsibilities can vary depending on the project. My involvement with a project can begin before there’s even a script or storyboard, and we need to estimate how much VFX will be involved and how long it will take. As the project develops and the direction becomes clearer, with scripts and storyboards and concept art, we refine this estimate and schedule and work with our clients to plan the shoot and make sure we have all the information and assets we need.

Once the commercial is shot and we have an edit, the bulk of the post work begins. This can involve anything from compositing fully CG environments, dragons or spaceships to beauty and product/pack-shot touch-ups or rig removal. So, my role involves a combination of overall project management and planning. But I also get into the detailed shot work and ultimately delivering the final picture. But the majority of the work I do can require a large team of people with different specializations, and those are usually the projects I find the most fun and rewarding due to the collaborative nature of the work.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I think the variety of the work would surprise most people unfamiliar with the industry. In a single day, I could be working on two or three completely different commercials with completely different challenges while also bidding future projects or reviewing prep work in the early stages of a current project.

HOW LONG HAVE YOU BEEN WORKING IN VFX?
I’ve been working in the industry for over 10 years.

HOW HAS THE VFX INDUSTRY CHANGED IN THE TIME YOU’VE BEEN WORKING?
The VFX industry is always changing. I find it exciting to see how quickly the technology is advancing and becoming more widely accessible, cost-effective and faster.

I still find it hard to comprehend the idea of using optical printers for VFX back in the day … before my time. Some of the most interesting areas for me at the moment are the developments in realtime rendering from engines such as Unreal and Unity, and the implementation of AI/machine learning tools that might be able to automate some of the more time-consuming tasks in the future.

DID A PARTICULAR FILM INSPIRE YOU ALONG THIS PATH IN ENTERTAINMENT?
I remember when I was 13, my older brother — who was studying architecture at the time — introduced me to 3ds Max, and I started playing around with some very simple modeling and rendering.

I would buy these monthly magazines like 3D World, which came with demo discs for different software and some CG animation compilations. One of the issues included the short CG film Fallen Art by Tomek Baginski. At the time I was mostly familiar with Pixar’s feature animation work like Toy Story and A Bug’s Life, so watching this short film created using similar techniques but with such a dark, mature tone and story really blew me away. It was this film that inspired me to pursue animation and, ultimately, visual effects.

DID YOU GO TO FILM SCHOOL?
I studied traditional hand-drawn animation at the Dun Laoghaire Institute of Art, Design and Technology in Dublin. This was a really fun course in which we spent the first two years focusing on the craft of animation and the fundamental principles of art and design, followed by another two years in which we had a lot of freedom to make our own films. It was during these final two years of experimentation that I started to move away from traditional animation and focus more on learning CG and VFX.

I really owe a lot to my tutors, who were really supportive during that time. I also had the opportunity to learn from visiting animation masters such as Andreas Deja, Eric Goldberg and John Canemaker. Although on the surface the work I do as a compositor is very different to animation, understanding those fundamental principles has really helped my compositing work; any additional disciplines or skills you develop in your career that require an eye for detail and aesthetics will always make you a better overall artist.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Even after 10 years in the industry, I still get satisfaction from the problem-solving aspect of the job, even on the smaller tasks. I love getting involved on the more creative projects, where I have the freedom to develop the “look” of the commercial/film. But, day to day, it’s really the team-based nature of the work that keeps me going. Working with other artists, producers, directors and clients to make a project look great is what I find really enjoyable.

WHAT’S YOUR LEAST FAVORITE?
Sometimes even if everything is planned and scheduled accordingly, a little hiccup along the way can easily impact a project, especially on jobs where you might only have a limited amount of time to get the work done. So it’s always important to work in such a way that allows you to adapt to sudden changes.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I used to draw all day, every day as a kid. I still sketch occasionally, but maybe I would have pursued a more traditional fine art or illustration career if I hadn’t found VFX.

Tiffany & Co.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
Over the past year, I’ve worked on projects for clients such as Facebook, Adidas, Samsung and Verizon. I also worked on the Tiffany & Co. campaign “Believe in Dreams” directed by Francis Lawrence, as well as the company’s holiday campaign directed by Mark Romanek.

I also worked on Cadillac’s “Rise Above” campaign for the 2019 Oscars, which was challenging since we had to deliver four spots within a short timeframe. But it was a fun project. There was also the Michelob Ultra Robots Super Bowl spot earlier this year. That was an interesting project, as the work was completed between our LA, New York and London studios.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
Last year, I had the chance to work with my friend and director Sofia Astrom on the music video for the song “Bone Dry” by Eels. It was an interesting project since I’d never done visual effects for a stop-motion animation before. This had its own challenges, and the style of the piece was very different compared to what I’m used to working on day to day. It had a much more handmade feel to it, and the visual effects design had to reflect that, which was such a change to the work I usually do in commercials, which generally leans more toward photorealistic visual effects work.

WHAT TOOLS DO YOU USE DAY TO DAY?
I mostly work with Foundry Nuke for shot compositing. When leading a job that requires a broad overview of the project and timeline management/editorial tasks, I use Nuke Studio or
Autodesk Flame, depending on the requirements of the project. I also use ftrack daily for project management.

WHERE DO YOU FIND INSPIRATION NOW?
I follow a lot of incredibly talented concept artists and photographers/filmmakers on Instagram. Viewing these images/videos on a tiny phone doesn’t always do justice to the work, but the platform is so active that it’s a great resource for inspiration and finding new artists.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I like to run and cycle around the city when I can. During the week it can be easy to get stuck in a routine of sitting in front of a screen, so getting out and about is a much-needed break for me.

An artist’s view of SIGGRAPH 2019

By Andy Brown

While I’ve been lucky enough to visit NAB and IBC several times over the years, this was my first SIGGRAPH. Of course, there are similarities. There are lots of booths, lots of demos, lots of branded T-shirts, lots of pairs of black jeans and a lot of beards. I fit right in. I know we’re not all the same, but we certainly looked like it. (The stats regarding women and diversity in VFX are pretty poor, but that’s another topic.)

Andy Brown

You spend your whole career in one industry and I guess you all start to look more and more like each other. That’s partly the problem for the people selling stuff at SIGGRAPH.

There were plenty of compositing demos from of all sorts of software. (Blackmagic was running a hands-on class for 20 people at a time.) I’m a Flame artist, so I think that Autodesk’s offering is best, obviously. Everyone’s compositing tool can play back large files and color correct, composite, edit, track and deliver, so in the midst of a buzzy trade show, the differences feel far fewer than the similarities.

Mocap
Take the world of tracking and motion capture as another example. There were more booths demonstrating tracking and motion capture than anything in the main hall, and all that tech came in different shapes and sizes and an interesting mix of hardware and software.

The motion capture solution required for a Hollywood movie isn’t the same as the one to create a live avatar on your phone, however. That’s where it gets interesting. There are solutions that can capture and translate the movement of everything from your fingers to your entire body using hardware from an iPhone X to a full 360-camera array. Some solutions used tracking ball markers, some used strips in the bodysuit and some used tiny proximity sensors, but the results were all really impressive.

Vicon

Vicon

Some tracking solution companies had different versions of their software and hardware. If you don’t need all of the cameras and all of the accuracy, then there’s a basic version for you. But if you need everything to be perfectly tracked in real time, then go for the full-on pro version with all the bells and whistles. I had a go at live-animating a monkey using just my hands, and apart from ending with him licking a banana in a highly inappropriate manner, I think it worked pretty well.

AR/VR
AR and VR were everywhere, too. You couldn’t throw a peanut across the room without hitting someone wearing a VR headset. They’d probably be able to bat it away whilst thinking they were Joe Root or Max Muncy (I had to Google him), with the real peanut being replaced with a red or white leather projectile. Haptic feedback made a few appearances, too, so expect to be able to feel those virtual objects very soon. Some of the biggest queues were at the North stand where the company had glasses that looked like the glasses everyone was wearing already (like mine, obviously) except the glasses incorporated a head-up display. I have mixed feelings about this. Google Glass didn’t last very long for a reason, although I don’t think North’s glasses have a camera in them, which makes things feel a bit more comfortable.

Nvidia

Data
One of the central themes for me was data, data and even more data. Whether you are interested in how to capture it, store it, unravel it, play it back or distribute it, there was a stand for you. This mass of data was being managed by really intelligent components and software. I was expecting to be writing all about artificial intelligence and machine learning from the show, and it’s true that there was a lot of software that used machine learning and deep neural networks to create things that looked really cool. Environments created using simple tools looked fabulously realistic because of deep learning. Basic pen strokes could be translated into beautiful pictures because of the power of neural networks. But most of that machine learning is in the background; it’s just doing the work that needs to be done to create the images, lighting and physical reactions that go to make up convincing and realistic images.

The Experience Hall
The Experience Hall was really great because no one was trying to sell me anything. It felt much more like an art gallery than a trade show. There were long waits for some of the exhibits (although not for the golf swing improver that I tried), and it was all really fascinating. I didn’t want to take part in the experiment that recorded your retina scan and made some art out of it, because, well, you know, its my retina scan. I also felt a little reluctant to check out the booth that made light-based animated artwork derived from your date of birth, time of birth and location of birth. But maybe all of these worries are because I’ve just finished watching the Netflix documentary The Great Hack. I can’t help but think that a better source of the data might be something a little less sinister.

The walls of posters back in the main hall described research projects that hadn’t yet made it into full production and gave more insight into what the future might bring. It was all about refinement, creating better algorithms, creating more realistic results. These uses of deep learning and virtual reality were applied to subjects as diverse as translating verbal descriptions into character design, virtual reality therapy for post-stroke patients, relighting portraits and haptic feedback anesthesia training for dental students. The range of the projects was wide. Yet everyone started from the same place, analyzing vast datasets to give more useful results. That brings me back to where I started. We’re all the same, but we’re all different.

Main Image Credit: Mike Tosti


Andy Brown is a Flame artist and creative director of Jogger Studios, a visual effects studio with offices in Los Angeles, New York, San Francisco and London.

Maxon intros Cinema 4D R21, consolidates versions into one offering

By Brady Betzel

At SIGGRAPH 2019, Maxon introduced the next release of its graphics software, Cinema 4D R21. Maxon also announced a subscription-based pricing structure as well as a very welcomed consolidation of its Cinema 4D versions into a single version, aptly titled Cinema 4D.

That’s right, no more Studio, Broadcast or BodyPaint. It all comes in one package at one price, and that pricing will now be subscription-based — but don’t worry, the online anxiety over this change seems to have been misplaced.

The cost has been substantially dropped for Cinema 4D R21, leading the way to start what Maxon is calling the “3D for the Real World” initiative. Maxon wants it to be the tool you choose for your graphics needs.

If you plan on upgrading every year or two, the new subscription-based model seems to be a great deal:

– Cinema 4D subscription paid annually: $59.99/month
– Cinema 4D subscription paid monthly: $94.99/month
– Cinema 4D subscription with Redshift paid annually: $81.99/month
– Cinema 4D subscription with Redshift paid monthly: $116.99/month
– Cinema 4D perpetual pricing: $3,495 (upgradeable)

Maxon did mention that if you have previously purchased Cinema 4D, there will be subscription-based upgrade/crossgrade deals coming.

The Updates
Cinema 4D R21 includes some great updates that will be welcomed by many users, both new and experienced. The new Field Force dynamics object allows the use of dynamic forces in modeling and animation within the MoGraph toolset. Caps and bevels have an all-new system that not only allows the extrusion of 3D logos and text effects but also means caps and bevels are integrated on all spline-based objects.

Furthering Cinema 4D’s integration with third-party apps, there is an all-new Mixamo Control rig allowing you to easily control any Mixamo characters. (If you haven’t checked out the models from Mixamo, you should. It’s a great way to find character rigs fast.)

An all-new Intel Open Image Denoise integration has been added to R21 in what seems like part of a rendering revolution for Cinema 4D. From the acquistion of Redshift to this integration, Maxon is expanding its third-party reach and doesn’t seem scared.

There is a new Node Space, which shows what materials are compatible with chosen render engines, as well as a new API available to third-party developers that allows them to integrate render engines with the new material node system. R21 has overall speed and efficiency improvements, with Cinema 4D supporting the latest processor optimizations from both Intel and AMD.

All this being said, my favorite update — or map toward the future — was actually announced last week. Unreal Engine added Cinema 4D .c4d file support via the Datasmith plugin, which is featured in the free Unreal Studio beta.

Today, Maxon is also announcing its integration with yet another game engine: Unity. In my opinion, the future lies in this mix of real-time rendering alongside real-world television and film production as well as gaming. With Cinema 4D, Maxon is bringing all sides to the table with a mix of 3D modeling, motion-graphics-building support, motion tracking, integration with third-party apps like Adobe After Effects via Cineware, and now integration with real-time game engines like Unreal Engine. Now I just have to learn it all.

Cinema 4D R21 will be available on both Mac OS and Windows on Tuesday, Sept. 3. In the meantime, watch out for some great SIGGRAPH presentations, including one from my favorite, Mike Winkelmann, better known as Beeple. You can find some past presentations on how he uses Cinema 4D to cover his “Everydays.”

Virtual Production Field Guide: Fox VFX Lab’s Glenn Derry

Just ahead of SIGGRAPH, Epic Games has published a resource guide called “The Virtual Production Field Guide”  — a comprehensive look at how virtual production impacts filmmakers, from directors to the art department to stunt coordinators to VFX teams and more. The guide is workflow-agnostic.

The use of realtime game engine technology has the potential to impact every aspect of traditional filmmaking, and the trend is increasingly being used in productions ranging from films like Avengers: Endgame and the upcoming Artemis Fowl to TV series like Game of Thrones.

The Virtual Production Field Guide offers an in-depth look at different types of techniques from creating and integrating high-quality CG elements live on set to virtual location scouting to using photoreal LED walls for in-camera VFX. It provides firsthand insights from award-winning professionals who have used these techniques – including directors Kenneth Branagh and Wes Ball, producers Connie Kennedy and Ryan Stafford, cinematographers Bill Pope and Haris Zambarloukos, VFX supervisors Ben Grossmann and Sam Nicholson, virtual production supervisors Kaya Jabar and Glenn Derry, editor Dan Lebental, previs supervisor Felix Jorge, stunt coordinators Guy and Harrison Norris, production designer Alex McDowell, and grip Kim Heath.

As mentioned, the guide is dense with information, so we decided to run an excerpt to give you an idea of what it covers.

Glenn DerryHere is an interview with Glenn Derry, founder and VP of visual effects at Fox VFX Lab, which offers a variety of virtual production services with a focus on performance capture. Derry is known for his work as a virtual production supervisor on projects like Avatar, Real Steel and The Jungle Book.

Let’s find out more.

How has performance capture evolved since projects such as The Polar Express?
In those earlier eras, there was no realtime visualization during capture. You captured everything as a standalone piece, and then you did what they called the director layout. After-the-fact, you would assemble the animation sequences from the motion data captured. Today, we’ve got a combo platter where we’re able to visualize in realtime.
When we bring a cinematographer in, he can start lining up shots with another device called the hybrid camera. It’s a tracked reference camera that he can hand hold. I can immediately toggle between an Unreal overview or a camera view of that scene.The earlier process was minimal in terms of aesthetics. We did everything we could in MotionBuilder, and we made it look as good as it could. Now we can make a lot more mission-critical decisions earlier in the process because the aesthetics of the renders look a lot better.

What are some additional uses for performance capture?
Sometimes we’re working with a pitch piece, where the studio is deciding whether they want to make a movie at all. We use the capture stage to generate what the director has in mind tonally and how the project could feel. We could do either a short little pitch piece or, for something like Call of the Wild, we created 20 minutes and three key scenes from the film to show the studio we could make it work.

The second the movie gets greenlit, we flip over into preproduction. Now we’re breaking down the full script and working with the art department to create concept art. Then we build the movie’s world out around those concepts.

We have our team doing environmental builds based on sketches. Or in some cases, the concept artists themselves are in Unreal Engine doing the environments. Then our virtual art department (VAD) cleans those up and optimizes them for realtime.

Are the artists modeling directly in Unreal Engine?
The artists model in Maya, Modo, 3ds Max, etc. — we’re not particular about the application as long as the output is FBX. The look development, which is where the texturing happens, is all done within Unreal. We’ll also have artists working in Substance Painter and it will auto-update in Unreal. We have to keep track of assets through the entire process, all the way through to the last visual effects vendor.

How do you handle the level of detail decimation so realtime assets can be reused for visual effects?
The same way we would work on AAA games. We begin with high-resolution detail and then use combinations of texture maps, normal maps and bump maps. That allows us to get high-texture detail without a huge polygon count. There are also some amazing LOD [level of detail] tools built into Unreal, which enable us to take a high-resolution asset and derive something that looks pretty much identical unless you’re right next to it, but runs at a much higher frame rate.

Do you find there’s a learning curve for crew members more accustomed to traditional production?
We’re the team productions come to do realtime on live-action sets. That’s pretty much all we do. That said, it requires prep, and if you want it to look great, you have to make decisions. If you were going to shoot rear projection back in the 1940s or Terminator 2 with large rear projection systems, you still had to have all that material pre-shot to make it work.
It’s the same concept in realtime virtual production. If you want to see it look great in Unreal live on the day, you can’t just show up and decide. You have to pre-build that world and figure out how it’s going to integrate.

The visual effects team and the virtual production team have to be involved from day one. They can’t just be brought in at the last minute. And that’s a significant change for producers and productions in general. It’s not that it’s a tough nut to swallow, it’s just a very different methodology.

How does the cinematographer collaborate with performance capture?
There are two schools of thought: one is to work live with camera operators, shooting the tangible part of the action that’s going on, as the camera is an actor in the scene as much as any of the people are. You can choreograph it all out live if you’ve got the performers and the suits. The other version of it is treated more like a stage play. Then you come back and do all the camera coverage later. I’ve seen DPs like Bill Pope and Caleb Deschanel pick this right up.

How is the experience for actors working in suits and a capture volume?
One of the harder problems we deal with is eye lines. How do we assist the actors so that they’re immersed in this, and they don’t just look around at a bunch of gray box material on a set. On any modern visual effects movie, you’re going to be standing in front of a 50-foot-tall bluescreen at some point.

Performance capture is in some ways more actor-centric versus a traditional set because there aren’t all the other distractions in a volume such as complex lighting and camera setup time. The director gets to focus in on the actors. The challenge is getting the actors to interact with something unseen. We’ll project pieces of the set on the walls and use lasers for eye lines. The quality of the HMDs today are also excellent for showing the actors what they would be seeing.

How do you see performance capture tools evolving?
I think a lot of the stuff we’re prototyping today will soon be available to consumers, home content creators, YouTubers, etc. A lot of what Epic develops also gets released in the engine. Money won’t be the driver in terms of being able to use the tools, your creative vision will be.

My teenage son uses Unreal Engine to storyboard. He knows how to do fly-throughs and use the little camera tools we built — he’s all over it. As it becomes easier to create photorealistic visual effects in realtime with a smaller team and at very high fidelity, the movie business will change dramatically.

Something that used to cost $10 million to produce might be a million or less. It’s not going to take away from artists; you still need them. But you won’t necessarily need these behemoth post companies because you’ll be able to do a lot more yourself. It’s just like desktop video — what used to take hundreds of thousands of dollars’ worth of Flame artists, you can now do yourself in After Effects.

Do you see new opportunities arising as a result of this democratization?
Yes, there are a lot of opportunities. High-quality, good-looking CG assets are still expensive to produce and expensive to make look great. There are already stock sites like TurboSquid and CGTrader where you can purchase beautiful assets economically.

But with the final assembly and coalescing of environments and characters there’s still a lot of need for talented people to do it effectively. I can see companies emerging out of that necessity. We spend a lot of time talking about assets because it’s the core of everything we do. You need to have a set to shoot on and you need compelling characters, which is why actors won’t go away.

What’s happening today isn’t even the tip of the iceberg. There are going to be 50 more big technological breakthroughs along the way. There’s tons of new content being created for Apple, Netflix, Amazon, Disney+, etc. And they’re all going to leverage virtual production.
What’s changing is previs’ role and methodology in the overall scheme of production.
While you might have previously conceived of previs as focused on the pre-production phase of a project and less integral to production, that conception shifts with a realtime engine. Previs is also typically a hands-off collaboration. In a traditional pipeline, a previs artist receives creative notes and art direction then goes off to create animation and present it back to creatives later for feedback.

In the realtime model, because the assets are directly malleable and rendering time is not a limiting factor, creatives can be much more directly and interactively involved in the process. This leads to higher levels of agency and creative satisfaction for all involved. This also means that instead of working with just a supervisor you might be interacting with the director, editor and cinematographer to design sequences and shots earlier in the project. They’re often right in the room with you as you edit the previs sequence and watch the results together in realtime.

Previs image quality has continued to increase in visual fidelity. This means a greater relationship between previs and final pixel image quality. When the assets you develop as a previs artist are of a sufficient quality, they may form the basis of final models for visual effects. The line between pre and final will continue to blur.

The efficiency of modeling assets only once is evident to all involved. By spending the time early in the project to create models of a very high quality, post begins at the outset of a project. Instead of waiting until the final phase of post to deliver the higher-quality models, the production has those assets from the beginning. And the models can also be fed into ancillary areas such as marketing, games, toys and more.

Meet the Artist: The Mill’s Anne Trotman

Anne Trotman is a senior Flame artist and VFX supervisor at The Mill in New York. She specializes in beauty and fashion work but gets to work on a variety of other projects as well.

A graduate of Kings College in London, Trotman took on what she calls “a lot of very random temp jobs” before finally joining London’s Blue Post Production as a runner.

“In those days a runner did a lot of ‘actual’ running around SoHo, dropping off tapes and picking up lunches,” she says, admitting she was also sent out for extra green for color bars and warm sake at midnight. After being promoted to the machine room, she spent her time assisting all the areas of the company, including telecine grading, offline, online, VFX and audio. “This gave me a strong understanding of the post production process as a whole.”

Trotman then joined the 2D VFX teams from Blue, Clear Post Production, The Hive and VTR to create a team at Prime Focus London. She moved into film compositing where she headed up the 2D team as a senior Flame operator. Overseeing projects, including shot allocation and VFX reviews. Then she joined SFG-Technicolor’s commercials facility in Shanghai. After a year in China she joined The Mill in New York, where she is today.

We reached out to Trotman to find out more about The Mill, a technology and visual effects studio, how she works and some recent projects. Enjoy.

Bumble

Can you talk about some recent high-profile projects you’ve completed?
The most recent high-profile project I’ve worked on was for Bumble’s Super Bowl 2019 spot. It was its first commercial ever. Being that Bumble is a female-founded company, it was important for this project to celebrate female artists and empowerment, something I strongly support. Therefore, I was thrilled to lead an all-female team for this project. The agency creatives and producers were all female and so was almost the whole post team, including the editor, colorist and all the VFX artists.

How did you first learn Flame, and how has your use of it evolved over the years?
I had been assisting artists working on a Quantel Editbox at Blue. They then installed a Flame and hired a female artist who had worked on Gladiator. That’s when I knew I had found my calling. Working with technical equipment was very attractive to me, and in those days it was a dark art, and you had to work in a company to get your hands on one. I worked nights doing a lot of conforming and rotoscoping. I also started doing small jobs for clients I knew well. I remember assisting on an Adele pop video, which is where my love of beauty started.

When I first started using Flame, the whole job was usually completed by one artist. These days, jobs are much bigger, and with so many versions for social media, some days a lot of my day is coordinating the team of artists. Workshare and remote artists are becoming a big part of our industry, so communicating with artists all over the world has become a big part of my job in order to bring everything together to create the final film.

In addition to Flame, what other tools are used in your workflow?
Post production has changed so much in the past five years. My job is not just to press buttons on a Flame to get a commercial on television anymore; that’s only a small part. My job is to help the director and/or the agency position a brand and connect it with the consumer.

My workflow usually starts with bidding an agency or a director’s brief. Sometimes they need tests to sell an idea to a client. I might supervise a previz artist on Maxon Cinema 4D to help them achieve the director’s vision. I attend most of the shoots, which gives me an insight into the project while assessing the client’s goals and vision. I can take Flame on a laptop to my shoots to do tests for the director to help explain how certain shots will look after post. This process is so helpful all around in order for me to see if what we are shooting is correct and for the client to understand the director’s vision.

At The Mill, I work closely with the colorists who work on FilmLight Baselight before completing the work on Flame. All the artists at The Mill use Flame and Foundry Nuke, although my Flame skills are 100% better than my Nuke skills.

What are the most fulfilling aspects of the work you do?
I’m lucky to work with many directors and agency creatives that I now call friends. It still gives me a thrill when I’m able to interpret the vision of the creative or director to create the best work possible and convey the message of the brand.

I also love working with the next generation of artists. I especially love being able to work alongside the young female talent at The Mill. This is the first company I’ve worked at where I’ve not been “the one and only female Flame artist.”

At the Mill NY, we currently have 11 full-time female 2D artists working in our team, which has a 30/70 male to female ratio. Still a way to go to get to 50/50, so if I can inspire another female intern or runner who is thinking of becoming a VFX artist or colorist, then it’s a good day. Helping the cycle continue for female artists is so important to me.

What is the greatest challenge you’ve faced in your career?
Moving to Shanghai. Not only did I have the challenge of the language barrier to overcome but also the culture — from having lunch at noon to working with clients from a completely different background than mine. I had to learn all I could about the Chinese culture to help me connect with my clients.

Covergirl with Issa Rae

Out of all of the projects you’ve worked on, which one are you the most proud of?
There are many, but one that stands out is the Covergirl brand relaunch (2018) for director Matt Lambert at Prettybird. As an artist working on high-profile beauty brands, what they stand for is very important to me. I know every young girl will want to use makeup to make themselves feel great, but it’s so important to make sure young women are using it for the right reason. The new tagline “I am what I make-up” — together with a very diverse group of female ambassadors — was such a positive message to put out into the world.

There was also 28 Weeks Later, a feature film from director Juan Carlos Fresnadillo. My first time working on a feature was an amazing experience. I got to make lifelong friends working on this project. My technical abilities as an artist grew so much that year, from learning the patience needed to work on the same shot for two months to discovering the technical difficulties in compositing fire to be able to blow up parts of London. Such fun!

Finally, there was also a spot for the Target Summer 2019 campaign. It was directed by Whitelabel’s Lacey, who I collaborate with together on a lot of projects. Tristan Sheridan was the DP and the agency was Mother NY.

Target Summer Campaign

What advice do you have for a young professional trying to break into the industry?
Try everything. Don’t get pigeonholed into one area of the industry too early on. Learn about every part of the post process; it will be so helpful to you as you progress through your career.

I was lucky my first boss in the industry (Dave Cadle) was patient and gave me time to find out what I wanted to focus on. I try to be a positive mentor to the young runners and interns at The Mill, especially the young women. I was so lucky to have had female role models throughout my career, from the person that employed me to the first person that started training me on Flame. I know how important it is to see someone like you in a role you are thinking of pursuing.

Outside of work, how do you enjoy spending your free time?
I travel as much as I can. I love learning about new cultures; it keeps me grounded. I live in New York City, which is a bubble, and if you stay here too long, you start to forget what the real world looks like. I also try to give back when I can. I’ve been helping a director friend of mine with some films focusing on the issue of female homelessness around the world. We collaborated on some lovely films about women in LA and are currently working on some London-based ones.

You can find out more here.

Anne Trotman Image: Photo by Olivia Burke

Conductor boosts its cloud rendering with Amazon EC2

Conductor Technologies’ cloud rendering platform will now support Amazon Web Services (AWS) and Amazon Elastic Compute Cloud (Amazon EC2), bringing the virtual compute resources of AWS to Conductor customers. This new capability will provide content production studios working in visual effects, animation and immersive media access to new, secure, powerful resources that will allow them — according to the company — to quickly and economically scale render capacity. Amazon EC2 instances, including cost-effective Spot Instances, are expected to be available via Conductor this summer.

“Our goal has always been to ensure that Conductor users can easily access reliable, secure instances on a massive scale. AWS has the largest and most geographically diverse compute, and the AWS Thinkbox team, which is highly experienced in all facets of high-volume rendering, is dedicated to M&E content production, so working with them was a natural fit,” says Conductor CEO Mac Moore. “We’ve already been running hundreds of thousands of simultaneous cores through Conductor, and with AWS as our preferred cloud provider, I expect we’ll be over the million simultaneous core mark in no time.”

Simple to deploy and highly scalable, Conductor is equally effective as an off-the-shelf solution or customized to a studio’s needs through its API. Conductor’s intuitive UI and accessible analytics provide a wealth of insightful data for keeping studio budgets on track. Apps supported by Conductor include Autodesk Maya and Arnold; Foundry’s Nuke, Cara VR, Katana, Modo and Ocula; Chaos Group’s V-Ray; Pixar’s RenderMan; Isotropix’s Clarisse; Golaem; Ephere’s Ornatrix; Yeti; and Miarmy. Additional software and plug-in support are in progress, and may be available upon request.

Some background on Conductor: it’s a secure cloud-based platform that enables VFX, VR/AR and animation studios to seamlessly offload rendering and simulation workloads to the public cloud. As the only rendering service that is scalable to meet the exact needs of even the largest studios, Conductor easily integrates into existing workflows, features an open architecture for customization, provides data insights and can implement controls over usage to ensure budgets and timelines stay on track.

Technicolor opens prepro studio in LA

Technicolor is opening a new studio in Los Angeles dedicated to creating a seamless pipeline for feature projects — from concept art and visualization through virtual production, production and into final VFX.

As new distribution models increase the demand for content, Technicolor Pre-Production will provide the tools, the talent and the space for creatives to collaborate from day one of their project – from helping set the vision at the start of a job to ensuring that the vision carries through to production and VFX. The result is a more efficient filmmaking process.

Technicolor Pre-Production studio is headed by Kerry Shea, an industry veteran with over 20 years of experience. She is no stranger to this work, having held executive positions at Method Studios, The Third Floor, Digital Domain, The Jim Henson Company, DreamWorks Animation and Sony Pictures Imageworks.

Kerry Shea

Credited on more than 60 feature films including The Jungle Book, Pirates of the Caribbean: Dead Men Tell No Tales and Guardians of the Galaxy Vol. 2, Shea has an extensive background in VFX and post production, as well as live action, animatronics and creature effects.

While the Pre-Production studio stands apart from Technicolor’s visual effects studios — MPC Film, Mill Film, MR. X and Technicolor VFX — it can work seamlessly in conjunction with one or any combination of them.

The Technicolor Pre-Production Studio will comprise of key departments:
– The Business Development Department will work with clients, from project budgeting to consulting on VFX workflows, to help plan and prepare projects for a smooth transition into VFX.
– The VFX Supervisors Department will offer creative supervision across all aspects of VFX on client projects, whether delivered by Technicolor’s studios or third-party vendors.
– The Art Department will work with clients to understand their vision – including characters, props, technologies, and environments – creating artwork that delivers on that vision and sets the tone for the rest of the project.
– The Virtual Production Department will partner with filmmakers to bridge the gap between them and VFX through the production pipeline. Working on the ground and on location, the department will deliver a fully integrated pipeline and shooting services with the flexibility of a small, manageable team — allowing critical players in the filmmaking process to collaborate, view and manipulate media assets and scenes across multiple locations as the production process unfolds.
– The Visualization Department will deliver visualizations that will assist in achieving on screen exactly what clients envisioned.

“With the advancements of tools and technologies, such as virtual production, filmmaking has reached an inflection point, one in which storytellers can redefine what is possible on-set and beyond,” says Shea. “I am passionate about the increasing role and influence that the tools and craft of visual effects can have on the production pipeline and the even more important role in creating more streamlined and efficient workflows that create memorable stories.”

EP Nick Litwinko leads Nice Shoes’ new long-form VFX arm

NYC-based creative studio Nice Shoes has hired executive producer Nick Litwinko to lead its new film and episodic VFX division. Litwinko, who has built a career on infusing a serial entrepreneur approach to the development of creative studios, will grow the division, recruiting talent to bring a boutique, collaborative approach to visual effects for long-form entertainment projects. The division will focus on feature film and episodic projects.

Since coming on board with Nice Shoes, Litwinko and his team already have three long-form projects underway and will continue working to sign on new talent.

Litwinko launched his career at MTV during the height of its popularity, working as a senior producer for MTV Promos/Animation before stepping up as executive producer/director for MTV Commercials. His decade-long tenure led him to launch his own company, Rogue Creative, where he served dual roles as EP and director and oversaw a wide range of animated, live-action and VFX-driven branded campaigns. He was later named senior producer for Psyop New York before launching the New York office of Blind. He moved on to join the team at First Avenue Machine as executive producer/head of production. He was then recruited to join Shooters Inc. as managing director, leading a strategic rebrand, building the company’s NYC offices and playing an instrumental part in the rebrand to Alkemy X.

Behind the Title: Neko founder Lirit Rosenzweig Topaz

NAME: Lirit Rosenzweig Topaz

COMPANY: Burbank’s Neko Productions

CAN YOU DESCRIBE YOUR COMPANY?
We are an animation studio working on games, TV, film, digital, AR, VR and promotional projects in a variety of styles, including super-cartoony and hyper-realistic CG and 2D. We believe in producing the best product for the budget, and giving our clients and partners peace of mind.

WHAT’S YOUR JOB TITLE?
Founder/Executive Producer

WHAT DOES THAT ENTAIL?
I established the company and built it from scratch. I am the face of the company and the force behind it. I am in touch with our clients and potential clients to make sure all are getting the best service possible.

Dr. Ruth doc

I am a part of the hiring process, making sure our team meets the standards of creativity, communication ability, responsibility and humanness. It is important for me to make sure all of our team members are great human beings, as well as being amazing and talented artists. I oversee all projects and make sure the machine is working smoothly to everyone’s satisfaction.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I am always looking at the big picture, from the macro to the micro, as well. I need to be aware of so many of the smaller details making sure everything is running smoothly for both sides, employees and clients.

WHAT HAVE YOU LEARNED OVER THE YEARS ABOUT RUNNING A BUSINESS?
I have learned that it is a roller coaster and one should enjoy the ride, and that one day doesn’t look like another day. I learned that if you are true to yourself, stick to your objectives and listen to your inner voice while doing a great job, things will work out. I always remember we are all human beings; you can succeed as a business person and have people and clients love working with you at the same time.

A LOT OF IT MUST BE ABOUT TRYING TO KEEP EMPLOYEES AND CLIENTS HAPPY. HOW DO YOU BALANCE THAT?
For sure! That is the key for everything. When employees are happy, they give their heart and soul. As a result, the workplace becomes a place they appreciate, not just a place they need to go to earn a living. Happy clients mean that you did your job well. I balance it by checking in with my team to make sure all is well by asking them to share with me any concerns they may have. At the end of the day, when the team is happy, they do a good job, and that results in satisfied clients.

WHAT’S YOUR FAVORITE PART OF THE JOB?
It is important for me that everybody comes to work with a smile on their face and to be a united team with the goal to create great projects. This usually results in their thinking out of the box and looking for ways to be efficient, to push the envelope and to make sure creativity is always at the highest level. Working on projects similar to ones we did in the past, but also to work on projects and styles we haven’t done before.

Dr. Ruth doc

I like the fact that I am a woman running a company. Being a woman allows me to juggle well, be on top of a few things at the same time and still be caring and loving.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
I have two. One is the beginning of the day when I know I have a full day ahead of me to create work, influence, achieve and do many things. Two is the evening, when I am back home with my family.

CAN YOU NAME SOME RECENT CLIENTS?
Sega, Wayforward and the recent Ask Dr Ruth documentary.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
My iPhone, my iPad and my computer.

Bipolar Studio gives flight to Uber Air campaign

Tech company Uber has announced their latest transportation offering — an aerial ride-sharing service. With plans to begin service within cities as soon as 2023, Uber Elevate launched their marketing campaign today at the Uber Elevate Summit. Uber Elevate picked LA-based creative and production boutique Bipolar Studio to create their integrated campaign, which includes a centerpiece film, experiential VR installation, print stills and social content.

The campaign’s centerpiece film Airborne includes stunning imagery that is 100 percent CGI. Beginning on an aerial mass transit platform at Mission Bay HQ, the flight travels across the city of San Francisco, above landmarks like where the Warriors play and Transamerica Tower. Lights blink on the ground below and buildings appear as silhouettes in the far background. The Uber Air flight lands in Santa Clara on Uber’s skytower with a total travel time of 18 minutes — compared to an hour or more driving through rush hour traffic. Multi-floor docking will allow Uber Air to land up to 1000 eVTOLs (those futuristic-looking vehicles that hover, take off and land vertically) per hour.

At the Uber Elevate Summit, attendees had the opportunity to experience a full flight inside a built-out physical cabin via a high-fidelity four-minute VR installation. After the Summit, the installation will travel to Uber events globally. Still images and social media content will further extend the campaign’s reach.

Uber Elevates head of design, John Badalamenti, explains, “We worked shoulder-to-shoulder with Bipolar Studio to create an entirely photoreal VR flight experience, detailed at a high level of accuracy from the physics of flight and advanced flight patterns, down to the dust on the windows. This work represents a powerful milestone in communicating our vision through immersive storytelling and creates a foundation for design iteration that advances our perspective on the rider experience. Bipolar took things a step beyond that as well, creating Airborne, our centerpiece 2D film, enabling future Uber Air passengers to take in the breadth and novelty of the journey outside the cabin from the perspective of the skyways.”

Bipolar developed a bespoke AI-fueled pipeline that could capture, manage and process miles and miles of actual data, then faithfully mirror the real terrain, buildings, traffic and scattered people in cities. They then re-used the all-digital assets, which gave them full freedom to digitally scout the city for locations for “Airborne.” Shooting the spot, as with live-action production, they were able to place the CG camera anywhere in the digital city to capture the aircraft. This gave the team a lot of room to play.

For the animation work, they built a new system through Side Effects Houdini where the flight of the vehicle wasn’t animated but rather simulated with real physics. The team coded a custom plugin to be able to punch in a number for the speed of the aircraft, its weight, and direction, then have AI do everything else. This allowed them to see it turn on the flight path, respond to wind turbulence and even oscillate when taking off. It also allowed them to easily iterate, change variables and get precise dynamics. They could then watch the simulations play out and see everything in realtime.

City Buildings
To bring this to life, Bipolor had to entirely digitize San Francisco. They spent a lot of time creating a pipeline and built the entire city with miles and miles of actual data that matched the terrain and buildings precisely. They then detailed the buildings and used AI to generate moving traffic — and even people, if you can spot them — to fill the streets. Some of the areas required a LIDAR scan for rebuilding. The end result is an incredibly detailed digital recreation of San Francisco. Each of the houses is a full model with windows, walls and doors. Each of the lights in the distance is a car. Even Alcatraz is there. They took the same approach to Santa Clara.

Data Management
Bipolar rendered out 32-bit EXRs in 4K, with each frame having multiple layers for maximum control by the client in the comp stage. That gave them a ton of data and raw number of files to deal with. Thankfully, it wasn’t the studio’s first time dealing with massive amounts of data — their internal infrastructure is already setup to handle a high volume of data being worked on simultaneously. They were also able to use the SSDs on their servers, in certain cases, for a faster time in rendering comps and pre-comps.

Review: Red Giant’s VFX Suite plugins

By Brady Betzel

If you have ever watched After Effects tutorials, you are bound to have seen the people who make up Red Giant. There is Aharon Rabinowitz, who you might mistake for a professional voiceover talent; Seth Worley, who can combine a pithy sense of humor and over-the-top creativity seamlessly; and my latest man-crush Daniel Hashimoto, better known as “Action Movie Dad” of Action Movie Kid.

In these videos, these talented pros show off some amazing things they created using Red Giant’s plugin offerings, such as the Trapcode Suite, the Magic Bullet Suite, Universe and others.

Now, Red Giant is trying to improve your visual effects workflow even further with the new VFX Suite for Adobe After Effects (although some work in Adobe Premiere as well).

The new VFX Suite is a compositing focused toolkit that will complement many aspects of your work, from green screen keying to motion graphics compositing with tools such as Video Copilot’s Element 3D. Whether you want to seamlessly composite light and atmospheric fog with fewer pre-composites, add a reflection to an object easily or even just have a better greenscreen keyer, the VFX Suite will help.

The VFX Suite includes Supercomp, Primatte Keyer 6, King Pin Tracker, Spot Clone Tracker, Optical Glow; Chromatic Displacement, Knoll Light Factory 3.1, Shadow and Reflection. The VFX Suite is priced at $999 unless you qualify for the academic discount, which means you can get it for $499.

In this review, I will go over each of the plugins within the VFX Suite. Up first will be Primatte Keyer 6.

Overall, I love Red Giant’s GUIs. They seem to be a little more intuitive, allowing me to work more “creatively” as opposed to spending time figuring out technical issues.

I asked Red Giant what makes VFX Suite so powerful and Rabinowitz, head of marketing for Red Giant and general post production wizard, shared this: “Red Giant has been helping VFX artists solve compositing challenges for over 15 years. For VFX suite, we looked at those challenges with fresh eyes and built new tools to solve them with new technologies. Most of these tools are built entirely from scratch. In the case of Primatte Keyer, we further enhanced the UI and sped it up dramatically with GPU acceleration. Primatte Keyer 6 becomes even more powerful when you combine the keying results with Supercomp, which quickly turns your keyed footage into beautifully comped footage.”

Primatte Keyer 6
Primatte is a chromakey/single-color keying technology used in tons of movies and television shows. I got familiar with Primatte when BorisFX included it in its Continuum suite of plugins. Once I used Primatte and learned the intricacies of extracting detail from hair and even just using their auto-analyze function, I never looked back. On occasion, Primatte needs a little help from others, like Keylight, but I can usually pull easy and tough keys all within one or two instances of Primatte.

If you haven’t used Primatte before, you essentially pick your key color by drawing a line or rectangle around the color, adjust the detail and opacity of the matte, and — boom — you’re done. With Primatte 6 you now also get Core Matte, a new feature that draws an inside mask automatically while allowing you to refine the edges — this is a real time-saver when doing hundreds of interview greenscreen keys, especially when someone decides to wear a reflective necklace or piece of jewelry that usually requires an extra mask and tracking. Primatte 6 also adds GPU optimization, gaining even more preview and rendering speed than previous versions.

Supercomp
If you are an editor like me — who knows enough to be dangerous when compositing and working within After Effects — sometimes you just want (or need) a simpler interface without having to figure out all the expressions, layer order, effects and compositing modes to get something to look right. And if you are an Avid Media Composer user, you might have encountered the Paint Effect Tool, which is one of those one-for-all plugins. You can paint, sharpen, blur and much more from inside one tool, much like Supercomp. Think of the Supercomp interface as a Colorista or Magic Bullet Looks-type interface, where you can work with composite effects such as fog, glow, lights, matte chokers, edge blend and more inside of one interface with much less pre-composing.

The effects are all GPU-accelerated and are context-aware. Supercomp is a great tool to use with your results from the Primatte Keyer, adding in atmosphere and light wraps quickly and easily inside one plugin instead of multiple.

King Pin Tracker and Spot Clone Tracker
As an online editor, I am often tasked with sign replacements, paint-out of crew or cameras in shots, as well as other clean-ups. If I can’t accomplish what I want with BorisFX Continuum while using Mocha inside of Media Composer or Blackmagic’s DaVinci Resolve, I will jump over to After Effects and try my hand there. I don’t practice as much corner pinning as I would like, so I often forget the intricacies when tracking in Mocha and copying Corner Pin or Transform Data to After Effects. This is where the new King Pin Tracker can ease any difficulties, especially when performing corner pinning on relatively simple objects but still need to keyframe positions or perform a planar track without using multiple plugins or applications.

The Spot Clone Tracker is exactly what is says it is. Much like Resolve’s Patch Replace, Spot Clone Tracker allows you to track one area while replacing that same area with another area from the screen. In addition, Spot Clone Tracker has options to flip vertical, flip horizontal, add noise, and adjust brightness and color values. For such a seemingly simple tool, the Spot Clone Tracker is the darkhorse in this race. You’d be surprised how many clone and paint tools don’t have adjustments, like flipping and flopping or brightness changes. This is a great tool for quick dead-pixel fixes and painting out GoPros when you don’t need to mask anything out. (Although there is an option to “Respect Alpha.”)

Optical Glow and Knoll Light Factory 3.1
Have you ever been in an editing session that needed police lights amplified or a nice glow on some text but the stock plugins just couldn’t get it right? Optical Glow will solve this problem. In another amazing, simple-yet-powerful Red Giant plugin, Optical Glow can be applied and gamma-adjusted for video, log and linear levels right off the bat.

From there you can pick an inner tint, outer tint and overall glow color via the Colorize tool and set the vibrance. I really love the Falloff, Highlight Rolloff, and Highlights Only functions, which allow you to fine-tune the glow and just how much it shows what it affects. It’s so simple that it is hard to mess up, but the results speak for themselves and render out quicker than with other glow plugins I am using.

Knoll Light Factory has been newly GPU-accelerated in Version 3.1 to decrease render times when using its more than 200 presets or when customizing your own lens flares. Optical Glow and Knoll Light Factory really complement each other.

Chromatic Displacement
Since watching an Andrew Kramer tutorial covering displacement, I’ve always wanted to make a video that showed huge seismic blasts but didn’t really want to put the time into properly making chromatic displacement. Lucky for me, Red Giant has introduced Chromatic Displacement! Whether you want to make rain drops appear on the camera lens or add a seismic blast from a phaser, Chromatic Displacement will allow you to offset your background with a glass-, mirror- or even heatwave-like appearance quickly. Simply choose the layer you want to displace from and adjust parameters such as displacement amount, spread and spread chroma, and whether you want to render using the CPU or GPU.

Shadow and ReflectionRed Giant packs Shadow and Reflection plugins into the VFX Suite as well. The Shadow plugin not only makes it easy to create shadows in front of or behind an object based on alpha channel or brightness, but, best of all, it gives you an easy way to identify the point where the shadow should bend. The Shadow Bend option lets you identify where the bend exists, what color the bend axis should be, the type of seam and seam the size, and even allows for motion blur.

The Reflection plugin is very similar to the Shadow plugin and produces quick and awesome reflections without any After Effects wizardry. Just like Shadow, the Reflection plugin allows you to identify a bend. Plus, you can adjust the softness of the reflection quickly and easily.

Summing Up
In the end, Red Giant always delivers great and useful plugins. VFX Suite is no different, and the only downside some might point to is the cost. While $999 is expensive, if compositing is a large portion of your business, the efficiency you gain might outweigh the cost.

Much like Shooter Suite does for online editors, Trapcode Suite does for VFX masters and Universe does for jacks of all trades, VFX Suite will take all of your ideas and help them blend seamlessly into your work.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Quick Chat: Sinking Ship’s Matt Bishop on live-action/CG series

By Randi Altman

Toronto’s Sinking Ship Entertainment is a production, distribution and interactive company specializing in children’s live-action and CGI-blended programming. The company has 13 Daytime Emmys and a variety of other international awards on its proverbial mantel. Sinking Ship has over 175 employees across all its divisions, including its VFX and interactive studio.

Matt Bishop

Needless to say, the company has a lot going on. We decided to reach out to Matt Bishop, founding partner at Sinking Ship, to find out more.

Sinking Ship produces, creates visual effects and posts its own content, but are you also open to outside projects?
Yes, we do work in co-production with other companies or contract our post production service to shows that are looking for cutting-edge VFX.

Have you always created your own content?
Sinking Ship has developed a number of shows and feature films, as well as worked in co-production with production companies around the world.

What came first, your post or your production services? Or were they introduced in tandem?
Both sides of company evolved together as a way to push our creative visions. We started acquiring equipment on our first series in 2004, and we always look for new ways to push the technology.

Can you mention some of your most recent projects?
Some of our current projects include Dino Dana (Season 4), Dino Dana: The Movie, Endlings and Odd Squad Mobile Unit.

What is your typical path getting content from set to post?
We have been working with Red cameras for years, and we were the first company in Canada to shoot in 4K over a decade ago. We shoot a lot of content, so we create backups in the field before the media is sent to the studio.

Dino Dana

You work with a lot of data. How do you manage and keep all of that secure?
Backups, lots of backups. We use a massive LTO-7 tape robot and we have over a 2PB of backup storage on top of that. We recently added Qumulo to our workflow to ensure the most secure method possible.

What do you use for your VFX work? What about your other post tools?
We use a wide range of software, but our main tools in our creature department are Pixologic Zbrush and Foundry Mari, with all animation happening inside Autodesk Maya.

We also have a large renderfarm to handle the amount of shots, and our render engine of choice is Arnold, which is now an Autodesk project.  In post we use an Adobe Creative Cloud pipeline with 4K HDR color grading happening in DaVinci Resolve. Qumulo is going to be a welcome addition as we continue to grow and our outputs become more complex.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Axis provides 1,000 VFX shots for the TV series Happy!

UK-based animation and visual effects house Axis Studios has delivered 1,000 shots across 10 episodes on the second series of the UCP-produced hit Syfy show Happy!.

Based on Grant Morrison and Darick Robertson’s graphic novel, Happy! follows alcoholic ex-cop turned hitman Nick Sax (Christopher Meloni), who teams up with imaginary unicorn Happy (voiced by Patton Oswalt). In the second season, the action moves from Christmastime to “the biggest holiday rebranding of all time” and a plot to “make Easter great again,” courtesy of last season’s malevolent child-kidnapper, Sonny Shine (Christopher Fitzgerald).

Axis Studios, working across its three creative sites in Glasgow, Bristol, and London, collaborated with executive producer and director Brian Taylor and showrunner Patrick Macmanus to raise the bar on the animation of the fully CG character. The studio also worked on a host of supporting characters, including a “chain-smoking man-baby,” a gimp-like Easter Bunny and even a Jeff Goldblum-shaped cloud. Alongside the extensive animation work, the team’s VFX workload greatly increased from the first season — including two additional episodes, creature work, matte painting, cloud simulations, asset building and extensive effects and clean-up work.

Building on the success of the first season, the 100-person team of artists further developed the animation of the lead character, Happy!, improving the rig, giving more nuanced emotions and continually working to integrate him more in the real-world environments.

RPS editors talk workflow, creativity and Michelob Ultra’s Robots

By Randi Altman

Rock Paper Scissors (RPS) is a veteran editing house specializing in commercials, music videos and feature films. Founded by Oscar-winning editor Angus Wall (The Social Network, The Girl With the Dragon Tattoo), RPS has a New York office as well as a main Santa Monica location that it shares with sister companies A52, Elastic and Jax.

We recently reached out to RPS editor Biff Butler and his assistant editor Alyssa Oh (both Adobe Premiere users) to find out about how they work, their editing philosophy and their collaboration on the Michelob Ultra Robots spot that premiered during this year’s Super Bowl.

Let’s find out more about their process…

Rock Paper Scissors, Santa Monica

What does your job entail?
Biff Butler: Simply put, someone hands us footage (and a script) and we make something out of it. The job is to act as cheerleader for those who have been carrying the weight of a project for weeks, maybe months, and have just emerged from a potentially arduous shoot.

Their job is to then sell the work that we do to their clients, so I must hold onto and protect their vision, maintaining that initial enthusiasm they had. If the agency has written the menu, and the client has ordered the meal, then a director is the farmer and the editor the cook.

I frequently must remind myself that although I might have been hired because of my taste, I am still responsible for feeding others. Being of service to someone else’s creative vision is the name of the game.

What’s your workflow like?
Alyssa Oh: At the start of the project, I receive the footage from production and organize it to Biff’s specs. Once it’s organized, I pass it off and he watches all the footage and assembles an edit. Once we get deeper into the project, he may seek my help in other aspects of the edit, including sound design, pulling music, creating graphics, temporary visual effects and creating animations. At the end of the project, I prep the edits for finishing color, mix, and conform.

What would surprise people about being an editor?
Oh: When I started, I associated editorial with “footage.” It surprised me that, aside from editing, we play a large part in decision-making for music and developing sound design.

Butler: I’ve heard the editor described as the final writer in the process. A script can be written and rewritten, but a lot happens in the edit room once shots are on a screen. The reality of seeing what actually fits within the allotted time that the format allows for can shape decisions as can the ever-evolving needs of the client in question. Another aspect we get involved with is the music — it’s often the final ingredient to be considered, despite how important a role it plays.

Robots

What do you enjoy the most about your job?
Oh: By far, my favorite part is the people that I work with. We spend so much time together; I think it’s important to not just get along, but to also develop close relationships. I’m so grateful to work with people who I look forward to spending the day with.

At RPS, I’ve gained so many great friendships over the years and learn a lot from everyone around me —- not just in the aspect of editorial, but also from the people at companies that work alongside us — A52, Elastic and Jax.

Butler: At the risk of sounding corny, what turns me on most is collaboration and connection with other creative talents. It’s a stark contrast to the beginning of the job, which I also very much adore — when it’s just me and my laptop, watching footage and judging shots.

Usually we get a couple days to put something together on our own, which can be a peaceful time of exploration and discovery. This is when I get to formulate my own opinions and points of view on the material, which is good to establish but also is something I must be ready to let go of… or at least be flexible with. Once the team gets involved in the room — be it the agency or the director — the real work begins.

As I said before, being of service to those who have trusted me with their footage and ideas is truly an honorable endeavor. And it’s not just those who hire us, but also talents we get to join forces with on the audio/music side, effects, etc. On second thought, the free supply of sparkly water we have on tap is probably my favorite part. It’s all pretty great.

What’s the hardest part of the job?
Oh: For me, the hardest part of our job are the “peaks and valleys.” In other words, we don’t have a set schedule, and with each project, our work hours will vary.

Robots

Butler: I could complain about the late nights or long weekends or unpredictable schedules, but those are just a result of being employed, so I count myself fortunate that I even get to moan about that stuff. Perhaps one of the trickiest parts is in dealing with egos, both theirs and mine.

Inevitably, I serve as mediator between a creative agency and the director they hired, and the client who is paying for this whole project. Throw in the mix my own sense of ownership that develops, and there’s a silly heap of egos to manage. It’s a joy, but not everyone can be fully satisfied all the time.

If you couldn’t edit for a living, what would you do?
Oh: I think I would definitely be working in a creative field or doing something that’s hands-on (I still hope to own a pottery studio someday). I’ve always had a fondness for teaching and working with kids, so perhaps I’d do something in the teaching field.

Butler: I would be pursuing a career in directing commercials and documentaries.

Did you know from a young age that you would be involved in this industry?
Oh: In all honesty, I didn’t know that this would be my path. Originally, I wanted to go into
broadcast, specifically sports broadcasting. I had an interest in television production since
high school and learned a bit about editing along the way.

However, I had applied to work at RPS as a production assistant shortly after graduating and quickly gained interest in editing and never looked back!
Butler : I vividly recall seeing the movie Se7en in the cinema and being shell-shocked by the opening title sequence. The feeling I was left with was so raw and unfiltered, I remember thinking, “That is what I want to do.” I wasn’t even 100 percent sure what that was. I knew I wanted to put things together! It wasn’t even so much a mission to tell stories, but to evoke emotion — although storytelling is most often the way to get there.

Robots

At the same time, I was a kid who grew up under the spell of some very effective marketing campaigns — from Nike, Jordan, Gatorade — and knew that advertising was a field I would be interesting in exploring when it came time to find a real job.

As luck would have it, in 2005 I found myself living in Los Angeles after the rock band I was in broke up, and I walked over to a nearby office an old friend of mine had worked at, looking for a job. She’d told me it was a place where editors worked. Turns out, that place was where many of my favorite ads were edited, and it was founded by the guy who put together that Se7en title sequence. That place was Rock Paper Scissors, and it’s been my home ever since.

Can you guys talk about the Michelob Ultra Robots spot that first aired during the Super Bowl earlier this year? What was the process like?
Butler: The process involved a lot of trust, as we were all looking at frames that didn’t have any of the robots in — they were still being created in CG — so when presenting edits, we would have words floating on screen reading “Robot Here” or “Robot Runs Faster Now.”

It says a lot about the agency in that it could hold the client’s hand through our rough edit and have them buy off on what looked like a fairly empty edit. Working with director Dante Ariola at the start of the edit helped to establish the correct rhythm and intention of what would need to be conveyed in each shot. Holding on to those early decisions was paramount, although we clearly had enough human performances to rest are hats on too.

Was there a particular cut that was more challenging than the others?
Butler: The final shot of the spot was a battle I lost. I’m happy with the work, especially the quality of human reactions shown throughout. I’m also keen on the spot’s simplicity. However, I had a different view of how the final shot would play out — a closer shot would have depicted more emotion and yearning in the robot’s face, whereas where we landed left the robot feeling more defeated — but you can’t win them all.

Robots

Did you feel extra stress knowing that the Michelob spot would air during the Super Bowl?
Butler: Not at all. I like knowing that people will see the work and having a firm airdate reduces the likelihood that a client can hem and haw until the wheels fall off. Thankfully there wasn’t enough time for much to go wrong!

You’ve already talked about doing more than just editing. What are you often asked to do in addition to just editing?
Butler: Editors are really also music supervisors. There can be a strategy to it, also knowing when to present a track you really want to sell through. But really, it’s that level of trust between myself and the team that can lead to some good discoveries. As I mentioned before, we are often tasked with just providing a safe and nurturing environment for people to create.

Truly, anybody can sit and hit copy and paste all day. I think it’s my job to hold on to that initial seed or idea or vision, and protect it through the final stages of post production. This includes ensuring the color correction, finishing and sound mix all reflect intentions established days or weeks ahead when we were still fresh enough in our thinking to be acting on instinct.

I believe that as creative professionals, we are who we are because of our instincts, but as a job drags on and on, we are forced to act more with our heads than our hearts. There is a stamina that is required, making sure that what ends up on the TV is representative of what was initially coming out of that instinctual artistic expression.

Does your editing hat change depending on the type of project you are cutting?
Butler: No, not really. An edit is an edit. All sessions should involve laughter and seriousness and focus and moments to unwind and goof off. Perhaps the format will determine the metaphorical hat, or to be more specific, the tempo.

Selecting shots for a 30- or 60-second commercial is very different than chasing moments for a documentary or long-form narrative. I’ll often remind myself to literally breathe slower when I know a shot needs to be long, and the efficiency with which I am telling a story is of less importance than the need to be absorbed in a moment.

Can you name some of your favorite technology?
Oh: My iPhone and all the apps that come with it; my Kindle, which allows me to be as indecisive as I want when it comes to picking a book and traveling; my laptop; and noise-cancelling headphones!

Butler: The carbonation of water, wireless earphones and tiny solid-state hard drives.

Zoic in growth mode, adds VFX supervisor Wanstreet, ups Overstrom

VFX house Zoic Studios has made changes to its creative team, adding VFX supervisor Chad Wanstreet to its Culver City studio and promoting Nate Overstrom to creative director in its New York studio.

Wanstreet has nearly 15 years of experience in visual effects, working across series, feature film, commercial and video game projects. He comes to Zoic from FuseFX, where he worked on television series including NBC’s Timeless, Amazon Prime’s The Tick, Marvel Agents of S.H.I.E.L.D. for ABC and Starz’s Emmy-winning series Black Sails.

Overstrom has spent over 15 years of his career with Zoic, working across the Culver City and New York City studios, earning two Emmy nominations and working on top series including Banshee, Maniac and Iron Fist. He is currently the VFX supervisor on Cinemax’s Warrior.

The growth of the creative department is accompanied by the promotion of several Zoic lead artists to VFX supervisors, with Andrew Bardusk, Matt Bramante, Tim Hanson and Billy Spradlin stepping up to lead teams on a wide range of episodic work. Bardusk just wrapped Season 4 of DC’s Legends of Tomorrow, Bramante just wrapped Noah Hawley’s upcoming feature film Lucy in the Sky, Hanson just completed Season 2 of Marvel’s Cloak & Dagger, and Spradlin just wrapped Season 7 of CW’s Arrow.

This news comes on the heels of a busy start of the year for Zoic across all divisions, including the recent announcement of the company’s second development deal — optioning New York Times best-selling author Michael Johnston’s fantasy novel Soleri for feature film and television adaptation. Zoic also added Daniel Cohen as executive producer, episodic and series in New York City, and Lauren F. Ellis as executive producer, episodic and series in Culver City.

Main Image Caption: (L-R) Chad Wanstreet and Nate Overstrom

UK’s Jellyfish adds virtual animation studio and Kevin Spruce

London-based visual effects and animation studio Jellyfish Pictures is opening of a new virtual animation facility in Sheffield. The new site is the company’s fifth studio in the UK, in addition to its established studios in Fitzrovia, Central London; Brixton; South London; and Oval, South London. This addition is no surprise considering Jellyfish created one of Europe’s first virtual VFX studios back in 2017.

With no hardware housed onsite, Jellyfish Pictures’ Sheffield studio — situated in the city center within the Cooper Project Complex — will operate in a completely PC-over-IP environment. With all technology and pipeline housed in a centrally-based co-location, the studio is able to virtualize its distributed workstations through Teradici’s remote visualization solution, allowing for total flexibility and scalability.

The Sheffield site will sit on the same logical LAN as the other four studios, providing access to the company’s software-defined storage (SDS) from Pixit Media, enabling remote collaboration and support for flexible working practices. With the rest of Jellyfish Pictures’ studios all TPN-accredited, the Sheffield studio will follow in their footsteps, using Pixit Media’s container solution within PixStor 5.

The innovative studio will be headed up by Jellyfish Pictures’ newest appointment, animation director Kevin Spruce. With a career spanning over 30 years, Spruce joins Jellyfish from Framestore, where he oversaw a team of 120 as the company’s head of animation. During his time at Framestore, Spruce worked as animation supervisor on feature films such as Fantastic Beasts and Where to Find Them, The Legend of Tarzan and Guardians of the Galaxy. Prior to his 17-year stint at Framestore, Spruce held positions at Canadian animation company, Bardel Entertainment and Spielberg-helmed feature animation studio Amblimation.

Jellyfish Pictures’ northern presence will start off with a small team of animators working on the company’s original animation projects, with a view to expand its team and set up with a large feature animation project by the end of the year.

“We have multiple projects coming up that will demand crewing up with the very best talent very quickly,” reports Phil Dobree, CEO of Jellyfish Pictures. “Casting off the constraints of infrastructure, which traditionally has been the industry’s way of working, means we are not limited to the London talent pool and can easily scale up in a more efficient and economical way than ever before. We all know London, and more specifically Soho, is an expensive place to play, both for employees working here and for the companies operating here. Technology is enabling us to expand our horizon across the UK and beyond, as well as offer talent a way out of living in the big city.”

For Spruce, the move made perfect sense: “After 30 years working in and around Soho, it was time for me to move north and settle in Sheffield to achieve a better work life balance with family. After speaking with Phil, I was excited to discover he was interested in expanding his remote operation beyond London. With what technology can offer now, the next logical step is to bring the work to people rather than always expecting them to move south.

“As animation director for Jellyfish Pictures Sheffield, it’s my intention to recruit a creative team here to strengthen the company’s capacity to handle the expanding slate of work currently in-house and beyond. I am very excited to be part of this new venture north with Jellyfish. It’s a vision of how creative companies can grow in new ways and access talent pools farther afield.”

 

Amazon’s Good Omens: VFX supervisor Jean-Claude Deguara

By Randi Altman

Good versus evil. It’s a story that’s been told time and time again, but Amazon’s Good Omens turns that trope on its head a bit. With Armageddon approaching, two unlikely heroes and centuries-long frenemies— an angel (Michael Sheen) and demon (David Tennant) — team up to try to fight off the end of the world. Think buddy movie, but with the fate of the world at stake.

In addition to Tennant and Sheen, the Good Omens cast is enviable — featuring Jon Hamm, Michael McKean, Benedict Cumberbatch and Nick Offerman, just to name a few. The series is based on the 1990 book by Terry Pratchett and Neil Gaiman.

Jean-Claude Degaura

As you can imagine, this six-part end-of-days story features a variety of visual effects, from creatures to environments to particle effects and fire. London’s Milk was called on to provide 650 visual effects shots, and its co-founder Jean-Claude Deguara supervised all.

He was also able to talk directly with Gaiman, which he says was a huge help. “Having access to Neil Gaiman as the author of Good Omens was just brilliant, as it meant we were able to ask detailed questions to get a more detailed brief when creating the VFX and receive such insightful creative feedback on our work. There was never a question that couldn’t be answered. You don’t often get that level of detail when you’re developing the VFX.”

Let’s find out more about Deguara’s process and the shots in the show as he walks us through his collaboration and creating some very distinctive characters.

Can you talk about how early you got involved on Good Omens?
We were involved right at the beginning, pre-script. It’s always the best scenario for VFX to be involved at the start, to maximize planning time. We spent time with director Douglas Mackinnon, breaking down all six scripts to plan the VFX methodology — working out and refining how to best use VFX to support the storytelling. In fact, we stuck to most of what we envisioned and we continued to work closely with him throughout the project.

How did getting involved when you did help the process?
With the sheer volume and variety of work — 650 shots, a five-month post production turnaround and a crew of 60 — the planning and development time in preproduction was essential. The incredibly wide range of work spanned multiple creatures, environments and effects work.

Having constant access to Neil as author and showrunner was brilliant as we could ask for clarification and more details from him directly when creating the VFX and receive immediate creative feedback. And it was invaluable to have Douglas working with us to translate Neil’s vision in words onto the screen and plan out what was workable. It also meant I was able to show them concepts the team were developing back in the studio while we were on set in South Africa. It was a very collaborative process.

It was important to have strong crew across all VFX disciplines as they worked together on multiple sequences at the same time. So you’re starting in tracking on one, in effects on another and compositing and finishing everything off on another. It was a big logistical challenge, but certainly the kind that we relish and are well versed in at Milk.

Did you do previs? If so, how did that help and what did you use?
We only used previs to work out how to technically achieve certain shots or to sell an idea to Douglas and Neil. It was generally very simple, using gray scale animation with basic geometry. We used it to do a quick layout of how to rescale the dog to be a bigger hellhound, for example.

You were on set supervising… can you talk about how that helped?
It was a fast-moving production with multiple locations in the UK over about six months, followed by three months in South Africa. It was crucial for the volume and variety of VFX work required on Good Omens that I was across all the planning and execution of filming for our shots.

Being on set allowed me to help solve various problems as we went along. I could also show Neil and Douglas various concepts that were being developed back in the studio, so that we could move forward more quickly with creative development of the key sequences, particularly the challenging ones such as Satan and the Bentley.

What were the crucial things to ensure during the shoot?
Making sure all the preparation was done meticulously for each shot — given the large volume and variety of the environments and sets. I worked very closely with Douglas on the shoot so we could have discussions to problem-solve where needed and find creative solutions.

Can you point to an example?
We had multiple options for shots involving the Bentley, so our advance planning and discussions with Douglas involved pulling out all the car sequences in the series scripts and creating a “mini script” specifically for the Bentley. This enabled us to plan which assets (the real car, the art department’s interior car shell or the CG car) were required and when.

You provided 650 VFX shots. Can you describe the types of effects?
We created everything from creatures (Satan exploding up out of the ground; a kraken; the hellhound; a demon and a snake) to environments (heaven – a penthouse with views of major world landmarks, a busy Soho street); feathered wings for Michael Sheen’s angel Aziraphale and David Tennant’s demon Crowley, and a CG Bentley in which Tennant’s Crowley hurtles around London.

We also had a large effects team working on a whole range of effects over the six episodes — from setting the M25 and the Bentley on fire to a flaming sword to a call center filled with maggots to a sequence in which Crowley (Tennant) travels through the internet at high speed.

Despite the fantasy nature of the subject matter, it was important to Gaiman that the CG elements did not stand out too much. We needed to ensure the worlds and characters were always kept grounded in reality. A good example is how we approached heaven and hell. These key locations are essentially based around an office block. Nothing too fantastical, but they are, as you would expect, completely different and deliberately so.

Hell is the basement, which was shot in a disused abattoir in South Africa, whilst heaven is a full CG environment located in the penthouse with a panoramic view over a cityscape featuring landmarks such as the Eiffel Tower, The Shard and the Pyramids.

You created many CG creatures. Can you talk about the challenges of that and how you accomplished them?
Many of the main VFX features, such as Satan (voiced by Benedict Cumberbatch), appear only once in the six-part series as the story moves swiftly toward the apocalypse. So we had to strike a careful balance between delivering impact yet ensuring they were immediately recognizable and grounded in reality. Given our fast five-month post- turnaround, we had our key teams working concurrently on creatures such as a kraken; the hellhound; a small, portly demon called Usher who meets his demise in a bath of holy water; and the infamous snake in the Garden of Eden.

We have incorporated Ziva VFX into our pipeline, which ensured our rigging and modeling teams maximized the development and build phases in the timeframe. For example, the muscle, fat and skin simulations are all solved on the renderfarm; the animators can publish a scene and then review the creature effects in dailies the next day.

We use our proprietary software CreatureTools for rigging all our creatures. It is a modular rigging package, which allows us to very quickly build animation rigs for previs or blocking and we build our deformation muscle and fat rigs in Ziva VFX. It means the animators can start work quickly and there is a lot of consistency between the rigs.

Can you talk about the kraken?
The kraken pays homage to Ray Harryhausen and his work on Clash of the Titans. Our team worked to create the immense scale of the kraken and take water simulations to the next level. The top half of the kraken body comes up out of the water and we used a complex ocean/water simulation system that was originally developed for our ocean work on the feature film Adrift.

Can you dig in a bit more about Satan?
Near the climax of Good Omens, Aziraphale, Crowley and Adam witness the arrival of Satan. In the early development phase, we were briefed to highlight Satan’s enormous size (about 400 feet) without making him too comical. He needed to have instant impact given that he appears on screen for just this one long sequence and we don’t see him again.

Our first concept was pretty scary, but Neil wanted him simpler and more immediately recognizable. Our concept artist created a horned crown, which along with his large, muscled, red body delivered the look Neil had envisioned.

We built the basic model, and when Cumberbatch was cast, the modeling team introduced some of his facial characteristics into Satan’s FACS-based blend shape set. Video reference of the actor’s voice performance, captured on a camera phone, helped inform the final keyframe animation. The final Satan was a full Ziva VFX build, complete with skeleton, muscles, fat and skin. The team set up the muscle scene and fat scene in a path to an Alembic cache of the skeleton so that they ended up with a blended mesh of Satan with all the muscle detail on it.

We then did another skin pass on the face to add extra wrinkles and loosen things up. A key challenge for our animation team — lead by Joe Tarrant — lay in animating a creature of the immense scale of Satan. They needed to ensure the balance and timing of his movements felt absolutely realistic.

Our effects team — lead by James Reid — layered multiple effects simulations to shatter the airfield tarmac and generate clouds of smoke and dust, optimizing setups so that only those particles visible on camera were simulated. The challenge was maintaining a focus on the enormous size and impact of Satan while still showing the explosion of the concrete, smoke and rubble as he emerges.

Extrapolating from live-action plates shot at an airbase, the VFX team built a CG environment and inserted live action of the performers into otherwise fully digital shots of the gigantic red-skinned devil bursting out of the ground.

And the hellhound?
Beelzebub (Anna Maxwell Martin) sends the antichrist (a boy named Adam) a giant hellhound. By giving the giant beast a scary name, Adam will set Armageddon in motion. In reality, Adam really just wants a loveable pet and transforms the hellhound into a miniature hound called, simply, Dog.

A Great Dane performed as the hellhound, photographed in a forest location while a grip kept pace with a small square of bluescreen. The Milk team tracked the live action and performed a digital head and neck replacement. Sam Lucas modeled the head in Autodesk Maya, matching the real dog’s anatomy before stretching its features into grotesquery. A final round of sculpting followed in Pixologic ZBrush, with artists refining 40-odd blend shapes for facial expression.

Once our rigging team got the first iteration of the blend shapes, they passed the asset off to animation for feedback. They then added an extra level of tweaking around the lips. In the creature effects phase, they used Ziva VFX to add soft body jiggle around the bottom of the lips and jowls.

What about creating the demon Usher?
One of our favorite characters was the small, rotund, quirky demon creature called Usher. He is a fully rigged CG character. Our team took a fully concepted image and adapted it to the performance and physicality of the actor. To get the weight of Usher’s rotund body, the rigging team — lead by Neil Roche — used Ziva VFX to run a soft body simulation on the fatty parts of the creature, which gave him a realistic jiggle. They then added a skin simulation using Ziva’s cloth solver to give an extra layer of wrinkling across Usher’s skin. Finally they used nCloth in Maya to simulate his sash and medals.

Was one more challenging/rewarding than the others?
Satan, because of his huge scale and the integrated effects.

Out of all of the effects, can you talk about your favorite?
The CG Bentley without a doubt! The digital Bentley featured in scenes showing the car tearing around London and the countryside at 90 miles per hour. Ultimately, Crowley drives through hell fire on the M25, it catches fire and burns continuously as he heads toward the site of Armageddon. The production located a real Bentley 3.5 Derby Coupe Thrupp & Maberly 1934, which we photo scanned and modeled in intricate detail. We introduced subtle imperfections to the body panels, ensuring the CG Bentley had the same handcrafted appearance as the real thing and would hold up in full-screen shots, including continuous transitions from the street through a window to the actors in an interior replica car.

In order to get the high speed required, we shot plates on location from multiple cameras, including on a motorbike to achieve the high-speed bursts. Later, production filled the car with smoke and our effects team added CG fire and burning textures to the exterior of our CG car, which intensified as he continued his journey.

You’ve talked about the tight post turnaround? How did you show the client shots for approval?
Given the volume and wide range of work required, we were working on a range of sequences concurrently to maximize the short post window — and align our teams when they were working on similar types of shot.

We had constant access to Neil and Douglas throughout the post period, which was crucial for approvals and feedback as we developed key assets and delivered key sequences. Neil and Douglas would visit Milk regularly for reviews toward delivery of the project.

What tools did you use for the VFX?
Amazon (AWS) for cloud rendering, Ziva for creature rigging, Maya, Nuke, Houdini for effects and Arnold for rendering.

What haven’t I asked that is important to touch on?
Our work on Soho, in which Michael Sheen’s Aziraphale bookshop is situated. Production designer Michael Ralph created a set based on Soho’s Berwick Street, comprising a two-block street exterior constructed up to the top of the first story, with the complete bookshop — inside and out — standing on the corner.

Four 20-x-20-foot mobile greenscreens helped our environment team complete the upper levels of the buildings and extend the road into the far distance. We photo scanned both the set and the original Berwick Street location, combining the reference to build digital assets capturing the district’s unique flavor for scenes during both day and nighttime.


Before and After: Soho

Mackinnon wanted crowds of people moving around constantly, so on shooting days crowds of extras thronged the main section of street and a steady stream of vehicles turned in from a junction part way down. Areas outside this central zone remained empty, enabling us to drop in digital people and traffic without having to do takeovers from live-action performers and cars. Milk had a 1,000-frame cycle of cars and people that it dropped into every scene. We kept the real cars always pulling in round the corner and devised it so there was always a bit of gridlock going on at the back.

And finally, we relished the opportunity to bring to life Neil Gaiman and Douglas Mackinnon’s awesome apocalyptic vision for Good Omens. It’s not often you get to create VFX in a comedy context. For example, the stuff inside the antichrist’s head: whatever he thinks of becomes reality. However, for a 12-year-old child, this means reality is rather offbeat.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Behind the Title: Ntropic Flame artist Amanda Amalfi

NAME: Amanda Amalfi

COMPANY: Ntropic (@ntropic)

CAN YOU DESCRIBE YOUR COMPANY?
Ntropic is a content creator producing work for commercials, music videos and feature films as well as crafting experiential and interactive VR and AR media. We have offices in San Francisco, Los Angeles, New York City and London. Some of the services we provide include design, VFX, animation, color, editing, color grading and finishing.

WHAT’S YOUR JOB TITLE?
Senior Flame Artist

WHAT DOES THAT ENTAIL?
Being a senior Flame artist involves a variety of tasks that really span the duration of a project. From communicating with directors, agencies and production teams to helping plan out any visual effects that might be in a project (also being a VFX supervisor on set) to the actual post process of the job.

Amanda worked on this lipstick branding video for the makeup brand Morphe.

It involves client and team management (as you are often also the 2D lead on a project) and calls for a thorough working knowledge of the Flame itself, both in timeline management and that little thing called compositing. The compositing could cross multiple disciplines — greenscreen keying, 3D compositing, set extension and beauty cleanup to name a few. And it helps greatly to have a good eye for color and to be extremely detail-oriented.

WHAT MIGHT SURPRISE PEOPLE ABOUT YOUR ROLE?
How much it entails. Since this is usually a position that exists in a commercial house, we don’t have as many specialties as there would be in the film world.

WHAT’S YOUR FAVORITE PART OF THE JOB?
First is the artwork. I like that we get to work intimately with the client in the room to set looks. It’s often a very challenging position to be in — having to create something immediately — but the challenge is something that can be very fun and rewarding. Second, I enjoy being the overarching VFX eye on the project; being involved from the outset and seeing the project through to delivery.

WHAT’S YOUR LEAST FAVORITE?
We’re often meeting tight deadlines, so the hours can be unpredictable. But the best work happens when the project team and clients are all in it together until the last minute.

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
The evening. I’ve never been a morning person so I generally like the time right before we leave for the day, when most of the office is wrapping up and it gets a bit quieter.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Probably a tactile art form. Sometimes I have the urge to create something that is tangible, not viewed through an electronic device — a painting or a ceramic vase, something like that.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I loved films that were animated and/or used 3D elements growing up and wanted to know how they were made. So I decided to go to a college that had a computer art program with connections in the industry and was able to get my first job as a Flame assistant in between my junior and senior years of college.

ANA Airlines

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
Most recently I worked on a campaign for ANA Airlines. It was a fun, creative challenge on set and in post production. Before that I worked on a very interesting project for Facebook’s F8 conference featuring its AR functionality and helped create a lipstick branding video for the makeup brand Morphe.

IS THERE A PROJECT THAT YOU ARE MOST PROUD OF?
I worked on a spot for Vaseline that was a “through the ages” concept and we had to create looks that would read as from 1880s, 1900, 1940s, 1970s and present day, in locations that varied from the Arctic to the building of the Brooklyn Bridge to a boxing ring. To start we sent the digitally shot footage with our 3D and comps to a printing house and had it printed and re-digitized. This worked perfectly for the ’70s-era look. Then we did additional work to age it further to the other eras — though my favorite was the Arctic turn-of-the-century look.

NAME SOME TECHNOLOGY YOU CAN’T LIVE WITHOUT.
Flame… first and foremost. It really is the most inclusive software — I can grade, track, comp, paint and deliver all in one program. My monitors — the 4K Eizo and color-calibrated broadcast monitor, are also essential.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
Mostly Instagram.

DO YOU LISTEN TO MUSIC WHILE YOU WORK? 
I generally have music on with clients, so I will put on some relaxing music. If I’m not with clients, I listen to podcasts. I love How Did This Get Made and Conan O’Brien Needs a Friend.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Hiking and cooking are two great de-stressors for me. I love being in nature and working out and then going home and making a delicious meal.

NYC’s The-Artery expands to larger space in Chelsea

The-Artery has expanded and moved into a new 7,500-square-foot space in Manhattan’s Chelsea neighborhood. Founded by chief creative officer Vico Sharabani, The-Artery will use this extra space while providing visual effects, post supervision, offline editorial, live action and experience design and development across multiple platforms.

According to Sharabani, the new space is not only a response to the studio’s growth, but allows The-Artery to foster better collaboration and reinforce its relationships with clients and creative partners. “As a creative studio, we recognize how important it is for our artists, producers and clients to be working in a space that is comfortable and supportive of our creative process,” he says. “The extraordinary layout of this new space, the size, the lighting and even our location, allows us to provide our clients with key capabilities and plays an important part in promoting our mission moving forward.”

Recent The-Artery projects include 2018’s VR-enabled production for Mercedez-Benz, their work on Under Armour’s “Rush” campaign and Beyonce’s Coachella documentary, Homecoming.

They have also worked on feature films like Netflix’s Beasts of No Nation, Wes Anderson’s Oscar-winning Grand Budapest Hotel and the crime caper Ocean’s 8.

The-Artery’s new studio features a variety of software including Flame, Houdini, Cinema 4D, 3ds Max, Maya, the Adobe Creative Cloud suite of tools, Avid Media Composer, Shotgun for review and approval and more.

The-Artery features a veteran team of talented team of artists and creative collaborators, including a recent addition — editor and former Mad River Post owner Michael Elliot. “Whether they are agencies, commercial and film directors or studios, our clients always work directly with our creative directors and artists, collaborating closely throughout a project,” says Sharabani.

Main Image: Vico Sharabani (far right) and team in their new space.

Phosphene’s visual effects for Showtime’s Escape at Dannemora

By Randi Altman

The Showtime limited series Escape at Dannemora is based on the true story of two inmates (David Sweat and Richard Matt) who escape from an Upstate New York prison. They were aided by Tilly, a female prison employee, whose husband also worked at Clinton Correctional Facility. She helped run the tailor shop where both men worked and had an intimate relationship with both men.

Matt Griffin

As we approach Emmy season, we thought it was a good time to reach out to the studio that provided visual effects for the Ben Stiller-directed miniseries, which was nominated for a Golden Globe for best television limited series or movie. Escape at Dannemora stars Patricia Arquette, Benicio Del Toro and Paul Dano.

New York City-based Phosphene was called on to create a variety of visual effects, including turning five different locations into the Clinton Correctional Facility, the maximum security prison where the escape took place. The series was also nominated for an Emmy for its Outstanding Visual Effects in A Supporting Role.

We recently spoke with VFX producer Matt Griffin and VFX supervisor Djuna Wahlrab to find out more.

How early did you guys get involved in the project? Were there already set plans for the types of VFX needed? How much input did Phosphene have?
Matt Griffin: There were key sequences that were discussed with us very early on. The most crucial among them were Sweat’s Run, which was a nine-minute “oner” that opened Episode 5; the gruesome death scene of Broome County Sheriff’s Deputy Kevin Tarsia and an ambitious crane shot that revealed the North Yard in the prison.

Djuna Wahlrab

What were the needs of the filmmakers and how did your studio fill that need?
Were you on set supervising?
Griffin: Ben Stiller and the writers had a very clear vision for these challenging sequences, and therefore had a very realistic understanding of how ambitious the VFX would be. They got us involved right at the start so we could be as collaborative as possible with production in preparing the methodology for execution.

In that same spirit, they had us supervise the majority of the shoot, which positioned us to be involved as the natural shifts and adjustments of production arose day to day. It was amazing to be creative problem solvers with the whole team and not just reacting to what happened once in post.

I know that creating the prison was a big part — taking pieces of a few different prisons to make one?
Djuna Wahlrab: Clinton Correctional is a functioning prison, so we couldn’t shoot the whole series within its premises — instead we filmed in five different locations. We shot at a decommissioned prison in Pittsburgh, the prison’s tailor shop was staged in an old warehouse in Brooklyn, and the Honor Block (where our characters were housed) and parts of the prison bowels were built on a stage in Queens. Remaining pieces under the prison were shot in Yonkers, New York in an active water treatment plant. Working closely with production designer Mark Ricker, we tackled the continuity across all these locations.

The upper courts overlook the town.

We knew the main guard tower visible from the outside of Clinton Correctional was crucial, so we always planned to carry that through to Pittsburgh. Scenes taking place just inside the prison wall were also shot in Pittsburgh, and it was not as long as Clinton so we extended the depth of those shots.

While the surrounding mountainside terrain is on beautiful display from the North Yard, it’s also felt from the ground among the buildings within the prison. When looking down the length of the streets, you can see the sloping side of the mountain just over the wall. These scenes were filmed in Pittsburgh, so what you see beyond those walls is actually a bustling hilly city with water towers and electric lines and highways, so we had to adjust to match the real location.

Can you talk about the shot that had David Sweat crawling through pipes in the basement of the prison?
Wahlrab: For what we call Sweat’s Run — because we were creating a “oner” out of 17 discrete pieces — preproduction was crucial. The previs went far beyond a compositional guide. Using blueprints from three different locations and plans for the eventual stage set, orthographic views were created with extremely detailed planning for camera rigging and hand-off points. Drawing on this early presentation, Matt Pebler and the camera department custom-built many of the rigs required for our constricted spaces and meticulous overlapping sections.

The previs was a common language for all departments at the start, but as each piece of the run was filmed, the previs was updated with completed runs and the requirements would shift. Shooting one piece of the run would instantly lock in requirements for the other connecting pieces, and we’d have to determine a more precise plan moving forward from that point. It took a high level of collaboration and flexibility from all departments to constantly narrow the margin for what level of precision was required from everyone.

Sweat preparing for escape.

Can you talk about the scene where Sweat runs over the sheriff’s deputy Tarsia?
Wahlrab: Special effects had built a rig for a partial car that would be safe to “run over” a stunt man. A shell of a vehicle was suspended from an arm off a rigged tactical truck, so that they moved in parallel. Sweat’s stunt car floated a few feet off the ground. The shell had a roof, windows, a windshield, a hood and a driver’s seat. Below that the sides, grill and wheels of the car were constructed of a soft foam. The stunt man for Tarsia was rigged with wires so they could control his drag beneath the car.

In this way, we were able to get the broad strokes of the stunt in-camera. Though the car needed to be almost completely replaced with CG, its structure took the first steps to inform the appropriate environmental re-lighting needed for the scene. The impact moment was a particular challenge because, of course, the foam grill completely gave way to Tarsia’s body. We had to simulate the cracking of the bumper and the stamp of the blood from Tarsia’s wounds. We also had to reimagine how Tarsia’s body would have moved with this rigid impact.

Tarsia’s death: Replaced stunt car, added blood and re-animated the victim.

For Tarsia himself, in addition to augmenting the chosen take, we used alt takes from the shoot for various parts of the body to recreate a Tarsia with more appropriate physical reactions to the trauma we were simulating. There was also a considerable amount of hand painting this animation to help it all mesh together. We added blood on the wheels, smok,  and animated pieces of the broken bumper, all of which helped to ground Tarsia in the space.

You also made the characters look younger. Can you talk about what tools you used for this particular effect?
Wahlrab: Our goal was to support this jump in time, but not distract by going too far. Early on, we did tests where we really studied the face of each actor. From this research, we determined targeted areas for augmentation, and the approach really ended up being quite tailored for each character.

We broke down the individual regions of the face. First, we targeted wrinkles with tailored defocusing. Second, we reshaped recessed portions of the face, mostly with selective grading. In some cases, we retextured the skin on top of this work. At the end of all of this, we had to reintegrate this into the grainy 16mm footage.

Can you talk about all the tools you used?
Griffin: At Phosphene, we use Foundry Nuke Studio and Autodesk 3ds Max. For additional support, we rely on Mocha Pro, 3DEqualizer and PF Track, among many others.


Added snow, cook fire smoke and inmates to upper tier.

Any other VFX sequences that you can talk about?
Wahlrab: As with any project, weather continuity was a challenge. Our prison was represented by five locations, but it took many more than that to fill out the lives of Tilly and Lyle beyond their workplace. Because we shot a few scenes early on with snow, we were locked into that reality in every single location going forward. The special FX team would give us practical snow in the areas with the most interaction, and we were charged with filling out much of the middle and background. For the most part, we relied on photography, building custom digital matte paintings for each shot. We spent a lot of time upstate in the winter, so I found myself pulling off the road in random places in search of different kinds of snow coverage. It became an obsession, figuring out the best way to shoot the same patch of snow from enough angles to cover my needs for different shots, at different times of day, not entirely knowing where we’d need to use it.

What was the most challenging shots?
Wahlrab: Probably the most challenging location to shoot was the North Yard within the prison. Clinton Correctional is a real prison in Dannemora, New York. It’s about 20 miles south of the Canadian border, set into the side of this hill in what is really a beautiful part of the country.This was the inmates outdoor space, divided into terraces overlooking the whole town of Dannemora and the valley beyond. Though the production value of shooting in an active prison was amazing, it also presented quite a few logistical challenges. For safety (ours as well as the prisoners), the gear allowed in was quite restricted. Many of the tools I rely on had to be left behind. Then, load-in required a military grade inspection by the COs, who examined every piece of our equipment before it could enter or exit. The crew was afforded no special privileges for entering the prison and we were shuffled through the standard intake. It was time consuming, and very much limited how long we’d be able to shoot that day once inside.


Before and After: Cooking fires in the upper courts.

Production did the math and balanced the crew and cast load-in with the coverage required. We had 150 background extras for the yard, but in reality, the average number of inmates, even on the coldest of days, was 300. Also, we needed the yard to have snow on the ground for continuity. Unfortunately it was an unseasonably warm day, and after the first few hours, the special effects snow that was painstakingly created and placed during the night was completely melted. Special effects was also charged with creating cook fire for the stoves in each court, but they could only bring in so much fuel. Our challenge was clear — fill out the background inmate population, add snow and cook fire smoke… everywhere.

The biggest challenge in this location was the shot Ben conceived of that would reveal of the enormity of the North Yard. It was this massive crane shot that began at the lowest part of the yard and panned to the upper courts. It slowly pulls out and cranes up to reveal the entire outdoor space. It’s really a beautiful way to introduce us to the North Yard, revealing one terraced level at a time until you have the whole space in view. It’s one of my favorite moments in the show.

Some shots outside the prison involved set extensions.

There’s this subtext about the North Yard and its influence on Sweat and Matt. Out in the yard, the inmates have a bit more autonomy. With good behavior, they have some ownership over the courts and are given the opportunity to curate these spaces. Some garden, many cook meals, and our characters draw and paint. For those lucky enough to be in the upper courts, they have this beautiful view beyond the walls of the prison, and you can almost forget you are locked up.

I think we’re meant to wonder, was it this autonomy or this daily reminder of the outside world beyond the prison walls that fueled their intense devotion to the escape? This location is a huge story piece, and I don’t think it would have been possible to truly render the scale of it all without the support of visual effects.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Sydney’s Fin creates CG robot for Netflix film I Am Mother

Fin Design + Effects, an Australian-based post production house with studios in Melbourne and Sydney, brings its VFX and visual storytelling expertise to the upcoming Netflix film I Am Mother. Directed by Grant Sputore, the post-apocalyptic film stars Hilary Swank, Rose Byrne and Clara Rugaard.

In I Am Mother, a teenage girl (Rugaard) is raised underground by the robot “Mother” (voiced by Byrne), designed to repopulate the earth following an extinction event. But their unique bond is threatened when an inexplicable stranger (Swank) arrives with alarming news.

Working closely with the director, Fin Design’s Sydney office built a CG version of the AI robot Mother to be used interchangeably with the practical robot suit built by New Zealand’s Weta Workshop. Fin was involved from the early stages of the process to help develop the look of Mother, completing extensive design work and testing, which then fed back into the practical suit.

In total, Fin produced over 220 VFX shots, including the creation of a menacing droid army as well as general enhancements to the environments and bunker where this post-apocalyptic story takes place.

According to Fin Australia’s managing director, Chris Spry, “Grant was keen on creating an homage of sorts to old-school science-fiction films and embracing practical filmmaking techniques, so we worked with him to formulate the best approach that would still achieve the wow factor — seamlessly combining CG and practical effects. We created an exact CG copy of the suit, visualizing high-action moments such as running, or big stunt scenes that the suit couldn’t perform in real life, which ultimately accounted for around 80 shots.”

Director Sputore on working with Fin: “They offer suggestions and bust expectations. In particular, they delivered visual effects magic with our CG Mother, one minute having her thunder down bunker corridors and in the next moment speed-folding intricate origami creations. For the most part, the robot at the center of our film was achieved practically. But in those handful of moments where a practical solution wasn’t possible, it was paramount that the audience was not be bumped from the film by a sudden transition to a VFX version of one of our central characters. In the end, even I can’t tell which shots of Mother are CG and which are practical, and, crucially, neither can the audience.”

To create the CG replica, the Fin team paid meticulous attention to detail, ensuring the material, shaders and textures perfectly matched photographs and laser scans of the practical suit. The real challenge, however, was in interpreting the nuances of the movements.

“Precision was key,” explains VFX supervisor Jonathan Dearing. “There are many shots cutting rapidly between the real suit and CG suit, so any inconsistencies would be under a spotlight. It wasn’t just about creating a perfect CG replica but also interpreting the limitations of the suit. CG can actually depict a more seamless movement, but to make it truly identical, we needed to mimic the body language and nuances of the actor in the suit [Luke Hawker]. We did a character study of Luke and rigged it to build a CG version of the suit that could mimic him precisely.”

Fin finessed its robust automation pipeline for this project. Built to ensure greater efficiency, the system allows animators to push their work through lighting and comp at the click of a button. For example, if a shot didn’t have a specific light rig made for it, animators could automatically apply a generic light rig that suits the whole film. This tightly controlled system meant that Fin could have one lighter and one animator working on 200 shots without compromising on quality.

The studio used Autodesk Maya, Side Effects Houdini, Foundry Nuke and Redshift on this project.

I Am Mother premiered at the 2019 Sundance Film Festival and is set to stream on Netflix on June 7.

Wacom updates its Intuos Pro Small tablet

Wacom has introduced a new Intuos Pro pen and touch tablet Small model to its advanced line of creative pen tablets. The new Intuos Pro Small joins the Intuos Pro Medium and Intuos Pro Large sizes already available.

Featuring Wacom’s precise Pro Pen 2 technology with over 8K pen pressure levels, pen tilt sensitivity, natural pen-on-paper feel and battery-free performance, artists now can choose the size — small, medium or large — that best fits their way of working.

The new small size features the same tablet working area as the previous model of Intuos Pro Small and targets on-the go creatives, whose Wacom tablet and PC or Mac laptops are always nearby. The space-saving tablet’s small footprint, wireless connectivity and battery-free pen technology that never needs charging makes setting makes working anywhere easy.

Built for pros, all three sizes of Intuos Pro tablets feature a TouchRing and ExpressKeys, six on the Small and eight on the Medium and Large, for the creation of customized shortcuts to speed up the creative workflow. In addition, incorporating both pen and touch on the tablet allows users to explore new ways to navigate and helps the whole creative experience become more interactive. The slim tablets, also feature a durable anodized aluminum back case and come with a desktop pen stand containing 10 replacement pen nibs.

The Wacom Pro Pen 2 features Wacom’s most advanced creative pen technology to date, with four times the pressure sensitivity as the original Pro Pen. 8,192 levels of pressure, tilt recognition and lag-free tracking effectively emulate working with traditional media by offering a natural drawing experience. Additionally, the pen’s customizable side switch allows one to easily access commonly used shortcuts, greatly speeding production.

Wacom offers two helpful accessory pens (purchased separately). The Pro Pen 3D, features a third button which can be set to perform typical 3D tasks such as tumbling objects in commonly used 3D creative apps. The newly released Pro Pen slim, supports some artists ergonomic preferences for a slimmer pen with a more pencil-like feel. Both are compatible with the Intuos Pro family and can help customize and speed the creative experience.

Intuos Pro is Bluetooth-enabled and compatible with Macs and PCs. All three sizes come with the Wacom Pro Pen 2, pen stand and feature ExpressKeys, TouchRing and multi-touch gesture control. The Intuos Pro Small ($249.95), Intuos Pro Medium ($379.95) and Intuos Pro Large ($499.95) are available now.

2 Chainz’s 2 Dolla Bill gets VFX from Timber

Santa Monica’s Timber, known for its VMA-winning work on the Kendrick Lamar music video “Humble, provided visual effects and post production for the latest music video from 2 Chainz, featuring E-40 and Lil Wayne — 2 Dolla Bill.

The video begins with a group of people in a living room with the artist singing, “I’m rare” while holding a steak. It transitions to a poker game where the song continues with “I’m rare, like a two dollar bill.” We then see a two dollar bill with Thomas Jefferson singing the phrase as well. The video takes us back to the living room, the poker game, an operating room, a kitchen and other random locations.

Artists at collaborating company Kevin provided 2D visual effects for the music video, including the scene with the third eye.

According to Timber creative director/partner Kevin Lau, “The main challenge for this project was the schedule. It was a quick turnaround initially, so it was great to be able to work in tandem with offline to get ahead of the schedule. This also allowed us to work closely with the director and implement some his requests to enhance the video after it was shot.”

Timber got involved early on in the project and was on set while they shot the piece. The studio called on Autodesk Flame for clean-up, compositing and enhancement work, as well as the animation of the talking money.

Lau was happy Timber got the chance to be on set. “It was very useful to have a VFX supervisor on set for this project, given the schedule and scope of work. We were able to flag any concerns/issues right away so they didn’t become bigger problems in post.”

Arcade Edit’s Geoff Hounsell edited the piece. Daniel de Vue from A52 provided the color grade.

 

Marvel Studios’ Victoria Alonso to keynote SIGGRAPH 2019

Marvel Studios executive VP of production Victoria Alonso has been name keynote speaker for SIGGRAPH 2019, which will run from July 28 through August 1 in downtown Los Angeles. Registration is now open. The annual SIGGRAPH conference is a melting pot for researchers, artists and technologists, among other professionals.

“Victoria is the ultimate symbol of where the computer graphics industry is headed and a true visionary for inclusivity,” says SIGGRAPH 2019 conference chair Mikki Rose. “Her outlook reflects the future I envision for computer graphics and for SIGGRAPH. I am thrilled to have her keynote this summer’s conference and cannot wait to hear more of her story.”

One of few women in Hollywood to hold such a prominent title, Alonso’s dedication to the industry has been admired for a long time, leading to multiple awards and honors, including the 2015 New York Women in Film & Television Muse Award for Outstanding Vision and Achievement, the Advanced Imaging Society’s first female Harold Lloyd Award recipient, and the 2017 VES Visionary Award (another female first). A native of Buenos Aires, her career began in visual effects and included a four-year stint at Digital Domain.

Alonso’s film credits include productions such as Ridley Scott’s Kingdom of Heaven, Tim Burton’s Big Fish, Andrew Adamson’s Shrek, and numerous Marvel titles — Iron Man, Iron Man 2, Thor, Captain America: The First Avenger, Iron Man 3, Captain America: The Winter Soldier, Captain America: Civil War, Thor: The Dark World, Avengers: Age of Ultron, Ant-Man, Guardians of the Galaxy, Doctor Strange, Guardians of the Galaxy Vol. 2, Spider-Man: Homecoming, Thor: Ragnarok, Black Panther, Avengers: Infinity War, Ant-Man and the Wasp and, most recently, Captain Marvel.

“I’ve been attending SIGGRAPH since before there was a line at the ladies’ room,” says Alonso. “I’m very much looking forward to having a candid conversation about the state of visual effects, diversity and representation in our industry.”

She adds, “At Marvel Studios, we have always tried to push boundaries with both our storytelling and our visual effects. Bringing our work to SIGGRAPH each year offers us the opportunity to help shape the future of filmmaking.”

The 2019 keynote session will be presented as a fireside chat, allowing attendees the opportunity to hear Alonso discuss her life and career in an intimate setting.

Review: Maxon Cinema 4D Release 20

By Brady Betzel

Last August, Maxon made available its Cinema 4D Release 20. From the new node-based Material Editor to the all new console used to debug and develop scripts, Maxon has really upped the ante.

At the recent NAB show, Maxon announced that they acquired Redshift Rendering Technologies, the makers of the Redshift rendering engine. This acquisition will hopefully tie in an industry standard GPU-based rendering engine inside of Cinema 4D R20’s workflow and speed up rendering. As of now there is still the same licensing fees attached to Redshift as there were before the acquisition: Node-Locked is $500 and Floating is $600.

Digging In
The first update to Cinema 4D R20 that I wanted to touch on is the new node-based Material Editor. If you are familiar with Blackmagic’s DaVinci Resolve or Nuke’s applications, then you have seen how nodes work. I love how nodes work, allowing the user to not only layer up effects — or in Cinema 4D R20’s case — diffusion to camera distance. There are over 150 nodes inside of the material editor to build textures with.

One small change that I noticed inside of the updated Material Editor was the new gradient settings. When you are working with gradient knots you can now select multiple knots at once and then right click and double the selected knots, invert the knots, select different knot interpolations (including stepped, smooth, cubic, linear, and blend) and even distribute the knots to clean up your pattern. A real nice and convenient update to gradient workflows.

In Cinema 4D R20, not only can you add new nodes from the search menu, but you can also click the node dots in the Basic properties window and route nodes through there. When you are happy with your materials made in the node editor, you can save them as assets in the scene file or even compress them in a .zip file to share with others.

In a related update category, Cinema 4D Release 20 has introduced the Uber Material. In simple terms (and I mean real simple), the Uber Material is a node-based material that is different from standard or physical materials because it can be edited inside of the Attribute Manager or Material Editor but retain the properties available in the Node Editor.

The Camera Tracking and 2D Camera View has been updated. While the Camera Tracking mode has been improved, the new 2D Camera View mode has combined the Film Move mode with the Film Zoom mode. Adding the ability to use standard shortcuts to move around a scene instead of messing with the Film Offset or Focal Length in the Camera Object Properties dialogue. For someone like me who isn’t a certified pro in Cinema 4D, these little shortcuts really make me feel at home. Much more like apps I’m used to such as Mocha Pro or After Effects. Maxon has also improved the 2D tracking algorithm for much tighter tracks as well as added virtual keyframes. The virtual keyframes are an extreme help when you don’t have time for minute adjustments.

Volume Modeling
What seems to be one of the largest updates in Cinema 4D R20 is the addition of Volume Modeling with the OpenVDB-based Volume Builder. According to www.openvdb.org, “OpenVDB is an Academy Award-winning C++ library comprising a hierarchical data structure and a suite of tools for the efficient manipulation of sparse, time-varying, volumetric data discretized on three-dimensional grids,” developed by Ken Museth at DreamWorks Animation. It uses 3D pixels called voxels instead of polygons. When using the Volume Builder you can combine multiple polygon and primitive objects using Boolean operations: Union, Subtract or Intersect. Furthermore you can smooth your volume using multiple techniques, including one that made me do some extra Google work: Laplacian Flow.

Fields
When going down the voxel rabbit hole in Cinema 4D R20, you will run into another new update: Fields. Prior to Cinema 4D R20, we would use Effectors to affect strength values of an object. You would stack and animate multiple effectors to achieve different results. In Cinema 4D R20, under the Falloff tab you will now see a Fields list along with the types of Field Objects to choose from.

Imagine you make a MoGraph object that you want its opacity to be controlled by a box object moving through your MoGraph but also physically modified by a capsule poking through. You can combine these different field object effectors by using compositing functions in the Fields list. In addition you can animate or alter these new fields straight away in the Objects window.

Summing Up
Cinema 4D Release 20 has some amazing updates that will greatly improve efficiency and quality of your work. From tracking updates to field updates, there are plenty of exciting tools to dive into. And if you are reading this as an After Effects user who isn’t sure about Cinema 4D, now is the time to dive in. Once you learn the basics, whether it’s from Youtube tutorials or you sign up for www.cineversity.com classes, you will immediately see an increase in the quality of your work.

Combining Adobe After Effects, Element 3D and Cinema 4D R20 is the ultimate in 3D motion graphics and 2D compositing — accessible to almost everyone. And I didn’t even touch on the dozens of other updates to Cinema 4D R20 like the multitude of ProRender updates, FBX import/export options, new node materials and CAD import support for Cataia, Iges, JT, Solidworks and Step formats. Check out Cinema 4D Release 20’s newest features on YouTube and on their website.

And, finally, I think it’s safe to assume that Maxon’s acquisition of RedShift renderer poses a bright future for Cinema 4D users.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

NAB 2019: Maxon acquires Redshift Rendering Technologies

Maxon, makers of Cinema 4D, has purchased Redshift Rendering Technologies, developers of the Redshift rendering engine. Redshift is a flexible GPU-accelerated renderer targeting high-end production. Redshift offers an extensive suite of features that makes rendering complicated 3D projects faster. Redshift is available as a plugin for Maxon’s Cinema 4D and other industry-standard 3D applications.

“Rendering can be the most time-consuming and demanding aspect of 3D content creation,” said David McGavran, CEO of Maxon. “Redshift’s speed and efficiency combined with Cinema 4D’s responsive workflow make it a perfect match for our portfolio.”

“We’ve always admired Maxon and the Cinema 4D community, and are thrilled to be a part of it,” said Nicolas Burtnyk, co-founder/CEO, Redshift. “We are looking forward to working closely with Maxon, collaborating on seamless integration of Redshift into Cinema 4D and continuing to push the boundaries of what’s possible with production-ready GPU rendering.”

Redshift is used by post companies, including Technicolor, Digital Domain, Encore Hollywood and Blizzard. Redshift has been used for VFX and motion graphics on projects such as Black Panther, Aquaman, Captain Marvel, Rampage, American Gods, Gotham, The Expanse and more.

Adobe’s new Content-Aware fill in AE is magic, plus other CC updates

By Brady Betzel

NAB is just under a week away, and we are here to share some of Adobe’s latest Creative Cloud offerings. And there are a few updates worth mentioning, such as a freeform project panel in Premiere Pro, AI-driven Auto Ducking for Ambience for Audition and addition of a Twitch extension for Character Animator. But, in my opinion, the Adobe After Effects updates are what this year’s release will be remembered by.


Content Aware: Here is the before and after. Our main image is the mask.

There is a new expression editor in After Effects, so us old pseudo-website designers can now feel at home with highlighting, line numbers and more. There are also performance improvements, such as faster project loading times and new deBayering support for Metal on macOS. But the first prize ribbon goes to the Content-Aware fill for video powered by Adobe Sensei, the company’s AI technology. It’s one of those voodoo features that when you use it, you will be blown away. If you have ever used Mocha Pro by BorisFX then you have had a similar tool known as the “Object Removal” tool. Essentially, you draw around the object you want to remove, such as a camera shadow or boom mic, hit the magic button and your object will be removed with a new background in its place. This will save users hours of manual work.

Freeform Project panel in Premiere.

Here are some details on other new features:

● Freeform Project panel in Premiere Pro— Arrange assets visually and save layouts for shot selects, production tasks, brainstorming story ideas, and assembly edits.
● Rulers and Guides—Work with familiar Adobe design tools inside Premiere Pro, making it easier to align titling, animate effects, and ensure consistency across deliverables.
● Punch and Roll in Audition—The new feature provides efficient production workflows in both Waveform and Multitrack for longform recording, including voiceover and audiobook creators.
● Surprise viewers in Twitch Live-Streaming Triggers with Character Animator Extension—Livestream performances are enhanced where audiences engage with characters in real-time with on-the-fly costume changes, impromptu dance moves, and signature gestures and poses—a new way to interact and even monetize using Bits to trigger actions.
● Auto Ducking for ambient sound in Audition and Premiere Pro — Also powered by Adobe Sensei, Auto Ducking now allows for dynamic adjustments to ambient sounds against spoken dialog. Keyframed adjustments can be manually fine-tuned to retain creative control over a mix.
● Adobe Stock now offers 10 million professional-quality, curated, royalty-free HD and 4K video footage and Motion Graphics templates from leading agencies and independent editors to use for editorial content, establishing shots or filling gaps in a project.
● Premiere Rush, introduced late last year, offers a mobile-to-desktop workflow integrated with Premiere Pro for on-the-go editing and video assembly. Built-in camera functionality in Premiere Rush helps you take pro-quality video on your mobile devices.

The new features for Adobe Creative Cloud are now available with the latest version of Creative Cloud.

Behind the Title: Nice Shoes animator Yandong Dino Qiu

This artist/designer has taken to sketching people on the subway to keep his skills fresh and mind relaxed.

NAME: Yandong Dino Qiu

COMPANY: New York’s Nice Shoes

CAN YOU DESCRIBE YOUR COMPANY?
Nice Shoes is a full-service creative studio. We offer design, animation, VFX, editing, color grading, VR/AR, working with agencies, brands and filmmakers to help realize their creative vision.

WHAT’S YOUR JOB TITLE?
Designer/Animator

WHAT DOES THAT ENTAIL?
Helping our clients to explore different looks in the pre-production stage, while aiding them in getting as close as possible to the final look of the spot. There’s a lot of exploration and trial and error as we try to deliver beautiful still frames that inform the look of the moving piece.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
Not so much for the title, but for myself, design and animation can be quite broad. People may assume you’re only 2D, but it also involves a lot of other skill sets such as 3D lighting and rendering. It’s pretty close to a generalist role that requires you to know nearly every software as well as to turn things around very quickly.

WHAT TOOLS DO YOU USE?
Photoshop, After Effects,. Illustrator, InDesign — the full Adobe Creative Suite — and Maxon Cinema 4D.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Pitch and exploration. At that stage, all possibilities are open. The job is alive… like a baby. You’re seeing it form and helping to make new life. Before this, you have no idea what it’s going to look like. After this phase, everyone has an idea. It’s very challenging, exciting and rewarding.

WHAT’S YOUR LEAST FAVORITE?
Revisions. Especially toward the end of a project. Everything is set up. One little change will affect everything else.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
2:15pm. Its right after lunch. You know you have the whole afternoon. The sun is bright. The mood is light. It’s not too late for anything.

Sketching on the subway.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I would be a Manga artist.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
La Mer. Frontline. Friskies. I’ve also been drawing during my commute everyday, sketching the people I see on the subway. I’m trying to post every week on Instagram. I think it’s important for artists to keep to a routine. I started up with this at the beginning of 2019, and there’ve been about 50 drawings already. Artists need to keep their pen sharp all the time. By doing these sketches, I’m not only benefiting my drawing skills, but I’m improving my observation about shapes and compositions, which is extremely valuable for work. Being able to break down shapes and components is a key principle of design, and honing that skill helps me in responding to client briefs.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
TED-Ed What Is Time? We had a lot of freedom in figuring out how to animate Einstein’s theories in a fun and engaging way. I worked with our creative director Harry Dorrington to establish the look and then with our CG team to ensure that the feel we established in the style frames was implemented throughout the piece.

TED-Ed What Is Time?

The film was extremely well received. There was a lot of excitement at Nice Shoes when it premiered, and TED-Ed’s audience seemed to respond really warmly as well. It’s rare to see so much positivity in the YouTube comments.

NAME SOME TECHNOLOGY YOU CAN’T LIVE WITHOUT.
My Wacom tablet for drawing and my iPad for reading.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I take time and draw for myself. I love that drawing and creating is such a huge part of my job, but it can get stressful and tiring only creating for others. I’m proud of that work, but when I can draw something that makes me personally happy, any stress or exhaustion from the work day just melts away.

VFX supervisor Christoph Schröer joins NYC’s Artjail

New York City-based VFX house Artjail has added Christoph Schröer as VFX supervisor. Previously a VFX supervisor/senior compositor at The Mill, Schröer brings over a decade of experience to his new role at Artjail. His work has been featured in spots for Mercedes-Benz, Visa, Volkswagen, Samsung, BMW, Hennessy and Cartier.

Combining his computer technology expertise and a passion for graffiti design, Schröer applied his degree in Computer and Media Sciences to begin his career in VFX. He started off working at visual effects studios in Germany and Switzerland where he collaborated with a variety of European auto clients. His credits from his tenure in the European market include lead compositor for multiple Mercedes-Benz spots, two global Volkswagen campaign launches and BMW’s “Rev Up Your Family.”

In 2016, Schröer made the move to New York to take on a role as senior compositor and VFX supervisor at The Mill. There, he teamed with directors such as Tarsem Singh and Derek Cianfrance, and worked on campaigns for Hennessy, Nissan Altima, Samsung, Cartier and Visa.

Autodesk Arnold 5.3 with Arnold GPU in public beta

Autodesk has made its Arnold 5.3 with Arnold GPU available as a public beta. The release provides artists with GPU rendering for a set number of features, and the flexibility to choose between rendering on the CPU or GPU without changing renderers.

From look development to lighting, support for GPU acceleration brings greater interactivity and speed to artist workflows, helping reduce iteration and review cycles. Arnold 5.3 also adds new functionality to help maximize performance and give artists more control over their rendering processes, including updates to adaptive sampling, a new version of the Randomwalk SSS mode and improved Operator UX.

Arnold GPU rendering makes it easier for artists and small studios to iterate quickly in a fast working environment and scale rendering capacity to accommodate project demands. From within the standard Arnold interface, users can switch between rendering on the CPU and GPU with a single click. Arnold GPU currently supports features such as arbitrary shading networks, SSS, hair, atmospherics, instancing, and procedurals. Arnold GPU is based on the Nvidia OptiX framework and is optimized to leverage Nvidia RTX technology.

New feature summary:
— Major improvements to quality and performance for adaptive sampling, helping to reduce render times without jeopardizing final image quality
— Improved version of Randomwalk SSS mode for more realistic shading
— Enhanced usability for Standard Surface, giving users more control
— Improvements to the Operator framework
— Better sampling of Skydome lights, reducing direct illumination noise
— Updates to support for MaterialX, allowing users to save a shading network as a MaterialX look

Arnold 5.3 with Arnold GPU in public beta will be available March 20 as a standalone subscription or with a collection of end-to-end creative tools within the Autodesk Media & Entertainment Collection. You can also try Arnold GPU with a free 30-day trial of Arnold. Arnold GPU is available in all supported plug-ins for Autodesk Maya, Autodesk 3ds Max, Houdini, Cinema 4D and Katana.

Sandbox VR partners with Vicon on Amber Sky 2088 experience

VR gaming company Sandbox VR has been partnering and working with Vicon motion capture tools to create next-generation immersive experiences. By using Vicon’s motion capture cameras and its location-based VR (LBVR) software Evoke, the Hong Kong-based Sandbox VR is working to transport up to six people at a time into the Amber Sky 2088 experience, which takes place in a future where the fate of humanity lies in the balance.

Sandbox VR’s adventures resemble movies where the players become the characters. With two proprietary AAA-quality games already in operation across Sandbox VR’s seven locations, for its third title, Amber Sky 2088, a new motion capture solution was needed. In the futuristic game, users step into the role of androids, granting players abilities far beyond the average human while still scaling the game to their actual movements. To accurately convey that for multiple users in a free-roam environment, precision tracking and flexible scalability were vital. For that, Sandbox VR turned to Vicon.

Set in the twilight of the 21st century, Amber Sky 2088 takes players to a futuristic version of Hong Kong, then through the clouds to the edge of space to fight off an alien invasion. Android abilities allow players to react with incredible strength and move at speeds fast enough to dodge bullets. And while the in-game action is furious, participants in the real-world — equipped with VR headsets —  freely roam an open environment as Vicon LBVR motion capture cameras track their movement.

Vicon’s motion capture cameras record every player movement, then send the data to its Evoke software, a solution introduced last year as part of its LBVR platform, Origin. Vicon’s solution offers  precise tracking, while also animating player motion in realtime, creating a seamless in-game experience. Automatic re-calibration also makes the experience’s operation easier than ever despite its complex nature, and the system’s scalability means fewer cameras can be used to capture more movement, making it cost-effective for large scale expansion.

Since its founding in 2016, Sandbox VR has been creating interactive experiences by combining motion capture technology with virtual reality. After opening its first location in Hong Kong in 2017, the company has since expanded to seven locations across Asia and North America, with six new sites on the way. Each 30- to 60-minute experience is created in-house by Sandbox VR, and each can accommodate up to six players at a time.

The recent partnership with Vicon is the first step in Sandbox VR’s expansion plans that will see it open over 40 experience rooms across 12 new locations around the world by the end of the year. In considering its plans to build and operate new locations, the VR makers chose to start with five systems from Vicon, in part because of the company’s collaborative nature.

Behind the Title: Gentleman Scholar MD/EP Jo Arghiris

LA-based Jo Arghiris embraces the creativity of the job and enjoys “pulling treatments together with our directors. It’s always such a fun, collaborative process.” Find out more…

Name: Jo Arghiris

Company: Gentleman Scholar (@gentscholar)

Can You Describe Your Company?
Gentleman Scholar is a creative production studio, drawn together by a love of design and an eagerness to push boundaries.  Since launching in Los Angeles in 2010, and expanding to New York in 2016, we have evolved within the disciplines of live-action production, digital exploration, print and VR. At our very core, we are a band of passionate artists and fearless makers.

The biggest thing that struck me when I joined Scholar was everyone’s willingness to roll up their sleeves and give it a go. There are so many creative people working across both our studios, it’s quite amazing what we can achieve when we put our collective minds to it. In fact, it’s really hard to put us in a category or to define what we do on a day-to-day basis. But if I had to sum it up in just one word, our company feels like “home”; there’s no place quite like it.

What’s Your Job Title?
Managing Director/EP Los Angeles

What Does That Entail?
Truth be told, it’s evolving all the time. In its purest form, my job entails having top-line involvement on everything going on in the LA studio, both from operational and new business POVs. I face inwards and outwards. I mentor and I project. I lead and I follow. But the main thing I want to mention is that I couldn’t do my job without all these incredible people by my side. It really does take a village, every single day.

What Would Surprise People the Most About What Falls Under That Title?
Not so much “surprising” but certainly different from other roles, is that my job is never done (or at least it shouldn’t be). I never go home with all my to-do’s ticked off. The deck is constantly shuffled and re-dealt. This fluidity can be off-putting to some people who like to have a clear idea of what they need to achieve on any given day. But I really like to work that way, as it keeps my mind nimble and fresh.

What’s Your Favorite Part of the Job?
Learning new things and expanding my mind. I like to see our teams push themselves in this way, too. It’s incredibly satisfying watching folks overcome challenges and grow into their roles. Also, I obviously love winning work, especially if it’s an intense pitch process. I’m a creative person and I really enjoy pulling treatments together with our directors. It’s always such a fun, collaborative process.

What’s Your Least Favorite?
Well, I guess the 24/7 availability thing that we’ve all become accustomed to and are all guilty of. It’s so, so important for us to have boundaries. If I’m emailing the team late at night or on the weekend, I will write in the subject line, “For the Morning” or “For Monday.” I sometimes need to get stuff set up in advance, but I absolutely do not expect a response at 10pm on a Sunday night. To do your best work, it’s essential that you have a healthy work/life balance.

What is Your Favorite Time of the Day?
As clichéd as it may sound, I love to get up before anyone else and sit, in silence, with a cup of coffee. I’m a one-a-day kind of girl, so it’s pretty sacred to me. Weekdays or weekends, I have so much going on, I need to set my day up in these few solitary moments. I am not a night person at all and can usually be found fast asleep on the sofa sometime around 9pm each night. Equally favorite is when my kids get up and we do “huggle” time together, before the day takes us away on our separate journeys.

Bleacher Report

Can you Name Some Recent Projects?
Gentleman Scholar worked on a big Acura TLX campaign, which is probably one of my all-time favorites. Other fun projects include Legends Club for Timberland, Upwork “Hey World!” campaign from Duncan Channon, the Sponsor Reel for the 2018 AICP Show and Bleacher Report’s Sports Alphabet.

If You Didn’t Have This Job, What Would You be Doing Instead?
I love photography, writing and traveling. So if I could do it all again, I’d be some kind of travel writer/photographer combo or a journalist or something. My brother actually does just that, and I’m super-proud of his choices. To stand behind your own creative point of view takes skill and dedication.

How Did You Know This Would Be Your Path?
The road has been long, and it has carried me from London to New York to Los Angeles. I originally started in post production and VFX, where I got a taste for creative problem-solving. The jump from this world to a creative production studio like Scholar was perfectly timed and I relished the learning curve that came with it. I think it’s quite hard to have a defined “path” these days.

My advice to anyone getting into our industry right now would be to understand that knowledge and education are powerful tools, so go out of your way to harness them. And never stand still; always keep pushing yourself.

Name Three Pieces of Technology You Can’t Live Without.
My Ear Pods — so happy to not have that charging/listening conflict with my iPhone anymore; all the apps that allow me to streamline my life and get shit done any time of day no matter what, no matter where; I think my electric toothbrush is pretty high up there too. Can I have one more? Not “tech” per se, but my super-cute mini-hair straightener, which make my bangs look on point, even after working out!

What Social Media Channels Do You Follow?
Well, I like Instagram mostly. Do you count Pinterest? I love a Pinterest board. I have many of those. And I read Twitter, but I don’t Tweet too much. To be honest, I’m pretty lame on social media, and all my accounts are private. But I realize they are such important tools in our industry so I use them on an as-needed basis. Also, it’s something I need to consider soon for my kids, who are obsessed with watching random, “how-to” videos online and periodically ask me, “Are you going to put that on YouTube?” So I need to keep on top of it, not just for work, but also for them. It will be their world very soon.

Do You Listen to Music While You Work? Care to Share Your Favorite Music to Work to?
Yes, I have a Sonos set up in my office. I listen to a lot of playlists — found ones and the random ones that your streaming services build for you. Earlier this morning I had an album called Smino by blkswn playing. Right now I’m listening to a band called Pronoun. They were on a playlist Nylon Studios released called, “All the Brooklyn Bands You Should Be Listening To.”

My drive home is all about the podcast. I’m trying to educate myself more on American history at the moment. I’m also tempted to get into Babel and learn French. With all the hours I spend in the car, I’m pretty sure I would be fluent in no time!

What Do You Do to De-stress From it All?
So many things! I literally never stop. Hot yoga, spinning, hiking, mountain biking, cooking and thinking of new projects for my house. Road tripping, camping and exploring new places with my family and friends. Taking photographs and doing art projects with my kids. My all-time favorite thing to do is hit the beach for the day, winter and summer. I find it one of the most restorative places on Earth. I’m so happy to call LA my home. It suits me down to the ground!

Autodesk cloud-enabled tools now work with BeBop post platform

Autodesk has enabled use of its software in the cloud — including 3DS Max, Arnold, Flame and Maya — and BeBop Technology will deploy the tools on its cloud-based post platform. The BeBop platform enables processing-heavy post projects, such as visual effects and editing, in the cloud on powerful and highly secure virtualized desktops. Creatives can process, render, manage and deliver media files from anywhere on BeBop using any computer and as small as a 20Mbps Internet connection.

The ongoing deployment of Autodesk software on the BeBop platform mirrors the ways BeBop and Adobe work closely together to optimize the experience of Adobe Creative Cloud subscribers. Adobe applications have been available natively on BeBop since April 2018.

Autodesk software users will now also gain access to BeBop Rocket Uploader, which enables ingestion of large media files at incredibly high speeds for a predictable monthly fee with no volume limits. Additionally, BeBop Over the Shoulder (OTS) enables secure and affordable remote collaboration, review and approval sessions in real-time. BeBop runs on all of the major public clouds, including Amazon Web Services (AWS), Google Cloud Platform (GCP), and Microsoft Azure.

Behind the Title: Carousel’s Head of VFX/CD Jeff Spangler

This creative has been an artist for as long as he could remember. “I’ve always loved the process of creation and can’t imagine any career where I’m not making something,” he says.

Name: Jeff Spangler

Company: NYC’s Carousel

Can you describe your company?
Carousel is a “creative collective” that was a response to this rapidly changing industry we all know and love. Our offerings range from agency creative services to editorial, design, animation (including both motion design and CGI), retouching, color correction, compositing, music licensing, content creation, and pretty much everything that falls between.

We have created a flexible workflow that covers everything from concept to execution (and delivery), while also allowing for clients whose needs are less all-encompassing to step on or off at any point in the process. That’s just one of the reasons we called ourselves Carousel — our clients have the freedom to climb on board for as much of the ride as they desire. And with the different disciplines all living under the same roof, we find that a lot of the inefficiencies and miscommunications that can get in the way of achieving the best possible result are eliminated.

What’s your job title?
Head of VFX/Creative Director

What does that entail?
That’s a really good question. There is the industry standard definition of that title as it applies to most companies. But it’s quite different if you are talking about a collective that combines creative with post production, animation and design. So for me, the dual role of CD and head of VFX works in a couple of ways. Where we have the opportunity to work with agencies, I am able to bring my experience and talents as a VFX lead to bear, communicating with the agency creatives and ensuring that the different Carousel artist involved are all able to collaborate and communicate effectively to get the work done.

Alternatively, when we work direct-to-client, I get involved much earlier in the process, collaborating with the Carousel creative directors to conceptualize and pitch new ideas, design brand elements, visualize concept art, storyboard and write copy or even work with stargeists to help hone the direction and target of a campaign.

That’s the true strength of Carousel — getting creatives from different backgrounds involved early on in the process where their experience and talent can make a much bigger impact in the long run. Most importantly, my role is not about dictating direction as much as it is about guiding and allowing for people’s talents to shine. You have to give artists the room to flourish if you really want to serve your clients and are serious about getting them something more than what they expected.

What would surprise people the most about what falls under that title?
I think that there is this misconception that it’s one creative sitting in a room that comes up with the “Big Idea” and he or she just dictates that idea to everyone. My experience is that any good idea started out as a lot of different ideas that were merged, pruned, refined and polished until they began to resemble something truly great.

Then after 24 hours, you look at that idea again and tear it apart because all of the flaws have started to show and you realize it still needs to be pummeled into shape. That process is generally a collaboration within a group of talented people who all look at the world very differently.

What tools do you use?
Anything that I can get my hands on (and my brain wrapped around). My foundation is as a traditional artist and animator and I find that those core skills are really the strength behind what I do everyday. I started out after college as a broadcast designer and later transitioned into a Flame artist where I spent many years working as a beauty retouch artist and motion designer.

These days, I primarily use Adobe Creative Suite as my role has become more creative in nature. I use Photoshop for digital painting and concept art , Illustrator for design and InDesign for layouts and decks. I also have a lot of experience in After Effects and Autodesk Maya and will use those tools for any animation or CGI that requires me to be hands-on, even if just to communicate the initial concept or design.

What’s your favorite part of the job?
Coming up with new ideas at the very start. At that point, the gloves are off and everything is possible.

What’s your least favorite?
Navigating politics within the industry that can sometimes get in the way of people doing their best work.

What is your favorite time of the day?
I’m definitely more of a night person. But if I had to choose a favorite time of day, it would be early morning — before everything has really started and there’s still a ton of anticipation and potential.

If you didn’t have this job, what would you be doing instead?
Working as a full-time concept artist. Or a logo designer. While I frequently have the opportunity to do both of those things in my role at Carousel, they are, for me, the most rewarding expression of being creative.

A&E’s Scraps

How early on did you know this would be your path?
I’ve been an artist for as long as I can remember and never really had any desire (or ability) to set it aside. I’ve always loved the process of creation and can’t imagine any career where I’m not “making” something.

Can you name some recents projects you have worked on?
We are wrapping up Season 2 of an A&E food show titled Scraps that has allowed us to flex our animation muscles. We’ve also been doing some in-store work with Victoria’s Secret for some of their flagship stores that has been amazing in terms of collaboration and results.

What is the project that you are most proud of?
It’s always hard to pick a favorite and my answer would probably change if you asked me more than once. But I recently had the opportunity to work with an up-and-coming eSports company to develop their logo. Collaborating with their CD, we landed on a design and aesthetic that makes me smile every time I see it out there. The client has taken that initial work and continues to surprise me with the way they use it across print, social media, swag, etc. Seeing their ability to be creative and flexible with what I designed is just validation that I did a good job. That makes me proud.

Name pieces of technology you can’t live without.
My iPad Pro. It’s my portable sketch tablet and presentation device that also makes for a damn good movie player during long commutes.

What do you do to de-stress from it all?
Muay Thai. Don’t get me wrong. I’m no serious martial artist and have never had the time to dedicate myself properly. But working out by punching and kicking a heavy bag can be very cathartic.

Review: Boris FX’s Continuum and Mocha Pro 2019

By Brady Betzel

I realize I might sound like a broken record, but if you are looking for the best plugin to help with object removals or masking, you should seriously consider the Mocha Pro plugin. And if you work inside of Avid Media Composer, you should also seriously consider Boris Continuum and/or Sapphire, which can use the power of Mocha.

As an online editor, I consistently use Continuum along with Mocha for tight blur and mask tracking. If you use After Effects, there is even a whittled-down version of Mocha built in for free. For those pros who don’t want to deal with Mocha inside of an app, it also comes as a standalone software solution where you can copy and paste tracking data between apps or even export the masks, object removals or insertions as self-contained files.

The latest releases of Continuum and Mocha Pro 2019 continue the evolution of Boris FX’s role in post production image restoration, keying and general VFX plugins, at least inside of NLEs like Media Composer and Adobe Premiere.

Mocha Pro

As an online editor I am alway calling on Continuum for its great Chroma Key Studio, Flicker Fixer and blurring. Because Mocha is built into Continuum, I am able to quickly track (backwards and forwards) difficult shapes and even erase shapes that the built-in Media Composer tools simply can’t do. But if you are lucky enough to own Mocha Pro you also get access to some amazing tools that go beyond planar tracking — such as automated object removal, object insertion, stabilizing and much more.

Boris FX’s latest updates to Boris Continuum and Mocha Pro go even further than what I’ve already mentioned and have resulted in a new version naming, this round we are at 2019 (think of it as Version 12). They have also created the new Application Manager, which makes it a little easier to find the latest downloads. You can find them here. This really helps when jumping between machines and you need to quickly activate and deactivate licenses.

Boris Continuum 2019
I often get offline edits effects from a variety plugins — lens flares, random edits, light flashes, whip transitions, and many more — so I need Continuum to be compatible with offline clients. I also need to use it for image repair and compositing.

In this latest version of Continuum, BorisFX has not only kept plugins like Primatte Studio, they have brought back Particle Illusion and updated Mocha and Title Studio. Overall, Continuum and Mocha Pro 2019 feel a lot snappier when applying and rendering effects, probably because of the overall GPU-acceleration improvements.

Particle Illusion has been brought back from the brink of death in Continuum 2019 for a 64-bit keyframe-able particle emitter system that can even be tracked and masked with Mocha. In this revamp of Particle Illusion there is an updated interface, realtime GPU-based particle generation, expanded and improved emitter library (complete with motion-blur-enabled particle systems) and even a standalone app that can design systems to be used in the host app — you cannot render systems inside of the standalone app.

While Particle Illusion is a part of the entire Continuum toolset that works with OFX apps like Blackmagic’s DaVinci Resolve, Media Composer, After Effects, and Premiere, it seems to work best in applications like After Effects, which can handle composites simply and naturally. Inside the Particle Illusion interface you can find all of the pre-built emitters. If you only have a handful make sure you download additional emitters, which you can find in the Boris FX App Manager.

       
Particle Illusion: Before and After

I had a hard time seeing my footage in a Media Composer timeline inside of Particle Illusion, but I could still pick my emitter, change specs like life and opacity, exit out and apply to my footage. I used Mocha to track some fire from Particle Illusion to a dumpster I had filmed. Once I dialed in the emitter, I launched Mocha and tracked the dumpster.

The first time I went into Mocha I didn’t see the preset tracks for the emitter or the world in which the emitter lives. The second time I launched Mocha, I saw track points. From there you can track where you want your emitter to track and be placed. Once you are done and happy with your track, jump back to your timeline where it should be reflected. In Media Composer I noticed that I had to go to the Mocha options and change the option from Mocha Shape to no shape. Essentially, the Mocha shape will act like a matte and cut off anything outside the matte.

If you are inside of After Effects, most parameters can now be keyframed and parented (aka pick-whipped) natively in the timeline. The Particle Illusion plugin is a quick, easy and good-looking tool to add sparks, Milky Way-like star trails or even fireworks to any scene. Check out @SurfacedStudio’s tutorial on Particle Illusion to get a good sense of how it works in Adobe Premiere Pro.

Continuum Title Studio
When inside of Media Composer (prior to the latest release 2018.12), there were very few ways to create titles that were higher resolution than HD (1920×1080) — the New Blue Titler was the only other option if you wanted to stay within Media Composer.

Title Studio within Media Composer

At first, the Continuum Title Studio interface appeared to be a mildly updated Boris Red interface — and I am allergic to the Boris Red interface. Some of the icons for the keyframing and the way properties are adjusted looks similar and threw me off. I tried really hard to jump into Title Studio and love it, but I really never got comfortable with it.

On the flip side, there are hundreds of presets that could help build quick titles that render a lot faster than New Blue Titler did. In some of the presets I noticed the text was placed outside of 16×9 Title Safety, which is odd since that is kind of a long standing rule in television. In the author’s defense, they are within Action Safety, but still.

If you need a quick way to make 4K titles, Title Studio might be what you want. The updated Title Studio includes realtime playback using the GPU instead of the CPU, new materials, new shaders and external monitoring support using Blackmagic hardware (AJA will be coming at some point). There are some great pre-sets including pre-built slates, lower thirds, kinetic text and even progress bars.

If you don’t have Mocha Pro, Continuum can still access and use Mocha to track shapes and masks. Almost every plugin can access Mocha and can track objects quickly and easily.
That brings me to the newly updated Mocha, which has some new features that are extremely helpful including a Magnetic Spline tool, prebuilt geometric shapes and more.

Mocha Pro 2019
If you loved the previous version of Mocha, you are really going to love Mocha Pro 2019. Not only do you get the Magnetic Lasso, pre-built geometric shapes, the Essentials interface and high-resolution display support, but BorisFX has rewritten the Remove Module code to use GPU video hardware. This increases render speeds about four to five times. In addition, there is no longer a separate Mocha VR software suite. All of the VR tools are included inside of Mocha Pro 2019.

If you are unfamiliar with what Mocha is, then I have a treat for you. Mocha is a standalone planar tracking app as well as a native plugin that works with Media Composer, Premiere and After Effects, or through OFX in Blackmagic’s Fusion, Foundry’s Nuke, Vegas Pro and Hitfilm.

Mocha tracking

In addition (and unofficially) it will work with Blackmagic DaVinci Resolve by way of importing the Mocha masks through Fusion. While I prefer to use After Effects for my work, importing Mocha masks is relatively painless. You can watch colorist Dan Harvey run through the process of importing Mocha masks to Resolve through Fusion, here.

But really, Mocha is a planar tracker, which means it tracks multiple points in a defined area that works best in flat surfaces or at least segmented surfaces, like the side of a face, ear, nose, mouth and forehead tracked separately instead of all at once. From blurs to mattes, Mocha tracks objects like glue and can be a great asset for an online editor or colorist.

If you have read any of my plugin reviews you probably are sick of me spouting off about Mocha, saying how it is probably the best plugin ever made. But really, it is amazing — especially when incorporated with plugins like Continuum and Sapphire. Also, thanks to the latest Media Composer with Symphony option you can incorporate the new Color Correction shapes with Mocha Pro to increase the effectiveness of your secondary color corrections.

Mocha Pro Remove module

So how fast is Mocha Pro 2019’s Remove Module these days? Well, it used to be a very slow process, taking lots of time to calculate an object’s removal. With the latest Mocha Pro 2019 release, including improved GPU support, the render time has been cut down tremendously. In my estimation, I would say three to four times the speed (that’s on the safe side). In Mocha Pro 2019 removal jobs that take under 30 seconds would have taken four to five minutes in previous versions. It’s quite a big improvement in render times.

There are a few changes in the new Mocha Pro, including interface changes and some amazing tool additions. There is a new drop-down tab that offers different workflow views once you are inside of Mocha: Essentials, Classic, Big Picture and Roto. I really wish the Essentials view was out when I first started using Mocha, because it gives you the basic tools you need to get a roto job done and nothing more.

For instance, just giving access to the track motion objects (Translation, Scale, Rotate, Skew and Perspective) with big shiny buttons helps to eliminate my need to watch YouTube videos on how to navigate the Mocha interface. However, if like me you are more than just a beginner, the Classic interface is still available and one I reach for most often — it’s literally the old interface. Big Screen hides the tools and gives you the most screen real estate for your roto work. My favorite after Classic is Roto. The Roto interface shows just the project window and the classic top toolbar. It’s the best of both worlds.

Mocha Pro 2019 Essentials Interface

Beyond the interface changes are some additional tools that will speed up any roto work. This has been one of the longest running user requests. I imagine the most requested feature that BorisFX gets for Mocha is the addition of basic shapes, such as rectangles and circles. In my work, I am often drawing rectangles around license plates or circles around faces with X-splines, so why not eliminate a few clicks and have that done already? Answering my need, Mocha now has elliptical and rectangular shapes ready to go in both X-splines and B-splines with one click.

I use Continuum and Mocha hand in hand. Inside of Media Composer I will use tools like Gaussian Blur or Remover, which typically need tracking and roto shapes created. Once I apply the Continuum effect, I launch Mocha from the Effect Editor and bam, I am inside Mocha. From here I track the objects I want to affect, as well as any objects I don’t want to affect (think of it like an erase track).

Summing Up
I can save tons of time and also improve the effectiveness of my work exponentially when working in Continuum 2019 and Mocha Pro 2019. It’s amazing how much more intuitive Mocha is to track with instead of the built-in Media Composer and Symphony trackers.

In the end, I can’t say enough great things about Continuum and especially Mocha Pro. Mocha saves me tons of time in my VFX and image restoration work. From removing camera people behind the main cast in the wilderness to blurring faces and license plates, using Mocha in tandem with Continuum is a match made in post production heaven.

Rendering in Continuum and Mocha Pro 2019 is a lot faster than previous versions, really giving me a leg up on efficiency. Time is money right?! On top of that, using Mocha Pro’s magic Object removal and Modules takes my image restoration work to the next level, separating me from other online editors who use standard paint and tracking tools.

In Continuum, Primatte Studio gives me the leg up on greenscreen keys with its exceptional ability to auto analyze a scene and perform 80% of the keying work before I dial-in the details. Whenever anyone asks me what tools I couldn’t live without, I without a doubt always say Mocha.
If you want a real Mocha Pro education you need to watch all of Mary Poplin’s tutorials. You can find them on YouTube. Check out this one on how to track and replace a logo using Mocha Pro 2019 in Adobe After Effects. You can also find great videos at Borisfx.com.

Mocha point parameter tracking

I always feel like there are tons of tools inside of the Mocha Pro toolset that go unused simply because I don’t know about them. One I recently learned about in a Surfaced Studio tutorial was the Quick Stabilize function. It essentially stabilizes the video around the object you are tracking allowing you to more easily rotoscope your object with it sitting still instead of moving all over the screen. It’s an amazing feature that I just didn’t know about.

As I was finishing up this review I saw that Boris FX came out with a training series, which I will be checking out. One thing I always wanted was a top-down set of tutorials like the ones on Mocha’s YouTube page but organized and sent along with practical footage to practice with.

You can check out Curious Turtle’s “More Than The Essentials: Mocha in After Effects” on their website where I found more Mocha training. There is even a great search parameter called Getting Started on BorisFX.com. Definitely check them out. You can never learn enough Mocha!


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Behind the Title: Left Field Labs ECD Yann Caloghiris

NAME: Yann Caloghiris

COMPANY: Left Field Labs (@LeftFieldLabs)

CAN YOU DESCRIBE YOUR COMPANY?
Left Field Labs is a Venice-California-based creative agency dedicated to applying creativity to emerging technologies. We create experiences at the intersection of strategy, design and code for our clients, who include Google, Uber, Discovery and Estée Lauder.

But it’s how we go about our business that has shaped who we have become. Over the past 10 years, we have consciously moved away from the traditional agency model and have grown by deepening our expertise, sourcing exceptional talent and, most importantly, fostering a “lab-like” creative culture of collaboration and experimentation.

WHAT’S YOUR JOB TITLE?
Executive Creative Director

WHAT DOES THAT ENTAIL?
My role is to drive the creative vision across our client accounts, as well as our own ventures. In practice, that can mean anything from providing insights for ongoing work to proposing creative strategies to running ideation workshops. Ultimately, it’s whatever it takes to help the team flourish and push the envelope of our creative work.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
Probably that I learn more now than I did at the beginning of my career. When I started, I imagined that the executive CD roles were occupied by seasoned industry veterans, who had seen and done it all, and would provide tried and tested direction.

Today, I think that cliché is out of touch with what’s required from agency culture and where the industry is going. Sure, some aspects of the role remain unchanged — such as being a supportive team lead or appreciating the value of great copy — but the pace of change is such that the role often requires both the ability to leverage past experience and accept that sometimes a new paradigm is emerging and assumptions need to be adjusted.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Working with the team, and the excitement that comes from workshopping the big ideas that will anchor the experiences we create.

WHAT’S YOUR LEAST FAVORITE?
The administrative parts of a creative business are not always the most fulfilling. Thankfully, tasks like timesheeting, expense reporting and invoicing are becoming less exhaustive thanks to better predictive tools and machine learning.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
The early hours of the morning, usually when inspiration strikes — when we haven’t had to deal with the unexpected day-to-day challenges that come with managing a busy design studio.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I’d probably be somewhere at the cross-section between an artist, like my mum was, and an engineer like my dad. There is nothing more satisfying than to apply art to an engineering challenge or vice versa.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I went to school in France, and there wasn’t much room for anything other than school and homework. When I got my Baccalaureate, I decided that from that point onward that whatever I did, it would be fun, deeply engaging and at a place where being creative was an asset.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
We recently partnered with ad agency RK Venture to craft a VR experience for the New Mexico Department of Transportation’s ongoing ENDWI campaign, which immerses viewers into a real-life drunk-driving scenario.

ENDWI

To best communicate and tell the human side of this story, we turned to rapid breakthroughs within volumetric capture and 3D scanning. Working with Microsoft’s Mixed Reality Capture Studio, we were able to bring every detail of an actor’s performance to life with volumetric performance capture in a way that previous techniques could not.

Bringing a real actor’s performance into a virtual experience is a game changer because of the emotional connection it creates. For ENDWI, the combination of rich immersion with compelling non-linear storytelling proved to affect the participants at a visceral level — with the goal of changing behavior further down the road.

Throughout this past year, we partnered with the VMware Cloud Marketing Team to create a one-of-a-kind immersive booth experience for VMworld Las Vegas 2018 and Barcelona 2018 called Cloud City. VMware’s cloud offering needed a distinct presence to foster a deeper understanding and greater connectivity between brand, product and customers stepping into the cloud.

Cloud City

Our solution was Cloud City, a destination merging future-forward architecture, light, texture, sound and interactions with VMware Cloud experts to give consumers a window into how the cloud, and more specifically how VMware Cloud, can be an essential solution for them. VMworld is the brand’s penultimate engagement where hands-on learning helped showcase its cloud offerings. Cloud City garnered 4000-plus demos, which led to a 20% lead conversion in 10 days.

Finally, for Google, we designed and built a platform for the hosting of online events anywhere in the world: Google Gather. For its first release, teams across Google, including Android, Cloud and Education, used Google Gather to reach and convert potential customers across the globe. With hundreds of events to date, the platform now reaches enterprise decision-makers at massive scale, spanning far beyond what has been possible with traditional event marketing, management and hosting.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
Recently, a friend and I shot and edited a fun video homage to the original technology boom-town: Detroit, Michigan. It features two cultural icons from the region, an original big block ‘60s muscle car and some gritty electro beats. My four-year-old son thinks it’s the coolest thing he’s ever seen. It’s going to be hard for me to top that.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
Human flight, the Internet and our baby monitor!

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
Instagram, Twitter, Medium and LinkedIn.

CARE TO SHARE YOUR FAVORITE MUSIC TO WORK TO?
Where to start?! Music has always played an important part of my creative process, and the joy I derive from what we do. I have day-long playlists curated around what I’m trying to achieve during that time. Being able to influence how I feel when working on a brief is essential — it helps set me in the right mindset.

Sometimes, it might be film scores when working on visuals, jazz to design a workshop schedule or techno to dial-up productivity when doing expenses.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Spend time with my kids. They remind me that there is a simple and unpretentious way to look at life.

Efilm’s Natasha Leonnet: Grading Spider-Man: Into the Spider-Verse

By Randi Altman

Sony Pictures’ Spider-Man: Into the Spider-Verse is not your typical Spider-Man film… in so many ways. The most obvious is the movie’s look, which was designed to make the viewer feel they are walking inside a comic book. This tale, which blends CGI with 2D hand-drawn animation and comic book textures, focuses on a Brooklyn teen who is bitten by a radioactive spider on the subway and soon develops special powers.

Natasha Leonnet

When he meets Peter Parker, he realizes he’s not alone in the Spider-Verse. It was co-directed by Peter Ramsey, Robert Persichetti Jr. and Rodney Rothman and produced by Phil Lord and Chris Miller, the pair behind 21 Jump Street and The Lego Movie.

Efilm senior colorist Natasha Leonnet provided the color finish for the film, which was nominated for an Oscar in the Best Animated Feature category. We reached out to find out more.

How early were you brought on the film?
I had worked on Angry Birds with visual effects supervisor Danny Dimian, which is how I was brought onto the film. It was a few months before we started color correction. Also, there was no LUT for the film. They used the ACES workflow, developed by The Academy and Efilm’s VP of technology, Joachim “JZ” Zell.

Can you talk about the kind of look they were after and what it took to achieve that look?
They wanted to achieve a comic book look. You look at the edges of characters or objects in comic books and you actually see aspects of the color printing from the beginning of comic book printing — the CMYK dyes wouldn’t all be the same line — it creates a layered look along with the comic book dots and expression lines on faces, as if you’re drawing a comic book.

For example, if someone gets hurt you put actual slashes on their face. For me it was a huge education about the comic book art form. Justin Thompson, the art director, in particular is so knowledgeable about the history of comic books. I was so inspired I just bought my first comic book. Also, with the overall look, the light is painting color everywhere the way it does in life.

You worked closely Justin, VFX supervisor Danny Dimian and art director Dean Gordon What was that process like?
They were incredible. It was usually a group of us working together during the color sessions — a real exercise in collaboration. They were all so open to each other’s opinions and constantly discussing every change in order to make certain that the change best served the film. There was no idea that was more important than another idea. Everyone listened to each other’s ideas.

Had you worked on an animated film previously? What are the challenges and benefits of working with animation?
I’ve been lucky enough to do all of Blue Sky Studios’ color finishes so far, except for the first Ice Age. One of the special aspects of working on animated films is that you’re often working with people who are fine-art painters. As a result, they bring in a different background and way of analyzing the images. That’s really special. They often focus on the interplay of different hues.

In the case of Spider-Man: Into the Spider-Verse, they also wanted to bring a certain naturalism to the color experience. With this particular film, they made very bold choices with their use of color finishing. They used an aspect of color correctors that are used to shift all of the hues and colors; that’s usually reserved for music videos. They completely embraced it. They were basically using color finishing to augment the story and refine their hues, especially time of day and progression of the day or night. They used it as their extra lighting step.

Can you talk about your typical process? Did that differ because of the animated content?
My process actually does not differ when I’m color finishing animated content. Continuity is always at the forefront, even in animation. I use the color corrector as a creative tool on every project.

How would you describe the look of the film?
The film embodies the vivid and magical colors that I always observed in childhood but never saw reflected on the screen. The film is very color intense. It’s as if you’re stepping inside a comic book illustrator’s mind. It’s a mind-meld with how they’re imagining things.

What system did you use for color and why?
I used Resolve on this project, as it was the system that the clients were most familiar with.

Any favorite parts of the process?
My favorite part is from start to finish. It was all magical on this film.

What was your path to being a colorist?
My parents loved going to the cinema. They didn’t believe in babysitters, so they took me to everything. They were big fans of the French new wave movement and films that offered unconventional ways of depicting the human experience. As a result, I got to see some pretty unusual films. I got to see how passionate my parents were about these films and their stories and unusual way of telling them, and it sparked something in me. I think I can give my parents full credit for my career.

I studied non-narrative experimental filmmaking in college even though ultimately my real passion was narrative film. I started as a runner in the Czech Republic, which is where I’d made my thesis film for my BA degree. From there I worked my way up and met a colorist (Biggi Klier) who really inspired me. I was hooked and lucky enough to study with her and another mentor of mine in Munich, Germany.

How do you prefer a director and DP describe a look?
Every single person I’ve worked with works differently, and that’s what makes it so fun and exciting, but also challenging. Every person communicates about color differently and our vocabulary for color is so limited, therein lies the challenge.

Where do you find inspiration?
From both the natural world and the world of films. I live in a place that faces east, and I get up every morning to watch the sunrise and the color palette is always different. It’s beautiful and inspiring. The winter palettes in particular are gorgeous, with reds and oranges that don’t exist in summer sunrises.

Avengers: Infinity War leads VES Awards with six noms

The Visual Effects Society (VES) has announced the nominees for the 17th Annual VES Awards, which recognize outstanding visual effects artistry and innovation in film, animation, television, commercials and video games as well as the VFX supervisors, VFX producers and hands-on artists who bring this work to life.

Avengers: Infinity War garners the most feature film nomination with six. Incredibles 2 is the top animated film contender with five nominations and Lost in Space leads the broadcast field with six nominations.

Nominees in 24 categories were selected by VES members via events hosted by 11 of the organizations Sections, including Australia, the Bay Area, Germany, London, Los Angeles, Montreal, New York, New Zealand, Toronto, Vancouver and Washington.

The VES Awards will be held on February 5th at the Beverly Hilton Hotel. As previously announced, the VES Visionary Award will be presented to writer/director/producer and co-creator of Westworld Jonathan Nolan. The VES Award for Creative Excellence will be given to award-winning creators/executive producers/writers/directors David Benioff and D.B. Weiss of Game of Thrones fame. Actor-comedian-author Patton Oswalt will once again host the VES Awards.

Here are the nominees:

Outstanding Visual Effects in a Photoreal Feature

Avengers: Infinity War

Daniel DeLeeuw

Jen Underdahl

Kelly Port

Matt Aitken

Daniel Sudick

 

Christopher Robin

Christopher Robin

Chris Lawrence

Steve Gaub

Michael Eames

Glenn Melenhorst

Chris Corbould

 

Ready Player One

Roger Guyett

Jennifer Meislohn

David Shirk

Matthew Butler

Neil Corbould

 

Solo: A Star Wars Story

Rob Bredow

Erin Dusseault

Matt Shumway

Patrick Tubach

Dominic Tuohy

 

Welcome to Marwen

Kevin Baillie

Sandra Scott

Seth Hill

Marc Chu

James Paradis

 

Outstanding Supporting Visual Effects in a Photoreal Feature 

12 Strong

Roger Nall

Robert Weaver

Mike Meinardus

 

Bird Box

Marcus Taormina

David Robinson

Mark Bakowski

Sophie Dawes

Mike Meinardus

 

Bohemian Rhapsody

Paul Norris

Tim Field

May Leung

Andrew Simmonds

 

First Man

Paul Lambert

Kevin Elam

Tristan Myles

Ian Hunter

JD Schwalm

 

Outlaw King

Alex Bicknell

Dan Bethell

Greg O’Connor

Stefano Pepin

 

Outstanding Visual Effects in an Animated Feature

Dr. Seuss’ The Grinch

Pierre Leduc

Janet Healy

Bruno Chauffard

Milo Riccarand

 

Incredibles 2

Brad Bird

John Walker

Rick Sayre

Bill Watral

 

Isle of Dogs

Mark Waring

Jeremy Dawson

Tim Ledbury

Lev Kolobov

 

Ralph Breaks the Internet

Scott Kersavage

Bradford Simonsen

Ernest J. Petti

Cory Loftis

 

Spider-Man: Into the Spider-Verse

Joshua Beveridge

Christian Hejnal

Danny Dimian

Bret St. Clair

 

Outstanding Visual Effects in a Photoreal Episode

Altered Carbon; Out of the Past

Everett Burrell

Tony Meagher

Steve Moncur

Christine Lemon

Joel Whist

 

Krypton; The Phantom Zone

Ian Markiewicz

Jennifer Wessner

Niklas Jacobson

Martin Pelletier

 

LOST IN SPACE

Lost in Space; Danger, Will Robinson

Jabbar Raisani

Terron Pratt

Niklas Jacobson

Joao Sita

 

The Terror; Go For Broke

Frank Petzold

Lenka Líkařová

Viktor Muller

Pedro Sabrosa

 

Westworld; The Passenger

Jay Worth

Elizabeth Castro

Bruce Branit

Joe Wehmeyer

Michael Lantieri

 

Outstanding Supporting Visual Effects in a Photoreal Episode

Tom Clancy’s Jack Ryan; Pilot

Erik Henry

Matt Robken

Bobo Skipper

Deak Ferrand

Pau Costa

 

The Alienist; The Boy on the Bridge

Kent Houston

Wendy Garfinkle

Steve Murgatroyd

Drew Jones

Paul Stephenson

 

The Deuce; We’re All Beasts

Jim Rider

Steven Weigle

John Bair

Aaron Raff

 

The First; Near and Far

Karen Goulekas

Eddie Bonin

Roland Langschwert

Bryan Godwin

Matthew James Kutcher

 

The Handmaid’s Tale; June

Brendan Taylor

Stephen Lebed

Winston Lee

Leo Bovell

 

Outstanding Visual Effects in a Realtime Project

Age of Sail

John Kahrs

Kevin Dart

Cassidy Curtis

Theresa Latzko

 

Cycles

Jeff Gipson

Nicholas Russell

Lauren Nicole Brown

Jorge E. Ruiz Cano

 

Dr Grordbort’s Invaders

Greg Broadmore

Mhairead Connor

Steve Lambert

Simon Baker

 

God of War

Maximilian Vaughn Ancar

Corey Teblum

Kevin Huynh

Paolo Surricchio

 

Marvel’s Spider-Man

Grant Hollis

Daniel Wang

Seth Faske

Abdul Bezrati

 

Outstanding Visual Effects in a Commercial 

Beyond Good & Evil 2

Maxime Luere

Leon Berelle

Remi Kozyra

Dominique Boidin

 

John Lewis; The Boy and the Piano

Kamen Markov

Philip Whalley

Anthony Bloor

Andy Steele

 

McDonald’s; #ReindeerReady

Ben Cronin

Josh King

Gez Wright

Suzanne Jandu

 

U.S. Marine Corps; A Nation’s Call

Steve Drew

Nick Fraser

Murray Butler

Greg White

Dave Peterson

 

Volkswagen; Born Confident

Carsten Keller

Anandi Peiris

Dan Sanders

Fabian Frank

 

Outstanding Visual Effects in a Special Venue Project

Beautiful Hunan; Flight of the Phoenix

R. Rajeev

Suhit Saha

Arish Fyzee

Unmesh Nimbalkar

 

Childish Gambino’s Pharos

Keith Miller

Alejandro Crawford

Thelvin Cabezas

Jeremy Thompson

 

DreamWorks Theatre Presents Kung Fu Panda

Marc Scott

Doug Cooper

Michael Losure

Alex Timchenko

 

Osheaga Music and Arts Festival

Andre Montambeault

Marie-Josee Paradis

Alyson Lamontagne

David Bishop Noriega

 

Pearl Quest

Eugénie von Tunzelmann

Liz Oliver

Ian Spendloff

Ross Burgess

 

Outstanding Animated Character in a Photoreal Feature

Avengers: Infinity War; Thanos

Jan Philip Cramer

Darren Hendler

Paul Story

Sidney Kombo-Kintombo

 

Christopher Robin; Tigger

Arslan Elver

Kayn Garcia

Laurent Laban

Mariano Mendiburu

 

Jurassic World: Fallen Kingdom; Indoraptor

Jance Rubinchik

Ted Lister

Yannick Gillain

Keith Ribbons

 

Ready Player One; Art3mis

David Shirk

Brian Cantwell

Jung-Seung Hong

Kim Ooi

 

Outstanding Animated Character in an Animated Feature

Dr. Seuss’ The Grinch; The Grinch

David Galante

Francois Boudaille

Olivier Luffin

Yarrow Cheney

 

Incredibles 2; Helen Parr

Michal Makarewicz

Ben Porter

Edgar Rodriguez

Kevin Singleton

 

Ralph Breaks the Internet; Ralphzilla

Dong Joo Byun

Dave K. Komorowski

Justin Sklar

Le Joyce Tong

 

Spider-Man: Into the Spider-Verse; Miles Morales

Marcos Kang

Chad Belteau

Humberto Rosa

Julie Bernier Gosselin

 

Outstanding Animated Character in an Episode or Realtime Project

Cycles; Rae

Jose Luis Gomez Diaz

Edward Everett Robbins III

Jorge E. Ruiz Cano

Jose Luis -Weecho- Velasquez

 

Lost in Space; Humanoid

Chad Shattuck

Paul Zeke

Julia Flanagan

Andrew McCartney

 

Nightflyers; All That We Have Found; Eris

Peter Giliberti

James Chretien

Ryan Cromie

Cesar Dacol Jr.

 

Spider-Man; Doc Ock

Brian Wyser

Henrique Naspolini

Sophie Brennan

William Salyers

 

Outstanding Animated Character in a Commercial

McDonald’s; Bobbi the Reindeer

Gabriela Ruch Salmeron

Joe Henson

Andrew Butler

Joel Best

 

Overkill’s The Walking Dead; Maya

Jonas Ekman

Goran Milic

Jonas Skoog

Henrik Eklundh

 

Peta; Best Friend; Lucky

Bernd Nalbach

Emanuel Fuchs

Sebastian Plank

Christian Leitner

 

Volkswagen; Born Confident; Bam

David Bryan

Chris Welsby

Fabian Frank

Chloe Dawe

 

Outstanding Created Environment in a Photoreal Feature

Ant-Man and the Wasp; Journey to the Quantum Realm

Florian Witzel

Harsh Mistri

Yuri Serizawa

Can Yuksel

 

Aquaman; Atlantis

Quentin Marmier

Aaron Barr

Jeffrey De Guzman

Ziad Shureih

 

Ready Player One; The Shining, Overlook Hotel

Mert Yamak

Stanley Wong

Joana Garrido

Daniel Gagiu

 

Solo: A Star Wars Story; Vandor Planet

Julian Foddy

Christoph Ammann

Clement Gerard

Pontus Albrecht

 

Outstanding Created Environment in an Animated Feature

Dr. Seuss’ The Grinch; Whoville

Loic Rastout

Ludovic Ramiere

Henri Deruer

Nicolas Brack

 

Incredibles 2; Parr House

Christopher M. Burrows

Philip Metschan

Michael Rutter

Joshua West

 

Ralph Breaks the Internet; Social Media District

Benjamin Min Huang

Jon Kim Krummel II

Gina Warr Lawes

Matthias Lechner

 

Spider-Man; Into the Spider-Verse; Graphic New York City

Terry Park

Bret St. Clair

Kimberly Liptrap

Dave Morehead

 

Outstanding Created Environment in an Episode, Commercial, or Realtime Project

Cycles; The House

Michael R.W. Anderson

Jeff Gipson

Jose Luis Gomez Diaz

Edward Everett Robbins III

 

Lost in Space; Pilot; Impact Area

Philip Engström

Kenny Vähäkari

Jason Martin

Martin Bergquist

 

The Deuce; 42nd St

John Bair

Vance Miller

Jose Marin

Steve Sullivan

 

The Handmaid’s Tale; June; Fenway Park

Patrick Zentis

Kevin McGeagh

Leo Bovell

Zachary Dembinski

 

The Man in the High Castle; Reichsmarschall Ceremony

Casi Blume

Michael Eng

Ben McDougal

Sean Myers

 

Outstanding Virtual Cinematography in a Photoreal Project

Aquaman; Third Act Battle

Claus Pedersen

Mohammad Rastkar

Cedric Lo

Ryan McCoy

 

Echo; Time Displacement

Victor Perez

Tomas Tjernberg

Tomas Wall

Marcus Dineen

 

Jurassic World: Fallen Kingdom; Gyrosphere Escape

Pawl Fulker

Matt Perrin

Oscar Faura

David Vickery

 

Ready Player One; New York Race

Daniele Bigi

Edmund Kolloen

Mathieu Vig

Jean-Baptiste Noyau

 

Welcome to Marwen; Town of Marwen

Kim Miles

Matthew Ward

Ryan Beagan

Marc Chu

 

Outstanding Model in a Photoreal or Animated Project 

Avengers: Infinity War; Nidavellir Forge Megastructure

Chad Roen

Ryan Rogers

Jeff Tetzlaff

Ming Pan

 

Incredibles 2; Underminer Vehicle

Neil Blevins

Philip Metschan

Kevin Singleton

 

Mortal Engines; London

Matthew Sandoval

James Ogle

Nick Keller

Sam Tack

 

Ready Player One; DeLorean DMC-12

Giuseppe Laterza

Kim Lindqvist

Mauro Giacomazzo

William Gallyot

 

Solo: A Star Wars Story; Millennium Falcon

Masa Narita

Steve Walton

David Meny

James Clyne

 

Outstanding Effects Simulations in a Photoreal Feature

Avengers: Infinity War; Titan

Gerardo Aguilera

Ashraf Ghoniem

Vasilis Pazionis

Hartwell Durfor

 

Avengers: Infinity War; Wakanda

Florian Witzel

Adam Lee

Miguel Perez Senent

Francisco Rodriguez

 

Fantastic Beasts: The Crimes of Grindelwald

Dominik Kirouac

Chloe Ostiguy

Christian Gaumond

 

Venom

Aharon Bourland

Jordan Walsh

Aleksandar Chalyovski

Federico Frassinelli

 

Outstanding Effects Simulations in an Animated Feature

Dr. Seuss’ The Grinch; Snow, Clouds and Smoke

Eric Carme

Nicolas Brice

Milo Riccarand

 

Incredibles 2

Paul Kanyuk

Tiffany Erickson Klohn

Vincent Serritella

Matthew Kiyoshi Wong

 

Ralph Breaks the Internet; Virus Infection & Destruction

Paul Carman

Henrik Fält

Christopher Hendryx

David Hutchins

 

Smallfoot

Henrik Karlsson

Theo Vandernoot

Martin Furness

Dmitriy Kolesnik

 

Spider-Man: Into the Spider-Verse

Ian Farnsworth

Pav Grochola

Simon Corbaux

Brian D. Casper

 

Outstanding Effects Simulations in an Episode, Commercial, or Realtime Project

Altered Carbon

Philipp Kratzer

Daniel Fernandez

Xavier Lestourneaud

Andrea Rosa

 

Lost in Space; Jupiter is Falling

Denys Shchukin

Heribert Raab

Michael Billette

Jaclyn Stauber

 

Lost in Space; The Get Away

Juri Bryan

Will Elsdale

Hugo Medda

Maxime Marline

 

The Man in the High Castle; Statue of Liberty Destruction

Saber Jlassi

Igor Zanic

Nick Chamberlain

Chris Parks

 

Outstanding Compositing in a Photoreal Feature

Avengers: Infinity War; Titan

Sabine Laimer

Tim Walker

Tobias Wiesner

Massimo Pasquetti

 

First Man

Joel Delle-Vergin

Peter Farkas

Miles Lauridsen

Francesco Dell’Anna

 

Jurassic World: Fallen Kingdom

John Galloway

Enrik Pavdeja

David Nolan

Juan Espigares Enriquez

 

Welcome to Marwen

Woei Lee

Saul Galbiati

Max Besner

Thai-Son Doan

 

Outstanding Compositing in a Photoreal Episode

Altered Carbon

Jean-François Leroux

Reece Sanders

Stephen Bennett

Laraib Atta

 

Handmaids Tale; June

Winston Lee

Gwen Zhang

Xi Luo

Kevin Quatman

 

Lost in Space; Impact; Crash Site Rescue

David Wahlberg

Douglas Roshamn

Sofie Ljunggren

Fredrik Lönn

 

Silicon Valley; Artificial Emotional Intelligence; Fiona

Tim Carras

Michael Eng

Shiying Li

Bill Parker

 

Outstanding Compositing in a Photoreal Commercial

Apple; Unlock

Morten Vinther

Michael Gregory

Gustavo Bellon

Rodrigo Jimenez

 

Apple; Welcome Home

Michael Ralla

Steve Drew

Alejandro Villabon

Peter Timberlake

 

Genesis; G90 Facelift

Neil Alford

Jose Caballero

Joseph Dymond

Greg Spencer

 

John Lewis; The Boy and the Piano

Kamen Markov

Pratyush Paruchuri

Kalle Kohlstrom

Daniel Benjamin

 

Outstanding Visual Effects in a Student Project

Chocolate Man

David Bellenbaum

Aleksandra Todorovic

Jörg Schmidt

Martin Boué

 

Proxima-b

Denis Krez

Tina Vest

Elias Kremer

Lukas Löffler

 

Ratatoskr

Meike Müller

Lena-Carolin Lohfink

Anno Schachner

Lisa Schachner

 

Terra Nova

Thomas Battistetti

Mélanie Geley

Mickael Le Mezo

Guillaume Hoarau