AMD 2.1

Category Archives: VFX

Autodesk’s 3ds Max 2021 now available

Autodesk has introduced 3ds Max 2021, a new version of its 3D modeling, animation and rendering software. This latest release offers new tools designed to give 3D artists the ability to work across design visualization and media and entertainment with a fully scriptable baking experience, simpler install process, viewport and rendering improvements, and integrated Python 3 support, among other enhancements.

Highlights include:
• New texture baking experience supports physically based rendering (PBR), overrides and OSL workflows and provides intuitive new tool set.
• Updated installer allows users to get up and running quickly and easily.
• Integrated support for Python 3 and an improved pymxs API that ensure developers and technical artists can better customize pipelines.
• Native integration with the Arnold Renderer V6.0 offers a high-end rendering experience out of the box, while included scripts efficiently convert V-Ray and Corona files to the Physical Material for added flexibility.
• Performance enhancements simplify the use of PBR workflows across renderers, including with realtime game engines; provide direct access to high-fidelity viewports; improve the OSL user experience; significantly accelerate file I/O; and enhance control over modeling with a new weighted normals modifier.
• Tool set advancements to SketchUp import, Substance, ProSound and FBX streamline the creation and movement of high-quality 3D assets.
• New plugin interop and improvements – from support for AMG and OSL shaders to scene converter extensions – allow for a broader range of plugins to easily hook into 3ds Max while also simplifying plugin development and installation.

Early 3ds Max 2021 adopter Eloi Andaluz Fullà, a freelance VFX artist on beta, reported, “The revamped viewport, IBL controls and persistent Ambient Occlusion speed up the client review process because I can easily share high-quality viewport images without having to wait for renders. The new bake to texture update is also a huge time-saver because we can easily modify multiple parameters at once, while other updates simplify day-to-day tasks.”

3ds Max 2021 is now available as a stand-alone subscription or with the Autodesk Media & Entertainment Collection.

Workstations Roundtable

By Randi Altman

In our Workstations Special Edition, we spoke to pros working in offline editing, visual effects and finishing about what they need technically in order to keep creating. Here in our Workstations Roundtable, we reached out to both users and those who make computers and related tools, all of whom talk about what they need from their workstations in order to get the job done.

The Foundation’s Director of Engineering, John Stevens 

John Stevens

Located just across the street from the Warner Bros. lot, The Foundation provides post production picture services and workflows in HD, 2K, 4K, UHD, HDR10 and Dolby Vision HDR. They work on many episodic shows, including Black-ish, Grown-ish, Curb Your Enthusiasm and American Soul.

Do you typically buy off the shelf or custom? Both?
Both. It depends on the primary application the system will be running. Typically, we buy off-the-shelf systems that have the CPU and memory configurations we are looking for.

How often do you upgrade your workstations, and what process do you go through in finding the right one?
There is no defined time frame. We look at every system manufacturer’s offerings, look at specs and request demo systems for test after we have narrowed it to a few systems.

How important is the GPU to your work?
The GPU is extremely important, as almost every application uses the GPU to allow for faster processing. A lot of applications allow for multiple GPUs, so I look for systems that will support them.

Curb Your Enthusiasm

What are the questions you ask yourself before buying a new system? And what do you do with your older systems?
What is the primary application that the system is being purchased for? Does the software vendor have a list of certified configurations? Is the application well-threaded, meaning, can the application make efficient use of multiple cores, or does a higher core clock rate make the application perform faster? How many PCI slots are available? What is the power supply capability? What’s the reputation and experience of the manufacturer?

Do you feel mobile workstations are just as powerful for your work as desktops these days?
No, systems are limited in expandability.

 

Puget Systems’ Solutions Research & Development, Matt Bach

Based in Auburn, Washington, Puget Systems specializes in high-performance, custom-built computers for media and entertainment.

Matt Bach

What is your definition of a workstation? We know there are a few definitions out there in the world.
While many people tend to focus on the hardware to define what a workstation is, to us it is really whether or not the computer is able to effectively allow you to get your work done. In order to do so, it has to be not only fast but reliable. In the past, you had to purchase very expensive “workstation-class” hardware to get the proper balance of performance and stability, but these days it is more about getting the right brands and models of parts to complement your workflow than just throwing money at the problem.

For users looking to buy a computer but are torn between off-the-shelf and building their own, what would you tell them?
The first thing I would clarify is that there are vastly different kinds of “off-the-shelf” computers. There are the systems you get from a big box store, where you have a handful of choices but no real customization options. Then there are systems from companies like us, where each system is tailor-made to match what applications you use and what you do in those applications. The sticker price on these kinds of systems might appear to be a bit higher, but in reality — because it is the right hardware for you — the actual performance you get per dollar tends to be quite a bit better.

Of course, you can build a system yourself, and in fact, many of our customers used to do exactly that. But when you are a professional trying to get your work done, most people don’t want to spend their time keeping up on the latest hardware, figuring out what exact components they should use and troubleshooting any issues that come up. Time spent fiddling with your computer is time that you could spend getting your job done. Working with a company like us that understands what it is you are doing — and how to quickly get you back up and running — can easily offset any cost of building your own system.

What questions would you suggest pros ask before deciding on the right computer for their work?
This could easily be an entire post all its own, and this is the reason why we highly encourage every customer to talk to one of our consultants — if not on the phone, then at least by email. The right configuration depends on a huge number of factors that are never quite the same from one person to the next. It includes what applications you use and what you do in those applications. For example, if you are a video editor, what resolution, fps and codec do you tend to work with? Do you do any multicam work? What about VFX or motion graphics?

Depending on what applications you use, it is often also the case that you will run into times when you have opposing “optimal” hardware. A program like After Effects prefers CPUs with high per-core performance, while Premiere Pro can benefit from a CPU with more cores. That means there is no single “best” option if you use both of those applications, so it comes down to determining which application is more likely to benefit from more performance in your own personal workflow.

This really only scratches the surface, however. There is also the need to make sure the system supports your existing peripherals (Thunderbolt, 10G networking, etc.), the physical size of the system and upgradability. Not to mention the quality of support from the system manufacturer.

How do you decide on what components to include in your systems … GPUs, for example?
We actually have an entire department (Puget Labs) that is dedicated to this exact question. Not only does hardware change very quickly, but software is constantly evolving as well. A few years back, developers were working on making their applications multi-threaded. Now, much of that dev time has switched over to GPU acceleration. And in the very near future, we expect work in AI and machine learning to be a major focus.

Keeping up with these trends — and how each individual application is keeping up with them — takes a lot of work. We do a huge amount of internal testing that we make available to the public to determine exactly how individual applications benefit from things like more CPU cores, more powerful GPUs or faster storage.

Can you talk about warranties and support? What do you offer?
As for support and warranty, our systems come with lifetime tech support and one to three years parts warranty. What makes us the most different from big box stores is that we understand your workflow. We do not want your tech support experience to be finger pointing between Adobe, Microsoft and Puget Systems. Our goal is to get you up and running, regardless of what the root cause is, and often that means we need to be creative and work with you individually on the best solution to the problem.

 

Goldcrest Post’s Technical Director, Barbary Ahmed

Barbary Ahmed

Goldcrest Post New York, located in the heart of the bustling Meatpacking District, is a full-service post facility offering offline and picture and sound finishing.  Recent credits include The Laundromat, Godfather of Harlem, Russian Doll, High Flying Bird, Her Smell; Sorry to Bother You, Billions and Unsane.   

Do you typically buy off the shelf or custom? Both?
We do both. But for most cases, we do custom builds because color grading workstations need more power, more GPUs and a lot of I/O options.

How often do you upgrade your workstations, and what process do you go through in finding the right one?
This is technically a long research process. We depend on our trusty vendors, and it also depends on pricing and availability of items and how quick we need them.

How important is the GPU to your work?
For color grading and visual effects, using applications such as Autodesk’s Maya and Flame, Blackmagic Resolve and Adobe Premiere, a high-end workstation will provide a smoother and faster workflow. 4K/UHD media and above can tax a computer, so having access to a top-of-the-line machine is a key for us.

The importance of GPUs is that the video software mentioned above is now able to dump much of the heavy lifting onto the GPU (or even several GPUs), leaving the CPU free to do its job of delegating tasks, applications, APIs, hardware process, I/O device requests and so on. The CPU just makes sure all the basic tasks run in harmony, while the GPU takes care of crunching the more complex and intensive computation needed by the application. It is important to know that for all but the most basic video — and certainly for any form of 4K.

What are the questions you ask yourself before buying a new system? And what do you do with your older systems?
There are many questions to ask here: Is this system scalable? Can we upgrade it in the future? What real change will it bring to our workflow? What are others in my industry using? Does my team like it? These are the kind of questions we start with for any job.

In terms of what to do with older systems, there are a couple things that we think about: Can we use it as a secondary system? Can we donate it? Can we turn it into an experimental box? Can we recycle it? These are the kind of questions we ask ourselves.

Do you feel mobile workstations are just as powerful for your work as desktops these days? Especially now, with the coronavirus shutdowns?
During these unprecedented times, it seems that mobile workstations are the only way to keep up with our clients’ needs. But we were innovative about it; we established the capability to conduct most picture and sound post production work remotely. Colorists, conform editors and other staff are now able to work from home or a remote site and connect to the facility’s central storage and main desktop workstations via remote collaboration software.

This allows Goldcrest to ensure theatrical and television projects remain on track while allowing clients to oversee work in as normal a manner as possible under current circumstances.

 

Dell’s M&E Strategist, Client Solutions, Matt Allard

Matt Allard

Dell Technologies helps users create, manage and deliver media through a complete and scalable IT infrastructure, including workstations, monitors, servers, shared storage, switches, virtualization solutions and more paired with the support and services.

What is Dell’s definition of a workstation? We know there are a few definitions.
One of the most important definitions is the International Data Corporation’s (IDC) definition that assesses the overall market for workstations. This definition includes several important elements:

1. Workstations should be highly configurable and include workstation-grade components, including:
a. Workstation-grade CPUs (like Intel Xeon processors)
b. Professional and discrete GPUs, like those in the Nvidia Quadro line and AMD Radeon Pro line
c. Support for ECC memory

2. Workstations must be certified with commonly used professional ISV software, like that from Adobe, Autodesk, Avid, Blackmagic and others.

3. IDC requires a brand that is dedicated and known for workstations.

Beyond the IDC’s requirements, we understand that workstation customers are seeking the utmost in performance and reliability to run the software they use every day. We feel that workstation-grade components and Dell Precision’s engineering deliver that environment. Reliability can also include the security and manageability that large enterprises expect, and our designs provide the hooks that allow IT to manage and maintain workstations across a large studio or media enterprise. Consumer PCs rarely include these commercial-grade IT capabilities.

Additionally, software and technology (such as the Dell Precision Optimizer, our Reliable Memory Technology, Dell Client Command Suite) can extend the performance, reliability and manageability on top of the hardware components in the system.

For users looking to buy a computer but are torn between off the shelf and building their own, what would you tell them?
It’s a common misconception that a computer is just a sum of its parts. It can be better to deal with a vendor that has the supply chain volume and market presence to have advantageous access during times like these, when supply constraints exist on popular CPUs and GPUs. Additionally, most professional ISV software is not qualified or certified on a set of off-the-shelf components, but on specific vendor PC models. If users want absolute confidence that their software will run optimally, using a certified/qualified platform is the best choice. Warranties are also important, but more on that in a bit.

What questions would you suggest pros ask before deciding on the right computer for their work?
The first question is to be clear about the nature of the work you do as a pro, using what software applications in the media and entertainment industry. Your working resolution has a large bearing on the ideal configuration for the workstation. We try to make deciding easier with Dell’s Precision Workstation Advisor, which provides pros an easy way to find configuration choices based on our certification testing and interaction with our ISV partners.

Do you think we are at a time when mobile workstations are as powerful as desktops?
The reality is that it is not challenging to build a desktop configuration that is more powerful than the most powerful mobile workstation. For instance, Dell Precision fixed workstations support configurations with multiple CPUs and GPUs, and those actually require beefier power supplies, more slots and thermal designs that need more physical space than in a reasonably sized mobile.

A more appropriate question might be, can a mobile workstation be an effective tool for M&E professionals who need to be on the road or on shoot? And the answer to that is a resounding yes.

How do you decide on what components to include in your systems … GPUs, for example?
As mentioned above, workstations tend to be highly configurable, often with multiple options for CPUs, GPUs and other components. We work to stay at the forefront of our suppliers’ roadmap offerings and to provide a variety of options so customers can choose the right price/performance configuration that suits their needs. This is where having a clear guidance on certified system for the ISV software a customer is using makes selecting the right configuration easier.

Can you talk about warranties and support?
An advantage of dealing with a Tier 1 workstation vendor like Dell is that pros can pick the right warranty and support level for their business, from basic hardware warranty to our ProSupport with aggressive availability and response times. All Dell Precision fixed workstations come with a three-year Dell Limited Hardware warranty, and users can opt for as many as five years. Precision mobile workstations come with a one-year warranty (except 7000 series mobile, which has three years standard), and users can opt for as many as five years’ warranty with ProSupport.

 

Performance Post’s Owner/President, Fausto Sanchez

Fausto Sanchez

Burbank’s Independently owned Performance Post focuses on broadcast television work. It works with Disney, Warner Bros. and NBCUniversal. Credits include TV versions of the Guardians of the Galaxy franchise and SD to UHD upconversion and framerate conversions for HBO’s At the Heart of Gold: Inside the USA Gymnastics Scandal.

Do you typically buy off the shelf or custom? Both?
We look to the major suppliers like HP, Dell and Apple for off-the-shelf products. We also have
purchased custom workstations, and we build our own.

How often do you upgrade your workstations, and what process do you go through in finding the right one?
If we have done our homework well, our workstations can last for three to five years. This timeline is becoming shorter, though, with new technologies such as higher core count and clock speed.

In evaluating our needs, first we look at the community for best practices. We look to see what has been successful for others. I love that we can get that info and stories here on postPerspective! We look at what the main suppliers are providing. These are great if you have a lot of extra cash. For many of us, the market is always demanding and squeezing everything it can. We are no different. We have bought both preconfigured systems from the primary manufacturers as well as custom systems.

HBO’s At the Heart of Gold: Inside the USA Gymnastics Scandal.

How important is the GPU to your work?
In our editorial workflows — Avid Media Composer, Adobe Premiere, Blackmagic Resolve (for editing) — GPU use is not a big deal because these applications are currently not relying on GPU so much for basic editing. Mostly, you select the one best for your applications. Nvidia has been the mainstay for a long time, but AMD has gotten great support, especially in the new Mac Pro workstation.

For color work or encoding, the GPU selection becomes critical. Currently, we are using the Nvidia Titan series GPUs for some of our heaviest processor-intensive workflows

What are the questions you ask yourself before buying new systems? And what do you do with your older systems?
When buying a new system, obviously the first questions are: What is it for? Can we expand it? How much? What kind of support is there? These questions become key, especially if you decide to build your custom workstation. Our old systems many times are repurposed for other work. Many can function in other duties for years.

Do you feel mobile workstations are just as powerful for your work as desktops these days?
We have had our eye on mobile workstations for some time. Many are extremely powerful and can find a good place for a specific purpose. There can be a few problems in this setup: additional monitor capabilities, external GPU, external mass storage connectivity. For a lot of work, mobile workstations make sense; if I do not have to connect a lot of peripherals and can work mostly self-contained or cloud-based, these can be great. In many cases you quickly learn that the keyboard, screen and battery life are not conducive to a long-term workflow. For the right workflow though, these can be great. They’re just not for us right now.

 

AMD’s Director of VFX/Media & Entertainment, James Knight

James Knight

AMD provides Threadripper and Epyc CPUs that accelerate workflows in M&E.

How does AMD describe a workstation?
Some companies have different definitions of what makes a workstation. 
Essentially AMD thinks of workstations as a combination of powerful CPUs and GPUs that enable professionals to create, produce, analyze, design, visualize, simulate and investigate without having to compromise on power or workload performance to achieve their desired results. In the specific case of media and entertainment, AMD designs and tests products aligned with the workstation ecosystem to enable professionals to do so much more within the same exact deadlines. We are giving them more time to create.

For users looking to buy a computer but are torn between off the shelf and building their own, what would you tell them?
Ultimately, professionals need to choose the best solution to meet their creative goals. We work closely with major OEMs to provide them with the best we have to offer for the market. For example, 64-core Threadripper has certainly been recognized by workstation manufacturers. System builders can offer these new CPUs to achieve great results.

What questions should pros ask before purchasing a workstation, in order to make sure they are getting the right workstation for their needs?
I typically ask professionals to focus on their pain points and how they want the new workstation to resolve those issues. More often than not, they tell me they want more time to create and time to try various renderings. With an optimized workstation matched with on optimal balance of powerful CPUs and reliable GPUs, pros can achieve the results they demand over and over.

What trends have you seen happening in this space over the last couple of years?
As memory technology improves and larger models of higher resolution are created, I’ve seen user expectations increase dramatically, as has their desire to work easily with these files. The demand for reliable tools for creating, editing and producing content has been constantly growing. For example, in the case of movie mastering and encoding, AMD’s 32-core and 64-core Threadripper CPUs have exceeded expectations when working with these large files.

PFX‘s Partner/VFX Supervisor, Jan Rybar 

Jan Rybar

PFX is a Czech-based company focused on animation, post and visual effects. They work on international projects ranging from short films to commercials, TV series and feature films. The 110-member team works in their studios in Prague

How often do you upgrade your workstations, and what process do you go through in finding the right one?
We upgrade the workstations themselves maybe every two or three years. We try to select good quality vendors and robust specs so we won’t be forced to replace workstations too often.

Do you also build your own workstations and renderfarms?
Not really — we have a vendor we like and buy all the hardware there. A long time ago, we found out that the reliability of HP and their Z line of workstations is what we need. So 99% of our workstations and blade renderfarms are HP.

How do your needs as a VFX house differ from a traditional post house?
It blends together a lot — it’s more about what the traditional post house specializes in. If it’s focused on animation or film, then the needs are quite similar, which means more based on CPU power. Lately, as we have been involved more and more in realtime engine-based workflows, state-of-the-art GPU technology is crucial. The Last Whale Singer teaser we did was created with the help of the latest GeForce RTX 2080ti hardware. This allowed us to work both efficiently and with the desired quality (raytracing).

Can you walk us through your typical workflow and how your workstations and their components play a part?
The workflow is quite similar to any other production: design/concept, sculpting, modeling, rigging, layout, animation, lighting/effects, rendering, compositing, color grading, etc.

The main question these days is whether the project runs in a classic animation pipeline, on a realtime engine pipeline or a hybrid. Based on this, we change our approach and adapt it to the technology. For example, when Telescope Animation works on a scene in Unreal, it requires different technology compared to a team that’s working in Maya/Houdini.

PNY’s Nvidia Quadro Product Marketing Manager, Carl Flygare

Carl Flygare

Nvidia’s Quadro RTX-powered workstations, featuring Nvidia Turing GPU architecture, allow for realtime raytracing, AI and advanced graphics capabilities for visualization pros. PNY is Nvidia’s Quadro channel partner throughout North America, Latin America, Europe and India.

How does PNY describe a workstation? Some folks have different definitions of what makes a workstation.
The traditional definition of the term comes from CAD – a system optimized for computer aided design — with a professional CPU (e.g., Xeon, Ryzen), generous DRAM capacity with ECC (Error Correction Code), a significant amount of mass storage, a graphics board capable of running a range of pro applications required by a given workflow and a power supply and system enclosure sufficient to handle all of the above. Markets and use cases also matter.

Contemporary M&E requires realtime cinematic quality rendering in application viewports, with an AI denoising assist. Retiming video (e.g., from 30 fps to 120 fps) for a slow-motion effect can be done by AI, with results essentially indistinguishable from a slow-motion session on the set. A data scientist would see things differently. GPU Tensor TFLOPS enable rapid model training to achieve inference accuracy requirements, GPU memory capacity to hold extremely large datasets, and a CPU/GPU combination that offers a balanced architectural approach to performance. With so many different markets and needs, practically speaking, a workstation is a system that allows a professional to do their best work in the least amount of time. Have the hardware address that need, and you’ve got a workstation.

For users looking to buy a computer but are torn between off the shelf and building their own, what would you tell them?
As Henry Ford famously said about the Model T: “Any customer can have a car painted any color that he wants so long as it is black.” That is the off-the-shelf approach to acquiring a workstation. Large Tier 1 OEMs offer extensive product lines and daunting Configure to Order options, but ultimately, all offer similar classes of systems. Off-the-shelf is easy; once you successfully navigate the product line and specifications maze, you order a product, and a box arrives. But building your own system is not for the faint-hearted. Pick up CPU data sheets from Intel or AMD — you can read them for days.

The same applies to GPUs. System memory is easier, but mass storage offers a dizzying array of options. HDD (hard disk drive) or SSD (solid state drive)? RAID (and if so, what kind) or no RAID? How much power supply capacity is required for stable performance? A built-from-scratch workstation can result in a dream system, but with a system of one (or a few), how well will critical applications run on it? What if an essential workflow component doesn’t behave correctly? In many instances this will leave you on your own. Do you want to buy a system to perform the work you went into business to do, or do you want to spend time maintaining a system you need to do your work?

A middle path is available. A vibrant, lithe, agile and market-solutions knowledge-based system builder community exists. Vendors like Boxx Technologies, Exxact, Rave Computer, Silverdraft Supercomputing and @Xi Computer (among others) come to mind. These companies specialize in workstations (as defined by any of the definitions discussed earlier), have deep vertical knowledge, react quickly to technological advances that provide a performance and productivity edge, and vigorously support what they sell

What questions would you suggest pros ask before deciding on the right computer for their work?
Where is their current system lacking? How are these deficits affecting creativity and productivity? What use cases does a new system need to perform well? What other parts of my employment environment do I need to interact with, and what do they expect me to provide? These top-line questions transition to many others. What is the model or scene size I need to be able to fit into GPU memory to benefit from full GPU performance acceleration? Will marketing show up in my office or cubicle and ask for a photorealistic render even though a project is early in the design stage? Will a client want to interact with and request changes by using VR? Is a component of singular significance — the GPU — certified and supported by the ISVs that my workflow is built around? Answer these questions first, and you’ll find the remainder of the process goes much more easily. Use case first, last and always!

You guys have a relationship with Nvidia and your system-builder partners use their Nvidia GPUs in their workstations. Can you talk about that?
PNY is Nvidia’s sole authorized channel partner for Nvidia Quadro products throughout North America and Latin America and Europe, Middle East, Africa and India. Every Quadro board is designed, tested and built by Nvidia, whether it comes from PNY, Dell, HP or Lenovo. The difference is that PNY supports Quadro in any system brand. Tier 1 OEMs only support a Quadro board’s “slot win” in systems they build. This makes PNY a much better choice for GPU upgrades — a great way to extend the life of existing workstations — or when looking for suppliers that can deliver the technical support required for a wonderful out-of-box experience with a new system. It’s true whether the workstation is custom-built or purchased through a PNY Partner that specializes in delivering turnkey systems (workstations) built for professionals.

Can you talk about warranties and support? What do you offer?
PNY offers support for Nvidia in any system brand. We have dedicated Nvidia Quadro technical support reps available by phone or email. PNY never asks for a credit card number before offering product or technical support. We also have full access to Nvidia product and technical specialists should escalation be necessary – and direct access to the same Nvidia bug reporting system used by Nvidia employees around the world.

Finally, what trends do you see in the workstation market currently?
First the good: Nvidia Quadro RTX has enabled a workstation renaissance. It’s driving innovation for design, visualization and data science professionals across all major market segments. An entirely new class of product — the data science workstation — has been developed. Quadro RTX in the data centers and virtual GPU technology can bring the benefits of Quadro RTX to many users while protecting essential intellectual property. This trend toward workstation specialization by use case offers buyers more choices that better fit their specific criteria. Workstations — however defined — have never been more relevant or central to creative pros across the globe. Another good trend is the advent of true mobile workstations and notebooks, including thin and light systems, with up to Quadro RTX 5000 class GPUs.

The bad? With choice comes confusion. So many to choose from. Which best meets my needs? Companies with large IT staff can navigate this maze, but what about small and medium businesses? They can find the expertise necessary to make the right choice with PNY’s extensive portfolio of systems builders. For that matter, enterprises can find solutions built from the chassis up to support a given use case. Workstations are better than ever before and purchasing one can be easier than ever as well.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

AMD 2.1

Maxon to live-stream NAB news and artist presentation

With the Las Vegas NAB Show now cancelled, Maxon will be hosting a virtual NAB presence on C4DLive.com featuring a lineup of working artists. Starting on Monday, April 20, and running through Thursday, April 23, these pros — who were originally slated to appear in Las Vegas — will share production tips, techniques and inspiration and talk about working with Maxon’s Cinema 4D, Red Giant and Redshift product lines

For over a decade, Maxon has supplemented its physical booth presence with live streaming presentations. This has allowed show attendees and those unable to attend events in person, to benefit from demos, technology updates and interaction with the guest artists in real time. First up will be CEO Dave McGavran, who will talk about Maxon’s latest news and recent merger with Red Giant.

In terms of artists, Penelope Nederlander will break down her latest end credit animation for Birds of Prey; filmmaker Seth Worley will walk through some of the visual effects shots from his latest short film, Darker Colors; Doug Appleton will share the creative processes behind creating the technology for Spider-Man: Far From Home; Jonathan Winbush will demo importing C4D scenes into Unreal Engine for rendering or VR/AR output; and Veronica Falconieri Hays will share how she builds cellular landscapes and molecular structures in order to convey complex scientific stories.

The line-up of artists also includes Mike “Beeple” Winkelmann, Stu Maschwitz, EJ Hassenfratz, Chris Schmidt, Angie Feret, Kelcey Steele, Daniel “Hashi” Hashimoto, Dan Pierse, Andy Needham, Saida Saetgareeva and many more.

Additional presenters’ info and a live streaming schedule will be available at C4DLive.com.

Main Image: (L-R) Saida Saetgareeva and Penelope Nederlander


Workstations and Visual Effects

By Karen Moltenbrey

A couple of decades or so ago, studios needed the exceptional power of a machine offered by the likes of SGI for complex visual effects. Non-specialized PCs simply were not cut out for this type of work. But then a sea change occurred, and suddenly those big blue and purple boxes were being replaced with options in the form of workstations from companies such as Sun, DEC, HP, IBM and others, which offered users tremendous value for something that could conveniently fit on a desktop.

Those hardware companies began to duke it out, leading to the demise of some and the rise of others. But the big winners in this early war for 3D content creators’ business were the users. With a price point that was affordable, these workstations were embraced by facilities big and small, leading to an explosion of 3D content.

Here, we look at two VFX facilities that have taken different approaches when it comes to selecting workstations for their digital artists. NYC’s Bonfire, a boutique studio, uses a range of Boxx and custom-built machines, along with iMacs. Meanwhile, Digital Domain, an Oscar-winning VFX powerhouse, recently set up a new site in Montreal outfitted with Dell workstations.

Dave Dimeola

Bonfire
Bonfire is a relative newcomer to the industry, founded three years ago by Flame artist Brendan O’Neil, who teamed up with Dave Dimeola to get the venture off the ground. Their goal was to create a boutique-style base for those working in post production. The front end would comprise a beautiful, comfortable space where Autodesk Flame and motion graphics artists could work and interact with clients in comfortable suites within a townhouse setting, while the back end would consist of a cloud-based pipeline.

“We figured that if we combined these two things — the traditional boutique shop with the client experience and the power of a cloud-based pipeline — we’d have something,” says Dimeola, whose prior experience in leveraging the cloud proved invaluable in this regard.

Soon after, Peter Corbett, who had sold his Click 3X creative digital studio in 2018, agreed to be part of Bonfire’s advisory board. Believing Dimeola and O’Neil were on to something, Corbett came aboard as a partner and brought “some essential talent” into the company as well. Currently, Bonfire has 11 people on staff, with great talent across the gamut of production and post — from CG and creative directing to producing and more. One of the first key people who Corbett brought in was managing director Jason Mayo.

And thanks to the company’s unconventional production pipeline, it is able to expand its services with remote teams as needed. (Currently, Bonfire has more than 3,000 vetted artists in its network, with a core group of around 150 who are constantly on rotation for work.)

“It’s a game-changer in the current climate,” Dimeola says of the company’s setup. The group is performing traditional post work for advertising agencies and direct to client. “We’re doing content, commercials, social content and brand films,” says Dimeola, “anything that requires storytelling and visual communication and is design-centric.” One of the studio’s key offerings is now color grading, handled by colorist Dario Bigi. In terms of visual effects, Dimeola notes that Bonfire can indeed make animals talk or blow things up, although the VFX work Bonfire does “is more seamless, artful, abstract and weird. We get into all those facets of creation.”

Bonfire has approximately 10 workstations at its location in New York and is expanding into the second floor. The company just ordered a new set of customized PCs with Nvidia GeForce RTX 2070 Super graphic cards and some new ultra-powerful Apple iMacs, which will be used for motion graphics work and editing. The core software running on the machines includes the major industry packages: Autodesk’s Maya, Maxon’s Cinema 4D, Side Effects’ Houdini, Autodesk’s Flame, Foundry’s Nuke and Adobe’s Creative Suite, in addition to Thinkbox’s Krakatoa for particle rendering and Autodesk’s 3ds Max for architectural work. Cinema 4D motion graphics software and the Adobe software will run on the new iMac, while the more render-intensive projects within Maya, Houdini and Nuke will run on the PC network.

As Dimeola points out, workstations have taken some interesting turns in the past two to three years, and Bonfire has embraced the move from CPU-based systems to ones that are GPU-based. As such, the company’s PC workstations — a mix of Boxx and custom-built machines (with an AMD Threadripper 2950X CPU and a pair of Asus GeForce RTX 2080 Ti video cards, along with significant memory) — contain powerful Nvidia Quadro RTX 1080 cards. He attributes Bonfire’s move in processing to the changing needs of CGI rendering and lighting, which are increasingly relying on GPU power.

“We still use CPU power, however, because we feel some of the best lighting is still being done in the CPU with software like [Autodesk’s] Arnold,” Dimeola contends. “But we’re flexible enough to be using GPU-based lighting, like Otoy’s OctaneRender and Maxon’s Redshift for jobs that need to look great but also move very quickly through the pipeline. Some shops own one way of rendering, but we really keep a flexible pipeline so we can pivot and render just about any way we want based on the creative, the look, the schedule. It has to be very flexible in order for us to be efficient.”

The media will work off the SAN, a communal server that is partitioned into segments: one for the CGI, another for the editing (Flame) and a third for color (Blackmagic DaVinci Resolve). “We partitioned a cloud section for the server, which allows us to have complete control on how we sync media with an external team,” explains Dimeola. “That’s a big part of how we share, collaborate and move assets quickly with a team inside and outside and how we scale for projects. This is unique for Bonfire. It is the innovative part of the post production that I don’t think any other shops are really talking about at this time.”

In addition to the local machines and software, Bonfire also runs applications on virtual machines in the cloud. The key, says Dimeola, is knowing how to create harmony between the internal and external infrastructures. The backbone is built on Amazon Web Services (AWS) and Google Cloud Platform (GCP) and functions much the same way as its internal pipeline does. A proprietary project tracker built by Bonfire enables the teams to manage shots and assets; it also has an array of services and tools that help the staff efficiently manage projects that vary in complexity and scale.

Brooklyn Nets

“It’s no single piece to our pipeline that’s so innovative; rather, it’s the way that we’ve configured it between our talent and infrastructure,” says Dimeola, noting that in addition to being able to take on big projects, the company is able to get its clients what they need in real time and make complete changes literally overnight. Dimeola recalls a recent project for Google requiring intensive CGI fluid simulations. The team sat with the client one day to work out the direction and was able to post fresh edits, which included rendered shots, for the client the very next morning. “[In a traditional setup], that never would have been possible,” he points out.

However, getting the best deal on the cloud services requires additional work. “We play that market like the stock market, where we’re finding the best deals and configurations based on our needs at the time,” Dimeola says, and the result is an exponential increase in workflow. “You can ramp up a team and be rendering and working 24/7 because you’re using people in different time zones, and you’re never down a machine for rendering and working.”

Best of all, the setup goes unnoticed by the customer. “The client doesn’t feel or see anything different,” says Dimeola. That is, with one exception: “a dramatic change in the cost of doing the work, particularly if they are requiring a lot of speed.”

Digital Domain Montreal
A longtime creative and technological force in the visual effects industry, Digital Domain has crafted a range of work spanning feature films, video games, commercials and virtual reality experiences. With global headquarters in LA, plus locations in Vancouver, Beijing, Shanghai and elsewhere around the globe, the studio has been the driving force behind many memorable and cutting-edge projects, including the Oscar-winning The Curious Case of Benjamin Button and more. In fact, Digital Domain is known for its technological prowess within visual effects, particularly in the area of realistic digital humans, recently recreating a photoreal 3D version of Martin Luther King Jr. for a groundbreaking immersive project.

Michael Quan

A year ago, Digital Domain expanded its North American footprint by opening an office in Montreal, which celebrated its grand opening this past February. The office has approximately 100 employees, with plans to expand in the future. Most of the work produced by Digital Domain is shared by its five worldwide studios, and that trend will continue with Digital Domain Montreal, particularly with the LA and Vancouver offices; it also will tackle regional projects, focusing mostly on features and episodic content.

Setting up the Montreal site’s infrastructure fell to Digital Domain’s internal IT department, including senior systems engineer Michael Quan, who helped outfit the facility with the same classes of machines that the Los Angeles and Vancouver locations use: the Dell Precision R7920 and R7910 rack workstation PCs. “All the studios share common configuration specifications,” he notes. “Having a common specification makes it tremendously easy to move resources around when necessary.”

In fact, the majority of the machines were purchased approximately during the third quarter of 2019. Prior to that, the location was started up with resources from the facility’s other sites, and since they are using a common configuration, doing so did not present an issue.

Quan notes that the studio is able to cover all aspects of typical VFX production, such as modeling, rigging, animation, lighting, rotoscoping, texture painting, compositing and so forth, using the machines. And with some additional hardware, the office can also leverage those workstations for dailies review, he adds. As for the software, Digital Domain runs the typical programs: Autodesk’s Maya, Foundry’s Mari and Nuke, Chaos’ V-Ray, Maxon’s Redshift, Adobe’s Photoshop and so on, in addition to proprietary software.

Terminator: Dark Fate

As Quan points out, Digital Domain has specific requirements for its workstations, aside from the general CPU, RAM and hard drive specs. The machines must be able to handle the GPUs required by Digital Domain along with additional support devices. While that might seem obvious, when a requirement comes into play, it reduces the number of systems that are available for evaluation, he points out. Furthermore, the workstations must be rack-mountable and of a “reasonable” size (2U) to fit within the data center as opposed to deskside. Also, since the workstations are deployed in the data center, they must be manageable remotely.

“Preferably, it is commodity hardware, meaning using a vendor that is stable, has a relatively large market share and isn’t using some exotic technology,” Quan says, “so if necessary, we could source from secondary markets.” Unfortunately, the company learned this the hard way in the past by using a vendor that implemented custom power and management hardware; the vendor exited the market, leaving the studio without an option for repair and no secondary market to source defective parts.

Just how long Digital Domain retains its workstations depends on their performance effectiveness: If an artist can no longer work due to a resource inefficiency, Quan says, then a new round of hardware specification is initialized.

“The workstations we use are multi-processor-based, have a relatively high amount of memory and are capable of running the higher-performing professional GPUs that our work requires,” he says. “These days, ‘workstations’ could mean what would normally be called gaming rigs but with more memory, a top-end GPU and a high-clock-speed single processor. It just depends on what software will be used and the hardware configuration that is optimized for that.”

Lost in Space, Season 2

As Quan points out, graphics workstations have evolved to where they have the same capabilities as some low- to mid-class servers. “For example, the Dell R7910/R7920 that we are using definitely could be used as servers, since they share the same hardware capability as their server class,” he says. “It used to be that if you wanted performance, you might have to sacrifice manageability and footprint. Now there are systems deployed with one, eight and 10 GPU configurations in a relatively small footprint, which is a fully remotely manageable system in one of our data centers.” He predicts that workstations are evolving to a point where they will just be a specification. “In the near future, it will just be an abstract for us. Gone will be the days of one workstation equating to one physical system.”

According to Quan, the Montreal studio is still ramping up and has several major projects on the horizon, including feature films from Marvel, Sony, 20th Century Studios and more. Some of Digital Domain’s more recent work includes Avengers: Endgame, Lost in Space (Season 2), Terminator: Dark Fate and several others. Globally, its New Media and Digital Humans groups are doing incredible things, he notes, and the Ads/Games Group is producing some exceptional work as well.

“The workstations at Digital Domain have constantly evolved. We went from generic white boxes to Tier 1 systems, back to white boxes, and now to a more sophisticated Tier 1 data center-targeted ecosystem. With the evolutionary steps we are taking, we are iterating to a more efficient management of these resources,” Quan says. “One of the great advantages of having the workstations placed in a remote location is the security aspects. And on a more human level, the reduction of the fan noises and the beeps all those workstations would have created in the artist locations is notable.”


Karen Moltenbrey is a veteran writer, covering visual effects and post production.


VFX studio One Of Us adds CTO Benoit Leveau

Veteran post technologist Benoit Leveau has joined London’s One of Us as CTO. The studio, which is in its 16th year, employs 200 VFX artists.

Leveau, who joins One of Us from Milk VFX, has been in the industry for 18 years, starting out in his native France before moving to MPC in London. He then joined Prime Focus, integrating the company’s Vancouver and Mumbai pipelines with London. In 2013, he joined Milk in its opening year as head of pipeline. He helped to build that department and later led the development of Milk’s cloud rendering system.

The studio, which depends on what it calls “the efficient use of existing technology and the timely adoption of new technology,” says Leveau’s knowledge and experience will ensure that “their artists’ creativity has the technical foundation which allows it to flourish.”


Maxon plugin allows for integration of Cinema 4D assets into Unity


Maxon is now a Unity Technologies Verified Solutions Partner and is distributing a plugin for Unity called Cineware by Maxon. The new plugin provides developers and creatives with seamless integration of Cinema 4D assets into Unity. Artists can easily create models and animations in Cinema 4D for use in realtime 3D (RT3D), interactive 2D, 3D, VR and AR experiences. The Cineware by Maxon plugin is now available free of charge on the Unity Asset Store.

The plugin is compatible with Cinema 4D Release 21, the latest version of the software, and Unity’s latest release, 2019.3. The plugin does not require a license of Cinema 4D as long as Cinema 4D scenes have been “Saved for Cineware.” By default, imported assets will appear relative to the asset folder or imported asset. The plugin also supports user-defined folder hierarchies.

Cineware by Maxon currently supports Geometry:
• Vertex Position, Normals, UV, Skinning Weight, Color
• Skin and Binding Rig
• Pose Morphs as Blend Shapes
• Lightmap UV2 Generation on Import

Materials:
• PBR Reflectance Channel Materials conversion
• Albedo/Metal/Rough
• Normal Map
• Bump Map
• Emission

Animated Materials:
• Color including Transparency
• Metalness
• Roughness
• Emission Intensity, Color
• Alpha Cutout Threshold

Lighting:
• Spot, Directional, Point
• Animated properties supported:
• Cone
• Intensity
• Color

Cameras:
• Animated properties
• Field of Vision (FOV)

Main Image: Courtesy of Cornelius Dämmrich


Director Vincent Lin discusses colorful Seagram’s Escapes spot

By Randi Altman

Valiant Pictures, a New York-based production house, recently produced a commercial spot featuring The Bachelor/Bachelorette host Chris Harrison promoting Seagram’s Escapes and its line of alcohol-based fruit drinks. A new addition to the product line is Tropical Rosé, which was co-developed by Harrison and contains natural passion fruit, dragon fruit and rosé flavors.

Valiant’s Vincent Lin directed the piece, which features Harrison in a tropical-looking room — brightened with sunny pinks and yellows thanks to NYC’s Nice Shoes — describing the rosé and signing off with the Seagram’s Escapes brand slogan, “Keep it colorful!”

Here, director Lin — and his DP Alexander Chinnici — talks about the project’s conception, shoot and post.

How early did you get involved? Did Valiant act as the creative agency on this spot?
Valiant has a long-standing history with the Seagram’s Escapes brand team, and we were fortunate enough to have the opportunity to brainstorm a few ideas with them early on for their launch of Seagram’s Escapes Tropical Rosé with Chris Harrison. The creative concept was developed by Valiant’s in-house creative agency, headed by creative directors Nicole Zizila and Steven Zizila, and me. Seagram’s was very instrumental in the creative for the project, and we collaborated to make sure it felt fresh and new — like an elevated evolution of their “Keep It Colorful” campaign rather than a replacement.

Clearly, it’s meant to have a tropical vibe. Was it shot greenscreen?
We had considered doing this greenscreen, which would open up some interesting options, but also it would pose some challenges. What was important for this campaign creatively was to seamlessly take Chris Harrison to the magical world of Seagram’s Escapes Tropical Rosé. A practical approach was chosen so it didn’t feel too “out of this world,” and the live action still felt real and relatable. We had considered putting Chris in a tropical location — either in greenscreen or on location — but we really wanted to play to Chris’ personality and strengths and have him lead us to this world, rather than throw him into it. Plus, they didn’t sign off on letting us film in the Maldives. I tried (smiles).

L-R: Vincent Lin and Alex Chinnici

What was the spot shot on?
Working with the very talented DP Alex Chinnici, he recommended shooting on the ARRI Alexa for many reasons. I’ll let Alex answer this one.

Alex Chinnici: Some DPs would likely answer with something sexier  like, “I love the look!” But that is ignoring a lot of the technical realities available to us these days. A lot of these cameras are wonderful. I can manipulate the look, so I choose a camera based on other reasons. Without an on-set live, color-capable DIT, I had to rely on the default LUT seen on set and through post. The Alexa’s default LUT is my preference among the digital cameras. For lighting and everyone on the set, we start in a wonderful place right off the bat. Post houses also know it so well, along with colorists and VFX. Knowing our limitations and expecting not to be entirely involved, I prefer giving these departments the best image/file possible.

Inherently, the color, highlight retention and skin tone are wonderful right off the bat without having to bend over backward for anyone. With the Alexa, you end up being much closer to the end rather than having to jump through hoops to get there like you would with some other cameras. Lastly, the reliability is key. With the little time that we had, and a celebrity talent, I would never put a production through the risk of some new tech. Being in a studio, we had full control but still, I’d rather start in a place of success and only make it better from there.

What about the lenses?
Chinnici: I chose the Zeiss Master Primes for similar reasons. While sharp, they are not overbearing. With some mild filtration and very soft and controlled lighting, I can adjust that in other ways. Plus, I know that post will beautify anything that needs it; giving them a clean, sharp image (especially considering the seltzer can) is key.

I shot at a deeper stop to ensure that the lenses are even cleaner and sharper, although the Master Primes do hold up very well wide open. I also wanted the Seagram’s can to be in focus as much as possible and for us to be able to see the set behind Chris Harrison, as opposed to a very shallow depth of field. I also wanted to ensure little to no flares, solid contrast, sharpness across the field and no surprises.

Thanks Alex. Back to you Vincent. How did you work with Alex to get the right look?
There was a lot of back and forth between Alex and me, and we pulled references to discuss. Ultimately, we knew the two most important things were to highlight Chris Harrison and the product. We also knew we wanted the spot to feel like a progression from the brand’s previous work. We decided the best way to do this was to introduce some dimensionality by giving the set depth with lighting, while keeping a clean, polished and sophisticated aesthetic. We also introduced a bit of camera movement to further pull the audience in and to compose the shots it in a way that all the focus would be on Chris Harrison to bring us into that vibrant CG world.

How did you work with Nice Shoes colorist Chris Ryan to make sure the look stayed on point? 
Nice Shoes is always one of our preferred partners, and Chris Ryan was perfect for the job. Our creatives, Nicole and Steven, had worked with him a number of times. As with all jobs, there are certain challenges and limitations, and we knew we had to work fast. Chris is not only detail oriented, creative and a wizard with color correction, but also able to work efficiently.

He worked on a FilmLight Baselight system off the Alexa raw files. The color grading really brought out the saturation to further reinforce the brand’s slogan, “Keep It Colorful,” but also to manage the highlights and whites so it felt inviting and bright throughout, but not at all sterile.

What about the VFX? Can you talk about how that was accomplished? 
Much like the camera work, we wanted to continue giving dimensionality to the spot by having depth in each of our CG shots. Not only depth in space but also in movement and choreography. We wanted the CG world to feel full of life and vibrant in order to highlight key elements of the beverage — the flavors, dragonfruit and passionfruit — and give it a sense of motion that draws you in while making you believe there’s a world outside of it. We wanted the hero to shine in the center and the animation to play out as if a kaleidoscope or tornado was pulling you in closer and closer.

We sought the help of creative production studio Taylor James tto build the CG elements. We chose to work with a core of 3ds Max artists who could do a range of tasks using Autodesk 3ds Max and Chaos Group’s V-Ray (we also use Maya and Arnold). We used Foundry Nuke to composite all of the shots and integrate the CGI into the footage. The 3D asset creation, animation and lighting were constructed and rendered in Autodesk Maya, with compositing done in Adobe After Effects.

One of the biggest challenges was making sure the live action felt connected to the CG world, but with each still having its own personality. There is a modern and clean feel to these spots that we wanted to uphold while still making it feel fun and playful with colors and movement. There were definitely a few earlier versions that we went a bit crazy with and had to scale down a bit.

Does a lot of your work feature live action and visual effects combined?
I think of VFX like any film technique: It’s simply a tool for directors and creatives to use. The most essential thing is to understand the brand, if it’s a commercial, and to understand the story you are trying to tell. I’ve been fortunate to do a number of spots that involve live-action and VFX now, but truth be told, VFX almost always sneaks its way in these days.

Even if I do a practical effect, there are limitless possibilities in post production and VFX. Anything from simple cleanup to enhancing, compositing, set building and extending — it’s all possible. It’d be foolish not to consider it as a viable tool. Now, that’s not to say you should rely solely on VFX to fix problems, but if there’s a way it can improve your work, definitely use it. For this particular project, obviously, the CG was crucial to let us really be immersed in a magical world at the level of realism and proximity we desired.

Anything challenging about this spot that you’d like to share?
Chris Harrison was terrible to work with and refused to wear a shirt for some reason … I’m just kidding! Chris was one of the most professional, humblest and kindest celebrity talents that I’ve had the pleasure to work with. This wasn’t a simple endorsement for him; he actually did work closely with Seagram’s Escapes over several months to create and flavor-test the Tropical Rosé beverage.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 


The-Artery sees red, creates VFX for Huawei’s AppGallery

The-Artery recently worked on a global campaign for agency LH in Israel, and consumer electronics brand Huawei’s official app distribution platform, AppGallery.

The campaign — set to an original musical track called Explore It by artist Tomer Biran — is meant to show the AppGallery as more than a mobile app store, but rather as a gate to an endless world of digital content that comes with data protection and privacy.

Each scene features the platform’s signature red square logo but shown in a variety of creative ways thanks to The-Artery’s visual effects work. This includes floating Tetris-like cubes that change with the beat of the music, camera focuses, red-seated subway cars with a floating red cube and more.

“Director Eli Sverdlov, editor Noam Weissman and executive producer Kobi Hoffman all have distinct artistic processes that are unforgiving to conventional storytelling,” explains founder/executive creative director Vico Sharabani. “We had ongoing conversations about how to create a deeper connection between the brand and audiences. The agency, LH, gave us the freedom to really explore the fun, convenience and security behind downloading apps on the Huawei AppGallery.”

Filming took place across the globe in Kiev, Ukraine, via production company Jiminy Creative Tel Aviv, while editing, design, animation, visual effects and color grading were all done under one roof in The-Artery’s New York studio. The entire production was completed in only 16 days.

The studio used Autodesk’s Flame and 3DS Max, Side Effects Houdini, Adobe’s After Effects and Photoshop for the visual effects and graphics. Colorist: Steve Picano called on Blackmagic’s DaVinci Resolve. Asaf Bitton provided sound design.


Foundry Nuke 12.1 offers upgrades across product line

Foundry has released Nuke 12.1, with UI enhancements and tool improvements across the entire Nuke family. The largest update to Blink and BlinkScript in recent years improves Cara VR node performance and introduces new tools for developers, while extended functionality in the timeline-based applications speeds up and enriches artist and team review.

Here are the upgrade highlights:
– New Shuffle node updates the classic checkboxes with an artist-friendly, node-based UI that supports up to eight channels per layer (Nuke’s limit) and consistent channel ordering, offering a more robust tool set at the heart of Nuke’s multi-channel workflow.
– Lens Distortion Workflow improvements: The LensDistortion node in NukeX is updated to have a more intuitive workflow and UI, making it easier and quicker to access the faster and more accurate algorithms and expanded options introduced in Nuke 11.
– Blink and BlinkScript improvements: Nuke’s architecture for GPU-accelerated nodes and the associated API can now store data on the GPU between operations, resulting in what Foundry says are “dramatic performance improvements to chains of nodes with GPU caching enabled.” This new functionality is available to developers using BlinkScript, along with bug fixes and a debug print out on Linux.
– Cara VR GPU performance improvements: The Cara VR nodes in NukeX have been updated to take advantage of the new GPU-caching functionality in Blink, offering performance improvements in viewer processing and rendering when using chains of these nodes together. Foundry’s internal tests on production projects show rendering time that’s up to 2.4 times faster.
– Updated Nuke Spherical Transform and Bilateral: The Cara VR versions of the Spherical Transform and Bilateral nodes have been merged with the Nuke versions of these nodes, adding increased functionality and GPU support in Nuke. Both nodes take advantage of the GPU performance improvements added in Nuke 12.1. They are now available in Nuke and no longer require a NukeX license.
– New ParticleBlinkScript node: NukeX now includes a new ParticleBlinkScript node, allowing developers to write BlinkScripts that operate on particles. Nuke 12.1 ships with more than 15 new gizmos, offering a starting point for artists who work with particle effects and developers looking to use BlinkScript.
– QuickTime audio and surround sound support: Nuke Studio, Hiero and HieroPlayer now support multi-channel audio. Artists can now import Mov containers holding audio on Linux and Windows without needing to extract and import the audio as a separate Wav file.

– Faster HieroPlayer launch and Nuke Flipbook integration: Foundry says new instances of HieroPlayer launch 1.2 times faster on Windows and up to 1.5 times faster on Linux in internal tests, improving the experience for artists using HieroPlayer for review. With Nuke 12.1, artists can also use HieroPlayer as the Flipbook tool for Nuke and NukeX, giving them more control when comparing different versions of their work in progress.
– High DPI Windows and Linux: UI scaling when using high-resolution monitors is now available on Windows and Linux, bringing all platforms in line with high-resolution display support added for macOS in Nuke 12.0 v1.
– Extended ARRI camera support: Nuke 12.1 adds support for ARRI formats, including Codex HDE .arx files, ProRes MXFs and the popular Alexa Mini LF. Foundry also says there are performance gains when debayering footage on CUDA GPUs, and there’s an SDK update.

The Call of the Wild director Chris Sanders on combining live-action, VFX

By Iain Blair

The Fox family film The Call of the Wild, based on the Jack London tale, tells the story of  a big-hearted dog named Buck whose is stolen from his California home and transported to the Canadian Yukon during the Gold Rush. Director Chris Sanders called on the latest visual effects and animation technology to bring the animals in the film to life. The film stars Harrison Ford and is based on a screenplay by Michael Green.

Sanders’ crew included two-time Oscar–winning cinematographer Janusz Kaminski; production designer Stefan Dechant; editors William Hoy, ACE, and David Heinz; composer John Powell; and visual effects supervisor Erik Nash.

I spoke with Sanders — who has helmed the animated films Lilo & Stitch, The Croods and How to Train Your Dragon — about making the film, which features a ton of visual effects.

You’ve had a very successful career in animation but wasn’t this a very ambitious project to take on for your live-action debut?
It was. It’s a big story, but I felt comfortable because it has such a huge animated element, and I felt I could bring a lot to the party. I also felt up to the task of learning — and having such an amazing crew made all of that as easy as it could possibly be.

Chris Sanders on set.

What sort of film did you set out to make?
As true a version as we could tell in a family-friendly way. No one’s ever tried to do the whole story. This is the first time. Before, people just focused on the last 30 pages of the novel and focused on the relationship between Buck and John Thornton, played by Harrison. And that makes perfect sense, but what you miss is the whole origin story of how they end up together — how Buck has to learn to become a sled dog, how he meets the wolves and joins their world. I loved all that, and also all the animation needed to bring it all alive.

How early on did you start integrating post and all the visual effects?
Right away, and we began with previs.

Your animation background must have helped with all the previs needed on this. Did you do a lot of previs, and what was the most demanding sequence?
We did a ton. In animation it’s called layout, a rough version, and on this we didn’t arrive on set without having explored the sequence many times in previs. It helped us place the cameras and block it all, and we also improvised and invented on set. But previs was a huge help with any heavy VFX element, like when Thornton’s going down river. We had real canoes in a river in Canada with inertial measurement devices and inertial recorders, and that was the most extensive recording we had to do. Later in post, we had to replace the stuntman in the canoe with Thornton and Buck in an identical canoe with identical movements. That was so intensive.

 

How was it working with Harrison Ford?
The devotion to his craft and professionalism… he really made me understand what “preparing for a role” really means, and he really focused on Thornton’s back story. The scene where he writes the letter to his wife? Harrison dictated all of that to me and I just wrote it down on top of the script. He invented all that. He did that quite a few times. He made the whole experience exciting and easy.

The film has a sort of retro look. Talk about working with DP Janusz Kaminski.
We talked about the look a lot, and we both wanted to evoke those old Disney films we saw as kids —something very rich with a magical storybook feel to it. We storyboarded a lot of the film, and I used all the skills I’d learned in animation. I’d see sequences a certain way, draw it out, and sometimes we’d keep them and cut them into editorial, which is exactly what you do in animation.

How tough was the shoot? It must have been quite a change of pace for you.
You’re right. It was about 50 days, and it was extremely arduous. It’s the hardest thing I’ve ever done physically, and I was not fully prepared for how exhausted you get — and there’s no time to rest. I’d be driving to set by 4:30am every day, and we’d be shooting by 6am. And we weren’t even in the Yukon — we shot here in California, a mixture of locations doubling for the Yukon and stage work.

 

Where did you post?
All on the Fox lot, and MPC Montreal did all the VFX. We cut it in relatively small offices. I’m so used to post, as all animation is basically post. I wish it was faster, but you can’t rush it.

You had two editors — William Hoy and David Heinz. How did that work?
We sent them dailies and they divided up the work since we had so much material. Having two great voices is great, as long as everyone’s making the same movie.

What were the big editing challenges?
The creative process in editorial is very different from animation, and I was floored by how malleable this thing was. I wasn’t prepared for that. You could change a scene completely in editorial, and I was blown away at what they could accomplish. It took a long time because we came back with over three hours of material in the first assembly, and we had to crush that down to 90 minutes. So we had to lose a huge amount, and what we kept had to be really condensed, and the narrative would shift a lot. We’d take comedic bits and make them more serious and vice versa.

Visual effects play a key role. Can you talk about working on them with VFX supervisor Erik Nash.
I love working with VFX, and they were huge in this. I believe there are less than 30 shots in the whole film that don’t have some VFX. And apart from creating Buck and most of the other dogs and animals, we had some very complex visual effects scenes, like the avalanche and the sledding sequence.

L-R: Director Chris Sanders and writer Iain Blair

We had VFX people on set at all times. Erik was always there supervising the reference. He’d also advise us on camera angles now and then, and we’d work very closely with him all the time. The cameras were hooked up to send data to our recording units so that we always knew what lens was on what camera at what focal length and aperture, so later the VFX team knew exactly how to lens the scenes with all the set extensions and how to light them.

The music and sound also play a key role, especially for Buck, right?
Yes, because music becomes Buck’s voice. The dogs don’t talk like they do in Lion King, so it was critical. John Powell wrote a beautiful score that we recorded on the Newman Stage at Fox, and then we mixed at 5 Cat Studios.

Where did you do the DI, and how important is it to you?
We did it at Technicolor with colorist Mike Hatzer, and I’m pretty involved. Janusz did the first pass and set the table, and then we fine-tuned it, and I’m very happy with the rich look we got.

Do you want to direct another live-action film?
Yes. I’m much more comfortable with the idea now that I know what goes into it. It’s a challenge, but a welcome one.

What’s next?
I’m looking at all sorts of projects, and I love the idea of doing another hybrid like this.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Behind the Title: Blue Bolt VFX supervisor Richard Frazer

“If we have done our job well, the viewers should never notice the work and instead just be enjoying the storytelling.”

Name: Richard Frazer

Company: London’s BlueBolt

Can you describe your company?
For the last four years, I’ve worked at BlueBolt, a Soho-based visual effects company in London. We work on high-end TV and feature films, and our main area of specialty is creating CG environments and populating them. BlueBolt is a privately owned company run by two women, which is pretty rare. They believe in nurturing good talent and training them up to help break through the glass ceiling, if an artist is up for it.

What’s your job title?
I joined as a lead compositor with a view to becoming a 2D supervisor, and now I am one of the studio’s main core VFX supervisors.

What does that entail?
It means I oversee all of the visual effects work for a specific TV show or movie — from script stage to final delivery. That includes working with the director and DP in preproduction to determine what they would like to depict on the screen. We then work out what is possible to shoot practically, or if we need to use visual effects to help out.

I’ll then often be on the set during the shoot to make sure we correctly capture everything we need for post work. I’ll work with the VFX producer to calculate the costs and time scales of the VFX work. Finally, I will creatively lead our team of talented artists to create those rendered images and make sure it all fits in with the show in a visually seamless way.

What would surprise people the most about what falls under that title?
The staggering amount of time and effort involved by many talented people to create something that an audience should be totally unaware exists. If we have done our job well, the viewers should never notice the work and instead just be enjoying the storytelling.

How long have you been working in VFX?
For around a decade. I started out as a rotoscope artist in 2008 and then became a compositor. I did my first supervisor job back in 2012.

How has the VFX industry changed in the time you’ve been working?
A big shift has been just how much more visual effects work there is on TV shows and how much the standard of that work has improved. It used to be that TV work was looked down on as the poor cousin of feature film work. But shows like Game of Thrones have set audience expectations so much higher now. I worked on nothing but movies for the first part of my career, but the majority of my work now is on TV shows.

Did a particular film inspire you along this path in entertainment?
I grew up on ‘80s sci-fi and horror, so movies like Aliens and The Thing were definitely inspirations. This was back when effects were almost all done practically, so I wanted to get into model-making or prosthetics. The first time I remember being blown away by digital VFX work was seeing Terminator 2 at the cinema. I’ve ended up doing the type of work I dreamed of as a kid, just in a digital form.

Did you go to film school?
No, I actually studied graphic design. I worked for some time doing animation, video editing and motion graphics. I taught myself compositing for commercials using After Effects. But I always had a love of cinema and decided to try and specializing in this area. Almost all of what I’ve learned has been on the job. I think there’s no better training than just throwing yourself at the work, absorbing everything you can from the people around you and just being passionate about what you do.

What’s your favorite part of the job?
Each project has its own unique set of challenges, and every day involves creative problem-solving. I love the process of translating what only exists in someone’s imagination and the journey of creating those images in a way that looks entirely real.

I also love the mix of being at the offices one day creating things that only exist in a virtual world, while the next day I might be on a film set shooting things in the real world. I get to travel to all kinds of random places and get paid to do so!

What’s your least favorite?
There are so many moving parts involved in creating a TV show or movie — so many departments all working together trying to complete the task at hand, as well as factors that are utterly out of your control. You have to have a perfectly clear idea of what needs to be done, but also be able to completely scrap that and come up with another idea at a moment’s notice.

If you didn’t have this job, what would you be doing instead?
Something where I can be creative and make things that physically exist. I’m always in awe of people who build and craft things with their hands.

Can you name some recent projects you have worked on?
Recent work has included Peaky Blinders, The Last Kingdom and Jamestown, as well as a movie called The Rhythm Section.

What is the project that you are most proud of?
I worked on a movie called Under the Skin a few years ago, which was a very technically and creatively challenging project. It was a very interesting piece of sci-fi that people seem to either love or hate, and everyone I ask seems to have a slightly different interpretation of what it was actually about.

What tools so you use day to day?
Almost exclusively Foundry Nuke. I use it for everything from drawing up concepts to reviewing artists’ work. If there’s functionality that I need from it that doesn’t exist, I’ll just write Python code to add features.

Where do you find inspiration now?
In the real world, if you just spend the time observing it in the right way. I often find myself distracted by how things look in certain light. And Instagram — it’s the perfect social media for me, as it’s just beautiful images, artwork and photography.

What do you do to de-stress from it all?
The job can be quite mentally and creatively draining and you spend a lot of time in dark rooms staring at screens, so I try to do the opposite of that. Anything that involves being outdoors or doing something physical — I find cycling or boxing are good ways to unwind.

I recently went on a paragliding trip in the French Alps, which was great, but I found myself looking at all these beautiful views of sunsets over mountains and just analyzing how the sunlight was interacting with the fog and the atmospheric hazing. Apparently, I can never entirely turn off that part of my brain.

ILM’s virtual production platform used on The Mandalorian

To bring The Mandalorian to life, Industrial Light & Magic (ILM) and Epic Games — along with production technology partners Fuse, Lux Machina, Profile Studios, Nvidia and ARRI — have introduced a new way to shoot VFX-heavy projects in collaboration with Jon Favreau’s Golem Creations.

The new virtual production workflow allows filmmakers to capture a significant amount of complex visual effects shots in-camera using realtime game engine technology (Epic’s Unreal Engine) and LED screens to represent dynamic photoreal digital landscapes and sets with creative flexibility previously unimaginable.

Also part of the news, ILM has made its new end-to-end virtual production solution, ILM StageCraft, available for use by filmmakers, agencies and showrunners worldwide.

Over 50 percent of The Mandalorian Season 1 was filmed using this new methodology, eliminating the need for location shoots entirely. Instead, actors in The Mandalorian performed in an immersive and massive 20-foot-high by 270-degree semicircular LED video wall and ceiling with a 75-foot-diameter performance space, where the practical set pieces were combined with digital extensions on the screens.

Digital 3D environments created by ILM played back interactively on the LED walls, edited in realtime during the shoot, which allowed for pixel-accurate tracking and perspective-correct 3D imagery rendered at high resolution via systems powered by Nvidia GPUs.

L-R: Jon Favreau and Richard Bluff

The environments were lit and rendered from the perspective of the camera to provide parallax in real time, as if the camera were really capturing the physical environment with accurate interactive light on the actors and practical sets, giving showrunner Favreau; executive producer/director Dave Filoni; visual effects supervisor Richard Bluff; cinematographers Greig Fraser and Barry “Baz” Idoine and the episodic directors the ability to make concrete creative choices for visual effects-driven work during photography and achieve realtime in-camera composites on set.

The technology and workflow required to make in-camera compositing and effects practical for on-set use combined the ingenuity of all the partners involved.

“We’ve been experimenting with these technologies on my past projects and were finally able to bring a group together with different perspectives to synergize film and gaming advances and test the limits of realtime, in-camera rendering,” explains Favreau adding, “We are proud of what was achieved and feel that the system we built was the most efficient way to bring The Mandalorian to life.”

“Merging our efforts in the space with what Jon Favreau has been working toward using virtual reality and game engine technology in his filmmaking finally gave us the chance to execute the vision,” says Rob Bredow, executive creative director and head of ILM. “StageCraft has grown out of the culmination of over a decade of innovation in the virtual production space at ILM. Seeing our digital sets fully integrated, in real time on stage, providing the kind of in-camera shots we’ve always dreamed of while also providing the majority of the lighting was really a dream come true.”

Bluff adds, “Working with Kim Libreri and his Unreal team, Golem Creations, and the ILM StageCraft team has opened new avenues to both the filmmakers and my fellow key creatives on The Mandalorian, allowing us to shoot principal photography on photoreal, virtual sets that are indistinguishable from their physical counterparts while incorporating physical set pieces and props as needed for interaction. It’s truly a game-changer.”

ILM StageCraft’s production tools provide filmmakers with the combination of traditional filmmaking equipment and methodologies with all of the advantages of a fully digital workflow. With ILM StageCraft, a production can acquire many in-camera finals, allowing filmmakers immediate and complete creative control of work typically handed off and reinterpreted in post, improving the quality of visual effects shots with perfectly integrated elements and reducing visual effects requirements in post, which is a major benefit considering today’s compressed schedules.

MPI restores The Wizard of Oz in 4K HDR

By Barry Goch

The classic Victor Fleming-directed film The Wizard of Oz, which was released by MGM in 1939 and won two of its six Academy Award nominations, has been beautifully restored by Burbank’s Warner Bros. Motion Picture Imaging (MPI).

Bob Bailey

To share its workflow on the film, MPI invited a group of journalists to learn about the 4K UHD HDR restoration of this classic film. The tour guide for our high-tech restoration journey was MPI’s VP of operations and sales Bob Bailey, who walked us through the entire restoration process — from the original camera negative to final color.

The Wizard of Oz, which starred Judy Garland, was shot on a Technicolor three-strip camera system. According to Bailey, it ran three black and white negatives simultaneously. “That is why it is known as three-strip Technicolor. The magazine on top of the camera was triple the width of a normal black and white camera because it contained each roll of negative to capture your red, green and blue records,” explained Bailey.

“When shooting in Technicolor, you weren’t just getting the camera. You would rent a package that included the camera, a camera crew with three assistants, the film, the processing and a Technicolor color consultant.”

George Feltenstein, SVP of theatrical catalog marketing for Warner Bros. Home Entertainment, spoke about why the film was chosen for restoration. “The Wizard of Oz is among the crown jewels that we hold,” he said. “We wanted to embrace the new 4K HDR technology, but nobody’s ever released a film that old using this technology. HDR, or high dynamic range, has a color range that is wider than anything that’s come before it. There are colors [in The Wizard of Oz] that were never reproducible before, so what better a film to represent that color?”

Feltenstein went on to explain that this is the oldest film to get released in the 4K format. He hopes that this is just the beginning and that many of the films in Warner Bros.’ classic library will also be released on 4K HDR and worked on at MPI under Bailey’s direction.

The Process
MPI scanned each of the three-strip Technicolor nitrate film negatives at 8K 16-bit, composited them together and then applied a new color grain. The film was rescanned with the Lasergraphics Director 10K scanner. “We have just under 15 petabytes of storage here,” said Bailey. “That’s working storage, because we’re working on 8K movies since [some places in the world] are now broadcasting 8K.”

Steven Anastasi

Our first stop was to look at the Lasergraphics Director. We then moved on to MPI’s climate-controlled vault, where we were introduced to Steven Anastasi, VP of technical operations at Warner Bros. Anastasi explained that the original negative vault has climate-controlled conditions with 25% humidity at 35 degrees Fahrenheit, which is the combination required for keeping these precious assets safe for future generations. He said there are 2 million assets in the building, including picture and sound.

It was amazing to see film reels for 2001: A Space Odyssey sitting on a shelf right in front of me. In addition to the feature reels, MPI also stores millions of negatives captured throughout the years by Warner productions. “We also have a very large library,” reported Anastasi. “So the original negatives from the set, a lot of unit photography, head shots in some cases and so forth. There are 10 million of these.”

Finally, we were led into the color bay to view the film. Janet Wilson, senior digital colorist at MPI, has overseen every remaster of The Wizard of Oz for the past 20 years. Wilson used a FilmLight Baselight X system for the color grade. The grading suite housed multiple screens: a Dolby Pulsar for the Dolby Vision pass, a Sony X300 and a Panasonic EZ1000 OLED 4K HDR.

“We have every 4K monitor manufactured, and we run the film through all of them,” said Bailey. “We painstakingly go through the process from a post perspective to make sure that our consumers get the best quality product that’s available out in the marketplace.”

“We want the consumer experience on all monitors to be something that’s taken into account,” added Feltenstein. “So we’ve changed our workflow by having a consumer or prosumer monitor in these color correction suites so the colorist has an idea of what people are going to see at home, and that’s helped us make a better product.”

Our first view of the feature was a side-by-side comparison of the black and white scanned negative and the sepia color corrected footage. The first part of the film, which takes place in Kansas, was shot in black and white, and then a sepia look was applied to it. The reveal scene, when Dorothy passes through the door going into Oz, was originally shot in color. For this new release, the team generated a matte so Wilson could add this sepia area to the inside of the house as Dorothy transitioned into Oz.

“So this is an example of some of the stuff that we could do in this version of the restoration,” explained Wilson. “With this version, you can see that the part of the image where she’s supposed to be in the monochrome house is not actually black and white. It was really a color image. So the trick was always to get the interior of the house to look sepia and the exterior to look like all of the colors that it’s supposed to. Our visual effects team here at MPI — Mike Moser and Richie Hiltzik — was able to draw a matte for me so that I could color inside of the house independently of the exterior and make them look right, which was always a really tricky thing to do.”

Wilson referred back to the Technicolor three-strip, explaining that because you’ve got three different pieces of film — the different records — they’re receiving the light in different ways. “So sometimes one will be a little brighter than the other. One will be a little darker than the other, which means that the Technicolor is not a consistent color. It goes a little red, and then it goes a little green, and then it goes a little blue, and then it goes a little red again. So if you stop on any given frame, it’s going to look a little different than the frames around it, which is one of the tricky parts of color correcting technical art. When that’s being projected by a film projector, it’s less noticeable than when you’re looking at it on a video monitor, so it takes a lot of little individual corrections to smooth those kinds of things out.”

Wilson reported seeing new things with the 8K scan and 4K display. “The amount of detail that went into this film really shows up.” She said that one of the most remarkable things about the restoration was the amazing detail visible on the characters. For the first time in many generations, maybe ever, you can actually see the detail of the freckles on Dorothy’s face.

In terms of leveraging the expanded dynamic range of HDR, I asked Wilson if she tried to map the HDR, like in kind of a sweet spot, so that it’s both spectacular yet not overpowering at the same time.

“I ended up isolating the very brightest parts of the picture,” she replied. “In this case, it’s mostly the sparkles on their shoes and curving those off so I could run those in, because this movie is not supposed to have modern-day animation levels of brightness. It’s supposed to be much more contained. I wanted to take advantage of brightness and the ability to show the contrast we get from this format, because you can really see the darker parts of the picture. You can really see detail within the Wicked Witch’s dress. I don’t want it to look like it’s not the same film. I want it to replicate that experience of the way this film should look if it was projected on a good print on a good projector.”

Dorothy’s ruby slippers also presented a challenge to Wilson. “They are so red and so bright. They’re so light-reflective, but there were times when they were just a little too distracting. So I had to isolate this level at the same track with slippers and bring them down a little bit so that it wasn’t the first and only thing you saw in the image.”

If you are wondering if audio was part of this most recent restoration, the answer is no, but it had been remastered for a previous version. “As early at 1929, MGM began recording its film music using multiple microphones. Those microphonic angles allowed the mixer to get the most balanced monophonic mix, and they were preserved,” explained Feltenstein. “Twenty years ago, we created a 5.1 surround mix that was organically made from the original elements that were created in 1939. It is full-frequency, lossless audio, and a beautiful restoration job was made to create that track so you can improve upon what I consider to be close to perfection without anything that would be disingenuous to the production.”

In all, it was an amazing experience to go behind the scenes and see how the wizards of MPI created a new version of this masterpiece for today and preserved it for future generations.

This restored version of The Wizard of Oz is a must-see visual extravaganza, and there is no better way to see it than in UHD, HDR, Dolby Vision or HDR10+. What I saw in person took my breath away, and I hope every movie fan out there can have the opportunity to see this classic film in its never-before-seen glory.

The 4K version of The Wizard of Oz is currently available via an Ultra HD Blu-ray Combo Pack and digital.


Barry Goch is a finishing artist at LA’s The Foundation as well as a UCLA Extension Instructor, Post Production. You can follow him on Twitter at @Gochya

Kevin Lau heads up advertising, immersive at Digital Domain

Visual effects studio Digital Domain has brought on Kevin Lau as executive creative director of advertising, games and new media. In this newly created position, Lau will oversee all short-form projects and act as a creative partner for agencies and brands.

Lau brings over 18 years of ad-based visual effects and commercial production experience, working on campaigns for brands such as Target, Visa and Sprint.

Most recently, he was the executive creative director and founding partner at Timber, an LA-based studio focused on ads (GMC, Winter Olympics) and music videos (Kendrick Lamar’s Humble). Prior to that, he held creative director positions at Mirada, Brand New School and Superfad. Throughout his career, his work has been honored with multiple awards including Clios, AICP Awards, MTV VMAs and a Cannes Gold Lion for Sprint’s “Now Network” campaign via Goodby.

Lau, who joins Digital Domain EPs Nicole Fina and John Canning as they continue to build the studio’s short-form business, will help unify the vision for the advertising, games and new media/experiential groups, promoting a consistent voice across campaigns.

Lau joins the team as the new media group prepares to unveil its biggest project to date: Time’s The March, a virtual reality recreation of the 1963 March on Washington for Jobs and Freedom. Digital Domain’s experience with digital humans will play a major role in the future of both groups as they continue to build on the photoreal cinematics and in-game characters previously created for Activision, Electronic Arts and Ubisoft.

Visible Studios produces, posts Dance Monkey music video

If you haven’t heard about the Dance Monkey song by Tones and I, you soon will.  Australia’s Visible Studios provided production and post on the video to go with the song that has hit number one in more than 30 countries, went seven times platinum and remained at the top of the charts in Australia for 22 weeks. The video has been viewed on YouTube more than half a billion times.

Visible Studios, a full production and post company, is run by producer Tim Whiting and director and editor Nick Kozakis. The company features a team of directors, scriptwriters, designers, motion graphic artists and editors working on films, TV commercials and music videos.

For Dance Monkey, Visible Studios worked directly with Tones and I to develop the idea for the video. The video, which was shot on Red cameras at the beginning of the song’s meteoric rise, was completed in less than a week and on a small budget.

“The Dance Monkey music video was made on an extremely quick turnaround,” says Whiting. “[Tones] was blowing up at the time, and they needed the music video out fast. The video was shot in one day, edited in two, with an extra day and a half for color and VFX.”  Visible Studios called on Blackmagic Resolve studio for edit, VFX and color.

Dance Monkey features the singer dressed as Old Tones, an elderly man whisked away by his friends to a golf course to dance and party. On the day of production, the sun was nowhere to be found, and each shot was done against a gray and dismal background. To fix this, the team brought in a sky image as a matte and used Resolve’s match move tool, keyer, lens blur and power windows to turn gray footage to brilliant sunshine.

“In post we decided to replace the overcast skies with a cloudy blue sky. We ended up doing this all in Resolve’s color page and keyed the grass and plants to make them more lush, and we were there,” says Whiting.

Editor/directors Kozakis and Liam Kelly used Resolve for the entire editing process. “Being able to edit 6K raw footage smoothly on a 4K timeline, at a good quality debayer, means that we don’t have to mess around with proxies and that the footage gets out of the way of the editing process. The recent update for decompression and debayer on Nvidia cards has made this performance even better,” Kozakis says.

 

Missing Link, The Lion King among VES Award winners

The Visual Effects Society (VES), the industry’s global professional honorary society, held its 18th Annual VES Awards, the yearly celebration that recognizes outstanding visual effects artistry and innovation in film, animation, television, commercials, video games and special venues.

Comedian Patton Oswalt served as host for the 9th time to the more than 1,000 guests gathered at the Beverly Hilton to celebrate VFX talent in 25 awards categories. The Lion King was named the photoreal feature winner, garnering three awards. Missing Link was named top animated film, winning two awards. The Mandalorian was named best photoreal episode and garnered two awards, with Game of Thrones and Stranger Things 3 also winning two awards each. Hennessy: The Seven Worlds topped the commercial field with two wins.

Andy Serkis presented the VES Award for Creative Excellence to visual effects supervisor Sheena Duggal. Joey King presented the VES Visionary Award to director-producer-screenwriter Roland Emmerich. VFX supervisor Pablo Helman presented the Lifetime Achievement Award to director/producer/screenwriter Martin Scorsese, who accepted via video from New York. Scorsese’s The Irishman also picked up two awards, including Outstanding Supporting Visual Effects in a Photoreal Feature.

Presenters also included: directors J.J. Abrams, Jon Favreau, Rian Johnson and Josh Cooley.

Winners of the 18th Annual VES Awards are as follows:

Outstanding Visual Effects in a Photoreal Feature

THE LION KING

Robert Legato

Tom Peitzman

Adam Valdez

Andrew R. Jones

 

Outstanding Supporting Visual Effects in a Photoreal Feature

THE IRISHMAN

Pablo Helman

Mitchell Ferm

Jill Brooks

Leandro Estebecorena

Jeff Brink

 

Outstanding Visual Effects in an Animated Feature

MISSING LINK

Brad Schiff

Travis Knight

Steve Emerson

Benoit Dubuc

 

Outstanding Visual Effects in a Photoreal Episode

THE MANDALORIAN; The Child

Richard Bluff

Abbigail Keller

Jason Porter

Hayden Jones

Roy K. Cancino

 

Outstanding Supporting Visual Effects in a Photoreal Episode

CHERNOBYL; 1:23:45

Max Dennison

Lindsay McFarlane

Clare Cheetham

Paul Jones

Claudius Christian Rauch

 

Outstanding Visual Effects in a Real-Time Project

Control

Janne Pulkkinen

Elmeri Raitanen

Matti Hämäläinen

James Tottman

 

Outstanding Visual Effects in a Commercial

Hennessy: The Seven Worlds

Carsten Keller

Selçuk Ergen

Kiril Mirkov

William Laban

 

Outstanding Visual Effects in a Special Venue Project

Star Wars: Rise of the Resistance

Jason Bayever

Patrick Kearney

Carol Norton

Bill George

 

Outstanding Animated Character in a Photoreal Feature

ALITA: BATTLE ANGEL; Alita

Michael Cozens

Mark Haenga

Olivier Lesaint

Dejan Momcilovic

 

Outstanding Animated Character in an Animated Feature

MISSING LINK; Susan

Rachelle Lambden

Brenda Baumgarten

Morgan Hay

Benoit Dubuc

 

Outstanding Animated Character in an Episode or Real-Time Project

STRANGER THINGS 3; Tom/Bruce Monster

Joseph Dubé-Arsenault

Antoine Barthod

Frederick Gagnon

Xavier Lafarge

 

Outstanding Animated Character in a Commercial

Cyberpunk 2077; Dex

Jonas Ekman

Jonas Skoog

Marek Madej

Grzegorz Chojnacki

 

Outstanding Created Environment in a Photoreal Feature

THE LION KING; The Pridelands

Marco Rolandi

Luca Bonatti

Jules Bodenstein

Filippo Preti

 

Outstanding Created Environment in an Animated Feature

TOY STORY 4; Antiques Mall

Hosuk Chang

Andrew Finley

Alison Leaf

Philip Shoebottom

 

Outstanding Created Environment in an Episode, Commercial, or Real-Time Project

GAME OF THRONES; The Iron Throne; Red Keep Plaza

Carlos Patrick DeLeon

Alonso Bocanegra Martinez

Marcela Silva

Benjamin Ross

 

Outstanding Virtual Cinematography in a CG Project

THE LION KING

Robert Legato

Caleb Deschanel

Ben Grossmann

AJ Sciutto

 

Outstanding Model in a Photoreal or Animated Project

THE MANDALORIAN; The Sin; The Razorcrest

Doug Chiang

Jay Machado

John Goodson

Landis Fields IV

 

Outstanding Effects Simulations in a Photoreal Feature

STAR WARS: THE RISE OF SKYWALKER

Don Wong

Thibault Gauriau

Goncalo Cababca

François-Maxence Desplanques

 

Outstanding Effects Simulations in an Animated Feature

FROZEN 2

Erin V. Ramos

Scott Townsend

Thomas Wickes

Rattanin Sirinaruemarn

 

Outstanding Effects Simulations in an Episode, Commercial, or Real-Time Project

STRANGER THINGS 3; Melting Tom/Bruce

Nathan Arbuckle

Christian Gaumond

James Dong

Aleksandr Starkov

 

Outstanding Compositing in a Feature

THE IRISHMAN

Nelson Sepulveda

Vincent Papaix

Benjamin O’Brien

Christopher Doerhoff

 

Outstanding Compositing in an Episode

GAME OF THRONES; The Long Night; Dragon Ground Battle

Mark Richardson

Darren Christie

Nathan Abbott

Owen Longstaff

 

Outstanding Compositing in a Commercial

Hennessy: The Seven Worlds

Rod Norman

Guillaume Weiss

Alexander Kulikov

Alessandro Granella

 

Outstanding Special (Practical) Effects in a Photoreal or Animated Project

THE DARK CRYSTAL: THE AGE OF RESISTANCE; She Knows All the Secrets

Sean Mathiesen

Jon Savage

Toby Froud

Phil Harvey

 

Outstanding Visual Effects in a Student Project

THE BEAUTY

Marc Angele

Aleksandra Todorovic

Pascal Schelbli

Noel Winzen

 

VFX-heavy Skyworth OLED TV spot via The-Artery

The-Artery created a spot for Skyworth’s latest version of its W81|W81 Pro Wallpaper OLED TV, which debuted last month at the “See the Wonder” event at CES 2020.

Created using The-Artery‘s newly opened Resolve-based color room and expanded design capabilities —spearheaded by colorist Stephen Picano and design director Lauren Indovina — the commercial features a couple swimming through space-like waters, children battling origami dragons while floating in a paper boat and a traveler treking through snowy tundras while glowing jellyfish float overhead. Publicis, Skyworth’s agency, wanted the ad to reflect “the wonder” of the company’s newest television model.

“The campaign, helmed by director Eli Sverdlov, was very director-led in a way that I’ve never seen before,” explains The-Artery’s EP/MD, Deborah Sullivan. “Of course, there was still ongoing dialogue with the client and agency, but the level of creative control that was entrusted is almost unheard of. Everything was open from start to finish, including the ideation phase, color grading and design — to name a few. Our team had a lot of fun jumping straight into the edit to develop and launch what we consider as a high-end conceptual throwback to the nineties.”

Sverdlov agrees: “Our flexible creative process was in a condensed schedule and required a very unique collaboration. We were practically creating the ideas and visuals while editing and sourcing footage.”

Due to the production’s long shooting schedule and tight deadlines, the visual effects were designed via Autodesk Flame in realtime, all under one roof, while filming took place in Serbia. Additional footage was carefully curated as well as color graded and cut to fit the tone and flow of the rest of the piece. The creature imagery such as the jellyfish was done via CG.

In addition to Flame and Resolve, The-Artery called on SideFX Houdini, Autodesk Maya, Maxon’s RedShift, Otoy’s Octane, Autodesk’s Arnold, Adobe After Effects and Maxon’s Cinema 4D.

Framestore launches FPS preproduction services

VFX studio Framestore has launched FPS (Framestore Pre-production Services) for the global film and content production industries. An expansion of Framestore’s existing capability, FPS is available to clients in need of standalone preproduction support or an end-to-end production solution.

The move builds out and aligns the company’s previz, virtual production, techviz and postviz services with Framestore’s art department (which operates either as part of the Framestore workflow or as a stand-alone creative service), virtual production team and R&D unit, and integrates with the company’s VFX and animation teams. The move builds on work on films such as Gravity and the knowledge gained during the company’s eight-year London joint venture with visualization company The Third Floor. FPS is working on feature film projects as part of an integrated offering and as a standalone visualization partner, with more projects slated in the coming months.

The new team is led by Alex Webster, who joins as FPS managing director after running The Third Floor London. He will report directly to Fiona Walkinshaw, Framestore’s global managing director, film.

“This work aligns Framestore’s singular VFX and animation craft with a granular understanding of the visualization industry,” says Webster. “It marries the company’s extraordinary legacy in VFX with established visualization and emergent virtual production processes, supported by bleeding-edge technology and dedicated R&D resource to inform the nimble approach which our clients need. Consolidating our preproduction services represents a significant creative step forward.”

“Preproduction is a crucial stage for filmmakers,” says chief creative officer Tim Webber. “From mapping out environments to developing creatures and characters to helping plot action sequences it provides unparalleled freedom in terms of seeing how a story unfolds or how characters interact with the worlds we create. Bringing together our technical innovation with an understanding of filmmaking, we want to offer a bespoke service for each film and each individual to help tell compelling, carefully crafted stories.”

“Our clients’ needs are as varied as the projects they bring to us, with some needing a start-to-finish service that begins with concept art and ends in post while others want a bespoke, standalone solution to specific creative challenges, be that in early stage concepting, through layout and visualization or in final animation and VFX” says Framestore CEO William Sargent. “It makes sense to bring all these services in-house — even more so when you consider how our work in adjacent fields like AR, VR and MR has helped the likes of HBO, Marvel and Warner Bros. bring their IP to new, immersive platforms. What we’ll ultimately deliver goes well beyond previz and beyond visualization.”

Main Image: (L-R) Tim Webber, Fiona Walkinshaw and Alex Webster.

FXhome’s HitFilm Express 14, ‘Pay What You Want’ option

FXhome has a new “Pay What You Want” good-will program inspired by the HitFilm Express community’s requests to be able to help pay for development of the historically free video editing and VFX software. Pay What You Want gives users the option to contribute financially, ensuring that those funds will be allocated for future development and improvements to HitFilm.

Additionally, FXhome will contribute a percentage of the proceeds of Pay What You Want to organizations dedicated to global causes important to the company and its community. At its launch, the FXhome Pay What You Want initiative will donate a portion of its proceeds to the WWF and the Australia Emergency Bushfire Fund. The larger the contribution from customers, the more FXhome will donate.

HitFilm Express remains a free download, however, first-time customers will now have the option to “Pay What You Want” on the software. They’ll also receive some exclusive discounts on HitFilm add-on packs and effects.

Coinciding with the release of Pay What You Want, FXhome is releasing HitFilm Express 14, the first version of HitFilm Express to be eligible for the Pay What You Want initiative. HitFilm Express 14 features a new and simplified export process, new text controls, a streamlined UI and a host of new features.

For new customers who would like to download HitFilm Express 14 and also contribute to the Pay What You Want program, there are three options available:

• Starter Pack Level: With a contribution as little as $9, new HitFilm Express 14 customers will also receive a free Starter Pack of software and effects that includes:
o Professional dark mode interface
o Edit tools including Text, Split Screen Masking, PiP, Vertical Video, Action Cam Crop
o Color tools including Exposure, Vibrance, Shadows and Highlights, Custom Gray, Color Phase, Channel Mixer and 16-bit color
o Additional VFX packs including Shatter, 3D Extrusion, Fire, Blood Spray and Animated Lasers
• Content Creator Level: With contributions of $19 or more, users will receive everything included in the Starter Pack, as well as:
o Edit: Repair Pack with Denoise, Grain Removal and Rolling Shutter
o Color: LUT Pack with LUTs and Grading Transfer
o Edit: Beautify Pack with Bilateral Blur and Pro Skin Retouch
• VFX Artist Level: Users who contribute from $39 to $99 get everything in the Starter Pack and Content Creator levels plus:
o Composite Toolkit Pack with Wire Removal, Projector, Clone and Channel Swapper
o Composite Pro-Keying Pack for Chroma Keying
o Motion Audio Visual Pack with Atomic Particles, Audio Spectrum and Audio Waveform
o VFX Neon Lights Pack with Lightsword Ultra (2-Point Auto), Lightsword Ultra (4-Point Manual), Lightsword Ultra (Glow Only) and Neon Path
o VFX Lighting Pack with Anamorphic Lens Flares, Gleam, Flicker and Auto Volumetrics

What’s new in HitFilm Express 14
HitFilm Express 14 adds a number of VFX workflow enhancements to enable even more sophisticated effects for content creators, including a simplified export workflow that allows users to export content directly from the timeline and comps, new text controls, a streamlined UI and a host of new features. Updates include:

• Video Textures for 3D Models: For creators who already have the 3D: Model Render Pack, they can now use a video layer as a texture on a 3D model to add animated bullet holes, cracked glass or changing textures.
• Improvements to the Export Process: In HitFilm Express 14, the Export Queue is now an Export Panel, and is now much easier to use. Exporting can also now be done from the timeline and from comps. These “in-context” exports will export the content between the In and Out points set or the entire timeline using the current default preset (which can be changed from the menu).
• Additional Text Controls: Customizing text in HitFilm Express 14 is now even simpler, with Text panel options for All Caps, Small Caps, Subscript and Superscript. Users can also change the character spacing, horizontal or vertical scale, as well as baseline shift (for that Stranger-Things-style titling).
• Usability and Workflow Enhancements: In addition to the new and improved export process, FXhome has also implemented new changes to the interface to further simplify the entire post production process, including a new “composite button” in the media panel, double-click and keyboard shortcuts. A new Masking feature adds new automation to the workflow; when users double-click the Rectangle or Ellipse tools, a centered mask is automatically placed to fill the center of the screen. Masks are also automatically assigned colors, which can be changed to more easily identify different masks.
• Effects: Users can now double-click the effects panel to apply to the selected layer and drop 2D effects directly onto layers in the viewer. Some effects — such as the Chroma Key and Light Flares — can be dropped on a specific point, or users can select a specific color to key by. Users can also now favorite “effects” for quick and easy access to their five most recently used effects from the ‘Effects’ menu in the toolbar.
• Additional Improvements: Users can now use Behavior effects from the editor timeline, click-drag across multiple layers to toggle “solo,” “locked” or “visibility” settings in one action, and access templates directly from the media panel with the new Templates button. Menus have also been added to the tab of each panel to make customization of the interface easier.
• Open Imerge Pro files in HitFilm: Imerge Pro files can now be opened directly from HitFilm as image assets. Any changes made in the Imerge Pro project will be automatically updated with any save, making it easier to change image assets in real time.
• Introducing Light Mode: The HitFilm Express interface is now available in Light Mode and will open in Light Mode the first time you open the software. Users with a pre-existing HitFilm Express license can easily change back to the dark theme if desired.

HitFilm Express 14 is available immediately and is a free download. Customers downloading HitFilm Express 14 for the first time are eligible to participate in the new Pay What You Want initiative. Free effects and software packs offered in conjunction with Pay What You Want are only available at initial download of HitFilm Express 14.

Rob Legato talks The Lion King‘s Oscar-nominated visual effects

By Karen Moltenbrey

There was a lot of buzz before — and after — this summer’s release of Disney’s remake of the animated classic The Lion King. And what’s not to love? From the animals to the African savannas, Disney brought the fabled world of Simba to life in what is essentially a “live-action” version of the beloved 1994 2D feature of the same name. Indeed, the filmmakers used tenets of live-action filmmaking to create The Lion King, and themselves call it a visual effects film. However, there are those who consider this remake, like the original, an animated movie, as 2019’s The Lion King used cutting-edge CGI for the photoreal beasts and environments.

Rob Legato

Whether you call it “live action” or “animation,” one thing’s for sure. This is no ordinary film. And, it was made using no ordinary production process. Rather, it was filmed entirely in virtual reality. And it’s been nominated for a Best Visual Effects Oscar this year.

“Everything in it is a visual effect, created in the same way that we would make a visual effects-oriented film, where we augment or create the backgrounds or create computer-generated characters for a scene or sequence. But in this case, that spanned the entire movie,” says VFX supervisor Rob Legato. “We used a traditional visual effects pipeline and hired MPC, which is a visual effects studio, not an animation house.”

MPC, which created the animals and environments, crafted all the elements, which were CG, and handled the virtual production, working with Magnopus to develop the necessary tools that would take the filmmakers from previz though shooting and, eventually, into post production. Even the location scouting occurred within VR, with Legato, director Jon Favreau and others, including cinematographer Caleb Deschanel, simultaneously walking through the sets and action by using HTC Vive headsets.

Caleb Deschanel (headset) and Rob Legato. Credit: Michael Legato

The Animations and Environments
MPC, known for its photorealistic animals and more, had worked with Disney and Favreau on the 2016 remake of The Jungle Book, which was shot within a total greenscreen environment and used realistic CG characters and sets with the exception of the boy Mowgli. (It also used VR, albeit for previsualization only.) The group’s innovative effort for that work won an Oscar for visual effects. Apparently that was just the tip of the spear, so to speak, as the team upped its game with The Lion King, making the whole production entirely CG and taking the total filmmaking process into virtual reality.

“It had to look as believable as possible. We didn’t want to exaggerate the performances or the facial features, which would make them less realistic,” says Legato of the animal characters in The Lion King.

The CG skeletons were built practically bone for bone to match their real-life counterparts, and the digital fur matched the hair variations of the various species found in nature. The animators, meanwhile, studied the motion of the real-life animals and moved the digital muscles accordingly.

“Your eye picks up when [the animal] is doing something that it can’t really do, like if it stretches its leg too far or doesn’t have the correct weight distribution that’s affecting the other muscles when it puts a paw down,” says Legato, contending that it is almost impossible to tell the CG version of the characters from the real thing in a non-talking shot or a still frame.

To craft the animals and environments, the MPC artists used Autodesk’s Maya as the main animation program, along with SideFX Houdini for water and fire simulations and Pixar’s RenderMan for rendering. MPC also used custom shaders and tools, particularly for the fur, mimicking that of the actual animal. “A lion has so many different types of hair — short hair around the body, the bushy mane, thick eyebrow hairs and whiskers. And every little nuance was recreated and faithfully reproduced,” Legato adds.

MPC artists brought to life dozens and dozens of animals for the film and then generated many more unique variations — from lions to mandrills to hyenas to zebras and more, even birds and bugs. And then the main cast and background animals were placed within a photoreal environment, where they were shot with virtual cameras that mimicked real cameras.

The world comprises expansive, open landscapes. “There were many, many miles of landscapes that were constructed,” says Legato. The filmmakers would film within pockets that were dressed and populated for different scenes, from Pride Rock to the interior of a cave to the savanna to the elephant graveyard — all built in CGI.

“Everything was simulated to be the real thing, so the sum total of the illusion is that it’s all real. And everything supports each other — the grounds, the characters, what they are physically doing. The sum total of that adds up to where your brain just says, ‘OK, this must be real. I’ll stop looking for flaws and will now just watch the story,’” says Legato. “That was the creative intent behind it.”

Virtual Production
All the virtual camera work was accomplished within Unity’s engine, so all the assets were ported in and out of that game engine. “Everyone would then know where our cameras were, what our camera moves were, how we were following the action, our lens choices, where the lights were placed … all those things,” says Legato.

Magnopus created the VR tools specific for the film, which ran on top of Unity to get the various work accomplished, such as the operation of the cameras. “We had a crane, dolly and other types of cameras encoded so that it basically drove its mate in the computer. For instance, we created a dolly and then had a physical dolly with encoders on it, so everything was hand operated, and we had a dolly grip and a camera assistant pulling focus. There was someone operating the cameras, and sometimes there was a crane operator. We did SteadiCam as well through an actual SteadiCam with a sensor on it to work with OptiTrack [motion capture that was used to track the camera],” explains Legato. “We built a little rig for the SteadiCam as well as one for a drone we’d fly around the stage, and we’d create the illusion that it was a helicopter shot while flying around Africa.”

Because the area within VR was so vast, a menu system was created so the filmmakers could locate one another within the virtual environment, making location scouting much easier. They also could take snapshots of different areas and angles and share them with the group. “We were standing next to each other [on stage], but within the virtual environment, we could be miles apart and not see each other because we’re maybe behind trees or rocks.”

As Legato points out, the menu tool is pretty robust. “We basically built a game of film production. Everything was customizable,” he says. Using iPads, the group could play the animation. As the camera was in operation, they could stop the animation, wind it backward, speed it forward, shoot it in slow motion or faster motion. “These options were all accessible to us,” he adds.

Legato provides the following brief step-by-step overview of how the virtual production occurred. First, the art department created the sets — Africa with the trees, ponds, rivers, mountains, waterfalls and so forth. “Based on the script, you know somewhat where you need to be [in the set],” he says. Production designer James Chinlund would make a composite background, and then they — along with Favreau, Deschanel and animation supervisor Andrew Jones — would go into VR.

“We had built these full-size stationary chess pieces of the animals, and in VR, we’d have these tools that let us grab a lion, for instance, or a meerkat, and position them, and then we’d look through the lens and start from there,” says Legato. “We would either move them by hand or puppeteer a simple walk cycle to get the idea of the blocking.”

Jones and his team would animate that tableau and port it back into the game engine as an animation cycle. “We’d find camera angles and augment them. We’d change some of the animation or slow it down or move the animals in slightly different positions. And then we’d shoot it like it’s on a live-action stage,” explains Legato. “We’d put a dolly track down, cover the action with various types of lenses, create full-coverage film dailies… We could shoot the same scene in as many different angles as we’d wish. We could then play it out to a video deck and start editing it right away.” The shots they liked might get rendered with more light or motion blur, but a lot of the time, they’d go right off the video tap.

Meanwhile, MPC recorded everything the filmmakers did and moved — every leaf, rock, tree, animal. Then, in post, all of that information would be reconverted back into Maya sets and the animation fine-tuned.

“In a nutshell, the filmmakers were imparting a live-action quality to the process — by not faking it, but by actually doing it,” says Legato. “And we still have the flexibility of full CGI.”

The Same, But Different
According to Legato, it did not take the group long to get the hang of working in VR. And the advantages are many — chief among them, time savings when it comes to planning and creating the sequence editorially, and then instantly being able to reshoot or iterate the scene inexpensively. “There is literally no downside to exploring a bold choice or an alternate angle on the concept,” he points out.

Yes, virtual filmmaking is the future, contends Legato.

So, back to the original question: Is The Lion King a VFX film or an animated film? “It’s perhaps a hybrid,” says Legato. “But, if you didn’t know how we did it and if the animals didn’t talk, you’d think it was done in the traditional manner of a live-action film. Which it is, visually speaking. You wouldn’t necessarily describe it as looking like ‘an animated film’ because it doesn’t really look like an animated film, like a Pixar or DreamWorks movie. By labeling it as such, you’re putting it into a hole that it’s not. It’s truly just a movie. How we achieved it is immaterial, as it should be.”

Legato and his colleagues call it “live action,” which it truly is. But some, including the Golden Globes, categorized it as “animation.” (They also called 2015’s The Martian and 2010’s The Tourist “comedies.”)

Call it what you will; the bottom line is that the film is breathtaking and the storytelling is amazing. And the filmmaking is inventive and pushes traditional boundaries, making it difficult to perhaps fit into a traditional category. Therefore, “beautiful,” “riveting,” “creative” and “innovative” might be the only descriptions necessary.


Karen Moltenbrey is a veteran writer, covering visual effects and post production.

Check out MPC’s VFX breakdown on the film:

Director James Mangold on Oscar-nominated Ford v Ferrari

By Iain Blair

Filmmaker James Mangold has been screenwriting, producing and directing for years. He has made films about country legends (Walk the Line), cowboys (3:10 to Yuma), superheroes (Logan) and cops (Cop Land), and has tackled mental illness (Girl Interrupted) as well.

Now he’s turned his attention to race car drivers and Formula 1 with his movie Ford v Ferrari, which has earned Mangold an Oscar nomination for Best Picture. The film also received nods for its editing, sound editing and sound mixing.

James Mangold (beard) on set.

The high-octane drama was inspired by a true-life friendship that forever changed racing history. In 1959, Carroll Shelby (Matt Damon) is on top of the world after winning the most difficult race in all of motorsports, the 24 Hours of Le Mans. But his greatest triumph is followed quickly by a crushing blow — the fearless Texan is told by doctors that a grave heart condition will prevent him from ever racing again.

Endlessly resourceful, Shelby reinvents himself as a car designer and salesman working out of a warehouse space in Venice Beach with a team of engineers and mechanics that includes hot-tempered test driver Ken Miles (Christian Bale). A champion British race car driver and a devoted family man, Miles is brilliant behind the wheel, but he’s also blunt, arrogant and unwilling to compromise.

After Shelby’s vehicles make a strong showing at Le Mans against Italy’s venerable Enzo Ferrari, Ford Motor Company recruits the firebrand visionary to design the ultimate race car, a machine that can beat even Ferrari on the unforgiving French track. Determined to succeed against overwhelming odds, Shelby, Miles and their ragtag crew battle corporate interference, the laws of physics and their own personal demons to develop a revolutionary vehicle that will outshine every competitor. The film culminates in the historic showdown between the US and Italy at the grueling 1966 24 hour Le Mans race.

Mangold’s below-the-line talent, many of whom have collaborated with the director before, includes Academy Award-nominated director of photography Phedon Papamichael; film editors Michael McCusker, ACE, and Andrew Buckland; visual effects supervisor Olivier Dumont; and composers Marco Beltrami and Buck Sanders.

L-R: Writer Iain Blair and Director James Mangold

I spoke with Mangold — whose other films include Logan, The Wolverine and Knight and Day — about making the film and his workflow.

You obviously love exploring very different subject matter in every film you make.
Yes, and I do every movie like a sci-fi film — meaning inventing a new world that has its own rules, customs, language, laws of physics and so on, and you need to set it up so the audience understands and they get it all. It’s like being a world-builder, and I feel every film should have that, as you’re entering this new world, whether it’s Walk the Line or The French Connection. And the rules and behavior are different from our own universe, and that’s what makes the story and characters interesting to me.

What sort of film did you set out to make?
Well, given all that, I wanted to make an exciting racing movie about that whole world, but it’s also that it was a moment when racing was free of all things that now turn me off about it. The cars were more beautiful then, and free of all the branding. Today, the cars are littered with all the advertising and trademarks — and it’s all nauseating to me. I don’t even feel like I’m watching a sport anymore.

When this story took place, it was also a time when all the new technology was just exploding. Racing hasn’t changed that much over the past 20 years. It’s just refining and tweaking to get that tiny edge, but back in the ‘60s they were still inventing the modern race car, and discovering aerodynamics and alternate building materials and methods. It was a brand-new world, so there was this great sense of discovery and charm along with all that.

What were the main technical challenges in pulling it all together?
Trying to do what I felt all the other racing movies hadn’t really done — taking the driving out of the CG world and putting it back in the real world, so you could feel the raw power and the romanticism of racing. A lot of that’s down to the particulates in the air, the vibrations of the camera, the way light moves around the drivers — and the reality of behavior when you’re dealing with incredibly powerful machines. So right from the start, I decided we had to build all the race cars; that was a huge challenge right there.

How early on did you start integrating post and all the VFX?
Day one. I wanted to use real cars and shoot the Le Mans and other races in camera rather than using CGI. But this is a period piece, so we did use a lot of CGI for set extensions and all the crowds. We couldn’t afford 50,000 extras, so just the first six rows or so were people in the stands; the rest were digital.

Did you do a lot of previz?

A lot, especially for Le Mans, as it was such a big, three-act sequence with so many moving parts. We used far less for Daytona. We did a few storyboards and then me and my second unit director, Darrin Prescott — who has choreographed car chases and races in such movies as Drive, Deadpool 2, Baby Driver and The Bourne Ultimatum — planned it out using matchbox cars.

I didn’t want that “previzy” feeling. Even when I do a lot of previz, whether it’s a Marvel movie or like this, I always tell my previz team “Don’t put the camera anywhere it can’t go.” One of the things that often happens when you have the ability to make your movie like a cartoon in a laboratory — which is what previz is — is that you start doing a lot of gimmicky shots and flying the camera through keyholes and floating like a drone, because it invites you to do all that crazy shit. It’s all very show-offy as a director — “Look at me!” — and a turnoff to me. It takes me out of the story, and it’s also not built off the subjective experience of your characters.

This marks your fifth collaboration with DP Phedon Papamichael, and I noticed there’s no big swooping camera moves or the beauty shot approach you see in all the car commercials.
Yes, we wanted it to look beautiful, but in a real way. There’s so much technology available now, like gyroscopic setups and arms that let you chase the cars in high-speed vehicles down tracks. You can do so much, so why do you need to do more? I’m conservative that way. My goal isn’t to brand myself through my storytelling tricks.

How tough was the shoot?
It was one of the most fun shoots I’ve ever had, with my regular crew and a great cast. But it was also very grueling, as we were outside a lot, often in 115-degree heat in the desert on blacktop. And locations were big challenges. The original Le Mans course doesn’t exist anymore like it used to be, so we used several locations in Georgia to double for it. We shot the races wide-angle anamorphic with a team of a dozen professional drivers, and with anamorphic you can shoot the cars right up into the lens — just inches away from camera, while they’d be doing 150 mph or 160 mph.

Where did you post?
All on the Fox lot at my offices. We scored at Capitol Records and mixed the score in Malibu at my composer’s home studio. I really love the post, and for me it’s all part of the same process — the same cutting and pasting I do when I’m writing, and even when I’m directing. You’re manipulating all these elements and watching it take form — and particularly in this film, where all the sound design and music and dialogue are all playing off one another and are so key. Take the races. By themselves, they look like nothing. It’s just a car whipping by. The power of it all only happens with the editing.

You had two editors — Michael McCusker and Andrew Buckland. How did that work?
Mike’s been with me for 20 years, so he’s kind of the lead. Mike and Drew take and trade scenes, and they’re good friends so they work closely together. I move back and forth between them, which also gives them each some space. It’s very collaborative. We all want it to look beautiful and elegant and well-designed, but no one’s a slave to any pre-existing ideas about structure or pace. (Check out postPerspective‘s interview with the editing duo here.)

What were the big editing challenges?
It’s a car racing movie with drama, so we had to hit you with adrenalin and then hold you with what’s a fairly procedural and process-oriented film about these guys scaling the corporate wall to get this car built and on the track. Most of that’s dramatic scenes. The flashiest editing is the races, which was a huge, year-long effort. Mike was cutting the previz before we shot a foot, and initially we just had car footage, without the actors, so that was a challenge. It all transformed once we added the actors.

Can you talk about working on the visual effects with Method’s VFX supervisor Olivier Dumont?
He did an incredible job, as no one thinks there are so many. They’re really invisible, and that’s what I love — the film feels 100% analog, but of course it isn’t. It’s impossible to build giant race tracks as they were in the ‘60s. But having real foregrounds really helped. We had very few scenes where actors were wandering around in a green void like on so many movies now. So you’re always anchored in the real world, and then all the set extensions were in softer focus or backlit.

This film really lends itself to sound.
Absolutely, as every car has its own signature sound, and as we cut rapidly from interiors to exteriors, from cars to pits and so on. The perspective aural shifts are exciting, but we also tried to keep it simple and not lose the dramatic identity of the story. We even removed sounds in the mix if they weren’t important, so we could focus on what was important.

Where did you do the DI, and how important is it to you?
At Efilm with Skip Kimball (working on Blackmagic DaVinci Resolve), and it was huge on this, especially dealing with the 24-hour race, the changing light, rain and night scenes, and having to match five different locations was a nightmare. So we worked on all that and the overall look from early on in the edit.

What’s next?
Don’t know. I’ve got two projects I’m working on. We’ll see.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Picture Shop VFX acquires Denmark’s Ghost VFX

Burbank’s Picture Shop VFX has acquired Denmark’s Ghost VFX. This Copenhagen-base studio, founded in 1999, provides high-end visual work for film, television and several streaming platforms. The move helps Picture Shop “increase its services worldwide and broaden its talent and expertise,” according to Picture Shop VFX’s president Tom Kendall.

Over the years, Ghost has contributed to more than 70 feature films and titles. Some of Ghost’s work includes Star Wars: The Rise of Skywalker, The Mandalorian, The Walking Dead, See, Black Panther and Star Trek Discovery.

“As we continue to expand our VFX footprint into the international market, I am extremely excited to have Ghost join Picture Shop VFX,” says Bill Romeo, president of Picture Head Holdings.

Christensen says the studio takes up three floors and 13,000 square feet in a “vintage and beautifully renovated office building” in Copenhagen. Their main tools are Autodesk Maya, Foundry Nuke and SideFX Houdini.

“We are really looking forward to a tight-nit collaboration with all the VFX teams in the Picture Shop group,” says Christensen. “Right now Ghost will continue servicing current clients and projects, but we’re really looking forward to exploring the massive potential of being part of a larger and international family.”

Picture Shop VFX is a division of Picture Head Holdings. Picture Head Holdings has locations in Los Angeles, Vancouver, the United Kingdom, and Denmark.

Main Image: Ghost artists at work.

Conductor Companion app targets VFX boutiques and freelancers

Conductor Technologies has introduced Conductor Companion, a desktop app designed to simplify the use of the cloud-based rendering service. Tailored for boutique studios and freelance artists, Companion streamlines the Conductor on-ramp and rendering experience, allowing users to easily manage and download files, write commands and handle custom submissions or plug-ins from their laptops or workstations. Along with this release, Conductor has added initial support for Blender creative software.

“Conductor was originally designed to meet the needs of larger VFX studios, focusing our efforts on maximizing efficiency and scalability when many artists simultaneously leverage the platform and optimizing how Conductor hooks into those pipelines,” explains CEO Mac Moore. “As Conductor’s user base has grown, we’ve been blown away by the number of freelance artists and small studios that have come to us for help, each of which has their own unique needs. Conductor Companion is a nod to that community, bringing all the functionality and massive render resource scale of Conductor into a user-friendly app, so that artists can focus on content creation versus pipeline management. And given that focus, it was a no-brainer to add Blender support, and we are eager to serve the passionate users of that product.”

Moore reports that this app will be the foundation of Conductor’s Intelligence Hub in the near future, “acting as a gateway to more advanced functionality like Shot Analytics and Intelligent Bid Assist. These features will leverage AI and Conductor’s cloud knowledge to help owners and freelancers make more informed business decisions as it pertains to project-to-project rendering financials.”

Conductor Companion is currently in public beta. You can download the app here.

In addition to Blender, applications currently supported by Conductor include Autodesk Maya and Arnold; Foundry’s Nuke, Cara VR, Katana, Modo and Ocula; Chaos Group’s V-Ray; Pixar’s RenderMan; Isotropix’s Clarisse; Golaem; Ephere’s Ornatrix; Yeti; and Miarmy.

The Mill opens boutique studio in Berlin

Technicolor’s The Mill has officially launched in Berlin. This new boutique studio is located in the heart of Berlin, situated in the creative hub of Mitte, near many of Germany’s agencies, production companies and brands.

The Mill has been working with German clients for years. Recent projects include the Mercedes’ Bertha Benz spot with director Sebastian Strasser; Netto’s The Easter Surprise, directed in-house by The Mill; and BMW The 8 with director Daniel Wolfe. The new studio will bring The Mill’s full range of creative services from color to experiential and interactive, as well as visual effects and design.

The Mill Berlin crew

Creative director Greg Spencer will lead the creative team. He is a multi-award winning creative, having won several VES, Cannes Lions and British Arrow awards. His recent projects include Carlsberg’s The Lake, PlayStation’s This Could Be You and Eve Cuddly Toy. Spencer also played a role in some of Mill Film’s major titles. He was the 2D supervisor for Les Misérables and also worked on the Lord of the Rings trilogy. His resume also includes campaigns for brands such as Nike and Samsung.

Executive producer Justin Stiebel moves from The Mill London, where he has been since early 2014, to manage client relationships and new business. Since joining the company, Stiebel has produced spots such as Audi’s Next Level and the Mini’s “The Faith of a Few” campaign. He has also collaborated with directors such as Sebastian Strasser, Markus Walter and Daniel Wolfe while working on brands like Mercedes, Audi and BMW.

Sean Costelloe is managing director of The Mill London and The Mill Berlin.

Main Image Caption: (L-R) Justin Stiebel and Greg Spencer

Directing Olly’s ‘Happy Inside Out’ campaign

How do you express how vitamins make you feel? Well, production company 1stAveMachine partnered with independent creative agency Yard NYC to develop the stylized “Happy Inside Out” campaign for Olly multivitamin gummies to show just that.

Beauty

The directing duo of Erika Zorzi and Matteo Sangalli, known as Mathery, highlighted the brand’s products and benefits by using rich textures, colors and lighting. They shot on an ARRI Alexa Mini. “Our vision was to tell a cohesive narrative, where each story of the supplements spoke the same visual language,” Mathery explains. “We created worlds where everything is possible and sometimes took each product’s concept to the extreme and other times added some romance to it.”

Each spot imagines various benefits of taking Olly products. The side-scrolling Energy, which features a green palette, shows a woman jumping and doing flips through life’s everyday challenges, including through her home to work, doing laundry and going to the movies. Beauty, with its pink color pallete, features another woman “feeling beautiful” while turning the heads of a parliament of owls. Meanwhile, Stress, with its purple/blue palette, features a women tied up in a giant ball of yarn, and as she unspools herself, the things that were tying her up spin away. In the purple-shaded Sleep, a lady lies in bed pulling off layer after layer of sleep masks until she just happily sleeps.

Sleep

The spots were shot with minimal VFX, other than a few greenscreen moments, and the team found itself making decisions on the fly, constantly managing logistics for stunt choreography, animal performances and wardrobe. Jogger Studios provided the VFX using Autodesk Flame for conform, cleanup and composite work. Adobe After Effects was used for all of the end tag animation. Cut+Run edited the campaign.

According to Mathery, “The acrobatic moves and obstacle pieces in the Energy spot were rehearsed on the same day of the shoot. We had to be mindful because the action was physically demanding on the talent. With the Beauty spot, we didn’t have time to prepare with the owls. We had no idea if they would move their heads on command or try to escape and fly around the whole time. For the Stress spot, we experimented with various costume designs and materials until we reached a look that humorously captured the concept.”

The campaign marks Mathery’s second collaboration with Yard NYC and Olly, who brought the directing team into the fold very early on, during the initial stages of the project. This familiarity gave everyone plenty of time to let the ideas breath.

VES Awards: The Lion King and Alita earn five noms each

The Visual Effects Society (VES) has announced its nominees for the 18th Annual VES Awards, which recognize outstanding visual effects artistry and innovation in film, animation, television, commercials and video games and the VFX supervisors, VFX producers and hands-on artists who bring this work to life. Alita: Battle Angel and The Lion King both have five nominations each; Toy Story 4 is the top animated film contender with five nominations, and Game of Thrones and The Mandalorian tie to lead the broadcast field with six nominations each.

Nominees in 25 categories were selected by VES members via events hosted by 11 VES sections, including Australia, the Bay Area, Germany, London, Los Angeles, Montreal, New York, New Zealand, Toronto, Vancouver and Washington.

The VES Awards will be held on January 29 at the Beverly Hilton Hotel. The VES Lifetime Achievement Award will be presented to Academy, DGA and Emmy-Award winning director-producer-screenwriter Martin Scorsese. The VES Visionary Award will be presented to director-producer-screenwriter Roland Emmerich. And the VES Award for Creative Excellence will be given to visual effects supervisor Sheena Duggal. Award-winning actor-comedian-author Patton Oswalt will once again host the event.

The nominees for the 18th Annual VES Awards in 25 categories are:

 

Outstanding Visual Effects in a Photoreal Feature

 

ALITA: BATTLE ANGEL

Richard Hollander

Kevin Sherwood

Eric Saindon

Richard Baneham

Bob Trevino

 

AVENGERS: ENDGAME

Daniel DeLeeuw

Jen Underdahl

Russell Earl

Matt Aitken

Daniel Sudick

 

GEMINI MAN

Bill Westenhofer

Karen Murphy-Mundell

Guy Williams

Sheldon Stopsack

Mark Hawker

 

STAR WARS: THE RISE OF SKYWALKER

Roger Guyett

Stacy Bissell

Patrick Tubach

Neal Scanlan

Dominic Tuohy

 

THE LION KING

Robert Legato

Tom Peitzman

Adam Valdez

Andrew R. Jones

 

Outstanding Supporting Visual Effects in a Photoreal Feature

 

1917

Guillaume Rocheron

Sona Pak

Greg Butler

Vijay Selvam

Dominic Tuohy

 

FORD V FERRARI

Olivier Dumont

Kathy Siegel

Dave Morley

Malte Sarnes

Mark Byers

 

JOKER

Edwin Rivera

Brice Parker

Mathew Giampa

Bryan Godwin

Jeff Brink

 

THE AERONAUTS

Louis Morin

Annie Godin

Christian Kaestner

Ara Khanikian

Mike Dawson

 

THE IRISHMAN

Pablo Helman

Mitch Ferm

Jill Brooks

Leandro Estebecorena

Jeff Brink

 

Outstanding Visual Effects in an Animated Feature

 

FROZEN 2

Steve Goldberg

Peter Del Vecho

Mark Hammel

Michael Giaimo

 

KLAUS

Sergio Pablos

Matthew Teevan

Marcin Jakubowski

Szymon Biernacki

 

MISSING LINK

Brad Schiff

Travis KnightSteve Emerson

Benoit Dubuc

 

THE LEGO MOVIE 2

David Burgess

Tim Smith

Mark Theriault

John Rix

 

TOY STORY 4

Josh Cooley

Mark Nielsen

Bob Moyer

Gary Bruins

 

Outstanding Visual Effects in a Photoreal Episode

 

GAME OF THRONES; The Bells

Joe Bauer

Steve Kullback

Ted Rae

Mohsen Mousavi

Sam Conway

 

HIS DARK MATERIALS; The Fight to the Death

Russell Dodgson

James Whitlam

Shawn Hillier

Robert Harrington

 

LADY AND THE TRAMP

Robert Weaver

Christopher Raimo

Arslan Elver

Michael Cozens

Bruno Van Zeebroeck

 

LOST IN SPACE – Episode: Ninety-Seven

Jabbar Raisani

Terron Pratt

Niklas Jacobson

Juri Stanossek

Paul Benjamin

 

STRANGER THINGS – Chapter Six: E Pluribus Unum

Paul Graff

Tom Ford

Michael Maher Jr.

Martin Pelletier

Andy Sowers

 

THE MANDALORIAN; The Child

Richard Bluff

Abbigail Keller

Jason Porter

Hayden Jones

Roy Cancinon

 

Outstanding Supporting Visual Effects in a Photoreal Episode

 

CHERNOBYL; 1:23:45

Max Dennison

Lindsay McFarlane

Clare Cheetham

Paul Jones

Claudius Christian Rauch

 

LIVING WITH YOURSELF; Nice Knowing You

Jay Worth

Jacqueline VandenBussche

Chris Wright

Tristan Zerafa

 

SEE; Godflame

Adrian de Wet

Eve Fizzinoglia

Matthew Welford

Pedro Sabrosa

Tom Blacklock

 

THE CROWN; Aberfan

Ben Turner

Reece Ewing

David Fleet

Jonathan Wood

 

VIKINGS; What Happens in the Cave

Dominic Remane

Mike Borrett

Ovidiu Cinazan

Tom Morrison

Paul Byrne

 

Outstanding Visual Effects in a Real-Time Project

 

Call of Duty Modern Warfare

Charles Chabert

Chris Parise

Attila Zalanyi

Patrick Hagar

 

Control

Janne Pulkkinen

Elmeri Raitanen

Matti Hämäläinen

James Tottman

 

Gears 5

Aryan Hanbeck

Laura Kippax

Greg Mitchell

Stu Maxwell

 

Myth: A Frozen Tale

Jeff Gipson

Nicholas Russell

Brittney Lee

Jose Luis Gomez Diaz

 

Vader Immortal: Episode I

Ben Snow

Mike Doran

Aaron McBride

Steve Henricks

 

Outstanding Visual Effects in a Commercial

 

Anthem Conviction

Viktor Muller

Lenka Likarova

Chris Harvey

Petr Marek

 

BMW Legend

Michael Gregory

Christian Downes

Tim Kafka

Toya Drechsler

 

Hennessy: The Seven Worlds

Carsten Keller

Selcuk Ergen

Kiril Mirkov

William Laban

 

PlayStation: Feel The Power of Pro

Sam Driscoll

Clare Melia

Gary Driver

Stefan Susemihl

 

Purdey’s: Hummingbird

Jules Janaud

Emma Cook

Matthew Thomas

Philip Child

 

Outstanding Visual Effects in a Special Venue Project

 

Avengers: Damage Control

Michael Koperwas

Shereif Fattouh

Ian Bowie

Kishore Vijay

Curtis Hickman

 

Jurassic World: The Ride

Hayden Landis

Friend Wells

Heath Kraynak

Ellen Coss

 

Millennium Falcon: Smugglers Run

Asa Kalama

Rob Huebner

Khatsho Orfali

Susan Greenhow

 

Star Wars: Rise of the Resistance

Jason Bayever

Patrick Kearney

Carol Norton

Bill George

 

Universal Sphere

James Healy

Morgan MacCuish

Ben West

Charlie Bayliss

 

Outstanding Animated Character in a Photoreal Feature

 

ALITA: BATTLE ANGEL; Alita

Michael Cozens

Mark Haenga

Olivier Lesaint

Dejan Momcilovic

 

AVENGERS: ENDGAME; Smart Hulk

Kevin Martel

Ebrahim Jahromi

Sven Jensen

Robert Allman

 

GEMINI MAN; Junior

Paul Story

Stuart Adcock

Emiliano Padovani

Marco Revelant

 

THE LION KING; Scar

Gabriel Arnold

James Hood

Julia Friedl

Daniel Fortheringham

 

 

 

 

Outstanding Animated Character in an Animated Feature

 

FROZEN 2; The Water Nøkk

Svetla Radivoeva

Marc Bryant

Richard E. Lehmann

Cameron Black

 

KLAUS; Jesper

Yoshimishi Tamura

Alfredo Cassano

Maxime Delalande

Jason Schwartzman

 

MISSING LINK; Susan

Rachelle Lambden

Brenda Baumgarten

Morgan Hay

Benoit Dubuc

 

TOY STORY 4; Bo Peep

Radford Hurn

Tanja Krampfert

George Nguyen

Becki Rocha Tower

 

Outstanding Animated Character in an Episode or Real-Time Project

 

LADY AND THE TRAMP; Tramp

Thiago Martins

Arslan Elver

Stanislas Paillereau

Martine Chartrand

 

STRANGER THINGS 3; Tom/Bruce Monster

Joseph Dubé-Arsenault

Antoine Barthod

Frederick Gagnon

Xavier Lafarge

 

THE MANDALORIAN; The Child; Mudhorn

Terry Bannon

Rudy Massar

Hugo Leygnac

 

THE UMBRELLA ACADEMY; Pilot; Pogo

Aidan Martin

Craig Young

Olivier Beierlein

Laurent Herveic

 

Outstanding Animated Character in a Commercial

 

Apex Legends; Meltdown; Mirage

Chris Bayol

John Fielding

Derrick Sesson

Nole Murphy

 

Churchill; Churchie

Martino Madeddu

Philippe Moine

Clement Granjon

Jon Wood

 

Cyberpunk 2077; Dex

Jonas Ekman

Jonas Skoog

Marek Madej

Grzegorz Chojnacki

 

John Lewis; Excitable Edgar; Edgar

Tim van Hussen

Diarmid Harrison-Murray

Amir Bazzazi

Michael Diprose

 

 

Outstanding Created Environment in a Photoreal Feature

 

ALADDIN; Agrabah

Daniel Schmid

Falk Boje

Stanislaw Marek

Kevin George

 

ALITA: BATTLE ANGEL; Iron City

John Stevenson-Galvin

Ryan Arcus

Mathias Larserud

Mark Tait

 

MOTHERLESS BROOKLYN; Penn Station

John Bair

Vance Miller

Sebastian Romero

Steve Sullivan

 

STAR WARS: THE RISE OF SKYWALKER; Pasaana Desert

Daniele Bigi

Steve Hardy

John Seru

Steven Denyer

 

THE LION KING; The Pridelands

Marco Rolandi

Luca Bonatti

Jules Bodenstein

Filippo Preti

 

 

Outstanding Created Environment in an Animated Feature

 

FROZEN 2; Giants’ Gorge

Samy Segura

Jay V. Jackson

Justin Cram

Scott Townsend

 

HOW TO TRAIN YOUR DRAGON: THE HIDDEN WORLD; The Hidden World

Chris Grun

Ronnie Cleland

Ariel Chisholm

Philippe Brochu

 

MISSING LINK; Passage to India Jungle

Oliver Jones

Phil Brotherton

Nick Mariana

Ralph Procida

 

TOY STORY 4; Antiques Mall

Hosuk Chang

Andrew Finley

Alison Leaf

Philip Shoebottom

 

 

Outstanding Created Environment in an Episode, Commercial, or Real-Time Project

 

GAME OF THRONES; The Iron Throne; Red Keep Plaza

Carlos Patrick DeLeon

Alonso Bocanegra Martinez

Marcela Silva

Benjamin Ross

 

LOST IN SPACE; Precipice; The Trench

Philip Engström

Benjamin Bernon

Martin Bergquist

Xuan Prada

 

THE DARK CRYSTAL: AGE OF RESISTANCE; The Endless Forest

Sulé Bryan

Charles Chorein

Christian Waite

Martyn Hawkins

 

THE MANDALORIAN; Nevarro Town

Alex Murtaza

Yanick Gaudreau

Marco Tremblay

Maryse Bouchard

 

Outstanding Virtual Cinematography in a CG Project

 

ALITA: BATTLE ANGEL

Emile Ghorayeb

Simon Jung

Nick Epstein

Mike Perry

 

THE LION KING

Robert Legato

Caleb Deschanel

Ben Grossmann

AJ Sciutto

 

THE MANDALORIAN; The Prisoner; The Roost

Richard Bluff

Jason Porter

Landis Fields IV

Baz Idione

 

 

TOY STORY 4

Jean-Claude Kalache

Patrick Lin

 

Outstanding Model in a Photoreal or Animated Project

 

LOST IN SPACE; The Resolute

Xuan Prada

Jason Martin

Jonathan Vårdstedt

Eric Andersson

 

MISSING LINK; The Manchuria

Todd Alan Harvey

Dan Casey

Katy Hughes

 

THE MAN IN THE HIGH CASTLE; Rocket Train

Neil Taylor

Casi Blume

Ben McDougal

Chris Kuhn

 

THE MANDALORIAN; The Sin; The Razorcrest

Doug Chiang

Jay Machado

John Goodson

Landis Fields IV

 

Outstanding Effects Simulations in a Photoreal Feature

 

DUMBO; Bubble Elephants

Sam Hancock

Victor Glushchenko

Andrew Savchenko

Arthur Moody

 

SPIDER-MAN: FAR FROM HOME; Molten Man

Adam Gailey

Jacob Santamaria

Jacob Clark

Stephanie Molk

 

 

 

 

 

STAR WARS: THE RISE OF SKYWALKER

Don Wong

Thibault Gauriau

Goncalo Cababca

Francois-Maxence Desplanques

 

THE LION KING

David Schneider

Samantha Hiscock

Andy Feery

Kostas Strevlos

 

Outstanding Effects Simulations in an Animated Feature

 

ABOMINABLE

Alex Timchenko

Domin Lee

Michael Losure

Eric Warren

 

FROZEN 2

Erin V. Ramos

Scott Townsend

Thomas Wickes

Rattanin Sirinaruemarn

 

HOW TO TRAIN YOUR DRAGON: THE HIDDEN WORLD; Water and Waterfalls

Derek Cheung

Baptiste Van Opstal

Youxi Woo

Jason Mayer

 

TOY STORY 4

Alexis Angelidis

Amit Baadkar

Lyon Liew

Michael Lorenzen

 

Outstanding Effects Simulations in an Episode, Commercial, or Real-Time Project

 

GAME OF THRONES; The Bells

Marcel Kern

Paul Fuller

Ryo Sakaguchi

Thomas Hartmann

 

Hennessy: The Seven Worlds

Selcuk Ergen

Radu Ciubotariu

Andreu Lucio

Vincent Ullmann

 

LOST IN SPACE; Precipice; Water Planet

Juri Bryan

Hugo Medda

Kristian Olsson

John Perrigo

 

STRANGER THINGS 3; Melting Tom/Bruce

Nathan Arbuckle

Christian Gaumond

James Dong

Aleksandr Starkov

 

THE MANDALORIAN; The Child; Mudhorn

Xavier Martin Ramirez

Ian Baxter

Fabio Siino

Andrea Rosa

 

Outstanding Compositing in a Feature

 

ALITA: BATTLE ANGEL

Adam Bradley

Carlo Scaduto

Hirofumi Takeda

Ben Roberts

 

AVENGERS: ENDGAME

Tim Walker

Blake Winder

Tobias Wiesner

Joerg Bruemmer

 

CAPTAIN MARVEL; Young Nick Fury

Trent Claus

David Moreno Hernandez

Jeremiah Sweeney

Yuki Uehara

 

STAR WARS: THE RISE OF SKYWALKER

Jeff Sutherland

John Galloway

Sam Bassett

Charles Lai

 

THE IRISHMAN

Nelson Sepulveda

Vincent Papaix

Benjamin O’Brien

Christopher Doerhoff

 

 

Outstanding Compositing in an Episode

 

GAME OF THRONES; The Bells

Sean Heuston

Scott Joseph

James Elster

Corinne Teo

 

GAME OF THRONES; The Long Night; Dragon Ground Battle

Mark Richardson

Darren Christie

Nathan Abbott

Owen Longstaff

 

STRANGER THINGS 3; Starcourt Mall Battle

Simon Lehembre

Andrew Kowbell

Karim El-Masry

Miklos Mesterhazy

 

WATCHMEN; Pilot; Looking Glass

Nathaniel Larouche

Iyi Tubi

Perunika Yorgova

Mitchell Beaton

 

Outstanding Compositing in a Commercial

 

BMW Legend

Toya Drechsler

Vivek Tekale

Guillaume Weiss

Alexander Kulikov

 

Feeding America; I Am Hunger in America

Dan Giraldo

Marcelo Pasqualino

Alexander Koester

 

Hennessy; The Seven Worlds

Rod Norman

Guillaume Weiss

Alexander Kulikov

Alessandro Granella

 

PlayStation: Feel the Power of Pro

Gary Driver

Stefan Susemihl

Greg Spencer

Theajo Dharan

 

Outstanding Special (Practical) Effects in a Photoreal or Animated Project

 

ALADDIN; Magic Carpet

Mark Holt

Jay Mallet

Will Wyatt

Dickon Mitchell

 

GAME OF THRONES; The Bells

Sam Conway

Terry Palmer

Laurence Harvey

Alastair Vardy

 

TERMINATOR: DARK FATE

Neil Corbould

David Brighton

Ray Ferguson

Keith Dawson

 

THE DARK CRYSTAL: THE AGE OF RESISTANCE; She Knows All the Secrets

Sean Mathiesen

Jon Savage

Toby Froud

Phil Harvey

 

Outstanding Visual Effects in a Student Project

 

DOWNFALL

Matias Heker

Stephen Moroz

Bradley Cocksedge

 

LOVE AND FIFTY MEGATONS

Denis Krez

Josephine Roß

Paulo Scatena

Lukas Löffler

 

OEIL POUR OEIL

Alan Guimont

Thomas Boileau

Malcom Hunt

Robin Courtoise

 

THE BEAUTY

Marc Angele

Aleksandra Todorovic

Pascal Schelbli

Noel Winzen

 

 

Recreating the Vatican and Sistine Chapel for Netflix’s The Two Popes

The Two Popes, directed by Fernando Meirelles, stars Anthony Hopkins as Pope Benedict XVI and Jonathan Pryce as current pontiff Pope Francis in a story about one of the most dramatic transitions of power in the Catholic Church’s history. The film follows a frustrated Cardinal Bergoglio (the future Pope Francis) who in 2012 requests permission from Pope Benedict to retire because of his issues with the direction of the church. Instead, facing scandal and self-doubt, the introspective Benedict summons his harshest critic and future successor to Rome to reveal a secret that would shake the foundations of the Catholic Church.

London’s Union was approached in May 2017 and supervised visual effects on location in Argentina and Italy over several months. A large proportion of the film takes place within the walls of Vatican City. The Vatican was not involved in the production and the team had very limited or no access to some of the key locations.

Under the direction of production designer Mark Tildesley, the production replicated parts of the Vatican at Rome’s Cinecitta Studios, including a life-size, open ceiling, Sistine Chapel, which took two months to build.

The team LIDAR-scanned everything available and set about amassing as much reference material as possible — photographing from a permitted distance, scanning the set builds and buying every photographic book they could lay their hands on.

From this material, the team set about building 3D models — created in Autodesk Maya — of St. Peter’s Square, the Basilica and the Sistine Chapel. The environments team was tasked with texturing all of these well-known locations using digital matte painting techniques, including recreating Michelangelo’s masterpiece on the ceiling of the Sistine Chapel.

The story centers on two key changes of pope in 2005 and 2013. Those events attracted huge attention, filling St. Peter’s Square with people eager to discover the identity of the new pope and celebrate his ascension. News crews from around the world also camp out to provide coverage for the billions of Catholics all over the world.

To recreate these scenes, the crew shot at a school in Rome (Ponte Mammolo) that has the same pattern on its floor. A cast of 300 extras was shot in blocks in different positions at different times of day, with costume tweaks including the addition of umbrellas to build a library that would provide enough flexibility during post to recreate these moments at different times of day and in different weather conditions.

Union also called on Clear Angle Studios to individually scan 50 extras to provide additional options for the VFX team. This was an ambitious crowd project, so the team couldn’t shoot in the location, and the end result had to stand up at 4K in very close proximity to the camera. Union designed a Houdini-based system to deal with the number of assets and clothing in such a way that the studio could easily art-direct them as individuals, allow the director to choreograph them and deliver a believable result.

Union conducted several motion capture shoots inhouse at Union to provide some specific animation cycles that married with the occasions they were recreating. This provided even more authentic-looking crowds for the post team.

Union worked on a total of 288 VFX shots, including greenscreens, set extensions, window reflections, muzzle flashes, fog and rain and a storm that included a lightning strike on the Basilica.

In addition, the team did a significant amount of de-aging work to accommodate the film’s eight-year main narrative timeline as well as a long period in Pope Francis’ younger years.

VFX pipeline trends for 2020

By Simon Robinson

A new year, more trends — some burgeoning, and others that have been dominating industry discussions for a while. Underpinning each is the common sentiment that 2020 seems especially geared toward streamlining artist workflows, more so than ever before.

There’s an increasing push for efficiency; not just through hardware but through better business practices and solutions to throughput problems.

Exciting times lie ahead for artists and studios everywhere. I believe the trends below form the pillars of this key industry mission for 2020.

Machine Learning Will Make Better, Faster Artists
Machines are getting smarter. AI software is becoming more universally applied in the VFX industry, and with this comes benefits and implications for artist workflows.

As adoption of machine learning increases, the core challenge for 2020 lies in artist direction and participation, especially since the M.O. of machine learning is its ability to solve entire problems on its own.

The issue is this: if you rely on something 99.9% of the time, what happens if it fails in that extra 0.1%? Can you fix it? While ML means less room for human error, will people have the skills to fix something gone wrong if they don’t need them anymore?

So this issue necessitates building a bridge between artist and algorithm. ML can do the hard work, giving artists the time to get creative and perfect their craft in the final stages.

Gemini Man

We’ve seen this pay off in the face of accessible and inexpensive deepfake technology giving rise to “quick and easy” deepfakes, which rely entirely on ML. In contrast to these, bridging the uncanny valley remains in the realm of highly-skilled artists, requiring thought, artistry and care to produce something that tricks the human eye. Weta Digital’s work on Gemini Man is a prime example.

As massive projects like these continue to emerge, studios strive for efficiency and being able to produce at scale. Since ML and AI are all about data, the manipulation of both can unlock endless potential for the speed and scale at which artists can operate.

Foundry’s own efforts in this regard revolve around improving the persistence and availability of captured data. We’re figuring out how to deliver data in a more sensible way downstream, from initial capture to timestamping and synchronization, and then final arrangement in an easy, accessible format.

Underpinning our research into this is Universal Scene Description (USD), which you’ve probably heard about…

USD Becomes Uniform
Despite having a legacy and prominence from its development with Pixar, the still relevant open-sourcing and gradual adoption of Universal Scene Description means that it’s still maturing for wider pipelines and workflows.

New iterations of USD are now being released at a three month cadence, where before it used to be every two months. With each new release comes improvements as growing pains and teething issues are ironed out, and the slower pace provides some respite for artists who rely on specific versions of USD.

But challenges still exist, namely mismatched USD pipelines, and scattered documentation which means that solutions to these can’t be easily found. Currently, no one is officially rubber stamping USD best practice.

Capturing volumetric datasets for future testing.

To solve this issue, the industry needs a universal application of USD so it can exist in pipelines as an application-standard plugin to prevent an explosion of multiple variants of USD, which may cause further confusion.

If this comes off, documentation could be made uniform, and information could be shared across software, teams and studios with even more ease and efficiency.

It’ll make Foundry’s life easier, too. USD is vital to us to power interoperability in our products, allowing clients to extend their software capabilities on top of what we do ourselves.

At Foundry, our lighting tool, Katana, uses USD Hydra tech as the basis for much improved viewer experiences. Most recently, its Advanced Viewport Technology aims at delivering a consistent visual experience across software.

This wouldn’t be possible without USD. Even in its current state, the benefits are tangible, and its core principles — flexibility, modularity, interoperability  — underpin 2020’s next big trends.

Artist Pipelines Will Look More Iterative 
The industry is asking, “How can you be more iterative through everything?” Calls for this will only grow louder as we move into next year.

There’s an increasing push for efficiency as the common sentiment prevails: too much work, not enough people to do it. While maximizing hardware usage might seem like a go-to solution to this, the actual answer lies in solving throughput problems by improving workflows and facilitating sharing between studios and artists.

Increasingly, VFX pipelines don’t work well as a waterfall structure anymore, where each stage is done, dusted, and passed onto the next department in a structured, rigid process.

Instead, artists are thinking about how data persists throughout their pipeline and how to make use of it in a smart way. The main aim is to iterate on everything simultaneously for a more fluid, consistent experience across teams and studios.

USD helps tremendously here, since it captures all of the data layers and iterations in one. Artists can go to any one point in their pipeline, change different aspects of it, and it’s all maintained in one neat “chunk.” No waterfalls here.

Compositing in particular benefits from this new style of working. Being able to easily review in context lends an immense amount of efficiency and creativity to artists working in post production.

That’s Just the Beginning
Other drivers for artist efficiency that may gain traction in 2020 include: working across multiple shots (currently featured in Nuke Studio), process automation, and volumetric-style workflows to let artists work with 3D representations featuring depth and volume.

The bottom line is that 2020 looks to be the year of the artist — and we can’t wait.


Simon Robinson is the co-founder and chief scientist at Foundry.

ILM’s Pablo Helman on The Irishman‘s visual effects

By Karen Moltenbrey

When a film stars Robert De Niro, Joe Pesci and Al Pacino, well, expectations are high. These are no ordinary actors, and Martin Scorsese is no ordinary director. These are movie legends. And their latest project, Netflix’s The Irishman, is no ordinary film. It features cutting-edge de-aging technology from visual effects studio Industrial Light & Magic (ILM) and earned the film’s VFX supervisor, Pablo Helman, an Oscar nomination.

The Irishman, adapted from the book “I Heard You Paint Houses,” tells the story of an elderly Frank “The Irishman” Sheeran (De Niro), whose life is nearing the end, as he looks back on his earlier years as a truck driver-turned-mob hitman for Russell Bufalino (Pesci) and family. While reminiscing, he recalls the role he played in the disappearance of his longtime friend, Jimmy Hoffa (Al Pacino), former president of the Teamsters, who famously disappeared in 1975 at the age of 62, and whose body has never been found.

The film contains 1,750 visual effects shots, most of which involve the de-aging of the three actors. In the film, the actors are depicted at various stages of their lives — mostly younger than their present age. Pacino is the least aged of the three actors, since he enters the story about a third of the way through — from the 1940s to his disappearance three decades later. He was 78 at the time of filming, and he plays Hoffa at various ages, from age 44 to 62. De Niro, who was 76 at the time of filming, plays Sheeran at certain points from age 20 to 80. Pesci plays Bufalino between age 53 and 83.

For the significantly older Sheeran, during his introspection, makeup was used. However, making the younger versions of all three actors was much more difficult. Indeed, current technology makes it possible to create believable younger digital doubles. But, it typically requires actors to perform alone on a soundstage wearing facial markers and helmet cameras, or requires artists to enhance or create performances with CG animation. That simply would not do for this film. Neither the actors nor Scorsese wanted the tech to interfere with the acting process in any way. Recreating their performances was also off the table.

“They wanted a technology that was non-intrusive and one that would be completely separate from the performances. They didn’t want markers on their faces, they did not want to wear helmet cams and they did not want to wear the gray [markered] pajamas that we normally use,” says VFX supervisor Helman. “They also wanted to be on set with theatrical lighting, and there wasn’t going to be any kind of re-shoots of performances outside the set.”

In a nutshell, ILM needed a markerless approach that occurred on-set during filming. To this end, ILM spent two years developing Flux, a new camera system and software, whereby a three-camera rig would extract performance data from lighting and textures captured on set and translate that to 3D computer-generated versions of the actors’ younger selves.

The camera rig was developed in collaboration with The Irishman’s DP, Rodrigo Prieto, and camera maker ARRI. It included two high-resolution (3.8K) Alexa Mini witness cameras that were modified with infrared rings; the two cameras were attached to and synched up with the primary sensor camera (the director’s Red Helium 8K camera). The infrared light from the two cameras was necessary to help neutralize any shadows on the actors’ faces, since Flux does not handle shadows well, yet remained “unseen” by the production camera.

Flux, meanwhile, used that camera information and translated that into deformable geometry mesh. “Flux takes that information from the three cameras and compares it to the lighting on set, deforms the geometry and changes the geometry and the shape of the actors on a frame-by-frame basis,” says Helman.

In fact, ILM continued to develop the software as it was working on the film. “It’s kind of like running the Grand Prix while you’re building the Ferrari,” Helman adds. “Then, you get better and better, and faster and faster, and your software gets better, and you are solving problems and learning from the software. Yes, it took a long time to do, but we knew we had time to do it and make it work.”

Pablo Helman (right) on The Irishman set.

At the beginning of the project, prior to the filming, the actors were digitally scanned performing a range of facial movements using ILM’s Medusa system, as well as on a light stage, which captured texture info under different lighting conditions. All that data was then used to create a 3D contemporary digital double of each of the actors. The models were sculpted in Autodesk’s Maya and with proprietary tools running on ILM’s Zeno platform.

ILM applied the 3D models to the exact performance data of each actor captured on set with the special camera rig, so the physical performances were now digital. No keyframe animation was used. However, the characters were still contemporary to the actors’ ages.

As Helman explains, after the performance, the footage was returned to ILM, where an intense matchmove was done of the actors’ bodies and heads. “The first thing that got matchmoved was the three cameras that were documenting what the actor was doing in the performance, and then we matchmoved the lighting instruments that were lighting the actor because Flux needs that lighting information in order to work,” he says.

Helman likens Flux to a black box full of little drawers where various aspects are inserted, like the layout, the matchimation, the lighting information and so forth, and it combines all that information to come up with the geometry for the digital double.

The actual de-aging occurs in modeling using a combination of libraries that were created for each actor and connected to and referenced by Flux. Later, modelers created the age variations, starting with the youngest version of each person. Variants were then generated gradually using a slider to move through life’s timeline. This process was labor-intensive as artists had to also erase the effects of time, such as wrinkles and age spots.

Insofar as The Irishman is not an action movie, creating motion for decades-younger versions of the characters was not an issue. However, a motion analyst was on set to work with the actors as they played the younger versions of their characters. Also, some visual effects work helped thin out the younger characters.

Helman points out that Scorsese stressed that he did not want to see a younger version of the actors playing roles from the past; he wanted to see younger versions of these particular characters. “He did not want to rewind the clock and see Robert De Niro as Jimmy Conway in 1990’s Goodfellas. He wanted to see De Niro as a 30-year-younger Frank Sheeran,” he explains.

When asked which actor posed the most difficulty to de-age, Helman explains that once you crack the code of capturing the performance and then retargeting the performance to a younger variation of the character, there’s little difference. Nevertheless, De Niro had the most screen time and the widest age range.

Performance capture began about 15 years ago, and Helman sees this achievement as a natural evolution of the technology. “Eventually those [facial] markers had to go away because for actors, that’s a very interesting way to work, if you really think about it. They have to try to ignore the markers and not be distracted by all the other intrusive stuff going on,” Helman says. “That time is now gone. If you let the actors do what they do, the performances will be so much better and the shots will look so much better because there is eye contact and context with another actor.”

While this technology is a quantum leap forward, there are still improvements to be made. The camera rig needs to get smaller and the software faster — and ILM is working on both aspects, Helman says. Nevertheless, the accomplishment made here is impressive and groundbreaking — the first markerless system that captures performance on set with theatrical lighting, thanks to more than 500 artists working around the world to make this happen. As a result, it opens up the door for more storytelling and acting options — not only for de-aging, but for other types of characters too.

Commenting on his Oscar nomination, Helman said, “It was an incredible, surreal experience to work with Scorsese and the actors, De Niro, Pacino and Pesci, on this movie. We are so grateful for the trust and support we got from the producers and from Netflix, and the talent and dedication of our team. We’re honored to be recognized by our colleagues with this nomination.”


Karen Moltenbrey is a veteran writer, covering visual effects and post production.

Shape+Light VFX boutique opens in LA with Trent, Lehr at helm


Visual effects and design studio boutique Shape+Light has officially launched in Santa Monica. At the helm is managing director/creative director Rob Trent and executive producer Cara Lehr. Shape+Light provides visual effects, design and finishing services for agency and brand-direct clients. The studio, which has been quietly operating since this summer, has already delivered work for Nike, Apple, Gatorade, Lexus and Proctor & Gamble.

Gatorade

Trent is no stranger to running VFX boutiques. An industry veteran, he began his career as a Flame artist, working at studios including Imaginary Forces and Digital Domain, and then at Asylum VFX as a VFX supervisor/creative director before co-founding The Mission VFX in 2010. In 2015, he established Saint Studio. During his career he has worked on big campaigns, including the launch of the Apple iPhone with David Fincher, celebrating the NFL with Nike and Michael Mann, and honoring moms with Alma Har’el and P&G for the Olympics. He has also contributed to award-winning feature films such as The Curious Case of Benjamin Button, Minority Report, X-Men and Zodiac.

Lehr is an established VFX producer with over 20 years of experience in both commercials and features. She has worked for many of LA’s leading VFX studios, including Zoic Studios, Asylum VFX, Digital Domain, Brickyard VFX and Psyop. She most recently served as EP at Method Studios, where she was on staff since 2012. She has worked on ad campaigns for brands including Apple, Microsoft, Nike, ESPN, Coca Cola, Taco Bell, AT&T, the NBA, Chevrolet and more.

Maya 2020 and Arnold 6 now available from Autodesk

Autodesk has released Autodesk Maya 2020 and Arnold 6 with Arnold GPU. Maya 2020 brings animators, modelers, riggers and technical artists a host of new tools and improvements for CG content creation, while Arnold 6 allows for production rendering on both the CPU and GPU.

Maya 2020 adds more than 60 new updates, as well as performance enhancements and new simulation features to Bifrost, the visual programming environment in Maya.

Maya 2020

Release highlights include:

— Over 60 animation features and updates to the graph editor and time slider.
— Cached Playback: New preview modes, layered dynamics caching and more efficient caching of image planes.
— Animation bookmarks: Mark, organize and navigate through specific events in time and frame playback ranges.
— Bifrost for Maya: Performance improvements, Cached Playback support and new MPM cloth constraints.
— Viewport improvements: Users can interact with and select dense geometry or a large number of smaller meshes faster in the viewport and UV editors.
— Modeling enhancements: New Remesh and Retopologize features.
— Rigging improvements: Matrix-driven workflows, nodes for precisely tracking positions on deforming geometry and a new GPU-accelerated wrap deformer.

The Arnold GPU is based on Nvidia’s OptiX framework and takes advantage of Nvidia RTX technology. Arnold 6 highlights include:

— Unified renderer— Toggle between CPU and GPU rendering.
— Lights, cameras and More— Support for OSL, OpenVDB volumes, on-demand texture loading, most LPEs, lights, shaders and all cameras.
— Reduced GPU noise— Comparable to CPU noise levels when using adaptive sampling, which has been improved to yield faster, more predictable results regardless of the renderer used.
— Optimized for Nvidia RTX hardware— Scale up rendering power when production demands it.
— New USD components— Hydra render delegate, Arnold USD procedural and USD schemas for Arnold nodes and properties are now available on GitHub.

Arnold 6

— Performance improvements— Faster creased subdivisons, an improved Physical Sky shader and dielectric microfacet multiple scattering.

Maya 2020 and Arnold 6 are available now as standalone subscriptions or with a collection of end-to-end creative tools within the Autodesk Media & Entertainment Collection. Monthly, annual and three-year single-user subscriptions of Arnold are available on the Autodesk e-store.

Arnold GPU is also available to try with a free 30-day trial of Arnold 6. Arnold GPU is available in all supported plug-ins for Autodesk Maya, Autodesk 3ds Max, SideFX Houdini, Maxon Cinema 4D and Foundry Katana.

Storage for Visual Effects

By Karen Moltenbrey

When creating visual effects for a live-action film or television project, the artist digs right in. But not before the source files are received and backed up. Of course, during the process, storage again comes into play, as the artist’s work is saved and composited into the live-action file and then saved (and stored) yet again. At mid-sized Artifex Studios and the larger Jellyfish Pictures, two visual effects studios, storage might not be the sexiest part of the work they do, but it is vital to a successful outcome nonetheless.

Artifex Studios
An independent studio in Vancouver, BC, Artifex Studios is a small- to mid-sized visual effects facility producing film and television projects for networks, film studios and streaming services. Founded in 1997 by VFX supervisor Adam Stern, the studio has grown over the years from a one- to two-person operation to one staffed by 35 to 45 artists. During that time it has built up a lengthy and impressive resume, from Charmed, Descendants 3 and The Crossing to Mission to Mars, The Company You Keep and Apollo 18.

To handle its storage needs, Artifex uses the Qumulo QC24 four-node storage cluster for its main storage system, along with G-Tech and LaCie portable RAIDs and Angelbird Technologies and Samsung portable SSD drives. “We’ve been running [Qumulo] for several years now. It was a significant investment for us because we’re not a huge company, but it has been tremendously successful for us,” says Stern.

“The most important things for us when it comes to storage are speed, data security and minimal downtime. They’re pretty obvious things, but Qumulo offered us a system that eliminated one of the problems we had been having with the [previous] system bogging down as concurrent users were moving the files around quickly between compositors and 3D artists,” says Stern. “We have 40-plus people hitting this thing, pulling in 4K, 6K, 8K footage from it, rendering and [creating] 3D, and it just ticks along. That was huge for us.”

Of course, speed is of utmost importance, but so is maintaining the data’s safety. To this end, the new system self-monitors, taking its own snapshots to maintain its own health and making sure there are constantly rotating levels of backups. Having the ability to monitor everything about the system is a big plus for the studio as well.

Because data safety and security is non-negotiable, Artifex uses Google Cloud services along with Qumulo for incremental storage, every night incrementally backing up to Google Cloud. “So while Qumulo is doing its own snapshots incrementally, we have another hard-drive system from Synology, which is more of a prosumer NAS system, whose only job is to do a local current backup,” Stern explains. “So in-house, we have two local backups between Qumulo and Synology, and then we have a third backup going to the cloud every night that’s off-site. When a project is complete, we archive it onto two sets of local hard drives, and one leaves the premises and the other is stored here.” At this point, the material is taken off the Qumulo system, and seven days later, the last of the so-called snapshots is removed.

As soon as data comes into Artifex — either via Aspera, Signiant’s Media Shuttle or hard disks — the material is immediately transferred to the Qumulo system, and then it is cataloged and placed into the studio’s ftrack database, which the studio uses for shot tracking. Then, as Stern says, the floodgates open, and all the artists, compositors, 3D team members and admin coordination team members access the material that resides on the Qumulo system.

Desktops at the studio have local storage, generally an SSD built into the machine, but as Stern points out, that is a temporary solution used by the artists while working on a specific shot, not to hold studio data.

Artifex generally works on a handful of projects simultaneously, including the Nickelodeon horror anthology Are You Afraid of the Dark? “Everything we do here requires storage, and we’re always dealing with high-resolution footage, and that project was no exception,” says Stern. For instance, the series required Artifex to simulate 10,000 CG cockroaches spilling out of every possible hole in a room — work that required a lot of high-speed caching.

“FX artists need to access temporary storage very quickly to produce those simulations. In terms of the Qumulo system, we need it to retrieve files at the speed our effects artists can simulate and cache, and make sure they are able to manage what can be thousands and thousands of files generated just within a few hours.”

Similarly, for Netflix’s Wu Assassins, the studio generated multiple simulations of CG smoke and fog within SideFX’s Side Effects Houdini and again had to generate thousands and thousands of cache files for all the particles and volume information. Just as it did with the caching for the CG cockroaches, the current system handled caching for the smoke and fog quite efficiently.

At this point, Stern says the vendor is doing some interesting things that his company has not yet taken advantage of. For instance, today one of the big pushes is working in the cloud and integrating that with infrastructures and workflows. “I know they are working on that, and we’re looking into that,” he adds. There are also some new equipment features, “bleeding-edge stuff” Artifex has not explored yet. “It’s OK to be cutting-edge, but bleeding-edge is a little scary for us,” Stern notes. “I know they are always playing with new features, but just having the important foundation of speed and security is right where we are at the moment.”

Jellyfish Pictures
When it comes to big projects with big storage needs, Jellyfish Pictures is no fish out of water. The studio works on myriad projects, from Hollywood blockbusters like Star Wars to high-end TV series like Watchmen to episodic animation like Floogals and Dennis & Gnasher: Unleashed! Recently, it has embarked on an animated feature for DreamWorks and has a dedicated art department that works on visual development for substantial VFX projects and children’s animated TV content.

To handle all this work, Jellyfish has five studios across the UK: four in London and one in Sheffield, in the north of England. What’s more, in early December, Jellyfish expanded further with a brand-new virtual studio in London seating over 150 artists — increasing its capacity to over 300 people. In line with this expansion, Jellyfish is removing all on-site infrastructure from its existing locales and moving everything to a co-location. This means that all five present locations will be wholly virtual as well, making Jellyfish the largest VFX and animation studio in the world operating this way, contends CTO Jeremy Smith.

“We are dealing with shows that have very large datasets, which, therefore, require high-performance computing. It goes without saying, then, that we need some pretty heavy-duty storage,” says Smith.

Not only must the storage solution be able to handle Jellyfish’s data needs, it must also fit into its operational model. “Even though we work across multiple sites, we don’t want our artists to feel that. We need a storage system that can bring together all locations into one centralized hub,” Smith explains. “As a studio, we do not rely on one storage hardware vendor; therefore, we need to work with a company that is hardware-agnostic in addition to being able to operate in the cloud.”

Also, Jellyfish is a TPN-assessed studio and thus has to work with vendors that are TPN compliant — another serious, and vital, consideration when choosing its storage solution. TPN is an initiative between the Motion Picture Association of America (MPAA) and the Content Delivery and Security Association (CDSA) that provides a set of requirements and best practices around preventing leaks, breaches and hacks of pre-released, high-valued media content.

With all those factors in mind, Jellyfish uses PixStor from Pixit Media for its storage solution. PixStor is a software-defined storage solution that allows the studio to use various hardware storage from other vendors under the hood. With PixStor, data moves seamlessly through many tiers of storage — from fast flash and disk tiers to cost-effective, high-capacity object storage to the cloud. In addition, the studio uses NetApp storage within a different part of the same workflow on Dell R740 hardware and alternates between SSD and spinning disks, depending on the purpose of the data and the file size.

“We’ve future-proofed our studio with the Mellanox SN2100 switch for the heavy lifting, and for connecting our virtual workstations to the storage, we are using several servers from the Dell N3000 series,” says Smith.

As a wholly virtual studio, Jellyfish has no storage housed locally; it all sits in a co-location, which is accessed through remote workstations powered by Teradici’s PCoIP technology.

According to Smith, becoming a completely virtual studio is a new development for Jellyfish. Nevertheless, the facility has been working with Pixit Media since 2014 and launched its first virtual studio in 2017, “so the building blocks have been in place for a while,” he says.

Prior to moving all the infrastructure off-site, Jellyfish ran its storage system out of its Brixton and Soho studios locally. Its own private cloud from Brixton powered Jellyfish’s Soho and Sheffield studios. Both PixStor storage solutions in Brixton and Soho were linked with the solution’s PixCache. The switches and servers were still from Dell and Mellanox but were an older generation.

“Way back when, before we adopted this virtual world we are living in, we still worked with on-premises and inflexible storage solutions. It limited us in terms of the work we could take on and where we could operate,” says Smith. “With this new solution, we can scale up to meet our requirements.”

Now, however, using Mellanox SN2100, which has 100GbE, Jellyfish can deal with obscene amounts of data, Smith contends. “The way the industry is moving with 4K and 8K, even 16K being thrown around, we need to be ready,” he says.

Before the co-location, the different sites were connected through PixCache; now the co-location and public cloud are linked via Ngenea, which pre-caches files locally to the render node before the render starts. Furthermore, the studio is able to unlock true multi-tenancy with a single storage namespace, rapidly deploying logical TPN-accredited data separation and isolation and scaling up services as needed. “Probably two of the most important facets for us in running a successful studio: security and flexibility,” says Smith.

Artists access the storage via their Teradici Zero Clients, which, through the Dell switches, connect users to the standard Samba SMB network. Users who are working on realtime clients or in high resolution are connected to the Pixit storage through the Mellanox switch, where PixStor Native Client is used.

“Storage is a fundamental part of any VFX and animation studio’s workflow. Implementing the correct solution is critical to the seamless running of a project, as well as the security and flexibility of the business,” Smith concludes. “Any good storage system is invisible to the user. Only the people who build it will ever know the precision it takes to get it up and running — and that is the sign you’ve got the perfect solution.”


Karen Moltenbrey is a veteran writer, covering visual effects and post production.

Reallusion’s Headshot plugin for realistic digi-doubles via AI

Reallusion has introduced a plugin for Character Creator 3 to help create realistic-looking digital doubles. According to the company, the Headshot plugin uses AI technology to automatically generate a digital human in minutes from one single photo, and those characters are fully rigged for voice lipsync, facial expression and full body animation.

Headshot allows game developers and virtual production teams to quickly funnel a cast of digital doubles into iClone, Unreal, Unity, Maya, ZBrush and more. The idea is to allow the digital humans to go anywhere they like and give creators a solution to rapidly develop, iterate and collaborate in realtime.

The plugin has two AI modes: Auto Mode and Pro Mode. Auto Mode is a one-click solution for creating mid-rez digital human crowds. This process allows one-click head and hair creation for realtime 3D head models. It also generates a separate 3D hair mesh with alpha mask to soften edge lines. The 3D hair is fully compatible with Character Creator’s conformable hair format (.ccHair). Users can add them into their hair library, and apply them to other CC characters.

Headshot Pro Mode offers full control of the 3D head generation process with advanced features such as Image Matching, Photo Reprojection and Custom Mask with up to 4,096-texture resolution.

The Image Matching Tool overlays an image reference plane for advanced head shape refinement and lens correction. With Photo Reprojection, users can easily fix the texture-to-mesh discrepancies resulting from face morph change.

Using high-rez source images and Headshot’s 1,000-plus morphs, users can get a scan-quality digital human face in 4K texture details. Additional textures include normal, AO, roughness, metallic, SSS and Micro Normal for more realistic digital human rendering.

The 3D Head Morph System is designed to achieve the professional and detailed look of 3D scan models. The 3D sculpting design allow users to hover over a control area and use directional mouse drags to adjust the corresponding mesh shape, from full head and face sculpting to individual features — head contour, face, eyes, nose, mouth and ears with more than 1,000 head morphs. It is now free with a purchase of the Headshot plugin.

The Headshot plugin for Character Creator is $199 and comes with the content pack Headshot Morph 1,000+ ($99). Character Creator 3 Pipeline costs $199.

Redshift integrates Cinema 4D noises, nodes and more

Maxon and Redshift Rendering Technologies have released Redshift 3.0.12, which has native support for Cinema 4D noises and deeper integration with Cinema 4D, including the option to define materials using Cinema 4D’s native node-based material system.

Cinema 4D noise effects have been in demand within other 3D software packages because of their flexibility, efficiency and look. Native support in Redshift means that users of other DCC applications can now access Cinema 4D noises by using Redshift as their rendering solution. Procedural noise allows artists to easily add surface detail and randomness to otherwise perfect surfaces. Cinema 4D offers 32 different types of noise and countless variations based on settings. Native support for Cinema 4D noises means Redshift can preserve GPU memory while delivering high-quality rendered results.

Redshift 3.0.12 provides content creators deeper integration of Redshift within Cinema 4D. Redshift materials can now be defined using Cinema 4D’s nodal material framework, introduced in Release 20. As well, Redshift materials can use the Node Space system introduced in Release 21, which combines the native nodes of multiple render engines into a single material. Redshift is the first to take advantage of the new API in Cinema 4D to implement its own Node Spaces. Users can now also use any Cinema 4D view panel as a Redshift IPR (interactive preview render) window, making it easier to work within compact layouts and interact with a scene while developing materials and lighting.

Redshift 3.0.12 is immediately available from the Redshift website.

Maxon acquired RedShift in April of 2019.

Framestore VFX will open in Mumbai in 2020

Oscar-winning creative studio Framestore will be opening a full-service visual effects studio in Mumbai in 2020 to target India’s booming creative industry. The studio will be located in the Nesco IT Park in Goregaon, in the center of Mumbai’s technology district. The news hammers home Framestore’s continued interest in India, after having made a major investment in Jesh Krishna Murthy’s VFX studio, Anibrain, in 2017.

“Mumbai represents a rolling of wheels that were set in motion over two years ago,” says Framestore founder/CEO William Sargent. “Our investment in Anibrain has grown considerably, and we continue in our partnership with Jesh Krishna Murthy to develop and grow that business. Indeed, they will become a valued production partner to our Mumbai offering.”

Framestore looks to make considerable hires in the coming months, aiming to build an initial 500-strong team with existing Framestore talent combined with the best of local Indian expertise. Mumbai will work alongside the global network, including London and Montreal, to create a cohesive virtual team delivering high-quality international work.

“Mumbai has become a center of excellence in digital filmmaking. There’s a depth of talent that can deliver to the scale of Hollywood with the color and flair of Bollywood,” Sargent continues. “It’s an incredibly vibrant city and its presence on the international scene is holding us all to a higher standard. In terms of visual effects, we will set the standard here as we did in Montreal almost eight years ago.”

 

London’s Freefolk beefs up VFX team

Soho-based visual effects studio Freefolk, which has seen growth in its commercials and longform work, has grown its staff to meet this demand. As part of the uptick in work, Freefolk promoted Cheryl Payne from senior producer to head of commercial production. Additionally, Laura Rickets has joined as senior producer, and 2D artist Bradley Cocksedge has been added to the commercials VFX team.

Payne, who has been with Freefolk since the early days, has worked on some of the studio’s biggest commercials, including; Warburtons for Engine, Peloton for Dark Horses and Cadburys for VCCP.

Rickets comes to Freefolk with over 18 years of production experience working at some of the biggest VFX houses in London, including Framestore, The Mill and Smoke & Mirrors, as well as agency side for McCann. Since joining the team, Rickets has VFX-produced work on the I’m A Celebrity IDs, a set of seven technically challenging and CG-heavy spots for the new series of the show as well as ads for the Rugby World Cup and Who Wants to Be a Millionaire?.

Cocksedge is a recent graduate who joins from Framestore, where he was working as an intern on Fantastic Beasts: The Crimes of Grindelwald. While in school at the University of Hertfordshire, he interned at Freefolk and is happy to be back in a full-time position.

“We’ve had an exciting year and have worked on some really stand-out commercials, like TransPennine for Engine and the beautiful spot for The Guardian we completed with Uncommon, so we felt it was time to add to the Freefolk family,” says Fi Kilroe, Freefolk’s co-managing director/executive producer.

Main Image: (L-R) Cheryl Payne, Laura Rickets and Bradley Cocksedge

Behind the Title: MPC’s CD Morten Vinther

This creative director/director still jumps on the Flame and also edits from time to time. “I love mixing it up and doing different things,” he says.

NAME: Morten Vinther

COMPANY: Moving Picture Company, Los Angeles

CAN YOU DESCRIBE YOUR COMPANY?
From original ideas all the way through to finished production, we are an eclectic mix of hard-working and passionate artists, technologists and creatives who push the boundaries of what’s possible for our clients. We aim to move the audience through our work.

WHAT’S YOUR JOB TITLE?
Creative Director and Director

WHAT DOES THAT ENTAIL?
I guide our clients through challenging shoots and post. I try to keep us honest in terms of making sure that our casting is right and the team is looked after and has the appropriate resources available for the tasks ahead, while ensuring that we go above and beyond on quality and experience. In addition to this, I direct projects, pitch on new business and develop methodology for visual effects.

American Horror Story

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I still occasionally jump on Flame and comp a job — right now I’m editing a commercial. I love mixing it up and doing different things.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Writing treatments. The moments where everything is crystal clear in your head and great ideas and concepts are rushing onto paper like an unstoppable torrent.

WHAT’S YOUR LEAST FAVORITE?
Writing treatments. Staring at a blank page, writing something and realizing how contrived it sounds before angrily deleting everything.

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
Early mornings. A good night’s sleep and freshly ground coffee creates a fertile breeding ground for pure clarity, ideas and opportunities.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I would be carefully malting barley for my next small batch of artisan whisky somewhere on the Scottish west coast.

Adidas Creators

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I remember making a spoof commercial at my school when I was about 13 years old. I became obsessed with operating cameras and editing, and I began to study filmmakers like Scorsese and Kubrick. After a failed career as a shopkeeper, a documentary production company in Copenhagen took mercy on me, and I started as an assistant editor.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
American Horror Story, Apple Unlock, directed by Dougal Wilson, and Adidas Creators, directed by Stacy Wall.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
If I had to single one out, it would probably be Apple’s Unlock commercial. The spot looks amazing, and the team was incredibly creative on this one. We enjoyed a great collaboration between several of our offices, and it was a lot of fun putting it together.

Apple’s Unlock

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
My phone, laptop and PlayStation.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
Some say social media rots your brains. That’s probably why I’m an Instagram addict.

CARE TO SHARE YOUR FAVORITE MUSIC TO WORK TO?
Odesza, SBTRKT, Little Dragon, Disclosure and classic reggae.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I recently bought a motorbike, and I spin around LA and Southern California most weekends. Concentrating on how to survive the next turn is a great way for me to clear the mind.

Director Robert Eggers talks about his psychological thriller The Lighthouse

By Iain Blair

Writer/director Robert Eggers burst onto the scene when his feature film debut, The Witch, won the Directing Award in the US Dramatic category at the 2015 Sundance Film Festival. He followed up that success by co-writing and directing another supernatural, hallucinatory horror film, The Lighthouse, which is set in the maritime world of the late 19th century.

L-R: Director Robert Eggers and cinematographer Jarin Blaschke on set.

The story begins when two lighthouse keepers (Willem Dafoe and Robert Pattinson) arrive on a remote island off the coast of New England for their month-long stay. But that stay gets extended as they’re trapped and isolated due to a seemingly never-ending storm. Soon, the two men engage in an escalating battle of wills, as tensions boil over and mysterious forces (which may or may not be real) loom all around them.

The Lighthouse has the power of an ancient myth. To tell this tale, which was shot in black and white, Eggers called on many of those who helped him create The Witch, including cinematographer Jarin Blaschke, production designer Craig Lathrop, composer Mark Korven and editor Louise Ford.

I recently talked to Eggers, who got his professional start directing and designing experimental and classical theater in New York City, about making the film, his love of horror and the post workflow.

Why does horror have such an enduring appeal?
My best argument is that there’s darkness in humanity, and we need to explore that. And horror is great at doing that, from the Gothic to a bad slasher movie. While I may prefer authors who explore the complexities in humanity, others may prefer schlocky films with jump scares that make you spill your popcorn, which still give them that dose of darkness. Those films may not be seriously probing the darkness, but they can relate to it.

This film seems more psychological than simple horror.
We’re talking about horror, but I’m not even sure that this is a horror film. I don’t mind the label, even though most wannabe auteurs are like, “I don’t like labels!” It started with an idea my brother Max had for a ghost story set in a lighthouse, which is not what this movie became. But I loved the idea, which was based on a true story. It immediately evoked a black and white movie on 35mm negative with a boxy aspect ratio of 1.19:1, like the old movies, and a fusty, dusty, rusty, musty atmosphere — the pipe smoke and all the facial hair — so I just needed a story that went along with all of that. (Laughs) We were also thinking a lot about influences and writers from the time — like Poe, Melville and Stevenson — and soaking up the jargon of the day. There were also influences like Prometheus and Proteus and God knows what else.

Casting the two leads was obviously crucial. What did Willem and Robert bring to their roles?
Absolute passion and commitment to the project and their roles. Who else but Willem can speak like a North Atlantic pirate stereotype and make it totally believable? Robert has this incredible intensity, and together they play so well against each other and are so well suited to this world. And they both have two of the best faces ever in cinema.

What were the main technical challenges in pulling it all together, and is it true you actually built the lighthouse?
We did. We built everything, including the 70-foot tower — a full-scale working lighthouse, along with its house and outbuildings — on Cape Forchu in Nova Scotia, which is this very dramatic outcropping of volcanic rock. Production designer Craig Lathrop and his team did an amazing job, and the reason we did that was because it gave us far more control than if we’d used a real lighthouse.

We scouted a lot but just couldn’t find one that suited us, and the few that did were far too remote to access. We needed road access and a place with the right weather, so in the end it was better to build it all. We also shot some of the interiors there as well, but most of them were built on soundstages and warehouses in Halifax since we knew it’d be very hard to shoot interiors and move the camera inside the lighthouse tower itself.

Your go-to DP, Jarin Blaschke, shot it. Talk about how you collaborated on the look and why you used black and white.
I love the look of black and white, because it’s both dreamlike and also more realistic than color in a way. It really suited both the story and the way we shot it, with the harsh landscape and a lot of close-ups of Willem and Robert. Jarin shot the film on the Panavision Millennium XL2, and we also used vintage Baltar lenses from the 1930s, which gave the film a great look, as they make the sea, water and sky all glow and shimmer more. He also used a custom cyan filter by Schneider Filters that gave us that really old-fashioned look. Then by using black and white, it kept the overall look very bleak at all times.

How tough was the shoot?
It was pretty tough, and all the rain and pounding wind you see onscreen is pretty much real. Even on the few sunny days we had, the wind was just relentless. The shoot was about 32 days, and we were out in the elements in March and April of last year, so it was freezing cold and very tough for the actors. It was very physically demanding.

Where did you post?
We did it all in New York at Harbor Post, with some additional ADR work at Goldcrest in London with Robert.

Do you like the post process?
I love post, and after the very challenging shoot, it was such a relief to just get in a warm, dry, dark room and start cutting and pulling it all together.

Talk about editing with Louise Ford, who also cut The Witch. How did that work?
She was with us on the shoot at a bed and breakfast, so I could check in with her at the end of the day. But it was so tough shooting that I usually waited until the weekends to get together and go over stuff. Then when we did the stage work at Halifax, she had an edit room set up there, and that was much easier.

What were the big editing challenges?
The DP and I developed such a specific and detailed cinema language without a ton of coverage and with little room for error that we painted ourselves into a corner. So that became the big challenge… when something didn’t work. It was also about getting the running time down but keeping the right pace since the performances dictate the pace of the edit. You can’t just shorten stuff arbitrarily. But we didn’t leave a lot of stuff on the cutting room floor. The assembly was just over two hours and the final film isn’t much shorter.

All the sound effects play a big role. Talk about the importance of sound and working on them with sound designer Damian Volpe, whose credits include Can You Ever Forgive Me?, Leave No Trace, Mudbound, Drive, Winter’s Bone and Margin Call.
It’s hugely important in this film, and Louise and I did a lot of work in the picture edit to create temps for Damian to inspire him. And he was so relentless in building up the sound design, and even creating weird sounds to go with the actual light, and to go with the score by Mark Korven, who did The Witch, and all the brass and unusual instrumentation he used on this. So the result is both experimental and also quite traditional, I think.

There are quite a few VFX shots. Who did them, and what was involved?
We had MELS and Oblique in Quebec and Brainstorm Digital in New York also did some. The big one was that the movie’s set on an island but we shot on a peninsula, which also had a lighthouse further north, which unfortunately didn’t look at all correct, so we framed it out a lot but we had to erase it for some of the time. And our period-correct sea ship broke down and had to be towed around by other ships, so there was a lot of clean up. Also with all the safety cables we had to use for cliff shots with the actors.

Where did you do the DI, and how important is it to you?
We did it at Harbor with colorist Joe Gawler, and it was hugely important although it was fairly simple because there’s very little latitude on the Double-X film stock we used. We did a lot of fine detail work to finesse it, but it was a lot quicker than if it’d been in color.

Did the film turn out the way you hoped?
No, they always change and surprise you, but I’m very proud of what we did.

What’s next?
I’m prepping another period piece, but it’s not a horror film. That’s all I can say.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Alkemy X adds Albert Mason as head of production

Albert Mason has joined VFX house Alkemy X as head of production. He comes to Alkemy X with over two decades of experience in visual effects and post production. He has worked on projects directed by such industry icons as Peter Jackson on the Lord of the Rings trilogy, Tim Burton on Alice in Wonderland and Robert Zemeckis on The Polar Express. In his new role at Alkemy X, he will use his experience in feature films to target the growing episodic space.

A large part of Alkemy X’s work has been for episodic visual effects, with credits that include Amazon Prime’s Emmy-winning original series, The Marvelous Mrs. Maisel, USA’s Mr. Robot, AMC’s Fear the Walking Dead, Netflix’s Maniac, NBC’s Blindspot and Starz’s Power.

Mason began his career at MTV’s on-air promos department, sharpening his production skills on top series promo campaigns and as a part of its newly launched MTV Animation Department. He took an opportunity to transition into VFX, stepping into a production role for Weta Digital and spending three years working globally on the Lord of the Rings trilogy. He then joined Sony Pictures Imageworks, where he contributed to features including Spider-Man 3 and Ghost Rider. He has also produced work for such top industry shops as Logan, Rising Sun Pictures and Greymatter VFX.

“[Albert’s] expertise in constructing advanced pipelines that embrace emerging technologies will be invaluable to our team as we continue to bolster our slate of VFX work,” says Alkemy X president/CEO Justin Wineburgh.

2019 HPA Award winners announced

The industry came together on November 21 in Los Angeles to celebrate its own at the 14th annual HPA Awards. Awards were given to individuals and teams working in 12 creative craft categories, recognizing outstanding contributions to color grading, sound, editing and visual effects for commercials, television and feature film.

Rob Legato receiving Lifetime Achievement Award from presenter Mike Kanfer. (Photo by Ryan Miller/Capture Imaging)

As was previously announced, renowned visual effects supervisor and creative Robert Legato, ASC, was honored with this year’s HPA Lifetime Achievement Award; Peter Jackson’s They Shall Not Grow Old was presented with the HPA Judges Award for Creativity and Innovation; acclaimed journalist Peter Caranicas was the recipient of the very first HPA Legacy Award; and special awards were presented for Engineering Excellence.

The winners of the 2019 HPA Awards are:

Outstanding Color Grading – Theatrical Feature

WINNER: “Spider-Man: Into the Spider-Verse”
Natasha Leonnet // Efilm

“First Man”
Natasha Leonnet // Efilm

“Roma”
Steven J. Scott // Technicolor

Natasha Leonnet (Photo by Ryan Miller/Capture Imaging)

“Green Book”
Walter Volpatto // FotoKem

“The Nutcracker and the Four Realms”
Tom Poole // Company 3

“Us”
Michael Hatzer // Technicolor

 

Outstanding Color Grading – Episodic or Non-theatrical Feature

WINNER: “Game of Thrones – Winterfell”
Joe Finley // Sim, Los Angeles

 “The Handmaid’s Tale – Liars”
Bill Ferwerda // Deluxe Toronto

“The Marvelous Mrs. Maisel – Vote for Kennedy, Vote for Kennedy”
Steven Bodner // Light Iron

“I Am the Night – Pilot”
Stefan Sonnenfeld // Company 3

“Gotham – Legend of the Dark Knight: The Trial of Jim Gordon”
Paul Westerbeck // Picture Shop

“The Man in The High Castle – Jahr Null”
Roy Vasich // Technicolor

 

Outstanding Color Grading – Commercial  

WINNER: Hennessy X.O. – “The Seven Worlds”
Stephen Nakamura // Company 3

Zara – “Woman Campaign Spring Summer 2019”
Tim Masick // Company 3

Tiffany & Co. – “Believe in Dreams: A Tiffany Holiday”
James Tillett // Moving Picture Company

Palms Casino – “Unstatus Quo”
Ricky Gausis // Moving Picture Company

Audi – “Cashew”
Tom Poole // Company 3

 

Outstanding Editing – Theatrical Feature

Once Upon a Time… in Hollywood

WINNER: “Once Upon a Time… in Hollywood”
Fred Raskin, ACE

“Green Book”
Patrick J. Don Vito, ACE

“Rolling Thunder Revue: A Bob Dylan Story by Martin Scorsese”
David Tedeschi, Damian Rodriguez

“The Other Side of the Wind”
Orson Welles, Bob Murawski, ACE

“A Star Is Born”
Jay Cassidy, ACE

 

Outstanding Editing – Episodic or Non-theatrical Feature (30 Minutes and Under)

VEEP

WINNER: “Veep – Pledge”
Roger Nygard, ACE

“Russian Doll – The Way Out”
Todd Downing

“Homecoming – Redwood”
Rosanne Tan, ACE

“Withorwithout”
Jake Shaver, Shannon Albrink // Therapy Studios

“Russian Doll – Ariadne”
Laura Weinberg

 

Outstanding Editing – Episodic or Non-theatrical Feature (Over 30 Minutes)

WINNER: “Stranger Things – Chapter Eight: The Battle of Starcourt”
Dean Zimmerman, ACE, Katheryn Naranjo

“Chernobyl – Vichnaya Pamyat”
Simon Smith, Jinx Godfrey // Sister Pictures

“Game of Thrones – The Iron Throne”
Katie Weiland, ACE

“Game of Thrones – The Long Night”
Tim Porter, ACE

“The Bodyguard – Episode One”
Steve Singleton

 

Outstanding Sound – Theatrical Feature

WINNER: “Godzilla: King of Monsters”
Tim LeBlanc, Tom Ozanich, MPSE // Warner Bros.
Erik Aadahl, MPSE, Nancy Nugent, MPSE, Jason W. Jennings // E Squared

“Shazam!”
Michael Keller, Kevin O’Connell // Warner Bros.
Bill R. Dean, MPSE, Erick Ocampo, Kelly Oxford, MPSE // Technicolor

“Smallfoot”
Michael Babcock, David E. Fluhr, CAS, Jeff Sawyer, Chris Diebold, Harrison Meyle // Warner Bros.

“Roma”
Skip Lievsay, Sergio Diaz, Craig Henighan, Carlos Honc, Ruy Garcia, MPSE, Caleb Townsend

“Aquaman”
Tim LeBlanc // Warner Bros.
Peter Brown, Joe Dzuban, Stephen P. Robinson, MPSE, Eliot Connors, MPSE // Formosa Group

 

Outstanding Sound – Episodic or Non-theatrical Feature

WINNER: “The Haunting of Hill House – Two Storms”
Trevor Gates, MPSE, Jason Dotts, Jonathan Wales, Paul Knox, Walter Spencer // Formosa Group

“Chernobyl – 1:23:45”
Stefan Henrix, Stuart Hilliker, Joe Beal, Michael Maroussas, Harry Barnes // Boom Post

“Deadwood: The Movie”
John W. Cook II, Bill Freesh, Mandell Winter, MPSE, Daniel Colman, MPSE, Ben Cook, MPSE, Micha Liberman // NBC Universal

“Game of Thrones – The Bells”
Tim Kimmel, MPSE, Onnalee Blank, CAS, Mathew Waters, CAS, Paula Fairfield, David Klotz

“Homecoming – Protocol”
John W. Cook II, Bill Freesh, Kevin Buchholz, Jeff A. Pitts, Ben Zales, Polly McKinnon // NBC Universal

 

Outstanding Sound – Commercial 

WINNER: John Lewis & Partners – “Bohemian Rhapsody”
Mark Hills, Anthony Moore // Factory

Audi – “Life”
Doobie White // Therapy Studios

Leonard Cheshire Disability – “Together Unstoppable”
Mark Hills // Factory

New York Times – “The Truth Is Worth It: Fearlessness”
Aaron Reynolds // Wave Studios NY

John Lewis & Partners – “The Boy and the Piano”
Anthony Moore // Factory

 

Outstanding Visual Effects – Theatrical Feature

WINNER: “The Lion King”
Robert Legato
Andrew R. Jones
Adam Valdez, Elliot Newman, Audrey Ferrara // MPC Film
Tom Peitzman // T&C Productions

“Avengers: Endgame”
Matt Aitken, Marvyn Young, Sidney Kombo-Kintombo, Sean Walker, David Conley // Weta Digital

“Spider-Man: Far From Home”
Alexis Wajsbrot, Sylvain Degrotte, Nathan McConnel, Stephen Kennedy, Jonathan Opgenhaffen // Framestore

“Alita: Battle Angel”
Eric Saindon, Michael Cozens, Dejan Momcilovic, Mark Haenga, Kevin Sherwood // Weta Digital

“Pokemon Detective Pikachu”
Jonathan Fawkner, Carlos Monzon, Gavin Mckenzie, Fabio Zangla, Dale Newton // Framestore

 

Outstanding Visual Effects – Episodic (Under 13 Episodes) or Non-theatrical Feature

Game of Thrones

WINNER: “Game of Thrones – The Bells”
Steve Kullback, Joe Bauer, Ted Rae
Mohsen Mousavi // Scanline
Thomas Schelesny // Image Engine

“Game of Thrones – The Long Night”
Martin Hill, Nicky Muir, Mike Perry, Mark Richardson, Darren Christie // Weta Digital

“The Umbrella Academy – The White Violin”
Everett Burrell, Misato Shinohara, Chris White, Jeff Campbell, Sebastien Bergeron

“The Man in the High Castle – Jahr Null”
Lawson Deming, Cory Jamieson, Casi Blume, Nick Chamberlain, William Parker, Saber Jlassi, Chris Parks // Barnstorm VFX

“Chernobyl – 1:23:45”
Lindsay McFarlane
Max Dennison, Clare Cheetham, Steven Godfrey, Luke Letkey // DNEG

 

Outstanding Visual Effects – Episodic (Over 13 Episodes)

Team from The Orville – Outstanding VFX, Episodic, Over 13 Episodes (Photo by Ryan Miller/Capture Imaging)

WINNER: “The Orville – Identity: Part II”
Tommy Tran, Kevin Lingenfelser, Joseph Vincent Pike // FuseFX
Brandon Fayette, Brooke Noska // Twentieth Century FOX TV

“Hawaii Five-O – Ke iho mai nei ko luna”
Thomas Connors, Anthony Davis, Chad Schott, Gary Lopez, Adam Avitabile // Picture Shop

“9-1-1 – 7.1”
Jon Massey, Tony Pirzadeh, Brigitte Bourque, Gavin Whelan, Kwon Choi // FuseFX

“Star Trek: Discovery – Such Sweet Sorrow Part 2”
Jason Zimmerman, Ante Dekovic, Aleksandra Kochoska, Charles Collyer, Alexander Wood // CBS Television Studios

“The Flash – King Shark vs. Gorilla Grodd”
Armen V. Kevorkian, Joshua Spivack, Andranik Taranyan, Shirak Agresta, Jason Shulman // Encore VFX

The 2019 HPA Engineering Excellence Awards were presented to:

Adobe – Content-Aware Fill for Video in Adobe After Effects

Epic Games — Unreal Engine 4

Pixelworks — TrueCut Motion

Portrait Displays and LG Electronics — CalMan LUT based Auto-Calibration Integration with LG OLED TVs

Honorable Mentions were awarded to Ambidio for Ambidio Looking Glass; Grass Valley, for creative grading; and Netflix for Photon.

Creating With Cloud: A VFX producer’s perspective

By Chris Del Conte

The ‘90s was an explosive era for visual effects, with films like Jurassic Park, Independence Day, Titanic and The Matrix shattering box office records and inspiring a generation of artists and filmmakers, myself included. I got my start in VFX working on seaQuest DSV, an Amblin/NBC sci-fi series that was ground-breaking for its time, but looking at the VFX of modern films like Gemini Man, The Lion King and Ad Astra, it’s clear just how far the industry has come. A lot of that progress has been enabled by new technology and techniques, from the leap to fully digital filmmaking and emergence of advanced viewing formats like 3D, Ultra HD and HDR to the rebirth of VR and now the rise of cloud-based workflows.

In my nearly 25 years in VFX, I’ve worn a lot of hats, including VFX producer, head of production and business development manager. Each role involved overseeing many aspects of a production and, collectively, they’ve all shaped my perspective when it comes to how the cloud is transforming the entire creative process. Thanks to my role at AWS Thinkbox, I have a front-row seat to see why studios are looking at the cloud for content creation, how they are using the cloud, and how the cloud affects their work and client relationships.

Chris Del Conte on the set of the IMAX film Magnificent Desolation.

Why Cloud?
We’re in a climate of high content demand and massive industry flux. Studios are incentivized to find ways to take on more work, and that requires more resources — not just artists, but storage, workstations and render capacity. Driving a need to scale, this trend often motivates studios to consider the cloud for production or to strengthen their use of cloud in their pipelines if already in play. Cloud-enabled studios are much more agile than traditional shops. When opportunities arise, they can act quickly, spinning resources up and down at a moment’s notice. I realize that for some, the concept of the cloud is still a bit nebulous, which is why finding the right cloud partner is key. Every facility is different, and part of the benefit of cloud is resource customization. When studios use predominantly physical resources, they have to make decisions about storage and render capacity, electrical and cooling infrastructure, and staff accommodations up front (and pay for them). Using the cloud allows studios to adjust easily to better accommodate whatever the current situation requires.

Artistic Impact
Advanced technology is great, but artists are by far a studio’s biggest asset; automated tools are helpful but won’t deliver those “wow moments” alone. Artists bring the creativity and talent to the table, then, in a perfect world, technology helps them realize their full potential. When artists are free of pipeline or workflow distractions, they can focus on creating. The positive effects spill over into nearly every aspect of production, which is especially true when cloud-based rendering is used. By scaling render resources via the cloud, artists aren’t limited by the capacity of their local machines. Since they don’t have to wait as long for shots to render, artists can iterate more fluidly. This boosts morale because the final results are closer to what artists envisioned, and it can improve work-life balance since artists don’t have to stick around late at night waiting for renders to finish. With faster render results, VFX supervisors also have more runway to make last-minute tweaks. Ultimately, cloud-based rendering enables a higher caliber of work and more satisfied artists.

Budget Considerations
There are compelling arguments for shifting capital expenditures to operational expenditures with the cloud. New studios get the most value out of this model since they don’t have legacy infrastructure to accommodate. Cloud-based solutions level the playing field in this respect; it’s easier for small studios and freelancers to get started because there’s no significant up-front hardware investment. This is an area where we’ve seen rapid cloud adoption. Considering how fast technology changes, it seems ill-advised to limit a new studio’s capabilities to today’s hardware when the cloud provides constant access to the latest compute resources.

When a studio has been in business for decades and might have multiple locations with varying needs, its infrastructure is typically well established. Some studios may opt to wait until their existing hardware has fully depreciated before shifting resources to the cloud, while others dive in right away, with an eye on the bigger picture. Rendering is generally a budgetary item on project bids, but with local hardware, studios are working to recoup a sunk cost. Using the cloud, render compute can be part of a bid and becomes a negotiable item. Clients can determine the delivery timeline based on render budget, and the elasticity of cloud resources allows VFX studios to pick up more work. (Even the most meticulously planned productions can run into 911 issues ahead of delivery, and cloud-enabled studios have bandwidth to be the hero when clients are in dire straits.)

Looking Ahead
When I started in VFX, giant rooms filled with racks and racks of servers and hardware were the norm, and VFX studios were largely judged by the size of their infrastructure. I’ve heard from an industry colleague about how their VFX studio’s server room was so impressive that they used to give clients tours of the space, seemingly a visual reminder of the studio’s vast compute capabilities. Today, there wouldn’t be nearly as much to view. Modern technology is more powerful and compact but still requires space, and that space has to be properly equipped with the necessary electricity and cooling. With cloud, studios don’t need switchers and physical storage to be competitive off the bat, and they experience fewer infrastructure headaches, like losing freon in the AC.

The cloud also opens up the available artist talent pool. Studios can dedicate the majority of physical space to artists as opposed to machines and even hire artists in remote locations on a per-project or long-term basis. Facilities of all sizes are beginning to recognize that becoming cloud-enabled brings a significant competitive edge, allowing them to harness the power to render almost any client request. VFX producers will also start to view facility cloud-enablement as a risk management tool that allows control of any creative changes or artistic embellishments up until delivery, with the rendering output no longer a blocker or a limited resource.

Bottom line: Cloud transforms nearly every aspect of content creation into a near-infinite resource, whether storage capacity, render power or artistic talent.


Chris Del Conte is senior EC2 business development manager at AWS Thinkbox.

Motorola’s next-gen Razr gets a campaign for today

Many of us have fond memories of our Razr flip phone. At the time, it was the latest and greatest. Then new technology came along, and the smartphone era was born. Now Motorola is asking, “Why can’t you have both?”

Available as of November 13, the new Razr fits in a palm or pocket when shut and flips open to reveal an immersive, full-length touch screen. There is a display screen called the Quick View when closed and the larger Flex View when open — and the two displays are made to work together. Whatever you see on Quick View then moves to the larger Flex View display when you flip it open.

In order to help tell this story, Motorola called on creative shop Los York to help relaunch the Razr. Los York created the new smartphone campaign to tap into the Razr’s original DNA and launch it for today’s user.

Los York developed a 360 campaign that included films, social, digital, TV, print and billboards, with visuals in stores and on devices (wallpapers, ringtones, startup screens). Los York treated the Razr as a luxury item and a piece of art, letting the device reveal itself unencumbered by taglines and copy. The campaign showcases the Razr as a futuristic, high-end “fashion accessory” that speaks to new industry conversations, such as advancing tech along a utopian or dystopian future.

The campaign features a mix of live action and CG. Los York shot on a Panavision DXL with Primo 70 lenses. CG was created using Maxon Cinema 4D with Redshift and composited in Adobe After Effects. The piece was edited in-house on Adobe Premiere.

We reached out to Los York CEO and founder Seth Epstein to find out more:

How much of this is live action versus CG?
The majority is CG, but, originally, the piece was intended to be entirely CG. Early in the creative process, we defined the world in which the new Razr existed and who would belong there. As we worked on the project, we kept feeling that bringing our characters to life in live action and blending the worlds. The proper live action was envisioned after the fact, which is somewhat unusual.

What were some of the most challenging aspects of this piece?
The most challenging part of the project was the fact that the project happened over a period of nine months. Wisely, the product release needed to push, and we continued to evolve the project over time, which is a blessing and a curse.

How did it feel taking on a product with a lot of history and then rebranding it for the modern day?
We felt the key was to relaunch an iconic product like the Razr with an eye to the future. The trap of launching anything iconic is falling back on the obvious retro throwback references, which can come across as too obvious. We dove into the original product and campaigns to extract the brand DNA of 2004 using archetype exercises. We tapped into the attitude and voice of the Razr at that time — and used that attitude as a starting point. We also wanted to look forward and stand three years in the future and imagine what the tone and campaign would be then. All of this is to say that we wanted the new Razr to extract the power of the past but also speak to audiences in a totally fresh and new way.

Check out the campaign here.

Blur Studio uses new AMD Threadripper for Terminator: Dark Fate VFX

By Dayna McCallum

AMD has announced new additions to its high-end desktop processor family. Built for demanding desktop and content creation workloads, the 24-core AMD Ryzen Threadripper 3960X and the 32-core AMD Ryzen Threadripper 3970X processors will be available worldwide November 25.

Tim Miller on the set of Dark Fate.

AMD states that the powerful new processors provide up to 90 percent more performance and up to 2.5 times more available storage bandwidth than competitive offerings, per testing and specifications by AMD performance labs. The 3rd Gen AMD Ryzen Threadripper lineup features two new processors built on 7nm “Zen 2” core architecture, claiming up to 88 PCIe 4.0 lanes and 144MB cache with 66 percent better power efficiency.

Prior to the official product launch, AMD made the 3rd Gen Threadrippers available to LA’s Blur Studio for work on the recent Terminator: Dark Fate and continued a collaboration with the film’s director — and Blur Studio founder — Tim Miller.

Before the movie’s release, AMD hosted a private Q&A with Miller, moderated by AMD’s James Knight. Please note that we’ve edited the lively conversation for space and taken a liberty with some of Miller’s more “colorful” language. (Also watch this space to see if a wager is won that will result in Miller sporting a new AMD tattoo.) Here is the Knight/Miller conversation…

So when we dropped off the 3rd Gen Threadripper to you guys, how did your IT guys react?
Like little children left in a candy shop with no adult supervision. The nice thing about our atmosphere here at Blur is we have an open layout. So when (bleep) like these new AMD processors drops in, you know it runs through the studio like wildfire, and I sit out there like everybody else does. You hear the guys talking about it, you hear people giggling and laughing hysterically at times on the second floor where all the compositors are. That’s where these machines really kick ass — busting through these comps that would have had to go to the farm, but they can now do it on a desktop.

James Knight

As an artist, the speed is crucial. You know, if you have a machine that takes 15 minutes to render, you want to stop and do something else while you wait for a render. It breaks your whole chain of thought. You get out of that fugue state that you produce the best art in. It breaks the chain between art and your brain. But if you have a machine that does it in 30 seconds, that’s not going to stop it.

But really, more speed means more iterations. It means you deal with heavier scenes, which means you can throw more detail at your models and your scenes. I don’t think we do the work faster, necessarily, but the work is much higher quality. And much more detailed. It’s like you create this vacuum, and then everybody rushes into it and you have this silly idea that it is really going to increase productivity, but what it really increases most is quality.

When your VFX supervisor showed you the difference between the way it was done with your existing ecosystem and then with the third-gen Threadripper, what were you thinking about?
There was the immediate thing — when we heard from the producers about the deadline, shots that weren’t going to get done for the trailer, suddenly were, which was great. More importantly, you heard from the artists. What you started to see was that it allows for all different ways of working, instead of just the elaborate pipeline that we’ve built up — to work on your local box and then submit it to the farm and wait for that render to hit the queue of farm machines that can handle it, then send that render back to you.

It has a rhythm that is at times tiresome for the artists, and I know that because I hear it all the time. Now I say, “How’s that comp coming and when are we going to get it, tick tock?” And they say, “Well, it’s rendering in the background right now, as I’m watching them work on another comp or another piece of that comp.” That’s pretty amazing. And they’re doing it all locally, which saves so much time and frustration compared to sending it down the pipeline and then waiting for it to come back up.

I know you guys are here to talk about technology, but the difference for the artists is the instead of working here until 1:00am, they’re going home to put their children to bed. That’s really what this means at the end of the day. Technology is so wonderful when it enables that, not just the creativity of what we do, but the humanity… allowing artists to feel like they’re really on the cutting edge, but also have a life of some sort outside.

Endoskeleton — Terminator: Dark Fate

As you noted, certain shots and sequences wouldn’t have made it in time for the trailer. How important was it for you to get that Terminator splitting in the trailer?
 Marketing was pretty adamant that that shot had to be in there. There’s always this push and pull between marketing and VFX as you get closer. They want certain shots for the trailer, but they’re almost always those shots that are the hardest to do because they have the most spectacle in them. And that’s one of the shots. The sequence was one of the last to come together because we changed the plan quite a bit, and I kept changing shots on Dan (Akers, VFX supervisor). But you tell marketing people that they can’t have something, and they don’t really give a (bleep) about you and your schedule or the path of that artist and shot. (Laughing)

Anyway, we said no. They begged, they pleaded, and we said, “We’ll try.” Dan stepped up and said, “Yeah, I think I can make it.” And we just made it, but that sounds like we were in danger because we couldn’t get it done fast enough. All of this was happening in like a two-day window. If you didn’t notice (in the trailer), that’s a Rev 7. Gabriel Luna is a Rev 9, which is the next gen. But the Rev 7s that you see in his future flashback are just pure killers. They’re still the same technology, which is looking like metal on the outside and a carbon endoskeleton that splits. So you have to run the simulation where the skeleton separates through the liquid that hangs off of an inch string; it’s a really hard simulation to do. That’s why we thought maybe it wasn’t going to get done, but running the simulation on the AMD boxes was lightning fast.

 

 

 

Carbon New York grows with three industry vets

Carbon in New York has grown with two senior hires — executive producer Nick Haynes and head of CG Frank Grecco — and the relocation of existing ECD Liam Chapple, who joins from the Chicago office.

Chapple joined Carbon in 2016, moving from Mainframe in London to open Carbon’s Chicago facility.  He brought in clients such as Porsche, Lululemon, Jeep, McDonald’s, and Facebook. “I’ve always looked to the studios, designers and directors in New York as the high bar, and now I welcome the opportunity to pitch against them. There is an amazing pool of talent in New York, and the city’s energy is a magnet for artists and creatives of all ilk. I can’t wait to dive into this and look forward to expanding upon our amazing team of artists and really making an impression in such a competitive and creative market.”

Chapple recently wrapped direction and VFX on films for Teflon and American Express (Ogilvy) and multiple live-action projects for Lululemon. The most recent shoot, conceived and directed by Chapple, was a series of eight live-action films focusing on Lululemon’s brand ambassadors and its new flagship store in Chicago.

Haynes joins Carbon from his former role as EP of MPC, bringing over 20 years of experience earned at The Mill, MPC and Absolute. Haynes recently wrapped the launch film for the Google Pixel phone and the Chromebook, as well as an epic Middle Earth: Shadow of War Monolith Games trailer combining photo-real CGI elements with live-action shot on the frozen Black Sea in Ukraine.  “We want to be there at the inception of the creative and help steer it — ideally, lead it — and be there the whole way through the process, from concept and shoot to delivery. Over the years, whether working for the world’s most creative agencies or directly with prestigious clients like Google, Guinness and IBM, I aim to be as close to the project as possible from the outset, allowing my team to add genuine value that will garner the best result for everyone involved.”

Grecco joins Carbon from Method Studios, where he most recently led projects for Google, Target, Microsoft, Netflix and Marvel’s Deadpool 2.  With a wide range of experience from Emmy-nominated television title sequences to feature films and Super Bowl commercials, Grecco looks forward to helping Carbon continue to push its visuals beyond the high bar that has already been set.

In addition to New York and Chicago, Carbon has a studio in Los Angeles.

Main Image: (L-R) Frank Grecco, Liam Chapple, Nick Haynes

Behind the Title: Sarofsky EP Steven Anderson

This EP’s responsibilities range gamut “from managing our production staff to treating clients to an amazing dinner.”

Company: Chicago’s Sarofsky

Can you describe your company?
We like to describe ourselves as a design-driven production company. I like to think of us as that but so much more. We can be a one-stop shop for everything from concept through finish, or we can partner with a variety of other companies and just be one piece of the puzzle. It’s like ordering from a Chinese menu — you get to pick what items you want.

What’s your job title, and what does the job entail?
I’m executive producer, and that means different things at different companies and industries. Here at Sarofsky, I am responsible for things that run the gamut from managing our production staff to treating clients to an amazing dinner.

Sarofsky

What would surprise people the most about what falls under that title?
I also run payroll, and I am damn good at it.

How has the VFX industry changed in the time you’ve been working?
It used to be that when you told someone, “This is going to take some time to execute,” that’s what it meant. But now, everyone wants everything two hours ago. On the flip side, the technology we now have access to has streamlined the production process and provided us with some terrific new tools.

Why do you like being on set for shoots? What are the benefits?
I always like being on set whenever I can because decisions are being made that are going to affect the rest of the production paradigm. It’s also a good opportunity to bond with clients and, sometimes, get some kick-ass homemade guacamole.

Did a particular film inspire you along this path in entertainment?
I have been around this business for quite a while, and one of the reasons I got into it was my love of film and filmmaking. I can’t say that one particular film inspired me to do this, but I remember being a young kid and my dad taking me to see The Towering Inferno in the movie theater. I was blown away.

What’s your favorite part of the job?
Choosing a spectacular bottle of wine for a favorite client and watching their face when they taste it. My least favorite has to be chasing down clients for past due invoices. It gets old very quickly.

What is your most productive time of the day?
It’s 6:30am with my first cup of coffee sitting at my kitchen counter before the day comes at me. I get a lot of good thinking and writing done in those early morning hours.

Original Bomb Pop via agency VMLY&R

If you didn’t have this job, what would you be doing instead?
I would own a combo bookstore/wine shop where people could come and enjoy two of my favorite things.

Why did you choose this profession?
I would say this profession chose me. I studied to be an actor and made my living at it for several years, but due to some family issues, I ended up taking a break for a few years. When I came back, I went for a job interview at FCB and the rest is history. I made the move from agency producing to post executive producer five years ago and have not looked back since.

Can you briefly explain one or more ways Sarofsky is addressing the issue of workplace diversity in its business?
We are a smallish women-owned business, and I am a gay man; diversity is part of our DNA. We always look out for the best talent but also try to ensure we are providing opportunities for people who may not have access to them. For example, one of our amazing summer interns came to us through a program called Kaleidoscope 4 Kids, and we all benefited from the experience.

Name some recent projects you have worked on, which are you most proud of, and why?
My first week here at EP, we went to LA for the friends and family screening of Guardians of the Galaxy, and I thought, what an amazing company I work for! Marvel Studios is a terrific production partner, and I would say there is something special about so many of our clients because they keep coming back. I do have a soft spot for our main title for Animal Kingdom just because I am a big Ellen Barkin fan.

Original Bomb Pop via agency VMLY&R

Name three pieces of technology you can’t live without.
I’d be remiss if I didn’t say my MacBook and iPhone, but I also wouldn’t want to live without my cooking thermometer, as I’ve learned how to make sourdough bread this year, and it’s essential.

What social media channels do you follow?
I am a big fan of Instagram; it’s just visual eye candy and provides a nice break during the day. I don’t really partake in much else unless you count NPR. They occupy most of my day.

Do you listen to music while you work? Care to share your favorite music to work to?
I go in waves. Sometimes I do but then I won’t listen to anything for weeks. But I recently enjoyed listening to “Ladies and Gentleman: The Best of George Michael.” It was great to listen to an entire album, a rare treat.

What do you do to de-stress from it all?
I get up early and either walk or do some type of exercise to set the tone for the day. It’s also so important to unplug; my partner and I love to travel, so we do that as often as we can. All that and a 2006 Chateau Margaux usually washes away the day in two delicious sips.

Filmmaker Hasraf “HaZ” Dulull talks masterclass on sci-fi filmmaking

By Randi Altman

Hasraf “HaZ” Dulull is a producer/director and a hands-on VFX and post pro. His most recent credits include the features films 2036 Origin Unknown and The Beyond, the Disney TV series Fast Layne and the Disney Channel original movies Under the Sea — A Descendants Story, which takes place between Descendants 2 and 3. Recently, Dulull developed a masterclass on Sci-Fi Filmmaking, which can be bought or rented.

Why would this already very busy man decide to take on another project and one that is a little off his current path? Well, we reached out to find out.

Why, at this point in your career, did you think it was important to create this masterclass?
I have seen other masterclasses out there to do with filmmaking and they were always academic based, which turned me off. The best ones were the ones that were taught by actual filmmakers who had made commercial projects, films or TV shows… not just short films. So I knew that if I was to create and deliver a masterclass, I would do it after having made a couple of feature films that have been released out there in the world. I wanted to lead by example and experience.

When I was in LA explaining to studio people, executives and other filmmakers how I made my feature films, they were impressed and fascinated with my process. They were amazed that I was able to pull off high-concept sci-fi films on tight budgets and schedules but still produce a film that looked expensive to make.

When I was researching existing masterclasses or online courses as references, I found that no one was actually going through the entire process. Instead they were offering specialized training in either cinematography or VFX, but there wasn’t anything about how to break down a script and put a budget and schedule together; how to work with locations to make your film work; how to use visual effects smartly in production; how to prepare for marketing and delivering your film for distribution. None of these things were covered as a part of a general masterclass, so I set out to fill that void with my masterclass series.

Clearly this genre holds a special place in your heart. Can you talk about why?
I think it’s because the genre allows for so much creative freedom because sci-fi relies on world-building and imagination. Because of this freedom, it leads to some “out of this world” storytelling and visuals, but on the flip side it may influence the filmmaker to be too ambitious on a tight budget. This could lead to making cheap-looking films because of the over ambitious need to create amazing worlds. Not many filmmakers know how to do this in a fiscally sensible way and they may try to make Star Wars on a shoestring budget. So this is why I decided to use the genre of sci-fi in this masterclass to share my experience of smart filmmaking to achieve commercially successful results.

How did you decide on what topics to cover? What was your process?
I thought about the questions the people and studio executives were asking me when I was in those LA meetings, which pretty much boiled down to, “How did you put the movie together for that tight budget and schedule?” When answering that question, I ended up mapping out my process and the various stages and approaches I took in preproduction, production and post production, but also in the deliverables stage and marketing and distribution stage too. As an indie filmmaker, you really need to get a good grasp on that part to ensure your film is able to be released by the distributors and received commercially.

I also wanted each class/episode to have a variety of timings and not go more than around 10 minutes (the longest one is around 12 minutes, and the shortest is three minutes). I went with a more bite-sized approach to make the experience snappy, fun yet in-depth to allow the viewers to really soak in the knowledge. It also allows for repeat viewing.

Why was it important to teach these classes yourself?
I wanted it to feel raw and personal when talking about my experience of putting two sci-fi feature films together. Plus I wanted to talk about the constant problem solving, which is what filmmaking is all about. Teaching the class myself allowed me to get this all out of my system in my voice and style to really connect with the audience intimately.

Can you talk about what the experience will be like for the student?
I want the students to be like flies on the wall throughout the classes — seeing how I put those sci-fi feature films together. By the end of the series, I want them to feel like they have been on an entire production, from receiving a script to the releasing of the movie. The aim was to inspire others to go out and make their film. Or to instill confidence in those who have fears of making their film, or for existing filmmakers to learn some new tips and tricks because in this industry we are always learning on each project.

Why the rental and purchase options? What have most people been choosing?
Before I released it, one of the big factors that kept me up nights was how to make this accessible and affordable for everyone. The idea of renting is for those who can’t afford to purchase it but would love to experience the course. They can do so at a cut-down price but can only view within the 48-hour window. Whereas the purchase price is a little higher price-wise but you get to access it as many times as you like. It’s pretty much the same model as iTunes when you rent or buy a movie.

So far I have found that people have been buying more than renting, which is great, as this means audiences want to do repeat viewings of the classes.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Review: Lenovo Yoga A940 all-in-one workstation

By Brady Betzel

While more and more creators are looking for alternatives to the iMac, iMac Pro and Mac Pro, there are few options with high-quality, built-in monitors: Microsoft Surface Studio, HP Envy, and Dell 7000 are a few. There are even fewer choices if you want touch and pen capabilities. It’s with that need in mind that I decided to review the Lenovo Yoga A940, a 27-inch, UHD, pen- and touch-capable Intel Core i7 computer with an AMD Radeon RX 560 GPU.

While I haven’t done a lot of all-in-one system reviews like the Yoga A940, I have had my eyes on the Microsoft Surface Studio 2 for a long time. The only problem is the hefty price tag of around $3,500. The Lenovo’s most appealing feature — in addition to the tech specs I will go over — is its price point: It’s available from $2,200 and up. (I saw Best Buy selling a similar system to the one I reviewed for around $2,299. The insides of the Yoga and the Surface Studio 2 aren’t that far off from each other either, at least not enough to make up for the $1,300 disparity.)

Here are the parts inside the Lenovo Yoga A940: Intel Core i7-8700 3.2GHz processor (up to 4.6GHz with Turbo Boost), six cores (12 threads) and 12MB cache; 27-inch 4K UHD IPS multitouch 100% Adobe RGB display; 16GB DDR4 2666MHz (SODIMM) memory; 1TB 5400 RPM drive plus 256GB PCIe SSD; AMD Radeon RX 560 4GB graphics processor; 25-degree monitor tilt angle; Dolby Atmos speakers; Dimensions: 25 inches by 18.3 inches by 9.6 inches; Weight: 32.2 pounds; 802.11AC and Bluetooth 4.2 connectivity; side panel inputs: Intel Thunderbolt, USB 3.1, 3-in-1 card reader and audio jack; rear panel inputs: AC-in, RJ45, HDMI and four USB 3.0; Bluetooth active pen (appears to be the Lenovo Active Pen 2); and QI wireless charging technology platform.

Digging In
Right off the bat, I just happened to put my Android Galaxy phone on the odd little flat platform located on the right side of the all-in-one workstation, just under the monitor, and I saw my phone begin to charge wirelessly. QI wireless charging is an amazing little addition to the Yoga; it really comes through in a pinch when I need my phone charged and don’t have the cable or charging dock around.

Other than that nifty feature, why would you choose a Lenovo Yoga A940 over any other all-in-one system? Well, as mentioned, the price point is very attractive, but you are also getting a near-professional-level system in a very tiny footprint — including Thunderbolt 3 and USB connections, HDMI port, network port and SD card reader. While it would be incredible to have an Intel i9 processor inside of the Yoga, the i7 clocks in at 3.2GHz with six cores. Not a beast, but enough to get the job done inside of Adobe Premiere and Blackmagic’s DaVinci Resolve, but maybe with transcoded files instead of Red raw or the like.

The Lenovo Yoga A940 is outfitted with a front-facing Dolby Atmos audio speaker as well as Dolby Vision technology in the IPS display. The audio could use a little more low end, but it is good. The monitor is surprisingly great — the whites are white and the blacks are black; something not everyone can get right. It has 100% Adobe RGB color coverage and is Pantone-validated. The HDR is technically Dolby Vision and looks great at about 350 nits (not the brightest, but it won’t burn your eyes out either). The Lenovo BT active pen works well. I use Wacom tablets and laptop tablets daily, so this pen had a lot to live up to. While I still prefer the Wacom pen, the Lenovo pen, with 4,096 levels of sensitivity, will do just fine. I actually found myself using the touchscreen with my fingers way more than the pen.

One feature that sets the A940 apart from the other all-in-one machines is the USB Content Creation dial. With the little time I had with the system, I only used it to adjust speaker volume when playing Spotify, but in time I can see myself customizing the dials to work in Premiere and Resolve. The dial has good action and resistance. To customize the dial, you can jump into the Lenovo Dial Customization Assistant.

Besides the Intel i7, there is an AMD Radeon RX 560 with 4GB of memory, two 3W and two 5W speakers, 32 GB of DDR4 2666 MHz memory, a 1 TB 5400 RPM hard drive for storage, and a 256GB PCIe SSD. I wish the 1TB drive was also an SSD, but obviously Lenovo has to keep that price point somehow.

Real-World Testing
I use Premiere Pro, After Effects and Resolve all the time and can understand the horsepower of a machine through these apps. Whether editing and/or color correcting, the Lenovo A940 is a good medium ground — it won’t be running much more than 4K Red raw footage in real time without cutting the debayering quality down to half if not one-eighth. This system would make a good “offline” edit system, where you transcode your high-res media to a mezzanine codec like DNxHR or ProRes for your editing and then up-res your footage back to the highest resolution you have. Or, if you are in Resolve, maybe you could use optimized media for 80% of the workflow until you color. You will really want a system with a higher-end GPU if you want to fluidly cut and color in Premiere and Resolve. That being said, you can make it work with some debayer tweaking and/or transcoding.

In my testing I downloaded some footage from Red’s sample library, which you can find here. I also used some BRAW clips to test inside of Resolve, which can be downloaded here. I grabbed 4K, 6K, and 8K Red raw R3D files and the UHD-sized Blackmagic raw (BRAW) files to test with.

Adobe Premiere
Using the same Red clips as above, I created two one-minute-long UHD (3840×2160) sequences. I also clicked “Set to Frame Size” for all the clips. Sequence 1 contained these clips with a simple contrast, brightness and color cast applied. Sequence 2 contained these same clips with the same color correction applied, but also a 110% resize, 100 sharpen and 20 Gaussian Blur. I then exported them to various codecs via Adobe Media Encoder using the OpenCL for processing. Here are my results:

QuickTime (.mov) H.264, No Audio, UHD, 23.98 Maximum Render Quality, 10 Mb/s:
Color Correction Only: 24:07
Color Correction w/ 110% Resize, 100 Sharpen, 20 Gaussian Blur: 26:11
DNxHR HQX 10 bit UHD
Color Correction Only: 25:42
Color Correction w/ 110% Resize, 100 Sharpen, 20 Gaussian Blur: 27:03

ProRes HQ
Color Correction Only: 24:48
Color Correction w/ 110% Resize, 100 Sharpen, 20 Gaussian Blur: 25:34

As you can see, the export time is pretty long. And let me tell you, once the sequence with the Gaussian Blur and Resize kicked in, so did the fans. While it wasn’t like a jet was taking off, the sound of the fans definitely made me and my wife take a glance at the system. It was also throwing some heat out the back. Because of the way Premiere works, it relies heavily on the CPU over GPU. Not that it doesn’t embrace the GPU, but, as you will see later, Resolve takes more advantage of the GPUs. Either way, Premiere really taxed the Lenovo A940 when using 4K, 6K and 8K Red raw files. Playback in real time wasn’t possible except for the 4K files. I probably wouldn’t recommend this system for someone working with lots of higher-than-4K raw files; it seems to be simply too much for it to handle. But if you transcode the files down to ProRes, you will be in business.

Blackmagic Resolve 16 Studio
Resolve seemed to take better advantage of the AMD Radeon RX 560 GPU in combination with the CPU, as well as the onboard Intel GPU. In this test I added in Resolve’s amazing built-in spatial noise reduction, so other than the Red R3D footage, this test and the Premiere test weren’t exactly comparing apples to apples. Overall the export times will be significantly higher (or, in theory, they should be). I also added in some BRAW footage to test for fun, and that footage was way easier to work and color with. Both sequences were UHD (3840×2160) 23.98. I will definitely be looking into working with more BRAW footage. Here are my results:

Playback: 4K realtime playback at half-premium, 6K no realtime playback, 8K no realtime playback

H.264 no audio, UHD, 23.98fps, force sizing and debayering to highest quality
Export 1 (Native Renderer)
Export 2 (AMD Renderer)
Export 3 (Intel QuickSync)

Color Only
Export 1: 3:46
Export 2: 4:35
Export 3: 4:01

Color, 110% Resize, Spatial NR: Enhanced, Medium, 25; Sharpening, Gaussian Blur
Export 1: 36:51
Export 2: 37:21
Export 3: 37:13

BRAW 4K (4608×2592) Playback and Export Tests

Playback: Full-res would play at about 22fps; half-res plays at realtime

H.264 No Audio, UHD, 23.98 fps, Force Sizing and Debayering to highest quality
Color Only
Export 1: 1:26
Export 2: 1:31
Export 3: 1:29
Color, 110% Resize, Spatial NR: Enhanced, Medium, 25; Sharpening, Gaussian Blur
Export 1: 36:30
Export 2: 36:24
Export 3: 36:22

DNxHR 10 bit:
Color Correction Only: 3:42
Color, 110% Resize, Spatial NR: Enhanced, Medium, 25; Sharpening, Gaussian Blur: 39:03

One takeaway from the Resolve exports is that the color-only export was much more efficient than in Premiere, taking just over three or four times realtime for the intensive Red R3D files, and just over one and a half times real time for BRAW.

Summing UpIn the end, the Lenovo A940 is a sleek looking all-in-one touchscreen- and pen-compatible system. While it isn’t jam-packed with the latest high-end AMD GPUs or Intel i9 processors, the A940 is a mid-level system with an incredibly good-looking IPS Dolby Vision monitor with Dolby Atmos speakers. It has some other features — like IR camera, QI wireless charger and USB Dial — that you might not necessarily be looking for but love to find.

The power adapter is like a large laptop power brick, so you will need somewhere to stash that, but overall the monitor has a really nice 25-degree tilt that is comfortable when using just the touchscreen or pen, or when using the wireless keyboard and mouse.

Because the Lenovo A940 starts at just around $2,299 I think it really deserves a look when searching for a new system. If you are working in primarily HD video and/or graphics this is the all-in-one system for you. Check out more at their website.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and The Shop. He is also a member of the Producer’s Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Bonfire adds Jason Mayo as managing director/partner

Jason Mayo has joined digital production company Bonfire in New York as managing director and partner. Industry veteran Mayo will be working with Bonfire’s new leadership lineup, which includes founder/Flame artist Brendan O’Neil, CD Aron Baxter, executive producer Dave Dimeola and partner Peter Corbett. Bonfire’s offerings include VFX, design, CG, animation, color, finishing and live action.

Mayo comes to Bonfire after several years building Postal, the digital arm of the production company Humble. Prior to that he spent 14 years at Click 3X, where he worked closely with Corbett as his partner. While there he also worked with Dimeola, who cut his teeth at Click as a young designer/compositor. Dimeola later went on to create The Brigade, where he developed the network and technology that now forms the remote, cloud-based backbone referred to as the Bonfire Platform.

Mayo says a number of factors convinced him that Bonfire was the right fit for him. “This really was what I’d been looking for,” he says. “The chance to be part of a creative and innovative operation like Bonfire in an ownership role gets me excited, as it allows me to make a real difference and genuinely effect change. And when you’re working closely with a tight group of people who are focused on a single vision, it’s much easier for that vision to be fully aligned. That’s harder to do in a larger company.”

O’Neil says that having Mayo join as partner/MD is a major move for the company. “Jason’s arrival is the missing link for us at Bonfire,” he says. “While each of us has specific areas to focus on, we needed someone who could both handle the day to day of running the company while keeping an eye on our brand and our mission and introducing our model to new opportunities. And that’s exactly his strong suit.”

For the most part, Mayo’s familiarity with his new partners means he’s arriving with a head start. Indeed, his connection to Dimeola, who built the Bonfire Platform — the company’s proprietary remote talent network, nicknamed the “secret sauce” — continued as Mayo tapped Dimeola’s network for overflow and outsourced work while at Postal. Their relationship, he says, was founded on trust.

“Dave came from the artist side, so I knew the work I’d be getting would be top quality and done right,” Mayo explains. “I never actually questioned how it was done, but now that he’s pulled back the curtain, I was blown away by the capabilities of the Platform and how it dramatically differentiates us.

“What separates our system is that we can go to top-level people around the world but have them working on the Bonfire Platform, which gives us total control over the process,” he continues. “They work on our cloud servers with our licenses and use our cloud rendering. The Platform lets us know everything they’re doing, so it’s much easier to track costs and make sure you’re only paying for the work you actually need. More importantly, it’s a way for us to feel connected – it’s like they’re working in a suite down the hall, except they could be anywhere in the world.”

Mayo stresses that while the cloud-based Platform is a huge advantage for Bonfire, it’s just one part of its profile. “We’re not a company riding on the backs of freelancers,” he points out. “We have great, proven talent in our core team who work directly with clients. What I’ve been telling my longtime client contacts is that Bonfire represents a huge step forward in terms of the services and level of work I can offer them.”

Corbett believes he and Mayo will continue to explore new ways of working now that he’s at Bonfire. “In the 14 years Jason and I built Click 3X, we were constantly innovating across both video and digital, integrating live action, post production, VFX and digital engagements in unique ways,” he observes. “I’m greatly looking forward to continuing on that path with him here.”

Technicolor Post opens in Wales 

Technicolor has opened a new facility in Cardiff, Wales, within Wolf Studios. This expansion of the company’s post production footprint in the UK is a result of the growing demand for more high-quality content across streaming platforms and the need to post these projects, as well as the growth of production in Wales.

The facility is connected to all of Technicolor’s locations worldwide through the Technicolor Production Network, giving creatives easy access and to their projects no matter where they are shooting or posting.

The facility, an extension of Technicolor’s London operations, supports all Welsh productions and features a multi-purpose, state-of-the-art suite as well as space for VFX and front-end services including dailies. Technicolor Wales is working on Bad Wolf Production’s upcoming fantasy epic His Dark Materials, providing picture and sound services for the BBC/HBO show. Technicolor London’s recent credits include The Two Popes, The Souvenir, Chernobyl, Black Mirror, Gentleman Jack and The Spanish Princess.

Within this new Cardiff facility, Technicolor is offering 2K digital cinema projection, FilmLight Baselight color grading, realtime 4K HDR remote review, 4K OLED video monitoring, 5.1/7.1 sound, ADR recording/source connect, Avid Pro Tools sound mixing, dailies processing and Pulse cloud storage.

Bad Wolf Studios in Cardiff offers 125,000 square feet of stage space with five stages. There is flexible office space, as well as auxiliary rooms and costume and props storage. Its within

Rising Sun Pictures’ Anna Hodge talks VFX education and training

Based in Adelaide, South Australia, Rising Sun Pictures (RSP) has created stunning visual effects for films including Spider-Man: Far From Home, Captain Marvel, Thor: Ragnarok and Game of Thrones.

It also operates a visual effects training program in conjunction with the University of South Australia in which students learn such skills as compositing, tracking, effects, lighting, look development and modeling from working professionals. Thanks to this program, many students have landed jobs in the industry.

We recently spoke with RSP’s manager of training and education, Anna Hodge, about the school’s success.

Tell us about the education program at Rising Sun Pictures.
Rising Sun Pictures is an independently owned visual effects company. We’ve worked on more than 130 films, as well as commercials and streaming series, and we are very much about employing locals from South Australia. When this is not possible, we hire staff from interstate and overseas for key senior positions.

Our education program was established in 2015 in conjunction with the University of South Australia (UniSA) in order to directly feed our junior talent pool. We found there was a gap between traditional visual effects training and the skills young artists needed to hit the ground running in a studio.

How is the program structured?
We began with a single Graduate Certificate in Visual Effects program of 12 weeks duration that was designed for students coming out of vocational colleges and universities wanting to improve their skills and employability. Students apply through a portfolio process. The program accepts 10 students each term and are exposes them to Foundry Nuke and other visual effects software. They gain experience by working on shots from past movies and creating a short film.

The idea is to give them a true industry experience, develop a showreel in the process and gain a qualification through a prestigious university. Our students are exposed to the studio floor from day one. They attend RSP five days a week. They work in our training rooms and are immersed in the life of the company. We want them to feel as much a part of RSP as our regular employees.

Our program has grown to include two graduate certificate streams. The Graduate Certificate in Effects and Lighting and our first graduate certificate was rebadged into the Graduate Certificate of Compositing and Tracking. Both have been highly successful for our graduates acquiring employment post studies at RSP.

Anna Hodge and students

We also offer course work toward the university’s media arts degree. We teach two elective courses in the second year, specializing in modeling and texturing and look development and lighting. The university students attend RSP as part of their studies at UniSA. It gives them exposure to our artists, industry-type projects and expectations of the industry through workshop-based delivery.

In 2019, our education program expanded, and we introduced “visual effects specialization” as part of the media arts degree. Unlike any other degree, the students spend their entire last year of studies at RSP. They are integrated with the graduate certificate classes, and learning at RSP for the whole year enables them to build skills in both compositing and tracking and effects and lighting, making them highly skilled and desirable employees at the end of their studies.

What practical skills do students learn?
In the Media Arts Modeling and Texturing elective course, they are exposed to Maya and are introduced to Pixologic ZBrush. In the second semester, they can study look development and lighting and learn Substance Painter and how to light in SideFX Houdini.

Both degree and graduate certificate students in the dynamic effects and lighting course receive around nine weeks of Houdini training and then move onto lighting. Those in the compositing and tracking stream learn Nuke, as well as 3D Equalizer and Silhouette. All our degree and graduate certificate students are also exposed to Autodesk’s Shotgun. They learn the tools we use on the floor and apply them in the same workflow.

Skills are never taught in isolation. They learn how they fit into the whole movie-making process. Working on the short film project, run in conjunction with We Made a Thing Studios (WEMAT), students learn how to work collaboratively, take direction and gain other necessary skills required for working in a deadline-driven environment.

Where do your students come from?
We attract applications from South Australia. Over the past few years, applications from interstate and overseas have significantly increased. The benefit of our program is that it’s only 12-weeks long, so students can pick up the skills they require without a huge investment in time. There is strong growth of jobs in South Australia so they are often employed locally or sometimes return to their hometowns to gain employment.

What are the advantages of training in a working VFX studio?
Our training goes beyond simple software skills. Our students are taught by some of our best artists in the world and professionals who have been working in the industry for years. Students can walk around the studio, talk to and shadow artists, and attend a company staff meeting. We schedule what we call “Day in the Life Of” presentations so students can gain an understanding of the various roles that make up our company. Students hear from department heads, senior artists, producers and even juniors. They talk about their jobs and their pathways into the industry. They provide students with sound practical advice on how to improve their skills and present themselves. We also run sessions with recruiters, who share insights in building good resumes and showreels.

We are always trying to reinvent and improve what we do. I have one-on-ones with students to find out how they are doing and what we can do to improve their learning experience. We take feedback seriously. Our instructors are passionate artists and educators. Over time, I think we’ve built something quite unique and special at RSP.

How do you support your students in their transition from the program into the professional world?
We have an excellent relationship with recruiters at other visual effects companies in South Australia, interstate and globally, and we use those connections to help our students find work. A VFX company that opened in Brisbane recently hired two of our students and wants to hire more.

Of course, one reason we created the program was to meet our own need for juniors. So I work closely with our department heads to meet their needs. If a job lands and they have positions open, I will refer students for interviews. Many of our students stay in touch after they leave here. Our support doesn’t stop after 12 weeks. When former students add new material to their showreels, I encourage them to send them in and I forward them to the relevant heads of department. When one of our graduates secures his or hers first VFX job, it’s the best news. This really makes my day.

How do you see the program evolving over the next few years?
We are working on new initiatives with UniSA. Nothing to reveal yet, but I do expect our numbers to grow simply because our graduate results are excellent. Our employment rate is well above 70 percent. I spoke with someone yesterday who is looking to apply next year. She was at a recent film event and met a bunch of our graduates who raved about the programs they studied at RSP. Hearing that sort of thing is really exciting and something that we are really proud of.

RSP and UniSA are both mindful that when scaling up we don’t compromise on quality delivery. It is important to us that students consistently receive the same high-quality training and support regardless of class size.

Do you feel that visual effects offer a strong career path?
Absolutely. I am constantly contacted by recruiters who are looking to hire our graduates. I don’t foresee a lack of jobs, only a lack of qualified artists. We need to keep educating students to avoid a skill shortage. There has never been a better time to train for a career in visual effects.