NBCUni 7.26

Category Archives: compositing

Behind the Title: MPC Senior Compositor Ruairi Twohig

After studying hand-drawn animation, this artist found his way to visual effects.

NAME: NYC-based Ruairi Twohig

COMPANY: Moving Picture Company (MPC)

CAN YOU DESCRIBE YOUR COMPANY?
MPC is a global creative and visual effects studio with locations in London, Los Angeles, New York, Shanghai, Paris, Bangalore and Amsterdam. We work with clients and brands across a range of different industries, handling everything from original ideas through to finished production.

WHAT’S YOUR JOB TITLE?
I work as a 2D lead/senior compositor.

Cadillac

WHAT DOES THAT ENTAIL?
The tasks and responsibilities can vary depending on the project. My involvement with a project can begin before there’s even a script or storyboard, and we need to estimate how much VFX will be involved and how long it will take. As the project develops and the direction becomes clearer, with scripts and storyboards and concept art, we refine this estimate and schedule and work with our clients to plan the shoot and make sure we have all the information and assets we need.

Once the commercial is shot and we have an edit, the bulk of the post work begins. This can involve anything from compositing fully CG environments, dragons or spaceships to beauty and product/pack-shot touch-ups or rig removal. So, my role involves a combination of overall project management and planning. But I also get into the detailed shot work and ultimately delivering the final picture. But the majority of the work I do can require a large team of people with different specializations, and those are usually the projects I find the most fun and rewarding due to the collaborative nature of the work.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I think the variety of the work would surprise most people unfamiliar with the industry. In a single day, I could be working on two or three completely different commercials with completely different challenges while also bidding future projects or reviewing prep work in the early stages of a current project.

HOW LONG HAVE YOU BEEN WORKING IN VFX?
I’ve been working in the industry for over 10 years.

HOW HAS THE VFX INDUSTRY CHANGED IN THE TIME YOU’VE BEEN WORKING?
The VFX industry is always changing. I find it exciting to see how quickly the technology is advancing and becoming more widely accessible, cost-effective and faster.

I still find it hard to comprehend the idea of using optical printers for VFX back in the day … before my time. Some of the most interesting areas for me at the moment are the developments in realtime rendering from engines such as Unreal and Unity, and the implementation of AI/machine learning tools that might be able to automate some of the more time-consuming tasks in the future.

DID A PARTICULAR FILM INSPIRE YOU ALONG THIS PATH IN ENTERTAINMENT?
I remember when I was 13, my older brother — who was studying architecture at the time — introduced me to 3ds Max, and I started playing around with some very simple modeling and rendering.

I would buy these monthly magazines like 3D World, which came with demo discs for different software and some CG animation compilations. One of the issues included the short CG film Fallen Art by Tomek Baginski. At the time I was mostly familiar with Pixar’s feature animation work like Toy Story and A Bug’s Life, so watching this short film created using similar techniques but with such a dark, mature tone and story really blew me away. It was this film that inspired me to pursue animation and, ultimately, visual effects.

DID YOU GO TO FILM SCHOOL?
I studied traditional hand-drawn animation at the Dun Laoghaire Institute of Art, Design and Technology in Dublin. This was a really fun course in which we spent the first two years focusing on the craft of animation and the fundamental principles of art and design, followed by another two years in which we had a lot of freedom to make our own films. It was during these final two years of experimentation that I started to move away from traditional animation and focus more on learning CG and VFX.

I really owe a lot to my tutors, who were really supportive during that time. I also had the opportunity to learn from visiting animation masters such as Andreas Deja, Eric Goldberg and John Canemaker. Although on the surface the work I do as a compositor is very different to animation, understanding those fundamental principles has really helped my compositing work; any additional disciplines or skills you develop in your career that require an eye for detail and aesthetics will always make you a better overall artist.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Even after 10 years in the industry, I still get satisfaction from the problem-solving aspect of the job, even on the smaller tasks. I love getting involved on the more creative projects, where I have the freedom to develop the “look” of the commercial/film. But, day to day, it’s really the team-based nature of the work that keeps me going. Working with other artists, producers, directors and clients to make a project look great is what I find really enjoyable.

WHAT’S YOUR LEAST FAVORITE?
Sometimes even if everything is planned and scheduled accordingly, a little hiccup along the way can easily impact a project, especially on jobs where you might only have a limited amount of time to get the work done. So it’s always important to work in such a way that allows you to adapt to sudden changes.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I used to draw all day, every day as a kid. I still sketch occasionally, but maybe I would have pursued a more traditional fine art or illustration career if I hadn’t found VFX.

Tiffany & Co.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
Over the past year, I’ve worked on projects for clients such as Facebook, Adidas, Samsung and Verizon. I also worked on the Tiffany & Co. campaign “Believe in Dreams” directed by Francis Lawrence, as well as the company’s holiday campaign directed by Mark Romanek.

I also worked on Cadillac’s “Rise Above” campaign for the 2019 Oscars, which was challenging since we had to deliver four spots within a short timeframe. But it was a fun project. There was also the Michelob Ultra Robots Super Bowl spot earlier this year. That was an interesting project, as the work was completed between our LA, New York and London studios.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
Last year, I had the chance to work with my friend and director Sofia Astrom on the music video for the song “Bone Dry” by Eels. It was an interesting project since I’d never done visual effects for a stop-motion animation before. This had its own challenges, and the style of the piece was very different compared to what I’m used to working on day to day. It had a much more handmade feel to it, and the visual effects design had to reflect that, which was such a change to the work I usually do in commercials, which generally leans more toward photorealistic visual effects work.

WHAT TOOLS DO YOU USE DAY TO DAY?
I mostly work with Foundry Nuke for shot compositing. When leading a job that requires a broad overview of the project and timeline management/editorial tasks, I use Nuke Studio or
Autodesk Flame, depending on the requirements of the project. I also use ftrack daily for project management.

WHERE DO YOU FIND INSPIRATION NOW?
I follow a lot of incredibly talented concept artists and photographers/filmmakers on Instagram. Viewing these images/videos on a tiny phone doesn’t always do justice to the work, but the platform is so active that it’s a great resource for inspiration and finding new artists.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I like to run and cycle around the city when I can. During the week it can be easy to get stuck in a routine of sitting in front of a screen, so getting out and about is a much-needed break for me.

Meet the Artist: The Mill’s Anne Trotman

Anne Trotman is a senior Flame artist and VFX supervisor at The Mill in New York. She specializes in beauty and fashion work but gets to work on a variety of other projects as well.

A graduate of Kings College in London, Trotman took on what she calls “a lot of very random temp jobs” before finally joining London’s Blue Post Production as a runner.

“In those days a runner did a lot of ‘actual’ running around SoHo, dropping off tapes and picking up lunches,” she says, admitting she was also sent out for extra green for color bars and warm sake at midnight. After being promoted to the machine room, she spent her time assisting all the areas of the company, including telecine grading, offline, online, VFX and audio. “This gave me a strong understanding of the post production process as a whole.”

Trotman then joined the 2D VFX teams from Blue, Clear Post Production, The Hive and VTR to create a team at Prime Focus London. She moved into film compositing where she headed up the 2D team as a senior Flame operator. Overseeing projects, including shot allocation and VFX reviews. Then she joined SFG-Technicolor’s commercials facility in Shanghai. After a year in China she joined The Mill in New York, where she is today.

We reached out to Trotman to find out more about The Mill, a technology and visual effects studio, how she works and some recent projects. Enjoy.

Bumble

Can you talk about some recent high-profile projects you’ve completed?
The most recent high-profile project I’ve worked on was for Bumble’s Super Bowl 2019 spot. It was its first commercial ever. Being that Bumble is a female-founded company, it was important for this project to celebrate female artists and empowerment, something I strongly support. Therefore, I was thrilled to lead an all-female team for this project. The agency creatives and producers were all female and so was almost the whole post team, including the editor, colorist and all the VFX artists.

How did you first learn Flame, and how has your use of it evolved over the years?
I had been assisting artists working on a Quantel Editbox at Blue. They then installed a Flame and hired a female artist who had worked on Gladiator. That’s when I knew I had found my calling. Working with technical equipment was very attractive to me, and in those days it was a dark art, and you had to work in a company to get your hands on one. I worked nights doing a lot of conforming and rotoscoping. I also started doing small jobs for clients I knew well. I remember assisting on an Adele pop video, which is where my love of beauty started.

When I first started using Flame, the whole job was usually completed by one artist. These days, jobs are much bigger, and with so many versions for social media, some days a lot of my day is coordinating the team of artists. Workshare and remote artists are becoming a big part of our industry, so communicating with artists all over the world has become a big part of my job in order to bring everything together to create the final film.

In addition to Flame, what other tools are used in your workflow?
Post production has changed so much in the past five years. My job is not just to press buttons on a Flame to get a commercial on television anymore; that’s only a small part. My job is to help the director and/or the agency position a brand and connect it with the consumer.

My workflow usually starts with bidding an agency or a director’s brief. Sometimes they need tests to sell an idea to a client. I might supervise a previz artist on Maxon Cinema 4D to help them achieve the director’s vision. I attend most of the shoots, which gives me an insight into the project while assessing the client’s goals and vision. I can take Flame on a laptop to my shoots to do tests for the director to help explain how certain shots will look after post. This process is so helpful all around in order for me to see if what we are shooting is correct and for the client to understand the director’s vision.

At The Mill, I work closely with the colorists who work on FilmLight Baselight before completing the work on Flame. All the artists at The Mill use Flame and Foundry Nuke, although my Flame skills are 100% better than my Nuke skills.

What are the most fulfilling aspects of the work you do?
I’m lucky to work with many directors and agency creatives that I now call friends. It still gives me a thrill when I’m able to interpret the vision of the creative or director to create the best work possible and convey the message of the brand.

I also love working with the next generation of artists. I especially love being able to work alongside the young female talent at The Mill. This is the first company I’ve worked at where I’ve not been “the one and only female Flame artist.”

At the Mill NY, we currently have 11 full-time female 2D artists working in our team, which has a 30/70 male to female ratio. Still a way to go to get to 50/50, so if I can inspire another female intern or runner who is thinking of becoming a VFX artist or colorist, then it’s a good day. Helping the cycle continue for female artists is so important to me.

What is the greatest challenge you’ve faced in your career?
Moving to Shanghai. Not only did I have the challenge of the language barrier to overcome but also the culture — from having lunch at noon to working with clients from a completely different background than mine. I had to learn all I could about the Chinese culture to help me connect with my clients.

Covergirl with Issa Rae

Out of all of the projects you’ve worked on, which one are you the most proud of?
There are many, but one that stands out is the Covergirl brand relaunch (2018) for director Matt Lambert at Prettybird. As an artist working on high-profile beauty brands, what they stand for is very important to me. I know every young girl will want to use makeup to make themselves feel great, but it’s so important to make sure young women are using it for the right reason. The new tagline “I am what I make-up” — together with a very diverse group of female ambassadors — was such a positive message to put out into the world.

There was also 28 Weeks Later, a feature film from director Juan Carlos Fresnadillo. My first time working on a feature was an amazing experience. I got to make lifelong friends working on this project. My technical abilities as an artist grew so much that year, from learning the patience needed to work on the same shot for two months to discovering the technical difficulties in compositing fire to be able to blow up parts of London. Such fun!

Finally, there was also a spot for the Target Summer 2019 campaign. It was directed by Whitelabel’s Lacey, who I collaborate with together on a lot of projects. Tristan Sheridan was the DP and the agency was Mother NY.

Target Summer Campaign

What advice do you have for a young professional trying to break into the industry?
Try everything. Don’t get pigeonholed into one area of the industry too early on. Learn about every part of the post process; it will be so helpful to you as you progress through your career.

I was lucky my first boss in the industry (Dave Cadle) was patient and gave me time to find out what I wanted to focus on. I try to be a positive mentor to the young runners and interns at The Mill, especially the young women. I was so lucky to have had female role models throughout my career, from the person that employed me to the first person that started training me on Flame. I know how important it is to see someone like you in a role you are thinking of pursuing.

Outside of work, how do you enjoy spending your free time?
I travel as much as I can. I love learning about new cultures; it keeps me grounded. I live in New York City, which is a bubble, and if you stay here too long, you start to forget what the real world looks like. I also try to give back when I can. I’ve been helping a director friend of mine with some films focusing on the issue of female homelessness around the world. We collaborated on some lovely films about women in LA and are currently working on some London-based ones.

You can find out more here.

Anne Trotman Image: Photo by Olivia Burke

NBCUni 7.26

Review: FXhome’s HitFilm Pro 12 for editing, compositing, VFX

By Brady Betzel

If you have ever worked in Adobe Premiere Pro, Apple FCP X or Avid Media Composer and wished you could just flip a tab and be inside After Effects, with access to 3D objects directly in your timeline, you are going to want to take a look at FXhome’s HitFilm Pro 12.

Similar to how Blackmagic brought Fusion inside of its most recent versions of DaVinci Resolve, HitFilm Pro offers a nonlinear editor, a composite/VFX suite and a finishing suite combined into one piece of software. Haven’t heard about HitFilm yet? Let me help fill in some blanks.

Editing and 3D model Import

Editing and 3D model Import

What is HitFilm Pro 12?
Technically, HitFilm Pro 12 is a non-subscription-based nonlinear editor, compositor and VFX suite that costs $299. Not only does that price include 12 months of updates and tech support, but one license can be used on up to three computers simultaneously. In my eyes, HitFilm Pro is a great tool set for independent filmmakers, social media content generators and any editor who goes beyond editing and dives into topics like 3D modeling, tracking, keying, etc. without having to necessarily fork over money for a bunch of expensive third-party plugins. That doesn’t mean you won’t want to buy third-party plugins, but you are less likely to need them with HitFilm’s expansive list of native features and tools.

At my day job, I use Premiere, After Effects, Media Composer and Resolve. I often come home and want to work in something that has everything inside, and that is where HitFilm Pro 12 lives. Not only does it have the professional functionality that I am used to, such as trimming, color scopes and more, but it also has BorisFX’s Mocha planar tracking plugin built in for no extra cost. This is something I use constantly and love.

One of the most interesting and recent updates to HitFilm Pro 12 is the ability to use After Effects plugins. Not all plugins will work since there are so many, but in a video released after NAB 2019, HitFilm said plugins like Andrew Kramer’s Video CoPilot Element3D and ones from Red Giant are on the horizon. If you are within your support window, or you continue to purchase HitFilm, FXhome will work with you to get your favorite After Effects plugins working directly inside of HitFilm.

Timeline and 3D model editor

Some additional updates to HitFilm Pro 12 include a completely redesigned user interface that resembles Premiere Pro… kind of. Threaded rendering has also been added, so Windows users who have Intel and Nvidia hardware will see increased GPU speeds, the ability to add title directly in the editor and more.

The Review
So how doees HitFilm Pro 12 compare to today’s modern software packages? That is an interesting question. I have become more and more of a Resolve convert over the past two years, so I am constantly comparing everything to that. In addition, being an Avid user for over 15 years, I am used to a rock-solid NLE with only a few hiccups here and there. In my opinion, HitFilm 12 lands itself right where Premiere and FCP X live.

It feels prosumer-y, in a YouTuber or content-generator capacity. Would it stand up to 10 hours of abuse with content over 45 minutes? It probably would, but much like with Premiere, I would probably split my edits in scenes or acts to avoid slowdowns, especially when importing things like OBJ files or composites.

The nonlinear editor portion feels like Premiere and FCP X had a baby, but left out FCP X’s Magnetic Timeline feature. The trimming in the timeline feels smooth, and after about 20 minutes of getting comfortable with it I felt like it was what I am generally used to. Cutting in footage feels good using three-point edits or simply dragging and dropping. Using effects feels very similar to the Adobe world, where you can stack them on top of clips and they each affect each other from the top down.

Mocha within HitFilm Pro

Where HitFilm Pro 12 shines is in the inclusion of typically third-party plugins directly in the timeline. From the ability to create a scene with 3D cameras and particle generators to being able to track using BorisFX’s Mocha, HitFilm Pro 12 has many features that will help take your project to the next level. With HitFilm 12 Pro’s true 3D cameras, you can take flat text and enhance it with raytraced lighting, shadows and even textures. You can even use the included BorisFX Continuum 3D Objects to make great titles relatively easily. To take it a step further, you can even track them and animate them.

Color Tools
By day, I am an online editor/colorist who deals with the finishing aspect of media creation. Throughout the process, from color correction to exporting files, I need tools that are not only efficient but accurate. When I started to dig into the color correction side of HitFilm Pro 12, things slowed down for me. The color correction tools are very close to what you’ll find in other NLEs, like Premiere and FCP X, but they don’t quite rise to the level of Resolve. HitFilm Pro 12 does operate inside of a 32-bit color pipeline, which really helps avoid banding and other errors when color correcting. However, I didn’t feel that the toolset was making me more efficient; in fact, it was the opposite. I felt like I had to learn FXhome’s way of doing it. It wasn’t that it totally slowed me down, but I felt it could be better.

Color

Color

Summing Up
In the end, HitFilm 12 Pro will fill a lot of holes for individual content creators. If you love learning new things (like I do), then HitFilm Pro 12 will be a good investment of your time. In fact, FXhome post tons of video tutorials on all sorts of good and topical stuff, like how to create a Stranger Things intro title.

If you are a little more inclined to work with a layer-based workflow, like in After Effects, then HitFilm Pro Pro 12 is the app you’ll want to learn. Check out HitFilm Pro 12 on FXhome’s website and definitely watch some of the company’s informative tutorials.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.


FXhome, Vegas Creative Software partner on Vegas Post

HitFilm creator FXhome has partnered with Vegas Creative Software to launch a new suite of editing, VFX, compositing and imaging tools for video pros, editors and VFX artists called Vegas Post.

Vegas Post will combine the editing tools of Vegas Pro with FXhome’s expertise in compositing and visual effects to offer an array of features and capabilities.

FXhome is developing customized effects and compositing tools specifically for Vegas Post. The new software suite will also integrate a custom-developed version of FXhome’s new non-destructive RAW image compositor that will enable video editors to work with still-image and graphical content and incorporate it directly into their final productions. All tools will work together seamlessly in an integrated, end-to-end workflow to accelerate and streamline the post production process for artists.

The new software suite is ideally suited for video pros in post facilities of all sizes and requirements — from individual artists to large post studios, broadcasters and small/medium enterprise installations. It will be available in the third quarter, with pricing to be announced.

Meanwhile, FXhome has teamed up with Filmstro, which offers a royalty-free music library, to provide HitFilm users with access to the entire Filmstro music library for 12 months. With Filmstro available directly from the FXhome store, HitFilm users can use Filmstro soundtracks on unlimited projects and get access to weekly new music updates.

Offering more than just a royalty-free music library, Filmstro has developed a user interface that gives artists flexibility and control over selected music tracks for use in their HitFilm projects. HitFilm users can control the momentum, depth and power of any Filmstro track, using sliders to perfectly match any sequence in a HitFilm project. Users can also craft soundtracks to perfectly fit images by using a keyframe graph editor within Filmstro. Moving sliders automatically create keyframes for each element and can be edited at any point.

Filmstro offers over 60 albums’ worth of music with weekly music releases. All tracks are searchable using keywords, film and video genre, musical style, instrumental palette or mood. All Filmstro music is licensed for usage worldwide and in perpetuity. The Filmstro dynamic royalty-free music library is available now on the FXhome Store for $249 and can be purchased here.


Fox Sports promotes US women’s World Cup team with VFX-heavy spots

Santa Monica creative studio Jamm worked with Wieden+Kennedy New York on the Fox Sports campaign “All Eyes on US.” Directed by Joseph Kahn out of Supply & Demand, the four spots celebrate the US Women’s soccer team as it gears up for the 2019 FIFA Women’s World Cup in June.

The newest 60-second spot All Eyes on US, features tens of thousands of screaming fans thanks to Jamm’s CG crowd work. On set, Jamm brainstormed with Kahn on how to achieve the immersive effect he was looking for. Much of the on-the-ground footage was shot using wide-angle lenses, which posed a unique set of challenges by revealing the entire environment as well as the close-up action. With pacing, Jamm achieved the sense of the game occurring in realtime, as the tempo of the camera keeps in step with the team moving the ball downfield.

The 30-second spot Goliath features the first CG crowd shot by the Jamm team, who successfully filled the soccer stadium with a roaring crowd. In Goliath, the entire US women’s soccer team runs toward the camera in slow motion. Captured locked off but digitally manipulated via a 3D camera to create a dolly zoom technique replicating real-life parallax, the altered perspective translates the unsettling feeling of being an opponent as the team literally runs straight into the camera.

On set, Jamm got an initial Lidar scan of the stadium as a base. From there, they used that scan along with reference photos taken on set to build a CG stadium that included accurate seating. They extended the stadium where there were gaps as well to make it a full 360 stadium. The stadium seating tools tie in with Jamm’s in-house crowd system (based on Side Effects Houdini) and allowed them to easily direct the performance of the crowd in every shot.

The Warrior focuses on Megan Rapinoe standing on the field in the rain, with a roaring crowd behind her. Whereas CG crowd simulation is typically captured with fast-moving cameras, the stadium crowd remains locked in the background of this sequence. Jamm implemented motion work and elements like confetti to make the large group of characters appear lively without detracting from Rapinoe in the foreground. Because the live-action scenes were shot in the rain, Jamm used water graphing to seamlessly blend the real-world footage and the CG crowd work.

The Finisher centers on Alex Morgan, who earned the nickname because “she’s the last thing they’ll see before it’s too late.”  The team ran down the field at a slow motion pace, while the cameraman rigged with a steady cam sprinted backwards through the goal. Then the footage was sped up by 600%, providing a realtime quality, as Morgan kicks a perfect strike to the back of the net.

Jamm used Autodesk Flame for compositing the crowds and CG ball, camera projections to rebuild and clean up certain parts of the environment, refining the skies and adding in stadium branding. They also used Foundry Nuke and Houdini for 3D.

The edit was via FinalCut and editor Spencer Campbell. The color grade was by Technicolor’s Tom Poole.


NAB 2019: postPerspective Impact Award winners

postPerspective has announced the winners of our Impact Awards from NAB 2019. Seeking to recognize debut products with real-world applications, the postPerspective Impact Awards are voted on by an anonymous judging body made up of respected industry artists and pros (to whom we are very grateful). It’s working pros who are going to be using these new tools — so we let them make the call.

It was fun watching the user ballots come in and discovering which products most impressed our panel of post and production pros. There are no entrance fees for our awards. All that is needed is the ability to impress our voters with products that have the potential to make their workdays easier and their turnarounds faster.

We are grateful for our panel of judges, which grew even larger this year. NAB is exhausting for all, so their willingness to share their product picks and takeaways from the show isn’t taken for granted. These men and women truly care about our industry and sharing information that helps their fellow pros succeed.

To be successful, you can’t operate in a vacuum. We have found that companies who listen to their users, and make changes/additions accordingly, are the ones who get the respect and business of working pros. They aren’t providing tools they think are needed; they are actively asking for feedback. So, congratulations to our winners and keep listening to what your users are telling you — good or bad — because it makes a difference.

The Impact Award winners from NAB 2019 are:

• Adobe for Creative Cloud and After Effects
• Arraiy for DeepTrack with The Future Group’s Pixotope
• ARRI for the Alexa Mini LF
• Avid for Media Composer
• Blackmagic Design for DaVinci Resolve 16
• Frame.io
• HP for the Z6/Z8 workstations
• OpenDrives for Apex, Summit, Ridgeview and Atlas

(All winning products reflect the latest version of the product, as shown at NAB.)

Our judges also provided quotes on specific projects and trends that they expect will have an impact on their workflows.

Said one, “I was struck by the predicted impact of 5G. Verizon is planning to have 5G in 30 cities by end of year. The improved performance could reach 20x speeds. This will enable more leverage using cloud technology.

“Also, AI/ML is said to be the single most transformative technology in our lifetime. Impact will be felt across the board, from personal assistants, medical technology, eliminating repetitive tasks, etc. We already employ AI technology in our post production workflow, which has saved tens of thousands of dollars in the last six months alone.”

Another echoed those thoughts on AI and the cloud as well: “AI is growing up faster than anyone can reasonably productize. It will likely be able to do more than first thought. Post in the cloud may actually start to take hold this year.”

We hope that postPerspective’s Impact Awards give those who weren’t at the show, or who were unable to see it all, a starting point for their research into new gear that might be right for their workflows. Another way to catch up? Watch our extensive video coverage of NAB.


Autodesk’s Flame 2020 features machine learning tools

Autodesk’s new Flame 2020 offers a new machine-learning-powered feature set with a host of new capabilities for Flame artists working in VFX, color grading, look development or finishing. This latest update will be showcased at the upcoming NAB Show.

Advancements in computer vision, photogrammetry and machine learning have made it possible to extract motion vectors, Z depth and 3D normals based on software analysis of digital stills or image sequences. The Flame 2020 release adds built-in machine learning analysis algorithms to isolate and modify common objects in moving footage, dramatically accelerating VFX and compositing workflows.

New creative tools include:
· Z-Depth Map Generator— Enables Z-depth map extraction analysis using machine learning for live-action scene depth reclamation. This allows artists doing color grading or look development to quickly analyze a shot and apply effects accurately based on distance from camera.
· Human Face Normal Map Generator— Since all human faces have common recognizable features (relative distance between eyes, nose, location of mouth) machine learning algorithms can be trained to find these patterns. This tool can be used to simplify accurate color adjustment, relighting and digital cosmetic/beauty retouching.
· Refraction— With this feature, a 3D object can now refract, distorting background objects based on its surface material characteristics. To achieve convincing transparency through glass, ice, windshields and more, the index of refraction can be set to an accurate approximation of real-world material light refraction.

Productivity updates include:
· Automatic Background Reactor— Immediately after modifying a shot, this mode is triggered, sending jobs to process. Accelerated, automated background rendering allows Flame artists to keep projects moving using GPU and system capacity to its fullest. This feature is available on Linux only, and can function on a single GPU.
· Simpler UX in Core Areas— A new expanded full-width UX layout for MasterGrade, Image surface and several Map User interfaces, are now available, allowing for easier discoverability and accessibility to key tools.
· Manager for Action, Image, Gmask—A simplified list schematic view, Manager makes it easier to add, organize and adjust video layers and objects in the 3D environment.
· Open FX Support—Flame, Flare and Flame Assist version 2020 now include comprehensive support for industry-standard Open FX creative plugins such as Batch/BFX nodes or on the Flame timeline.
· Cryptomatte Support—Available in Flame and Flare, support for the Cryptomatte open source advanced rendering technique offers a new way to pack alpha channels for every object in a 3D rendered scene.

For single-user licenses, Linux customers can now opt for monthly, yearly and three-year single user licensing options. Customers with an existing Mac-only single user license can transfer their license to run Flame on Linux.
Flame, Flare, Flame Assist and Lustre 2020 will be available on April 16, 2019 at no additional cost to customers with a current Flame Family 2019 subscription. Pricing details can be found at the Autodesk website.


Adobe’s new Content-Aware fill in AE is magic, plus other CC updates

By Brady Betzel

NAB is just under a week away, and we are here to share some of Adobe’s latest Creative Cloud offerings. And there are a few updates worth mentioning, such as a freeform project panel in Premiere Pro, AI-driven Auto Ducking for Ambience for Audition and addition of a Twitch extension for Character Animator. But, in my opinion, the Adobe After Effects updates are what this year’s release will be remembered by.


Content Aware: Here is the before and after. Our main image is the mask.

There is a new expression editor in After Effects, so us old pseudo-website designers can now feel at home with highlighting, line numbers and more. There are also performance improvements, such as faster project loading times and new deBayering support for Metal on macOS. But the first prize ribbon goes to the Content-Aware fill for video powered by Adobe Sensei, the company’s AI technology. It’s one of those voodoo features that when you use it, you will be blown away. If you have ever used Mocha Pro by BorisFX then you have had a similar tool known as the “Object Removal” tool. Essentially, you draw around the object you want to remove, such as a camera shadow or boom mic, hit the magic button and your object will be removed with a new background in its place. This will save users hours of manual work.

Freeform Project panel in Premiere.

Here are some details on other new features:

● Freeform Project panel in Premiere Pro— Arrange assets visually and save layouts for shot selects, production tasks, brainstorming story ideas, and assembly edits.
● Rulers and Guides—Work with familiar Adobe design tools inside Premiere Pro, making it easier to align titling, animate effects, and ensure consistency across deliverables.
● Punch and Roll in Audition—The new feature provides efficient production workflows in both Waveform and Multitrack for longform recording, including voiceover and audiobook creators.
● Surprise viewers in Twitch Live-Streaming Triggers with Character Animator Extension—Livestream performances are enhanced where audiences engage with characters in real-time with on-the-fly costume changes, impromptu dance moves, and signature gestures and poses—a new way to interact and even monetize using Bits to trigger actions.
● Auto Ducking for ambient sound in Audition and Premiere Pro — Also powered by Adobe Sensei, Auto Ducking now allows for dynamic adjustments to ambient sounds against spoken dialog. Keyframed adjustments can be manually fine-tuned to retain creative control over a mix.
● Adobe Stock now offers 10 million professional-quality, curated, royalty-free HD and 4K video footage and Motion Graphics templates from leading agencies and independent editors to use for editorial content, establishing shots or filling gaps in a project.
● Premiere Rush, introduced late last year, offers a mobile-to-desktop workflow integrated with Premiere Pro for on-the-go editing and video assembly. Built-in camera functionality in Premiere Rush helps you take pro-quality video on your mobile devices.

The new features for Adobe Creative Cloud are now available with the latest version of Creative Cloud.


Timber finishes Chipotle ‘Fresh Food’ campaign

In Chipotle’s new Fresh Food campaign, directed by Errol Morris for Moxie Pictures out of agency Venables Bell & Partners, real-life employees of the food chain talk about the pride they take in their work while smashing guacamole and cutting peppers, cilantro and other fresh ingredients.

The food shots are designed to get all five of your senses moving by grabbing the audience with the visually appealing, fresh food served and leading them to taste, smell, and hear the authentic ingredients.

The four spots — Bre – Just BraggingCarson – Good Food Good Person, Krista – Fresh Everyday
Robbie – Microwaves Not Welcome — are for broadcast and the web.

For Chipotle, Santa Monica’s Timber handled online, finishing and just a splash of cleanup. They used Flame on the project. According to Timber head of production Melody Alexander, “The Chipotle project was based on showcasing the realness of the products the restaurants use in their food. Minimal clean-up was required as the client was keen to keep the naturalness of the footage. We, at Timber, use a combination of finishing tools when working on online projects. The Chipotle project was completely done in Flame.”

Roper Technologies set to acquire Foundry

Roper Technologies, a technology company and a constituent of the S&P 500, Fortune 1000 and the Russell 1000 indices, is expected to purchase Foundry — the deal is expected to close in April 2019, subject to regulatory approval and customary closing conditions.Foundry makes software tools used to create visual effects and 3D for the media and entertainment world, including Nuke, Modo, Mari and Katana.

Craig Rodgerson

It’s a substantial move that enables Foundry to remain an independent company, with Roper assuming ownership from Hg. Roper has a successful history of acquiring well-run technology companies in niche markets that have strong, sustainable growth potential.

“We’re excited about the opportunities this partnership brings. Roper understands our strategy and chose to invest in us to help us realize our ambitious growth plans,” says Foundry CEO Craig Rodgerson. “This move will enable us to continue investing in what really matters to our customers: continued product improvement, R&D and technology innovation and partnerships with global leaders in the industry.”

Autodesk cloud-enabled tools now work with BeBop post platform

Autodesk has enabled use of its software in the cloud — including 3DS Max, Arnold, Flame and Maya — and BeBop Technology will deploy the tools on its cloud-based post platform. The BeBop platform enables processing-heavy post projects, such as visual effects and editing, in the cloud on powerful and highly secure virtualized desktops. Creatives can process, render, manage and deliver media files from anywhere on BeBop using any computer and as small as a 20Mbps Internet connection.

The ongoing deployment of Autodesk software on the BeBop platform mirrors the ways BeBop and Adobe work closely together to optimize the experience of Adobe Creative Cloud subscribers. Adobe applications have been available natively on BeBop since April 2018.

Autodesk software users will now also gain access to BeBop Rocket Uploader, which enables ingestion of large media files at incredibly high speeds for a predictable monthly fee with no volume limits. Additionally, BeBop Over the Shoulder (OTS) enables secure and affordable remote collaboration, review and approval sessions in real-time. BeBop runs on all of the major public clouds, including Amazon Web Services (AWS), Google Cloud Platform (GCP), and Microsoft Azure.

Cinesite recreates Nottingham for Lionsgate’s Robin Hood

The city of Nottingham perpetually exists in two states: the metropolitan center that it is today, and the fictional home of one of the world’s most famous outlaws. So when the filmmakers behind Robin Hood, which is now streaming and on DVD, looked to recreate the fictional Nottingham, they needed to build it from scratch with help from London’s Cinesite Studio. The film stars Taron Egerton, Jamie Foxx, Ben Mendelsohn, Eve Hewson, and Jamie Dornan.

Working closely with Robin Hood’s VFX supervisor Simon Stanley-Clamp and director Otto Bathurst, Cinesite created a handful of settings and backgrounds for the film, starting with a digital model of Nottingham built to scale. Given its modern look and feel, Nottingham of today wouldn’t do, so the team used Dubrovnik, Croatia, as its template. The Croatian city — best known to TV fans around the world as the model for Game of Thrones’ Kings Landing — has become a popular spot for filming historical fiction, thanks to its famed stone walls and medieval structures. That made it an ideal starting point for a film set around the time of the Crusades.

“Robin’s Nottingham is a teeming industrial city dominated by global influences, politics and religion. It’s also full of posh grandeur but populated by soot-choked mines and sprawling slums reflecting the gap between haves and have-nots, and we needed to establish that at a glance for audiences,” says Cinesite’s head of assets, Tim Potter. “With so many buildings making up the city, the Substance Suite allowed us to achieve the many variations and looks that were required for the large city of Nottingham in a very quick and easy manner.”

Using Autodesk Maya for the builds and Pixologic ZBrush for sculpting and displacement, the VFX team then relied on Allegorithmic Substance Designer (which was acquired by Adobe recently) to customize the city, creating detailed materials that would give life and personality to the stone and wood structures. From the slums inspired by Brazilian favelas to the gentry and nobility’s grandiose environments, the texturing and materials helped to provide audiences with unspoken clues about the outlaw archer’s world.

Creating these swings from the oppressors to the oppressed was often a matter of dirt, dust and grime, which were added to the RGB channels over the textures to add wear and tear to the city. Once the models and layouts were finalized, Cinesite then added even more intricate details using Substance Painter, giving an already realistic recreation additional touches to reflect the sometimes messy lives of the people that would inhabit a city like Nottingham.

At its peak, Cinesite had around 145 artists working on the project, including around 10 artists focusing on texturing and look development. The team spent six months alone creating the reimagined Nottingham, with another three months spent on additional scenes. Although the city of Dubrovnik informed many of the design choices, one of the pieces that had to be created from scratch was a massive cathedral, a focal point of the story. To fit with the film’s themes, Cinesite took inspiration from several real churches around the world to create something original, with a brutalist feel.

Using models and digital texturing, the team also created Robin’s childhood home of Loxley Manor, which was loosely based on a real structure in Završje, Croatia. There were two versions of the manor: one meant to convey the Loxley family in better times, and another seen after years of neglect and damage. Cinesite also helped to create one of the film’s most integral and complex moments, which saw Robin engage in a wagon chase through Nottingham. The scene was far too dangerous to use real animals in most shots, requiring Cinesite to dip back into its toolbox to create the texturing and look of the horse and its groom, along with the rigging and CFX.

“To create the world that the filmmakers wanted, we started by going through the process of understanding the story. From there we saw what the production had filmed and where the action needed to take place within the city, then we went about creating something unique,” Potter says. “The scale was massive, but the end result is a realistic world that will feel somewhat familiar, and yet still offer plenty of surprises.”

Robin Hood was released on home media on February 19.

Quick Chat: Crew Cuts’ Nancy Jacobsen and Stephanie Norris

By Randi Altman

Crew Cuts, a full-service production and post house, has been a New York fixture since 1986. Originally established as an editorial house, over the years as the industry evolved they added services that target all aspects of the workflow.

This independently-owned facility is run by executive producer/partner Nancy Jacobsen, senior editor/partner Sherri Margulies Keenan and senior editor/partner Jake Jacobsen. While commercial spots might be in their wheelhouse, their projects vary and include social media, music videos and indie films.

We decided to reach out to Nancy Jacobsen, as well as EP of finishing Stephanie Norris, to find out about trends, recent work and succeeding in an industry and city that isn’t always so welcoming.

Can you talk about what Crew Cuts provides and how you guys have evolved over the years?
Jacobsen: We pretty much do it all. We have 10 offline editors as well as artists working in VFX, 2D/3D animation, motion graphics/design, audio mix and sound design, VO record, color grading, title treatment, advanced compositing and conform. Two of our editors double as directors.

In the beginning, Crew Cuts primarily offered only editorial. As the years went by and the industry climate changed we began to cater to the needs of clients and slowly built out our entire finishing department. We started with some minimal graphics work and one staff artist in 2008.

In 2009, we expanded the team to include graphics, conform and audio mix. From there we just continued to grow and expand our department to the full finishing team we have today.

As a woman owner of a post house, what challenges have you had to overcome?
Jacobsen: When I started in this business, the industry was very different. I made less money than my male counterparts and it took me twice as long to be promoted because I am a woman. I have since seen great change where women are leading post houses and production houses and are finally getting the recognition for the hard work they deserve. Unfortunately, I had to “wait it out” and silently work harder than the men around me. This has paid off for me, and now I can help women get the credit they rightly deserve

Do you see the industry changing and becoming less male-dominated?
Jacobsen: Yes, the industry is definitely becoming less male-dominated. In the current climate, with the birth of the #metoo movement and specifically in our industry with the birth of Diet Madison Avenue (@dietmadisonave), we are seeing a lot more women step up and take on leading roles.

Are you mostly a commercial house? What other segments of the industry do you work in?
Jacobsen: We are primarily a commercial house. However, we are not limited to just broadcast and digital commercial advertising. We have delivered specs for everything from the Godzilla screen in Times Square to :06 spots on Instagram. We have done a handful of music videos and also handle a ton of B2B videos for in-house client meetings, etc., as well as banner ads for conferences and trade shows. We’ve even worked on display ads for airports. Most recently, one of our editors finished a feature film called Public Figure that is being submitted around the film festival circuit.

What types of projects are you working on most often these days?
Jacobsen: The industry is all over the place. The current climate is very messy right now. Our projects are extremely varied. It’s hard to say what we work on most because it seems like there is no more norm. We are working on everything from sizzle pitch videos to spots for the Super Bowl.

What trends have you seen over the last year, and where do you expect to be in a year?
Jacobsen: Over the last year, we have noticed that the work comes from every angle. Our typical client is no longer just the marketing agency. It is also the production company, network, brand, etc. In a year we expect to be doing more production work. Seeing as how budgets are much smaller than they used to be and everyone wants a one-stop shop, we are hoping to stick with our gut and continue expanding our production arm.

Crew Cuts has beefed up its finishing services. Can you talk about that?
Stephanie Norris: We offer a variety of finishing services — from sound design to VO record and mix, compositing to VFX, 2D and 3D motion graphics and color grading. Our fully staffed in-house team loves the visual effects puzzle and enjoys working with clients to help interpret their vision.

Can you name some recent projects and the services you provided?
Norris: We just worked on a new campaign for New Jersey Lottery in collaboration with Yonder Content and PureRed. Brian Neaman directed and edited the spots. In addition to editorial, Crew Cuts also handled all of the finishing, including color, conform, visual effects, graphics, sound design and mix. This was one of those all-hands-on-deck projects. Keeping everything under one roof really helped us to streamline the process.

New Jersey Lottery

Working with Brian to carefully plan the shooting strategy, we filmed a series of plate shots as elements that could later be combined in post to build each scene. We added falling stacks of cash to the reindeer as he walks through the loading dock and incorporated CG inflatable decorations into a warehouse holiday lawn scene. We also dramatically altered the opening and closing exterior warehouse scenes, allowing one shot to work for multiple seasons. Keeping lighting and camera positions consistent was mission-critical, and having our VFX supervisor, Dulany Foster, on set saved us hours of work down the line.

For the New Jersey Lottery Holiday spots, the Crew Cuts CG team, led by our creative director Ben McNamara created a 3D Inflatable display of lottery tickets. This was something that proved too costly and time consuming to manufacture and shoot practically. After the initial R&D, our team created a few different CG inflatable simulations prior to the shoot, and Dulany was able to mock them up live while on set. Creating the simulations was crucial for giving the art department reference while building the set, and also helped when shooting the plates needed to composite the scene together.

Ben and his team focused on the physics of the inflation, while also making sure the fabric simulations, textures and lighting blended seamlessly into the scene — it was important that everything felt realistic. In addition to the inflatables, our VFX team turned the opening and closing sunny, summer shots of the warehouse into a December winter wonderland thanks to heavy compositing, 3D set extension and snow simulations.

New Jersey Lottery

Any other projects you’d like to talk about?
Jacobsen: We are currently working on a project here that we are handling soup to nuts from production through finishing. It was a fun challenge to take on. The spot contains a hand model on a greenscreen showing the audience how to use a new product. The shoot itself took place here at Crew Cuts. We turned our common area into a stage for the day and were able to do so without interrupting any of the other employees and projects going on.

We are now working on editorial and finishing. The edit is coming along nicely. What really drives the piece here is the graphic icons. Our team is having a lot of fun designing these elements and implementing them into the spot. We are so proud because we budgeted wisely to make sure to accommodate all of the needs of the project so that we could handle everything and still turn a profit. It was so much fun to work in a different setting for the day and has been a very successful project so far. Clients are happy and so are we.

Main Image: (L-R) Stephanie Norris and Nancy Jacobsen

Avengers: Infinity War leads VES Awards with six noms

The Visual Effects Society (VES) has announced the nominees for the 17th Annual VES Awards, which recognize outstanding visual effects artistry and innovation in film, animation, television, commercials and video games as well as the VFX supervisors, VFX producers and hands-on artists who bring this work to life.

Avengers: Infinity War garners the most feature film nomination with six. Incredibles 2 is the top animated film contender with five nominations and Lost in Space leads the broadcast field with six nominations.

Nominees in 24 categories were selected by VES members via events hosted by 11 of the organizations Sections, including Australia, the Bay Area, Germany, London, Los Angeles, Montreal, New York, New Zealand, Toronto, Vancouver and Washington.

The VES Awards will be held on February 5th at the Beverly Hilton Hotel. As previously announced, the VES Visionary Award will be presented to writer/director/producer and co-creator of Westworld Jonathan Nolan. The VES Award for Creative Excellence will be given to award-winning creators/executive producers/writers/directors David Benioff and D.B. Weiss of Game of Thrones fame. Actor-comedian-author Patton Oswalt will once again host the VES Awards.

Here are the nominees:

Outstanding Visual Effects in a Photoreal Feature

Avengers: Infinity War

Daniel DeLeeuw

Jen Underdahl

Kelly Port

Matt Aitken

Daniel Sudick

 

Christopher Robin

Christopher Robin

Chris Lawrence

Steve Gaub

Michael Eames

Glenn Melenhorst

Chris Corbould

 

Ready Player One

Roger Guyett

Jennifer Meislohn

David Shirk

Matthew Butler

Neil Corbould

 

Solo: A Star Wars Story

Rob Bredow

Erin Dusseault

Matt Shumway

Patrick Tubach

Dominic Tuohy

 

Welcome to Marwen

Kevin Baillie

Sandra Scott

Seth Hill

Marc Chu

James Paradis

 

Outstanding Supporting Visual Effects in a Photoreal Feature 

12 Strong

Roger Nall

Robert Weaver

Mike Meinardus

 

Bird Box

Marcus Taormina

David Robinson

Mark Bakowski

Sophie Dawes

Mike Meinardus

 

Bohemian Rhapsody

Paul Norris

Tim Field

May Leung

Andrew Simmonds

 

First Man

Paul Lambert

Kevin Elam

Tristan Myles

Ian Hunter

JD Schwalm

 

Outlaw King

Alex Bicknell

Dan Bethell

Greg O’Connor

Stefano Pepin

 

Outstanding Visual Effects in an Animated Feature

Dr. Seuss’ The Grinch

Pierre Leduc

Janet Healy

Bruno Chauffard

Milo Riccarand

 

Incredibles 2

Brad Bird

John Walker

Rick Sayre

Bill Watral

 

Isle of Dogs

Mark Waring

Jeremy Dawson

Tim Ledbury

Lev Kolobov

 

Ralph Breaks the Internet

Scott Kersavage

Bradford Simonsen

Ernest J. Petti

Cory Loftis

 

Spider-Man: Into the Spider-Verse

Joshua Beveridge

Christian Hejnal

Danny Dimian

Bret St. Clair

 

Outstanding Visual Effects in a Photoreal Episode

Altered Carbon; Out of the Past

Everett Burrell

Tony Meagher

Steve Moncur

Christine Lemon

Joel Whist

 

Krypton; The Phantom Zone

Ian Markiewicz

Jennifer Wessner

Niklas Jacobson

Martin Pelletier

 

LOST IN SPACE

Lost in Space; Danger, Will Robinson

Jabbar Raisani

Terron Pratt

Niklas Jacobson

Joao Sita

 

The Terror; Go For Broke

Frank Petzold

Lenka Líkařová

Viktor Muller

Pedro Sabrosa

 

Westworld; The Passenger

Jay Worth

Elizabeth Castro

Bruce Branit

Joe Wehmeyer

Michael Lantieri

 

Outstanding Supporting Visual Effects in a Photoreal Episode

Tom Clancy’s Jack Ryan; Pilot

Erik Henry

Matt Robken

Bobo Skipper

Deak Ferrand

Pau Costa

 

The Alienist; The Boy on the Bridge

Kent Houston

Wendy Garfinkle

Steve Murgatroyd

Drew Jones

Paul Stephenson

 

The Deuce; We’re All Beasts

Jim Rider

Steven Weigle

John Bair

Aaron Raff

 

The First; Near and Far

Karen Goulekas

Eddie Bonin

Roland Langschwert

Bryan Godwin

Matthew James Kutcher

 

The Handmaid’s Tale; June

Brendan Taylor

Stephen Lebed

Winston Lee

Leo Bovell

 

Outstanding Visual Effects in a Realtime Project

Age of Sail

John Kahrs

Kevin Dart

Cassidy Curtis

Theresa Latzko

 

Cycles

Jeff Gipson

Nicholas Russell

Lauren Nicole Brown

Jorge E. Ruiz Cano

 

Dr Grordbort’s Invaders

Greg Broadmore

Mhairead Connor

Steve Lambert

Simon Baker

 

God of War

Maximilian Vaughn Ancar

Corey Teblum

Kevin Huynh

Paolo Surricchio

 

Marvel’s Spider-Man

Grant Hollis

Daniel Wang

Seth Faske

Abdul Bezrati

 

Outstanding Visual Effects in a Commercial 

Beyond Good & Evil 2

Maxime Luere

Leon Berelle

Remi Kozyra

Dominique Boidin

 

John Lewis; The Boy and the Piano

Kamen Markov

Philip Whalley

Anthony Bloor

Andy Steele

 

McDonald’s; #ReindeerReady

Ben Cronin

Josh King

Gez Wright

Suzanne Jandu

 

U.S. Marine Corps; A Nation’s Call

Steve Drew

Nick Fraser

Murray Butler

Greg White

Dave Peterson

 

Volkswagen; Born Confident

Carsten Keller

Anandi Peiris

Dan Sanders

Fabian Frank

 

Outstanding Visual Effects in a Special Venue Project

Beautiful Hunan; Flight of the Phoenix

R. Rajeev

Suhit Saha

Arish Fyzee

Unmesh Nimbalkar

 

Childish Gambino’s Pharos

Keith Miller

Alejandro Crawford

Thelvin Cabezas

Jeremy Thompson

 

DreamWorks Theatre Presents Kung Fu Panda

Marc Scott

Doug Cooper

Michael Losure

Alex Timchenko

 

Osheaga Music and Arts Festival

Andre Montambeault

Marie-Josee Paradis

Alyson Lamontagne

David Bishop Noriega

 

Pearl Quest

Eugénie von Tunzelmann

Liz Oliver

Ian Spendloff

Ross Burgess

 

Outstanding Animated Character in a Photoreal Feature

Avengers: Infinity War; Thanos

Jan Philip Cramer

Darren Hendler

Paul Story

Sidney Kombo-Kintombo

 

Christopher Robin; Tigger

Arslan Elver

Kayn Garcia

Laurent Laban

Mariano Mendiburu

 

Jurassic World: Fallen Kingdom; Indoraptor

Jance Rubinchik

Ted Lister

Yannick Gillain

Keith Ribbons

 

Ready Player One; Art3mis

David Shirk

Brian Cantwell

Jung-Seung Hong

Kim Ooi

 

Outstanding Animated Character in an Animated Feature

Dr. Seuss’ The Grinch; The Grinch

David Galante

Francois Boudaille

Olivier Luffin

Yarrow Cheney

 

Incredibles 2; Helen Parr

Michal Makarewicz

Ben Porter

Edgar Rodriguez

Kevin Singleton

 

Ralph Breaks the Internet; Ralphzilla

Dong Joo Byun

Dave K. Komorowski

Justin Sklar

Le Joyce Tong

 

Spider-Man: Into the Spider-Verse; Miles Morales

Marcos Kang

Chad Belteau

Humberto Rosa

Julie Bernier Gosselin

 

Outstanding Animated Character in an Episode or Realtime Project

Cycles; Rae

Jose Luis Gomez Diaz

Edward Everett Robbins III

Jorge E. Ruiz Cano

Jose Luis -Weecho- Velasquez

 

Lost in Space; Humanoid

Chad Shattuck

Paul Zeke

Julia Flanagan

Andrew McCartney

 

Nightflyers; All That We Have Found; Eris

Peter Giliberti

James Chretien

Ryan Cromie

Cesar Dacol Jr.

 

Spider-Man; Doc Ock

Brian Wyser

Henrique Naspolini

Sophie Brennan

William Salyers

 

Outstanding Animated Character in a Commercial

McDonald’s; Bobbi the Reindeer

Gabriela Ruch Salmeron

Joe Henson

Andrew Butler

Joel Best

 

Overkill’s The Walking Dead; Maya

Jonas Ekman

Goran Milic

Jonas Skoog

Henrik Eklundh

 

Peta; Best Friend; Lucky

Bernd Nalbach

Emanuel Fuchs

Sebastian Plank

Christian Leitner

 

Volkswagen; Born Confident; Bam

David Bryan

Chris Welsby

Fabian Frank

Chloe Dawe

 

Outstanding Created Environment in a Photoreal Feature

Ant-Man and the Wasp; Journey to the Quantum Realm

Florian Witzel

Harsh Mistri

Yuri Serizawa

Can Yuksel

 

Aquaman; Atlantis

Quentin Marmier

Aaron Barr

Jeffrey De Guzman

Ziad Shureih

 

Ready Player One; The Shining, Overlook Hotel

Mert Yamak

Stanley Wong

Joana Garrido

Daniel Gagiu

 

Solo: A Star Wars Story; Vandor Planet

Julian Foddy

Christoph Ammann

Clement Gerard

Pontus Albrecht

 

Outstanding Created Environment in an Animated Feature

Dr. Seuss’ The Grinch; Whoville

Loic Rastout

Ludovic Ramiere

Henri Deruer

Nicolas Brack

 

Incredibles 2; Parr House

Christopher M. Burrows

Philip Metschan

Michael Rutter

Joshua West

 

Ralph Breaks the Internet; Social Media District

Benjamin Min Huang

Jon Kim Krummel II

Gina Warr Lawes

Matthias Lechner

 

Spider-Man; Into the Spider-Verse; Graphic New York City

Terry Park

Bret St. Clair

Kimberly Liptrap

Dave Morehead

 

Outstanding Created Environment in an Episode, Commercial, or Realtime Project

Cycles; The House

Michael R.W. Anderson

Jeff Gipson

Jose Luis Gomez Diaz

Edward Everett Robbins III

 

Lost in Space; Pilot; Impact Area

Philip Engström

Kenny Vähäkari

Jason Martin

Martin Bergquist

 

The Deuce; 42nd St

John Bair

Vance Miller

Jose Marin

Steve Sullivan

 

The Handmaid’s Tale; June; Fenway Park

Patrick Zentis

Kevin McGeagh

Leo Bovell

Zachary Dembinski

 

The Man in the High Castle; Reichsmarschall Ceremony

Casi Blume

Michael Eng

Ben McDougal

Sean Myers

 

Outstanding Virtual Cinematography in a Photoreal Project

Aquaman; Third Act Battle

Claus Pedersen

Mohammad Rastkar

Cedric Lo

Ryan McCoy

 

Echo; Time Displacement

Victor Perez

Tomas Tjernberg

Tomas Wall

Marcus Dineen

 

Jurassic World: Fallen Kingdom; Gyrosphere Escape

Pawl Fulker

Matt Perrin

Oscar Faura

David Vickery

 

Ready Player One; New York Race

Daniele Bigi

Edmund Kolloen

Mathieu Vig

Jean-Baptiste Noyau

 

Welcome to Marwen; Town of Marwen

Kim Miles

Matthew Ward

Ryan Beagan

Marc Chu

 

Outstanding Model in a Photoreal or Animated Project 

Avengers: Infinity War; Nidavellir Forge Megastructure

Chad Roen

Ryan Rogers

Jeff Tetzlaff

Ming Pan

 

Incredibles 2; Underminer Vehicle

Neil Blevins

Philip Metschan

Kevin Singleton

 

Mortal Engines; London

Matthew Sandoval

James Ogle

Nick Keller

Sam Tack

 

Ready Player One; DeLorean DMC-12

Giuseppe Laterza

Kim Lindqvist

Mauro Giacomazzo

William Gallyot

 

Solo: A Star Wars Story; Millennium Falcon

Masa Narita

Steve Walton

David Meny

James Clyne

 

Outstanding Effects Simulations in a Photoreal Feature

Avengers: Infinity War; Titan

Gerardo Aguilera

Ashraf Ghoniem

Vasilis Pazionis

Hartwell Durfor

 

Avengers: Infinity War; Wakanda

Florian Witzel

Adam Lee

Miguel Perez Senent

Francisco Rodriguez

 

Fantastic Beasts: The Crimes of Grindelwald

Dominik Kirouac

Chloe Ostiguy

Christian Gaumond

 

Venom

Aharon Bourland

Jordan Walsh

Aleksandar Chalyovski

Federico Frassinelli

 

Outstanding Effects Simulations in an Animated Feature

Dr. Seuss’ The Grinch; Snow, Clouds and Smoke

Eric Carme

Nicolas Brice

Milo Riccarand

 

Incredibles 2

Paul Kanyuk

Tiffany Erickson Klohn

Vincent Serritella

Matthew Kiyoshi Wong

 

Ralph Breaks the Internet; Virus Infection & Destruction

Paul Carman

Henrik Fält

Christopher Hendryx

David Hutchins

 

Smallfoot

Henrik Karlsson

Theo Vandernoot

Martin Furness

Dmitriy Kolesnik

 

Spider-Man: Into the Spider-Verse

Ian Farnsworth

Pav Grochola

Simon Corbaux

Brian D. Casper

 

Outstanding Effects Simulations in an Episode, Commercial, or Realtime Project

Altered Carbon

Philipp Kratzer

Daniel Fernandez

Xavier Lestourneaud

Andrea Rosa

 

Lost in Space; Jupiter is Falling

Denys Shchukin

Heribert Raab

Michael Billette

Jaclyn Stauber

 

Lost in Space; The Get Away

Juri Bryan

Will Elsdale

Hugo Medda

Maxime Marline

 

The Man in the High Castle; Statue of Liberty Destruction

Saber Jlassi

Igor Zanic

Nick Chamberlain

Chris Parks

 

Outstanding Compositing in a Photoreal Feature

Avengers: Infinity War; Titan

Sabine Laimer

Tim Walker

Tobias Wiesner

Massimo Pasquetti

 

First Man

Joel Delle-Vergin

Peter Farkas

Miles Lauridsen

Francesco Dell’Anna

 

Jurassic World: Fallen Kingdom

John Galloway

Enrik Pavdeja

David Nolan

Juan Espigares Enriquez

 

Welcome to Marwen

Woei Lee

Saul Galbiati

Max Besner

Thai-Son Doan

 

Outstanding Compositing in a Photoreal Episode

Altered Carbon

Jean-François Leroux

Reece Sanders

Stephen Bennett

Laraib Atta

 

Handmaids Tale; June

Winston Lee

Gwen Zhang

Xi Luo

Kevin Quatman

 

Lost in Space; Impact; Crash Site Rescue

David Wahlberg

Douglas Roshamn

Sofie Ljunggren

Fredrik Lönn

 

Silicon Valley; Artificial Emotional Intelligence; Fiona

Tim Carras

Michael Eng

Shiying Li

Bill Parker

 

Outstanding Compositing in a Photoreal Commercial

Apple; Unlock

Morten Vinther

Michael Gregory

Gustavo Bellon

Rodrigo Jimenez

 

Apple; Welcome Home

Michael Ralla

Steve Drew

Alejandro Villabon

Peter Timberlake

 

Genesis; G90 Facelift

Neil Alford

Jose Caballero

Joseph Dymond

Greg Spencer

 

John Lewis; The Boy and the Piano

Kamen Markov

Pratyush Paruchuri

Kalle Kohlstrom

Daniel Benjamin

 

Outstanding Visual Effects in a Student Project

Chocolate Man

David Bellenbaum

Aleksandra Todorovic

Jörg Schmidt

Martin Boué

 

Proxima-b

Denis Krez

Tina Vest

Elias Kremer

Lukas Löffler

 

Ratatoskr

Meike Müller

Lena-Carolin Lohfink

Anno Schachner

Lisa Schachner

 

Terra Nova

Thomas Battistetti

Mélanie Geley

Mickael Le Mezo

Guillaume Hoarau

Rodeo VFX supe Arnaud Brisebois on the Fantastic Beasts sequel

By Randi Altman

Fantastic Beasts: Crimes of Grindelwald, directed by David Yates and written by J.K. Rowling, is a sequel to 2016’s Fantastic Beasts and Where to Find Them. It follows Newt Scamander (Eddie Redmayne) and a young Albus Dumbledore (Jude Law) as they attempt to take down the dark wizard Gellert Grindelwald (Johnny Depp).

Arnaud_Brisebois

As you can imagine, the film features a load of visual effects, and once again the team at Rodeo FX was called on to help. Their work included establishing the period in which the film is set and helping with the history of the Obscurus, Credence Barebone, and more.

Rodeo FX visual effects supervisor Arnaud Brisebois and team worked with the film’s VFX supervisors — Tim Burke and Christian Manz — to create digital environments, including detailed recreations of Paris in the 1920s and iconic wizarding locations like the Ministry of Magic.

Beyond these settings, the Montreal-based Brisebois was also in charge of creating the set pieces of the Obscurus’ destructive powers and a scene depicting its backstory. In all, they produced approximately 200 shots over a dozen sequences. While Brisebois visited the film’s set in Leavesden to get a better feel of the practical environments, he was not involved in principal photography.

Let’s find out more…

How early did you get involved, and how much input did you have?
Rodeo got involved in May 2017, at the time mainly working on pre-production creatures, design and concept art. I had a few calls with the film’s VFX supervisors, Tim Burke and Christian Manz, to discuss creatures and main directive lines for us to play with. From there we tried various ideas.
At that moment in pre-production, the essence of what the creatures were was clear, but their visual representation could really swing between extremes. That was the time to invent, study and propose directions for design.

Can you talk about creating the Ministry of Magic, which was partially practical, yes?
Correct, the London Ministry of Magic was indeed partially practically built. The partial set in this case meant a simple incurved corridor with a ceramic tiled wall. We still had to build the whole environment in CG in order to directly extend that practical set, but, most importantly, we extended the environment itself, with its immense circular atrium filled with thousands of busy offices.

For this build, we were provided with original Harry Potter set plans from production designer Stuart Craig, as well as plan revisions meant specifically for Crimes of Grindelwald. We also had access to LIDAR scans and cross-polarized photography from areas of the Harry Potter tour in Leavesden, which was extremely useful.

Every single architectural element was precisely built as individual units, and each unit composed of individual pieces. The single office variants were procedurally laid out on a flat grid over the set plan elevations and then wrapped as a cylinder using an expression.

The use of a procedural approach for this asset allowed for faster turnarounds and for changes to be made, even in the 11th hour. A crowd library was built to populate the offices and various areas of the Ministry, helping give it life and support the sense of scale.

So you were able to use assets from previous films?
What really links these movies together is production designer Stuart Craig. This is definitely his world, at least in visual terms. Also, as with all the Potter films, there are a large number of references and guidelines available for inspiration. This world has its own mythology, history and visual language. One does not need to look for long before finding a hint, something to link or ground a new effect in the wizarding world.

What about the scenes involving the Obscurus? Was any of the destruction it caused practical?
Apart from a few fans blowing a bit of wind on the actors, all destruction was full-frontal CG. A complex model of Irma’s house was built with precise architectural details required for its destruction. We also built a wide library of high-resolution hero debris, which was scattered on points and simulated for the very close-up shots. In the end, only the actors were preserved from live photography.

What was the most challenging sequence you worked on?
It was definitely Irma’s death. This sequence involved such a wide variety of effects — ranging from cloth and RBD levitation, tearing cloth, huge RBD simulations and, of course, the Obscurus itself, which is a very abstract and complex cloth setup driving flip simulations. The challenge also came from shot values, which meant everything we built or simulated had to hold up for tight close-ups, as well as wide shots.

Can you talk about the tools you used for VFX, management and review and approval?
All our tracking and review is done in Autodesk Shotgun. Artists worked up versions that they would then submit for dailies. All these submissions got in front of me at one point or another, and I then reviewed them and entered notes and directives to guide artists in the right direction.
For a project the size of Crimes of Grindelwald, over the course of 10 months, I reviewed and commented on approximately 6,000 versions for about 500 assets and 200 shots.

We are working on a Maya-based pipeline mainly, using it for modeling, rigging and shading. Zbrush is of course our main tool for organic modeling. We mostly use Mari and Substance Designer for textures. FX and CFX is handled in Houdini and our lighting pipeline is Katana based using Arnold as renderer. Our compositing pipeline is Nuke with a little use of Flame/Flare for very specific cases. We obviously have proprietary tools which help us boost these great softwares potential and offer custom solutions.

How did the workflow differ on this film from previous films?
It didn’t really differ. Working with the same team and the same crew, it really just felt like a continuation of our collaboration. These films are great to work on, not only because of their subject matter, but also thanks to the terrific people involved.

Roy H. Wagner, ASC, to speak at first Blackmagic Collective event

By Randi Altman

The newly formed Blackmagic Collective, a group founded by filmmakers for filmmakers, is holding the first of its free monthly meetings on Saturday, January 12 at the Gnomon School of Visual Effects in Hollywood.

The group, headed up by executive director Brett Harrison, says they are dedicated to sharing info on the art of filmmaking as well as education. “With Blackmagic Design’s support, the group will feature ‘TED Talk’-like presentations from media experts, panels covering post and production topics and film festivals, as well as networking opportunities.”

In addition, Blackmagic Design is offering free Resolve training attached to the meetings. While Blackmagic is a sponsor, this is not a Blackmagic-run group. According to Harrison, “The Blackmagic Collective is an independent group created to support the art of filmmaking as a whole. We are also a 501(c)(3) charity, with plans to find ways to give back to the community.” Membership is free, with no application process. Members can simply sign-up on the site. Despite the name, Harrison insists that the group, while inspired by Blackmagic’s filmmaking tools, is focused on filmmaking as a whole. “You do not need to use BMD tools to be a member,” adds Harrison.

On creating the Collective, Harrison says, “After producing the Blackmagic Design Conference + Expo in LA early in 2018, I realized that a monthly group in Hollywood for filmmakers to learn from other professionals and share with and inspire each other would be well-received and vital, particularly for Blackmagic users in the industry. BMD allows for an end-to-end workflow that encompasses the spectrum of production and post, with endless topics for our group to focus on, though we will be speaking on a range of topics and not strictly BMD gear and software.”

At their first meeting, esteemed film and television cinematographer Roy H. Wagner, ASC, will be interviewed by Christian Sebaldt, ASC, with a focus on Roy’s new feature film Stand!. There will be a panel discussing the art and experiences of young colorists from Efilm, Apache and Company 3. Also, the Blackmagic Collective will be announcing a film festival that will start in April and end in November with a final competition. Filmmakers can submit films each month. Selected films will be streamed on the group website, with a select few shown at the monthly meetings starting in April. Members will have the opportunity to vote for the best each month, with a final competition for the top five films at the November event.

In case you were wondering, and we know you are, the current plan for the film festival is this:
“Our film festival submissions must use BMD technology to be eligible to enter the contest. That may include cameras, software or both, depending on the category,” explains Harrison.

The Collective will also be hosting job fairs at every other meeting.

“We are thrilled to be supporting the Blackmagic Collective,” says Blackmagic president Dan May. “Our company shares a passion with filmmakers by creating hardware and software that make their craft easier and more cost effective. We feel the Collective will provide the added resource of bringing a focus to the art form of filmmaking, as well as helping share new ideas and technology among creatives at all skill levels, from student to professional.”

You can sign up for the Resolve editing class or the event (or both) at the website.

Foundry Nuke 11.3’s performance, collaboration updates

Foundry has launched Nuke 11.3, introducing new features and updates to the company’s family of compositing and review tools. The release is the fourth update to the Nuke 11 Series and is designed to improve the user experience and to speed up heavy processing tasks for pipelines and individual users.

Nuke 11.3 lands with major enhancements to its Live Groups feature. It introduces new functionality along with corresponding Python callbacks and UI notifications that will allow for greater collaboration and offer more control. These updates make Live Groups easier for larger pipelines to integrate and give artists more visibility over the state of the Live Group and flexibility when using user knobs to override values within a Live Group.

The particle system in NukeX has been optimized to produce particle simulations up to six times faster than previous versions of the software, and up to four times faster for playback, allowing for faster iteration when setting up particle systems.

New Timeline Multiview support provides an extension to stereo and VR workflows. Artists can now use the same multiple-file stereo workflows that exist in Nuke on the Nuke Studio, Hiero and HieroPlayer timeline. The updated export structure can also be used to create multiple-view Nuke scripts from the timeline in Nuke Studio and Hiero.

Support for full-resolution stereo on monitor out makes review sessions even easier, and a new export preset helps with rendering of stereo projects.

New UI indications for changes in bounding box size and channel count help artists troubleshoot their scripts. A visual indication identifies nodes that increase bounding box size to be greater than the image, helping artists to identify the state of the bounding box at a glance. Channel count is now displayed in the status bar, and a warning is triggered when the 1024-channel limit is exceeded. The appearance and threshold for triggering the bounding box and channel warnings can be set in the preferences.

The selection tool has also been improved in both 2D and 3D views, and an updated marquee and new lasso tool make selecting shapes and points even easier.

Nuke 11.3 is available for purchase — alongside full release details — on Foundry’s website and via accredited resellers.

Review: Foundry’s Athera cloud platform

By David Cox

I’ve been thinking for a while that there are two types of post houses — those that know what cloud technology can do for them, and those whose days are numbered. That isn’t to say that the use of cloud technology is essential to the survival of a post house, but if they haven’t evaluated the possibilities of it they’re probably living in the past. In such a fast-moving business, that’s not a good place to be.

The term “cloud computing” suffers a bit from being hijacked by know-nothing marketeers and has become a bit vague in meaning. It’s quite simple though: it just means a computer (or storage) owned and maintained by someone else, housed somewhere else and used remotely. The advantage is that a post house can reduce its destructive fixed overheads by owning fewer computers and thus save money on installation and upkeep. Cloud computers can be used as and when they are needed. This allows scaling up and down in proportion to workload.

Over the last few years, several providers have created global datacenters containing upwards of 50,000 servers per site, entirely for the use of anyone who wants to “remote in.” Amazon and Google are the two biggest providers, but as anyone who has tried to harness their power for post production can confirm, they’re not simple to understand or configure. Amazon alone has hundreds of different computer “instance” types, and accessing them requires navigating through a sea of unintelligible jargon. You must know your Elastic Beanstalks from your EC2, EKS and Lambda. And make sure you’ve worked out how to connect your S3, EFS and Glacier. Software licensing can also be tricky.

The truth is, these incredible cloud installations are for cleverer people than those of us that just like to make pretty pictures. They are more for the sort that like to build neural networks and don’t go outside very much. What our industry needs is some clever company to make a nice shiny front end that allows us to harness that power using the tools we know and love, and just make it all a bit simpler. Enter Athera, from Foundry. That’s exactly what they’ve done.

What is Athera?

Athera is a platform hosted on Google Cloud infrastructure that presents a user with icons for apps such as Nuke and Houdini. Access to each app is via short-term (30-day) rental. When an available app icon is clicked, a cloud computer is commanded into action, pre-installed with the chosen app. From then on, the app is used just as if locally installed. Of course, the app is actually running on a high-performance computer located in a secure and nicely cooled datacenter environment. Provided the user has a vaguely decent Internet connection, they’re good to go, because only the user interface is being transmitted across the network, not the actual raw image data.

Apps available on Athera include Foundry’s products, plus a few others. Nuke is represented in its base form, plus a Nuke X variant, Nuke Studio, and a combination of Nuke X and Cara VR. Also available are the Mari texture painting suite, Katana look-creating app and Modo CGI modeling software.

Athera also offers access to non-Foundry products like CGI software Houdini and Blender, as well as the Gaffer management tool.

NukeIn my first test, I rustled up an instance of Nuke Studio and one of Blender. The first thing I wanted to test was the GPU speed, as this can be somewhat variable for many cloud computer types (usually between zero and not much). I was pleasantly surprised as the rendering speed was close to that of a local Nvidia GeForce GTX 1080, which is pretty decent. I was also pleased to see that user preferences were maintained between sessions.

One thing that particularly impressed me was how I could call up multiple apps together and Athera would effectively build a network in the background to link them all up. Frames rendered out of Blender were instantly available in the cloud-hosted Nuke Studio, even though it was running on a different machine. This suggests the Athera infrastructure is well thought out because multi-machine, networked pipelines with attached storage are constructed with just a few clicks and without really thinking about it.

Access to the Athera apps is either by web browser or via a local client software called “Orbit.” In web browser mode, each app opens in its own browser tab. With Orbit, each app appears in a dedicated local window. Orbit boasts lower latency and the ability to use local hardware such as multiple monitors. Latency, which would show itself as a frustrating delay between control input and visual feedback, was impressively low, even when using the web browser interface. Generally, it was easy to forget that the app being used was not installed locally.

Getting files in and out was also straightforward. A Dropbox account can be directly linked, although a Google or Amazon S3 storage “bucket” is preferred for speed. There is also a hosted app called “Toolbox,” which is effectively a file browser to allow the management of files and folders.

The Athera platform also contains management and reporting features. A manager can set up projects and users, setting out which apps and projects a user has access to. Quotas can be set, and full reports are given as to who did what, when and with which app.

Athera’s pricing is laid out on their website and it’s interesting to drill into the costs and make comparisons. A user buys access to apps in 30-day blocks. Personally, I would like to see shorter blocks at some point to increase up/down scale flexibility. That said, render-only instances for many of the apps can be accessed on a per-second billing basis. The 30-day block comes with a “fair use” policy of 200 hours. This is a hard limit, which equates to around nine and a half hours per day for five-day weeks (which is technically known in post production as part time).

Figuring Out Cost
Blender is a good place to start analyzing cost because it’s open source (free) software, so the $244 Athera cost to run for 30 days/200 hours must be for hardware only. This equates to $1.22 per hour, which, compared to direct cloud computer usage, is pretty good value for the GPU-backed machine on offer.

Modo

Another way of comparing the amount of $244 a month would be to say that a new computer costing $5,800 depreciates at roughly this monthly rate if depreciated over two years. That is to say, if a computer of that value is kept for two years before being replaced, it effectively loses roughly $241 per month in value. If depreciated over three years, the figure is $80 per month less. Of course, that’s just comparing the cost of depreciation. Cost of ownership must also include the costs of updating, maintaining, powering, cooling, insuring, housing and repairing if (when!) it breaks down. If a cloud computer breaks down, Google has a few thousand waiting in the wings. In general, the base hardware cost seems quite competitive.

Of course, Blender is not really the juicy stuff. Access to a base Nuke, complete with workstation, is $685 per 30 days / 200 hours. Nuke X is $1,025. There are also “power” options for around 20% more, where a significantly more powerful machine is provided. Compared to running a local machine with purchased or rented software, these prices are very interesting. But when the ability to scale up and down with workload is factored in, especially being able to scale down to nothing during quiet times, the case for Athera becomes quite compelling.

Another helpful factor is that a single 30-day access block to a particular app can be shared between multiple users — as long as only one user has control of the app at a time. This is subject to the fair use limitation.

There is an issue if commercial (licensed) plug-ins are needed. For the time being, these can’t be used on Athera due to the obvious licensing issues relating to their installation on a different cloud machine each time. Hopefully, plugin developers will become alive to the possibilities of pay-per-use licensing, as a platform like Athera could be the perfect storefront.

Mari

Security
One of the biggest concerns about using remote computing is that of security. This concern tends to be more perceptual than real. The truth is that a Google datacenter is likely to have significantly more security than an average post company’s machine room. Also, they will be employing the best in the security business. But if material being worked on leaks out into the public, telling a client, “But I just sent it to Google and figured it would be fine,” isn’t going to sound great. Realistically, the most likely concern for security is the sending of data to and from a datacenter. A security breach inside the datacenter is very unlikely. As ever, a post producer has to remain vigilant.

Summing Up
I think Foundry has been very smart and forward thinking to create a platform that is able to support more than just Foundry products in the cloud. It would have been understandable if they just made it a storefront for alternative ways of using a Nuke (etc), but they clearly see a bigger picture. Using a platform like Athera, post infrastructure can be assembled and disassembled on demand to allow post producers to match their overheads to their workload.

Athera enables smart post producers to build a highly scalable post environment with access to a global pool of creative talent who can log in and contribute from anywhere with little more than a modest computer and internet connection.

I hate the term game-changer — it’s another term so abused by know-nothing marketeers who have otherwise run out of ideas — but Athera, or at least what this sort of platform promises to provide, is most certainly a game-changer. Especially if more apps from different manufacturers can be included.


David Cox is a VFX compositor and colorist with 20-plus years of experience. He started his career with MPC and The Mill before forming his own London-based post facility. Cox recently created interactive projects with full body motion sensors and 4D/AR experiences.

Behind the Title: Jogger Studios’ CD Andy Brown

This veteran creative director can also often be found at the controls of his Flame working on a new spot.

NAME: Andy Brown

COMPANY: Jogger Studios (@joggerstudios)

CAN YOU DESCRIBE YOUR COMPANY?
We are a boutique post house with offices in the US and UK providing visual effects, motion graphics, color grading and finishing. We are partnered with Cut + Run for editorial and get to work with their editors from around the world. I am based in our Jogger Los Angeles office, after having helped found the company in London.

WHAT’S YOUR JOB TITLE?
Creative Director

WHAT DOES THAT ENTAIL?
Overseeing compositing, visual effects and finishing. Looking after staff and clients. Juggling all of these things and anticipating the unexpected.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I’m still working “on the box” every day. Even though my title is creative director, it is the hands-on work that is my first love as far as project collaborations go. Also I get to re-program the phones and crawl under the desks to get the wires looking neater when viewed from the client couch.

WHAT’S YOUR FAVORITE PART OF THE JOB?
The variety, the people and the challenges. Just getting to work on a huge range of creative projects is such a privilege. How many people get to go to work each day looking forward to it?

WHAT’S YOUR LEAST FAVORITE?
The hours, occasionally. It’s more common to have to work without clients nowadays. That definitely makes for more work sometimes, as you might need to create two or three versions of a spot to get approval. If everyone was in the room together you reach a consensus more quickly.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
I like the start of the day best, when everyone is coming into the office and we are getting set up for whatever project we are working on. Could be the first coffee of the day that does it.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I want to say classic car dealer, but given my actual career path the most likely alternative would be editor.

WHY DID YOU CHOOSE THIS PROFESSION?
There were lots of reasons, when I look at it. It was either the Blue Peter Book of Television (the longest running TV program for kids, courtesy of the BBC) or my visit to the HTV Wales TV station with my dad when I was about 12. We walked around the studios and they were playing out a film to air, grading it live through a telecine. I was really struck by the influence that the colorist was having on what was seen.

I went on to do critical work on photography, film and television at the Centre for Contemporary Cultural Studies at Birmingham University. Part of that course involved being shown around the Pebble Mill BBC Studios. They were editing a sequence covering a public enquiry into the Handsworth riots in 1985. It just struck me how powerful the editing process was. The story could be told so many different ways, and the editor was playing a really big part in the process.

Those experiences (and an interest in writing) led me to think that television might be a good place to work. I got my first job as a runner at MPC after a friend had advised me how to get a start in the business.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
We worked on a couple of spots for Bai recently with Justin Timberlake creating the “brasberry.” We had to make up some graphic animations for the newsroom studio backdrop for the shoot and then animate opening title graphics to look just enough like it was a real news report, but not too much like a real news report.

We do quite a bit of food work, so there’s always some burgers, chicken or sliced veggies that need a bit of love to make them pop.

There’s a nice set extension job starting next week, and we recently finished a job with around 400 final versions, which made for a big old deliverables spreadsheet. There’s so much that we do that no one sees, which is the point if we do it right.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
Sometimes the job that you are most proud of isn’t necessarily the most amazing thing to look at. I used to work on newspaper commercials back in the UK, and it was all so “last minute.” A story broke, and all of a sudden you had to have a spot ready to go on air with no edit, no footage and only the bare bones of a script. It could be really challenging, but we had to get it done somehow.

But the best thing is seeing something on TV that you’ve worked on. At Jogger Studios, it is primarily commercials, so you get that excitement over and over again. It’s on air for a few weeks and then it’s gone. I like that. I saw two of our spots in a row recently on TV, which I got a kick out of. Still looking for that elusive hat-trick.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
The Flame, the Land Rover Series III and, sadly, my glasses.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
Just friends and family on Instagram, mainly. Although like most Flame operators, I look at the Flame Learning Channel on YouTube pretty regularly. YouTube also thinks I’m really interested in the Best Fails of 2018 for some reason.

DO YOU LISTEN TO MUSIC WHILE YOU WORK?
More often than not it is podcasts. West Wing Weekly, The Adam Buxton Podcast, Short Cuts and Song Exploder. Plus some of the shows on BBC 6 Music, which I really miss.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I go to work every day feeling incredibly lucky to be doing the job that I do, and it’s good to remember that. The 15-minute walk to and from work in Santa Monica usually does it.

Living so close to the beach is fantastic. We can get down to the sand, get the super-brella set up and get in the sea with the bodyboards in about 15 minutes. Then there’s the Malibu Cars & Coffee, which is a great place to start your Sunday.

Review: Blackmagic’s Resolve 15

By David Cox

DaVinci Resolve 15 from Blackmagic Design has now been released. The big news is that Blackmagic’s compositing software Fusion has been incorporated into Resolve, joining the editing and audio mixing capabilities added to color grading in recent years. However, to focus just on this would hide a wide array of updates to Resolve, large and small, across the entire platform. I’ve picked out some of my favorite updates in each area.

For Colorists
Each time Blackmagic adds a new discipline to Resolve, colorists fear that the color features take a back seat. After all, Resolve was a color grading system long before anything else. But I’m happy to say there’s nothing to fear in Version 15, as there are several very nice color tweaks and new features to keep everyone happy.

I particularly like the new “stills store” functionality, which allows the colorist to find and apply a grade from any shot in any timeline in any project. Rather than just having access to manually saved grades in the gallery area, thumbnails of any graded shot can be viewed and copied, no matter which timeline or project they are in, even those not explicitly saved as stills. This is great for multi-version work, which is every project these days.

Grades saved as stills (and LUTS) can also be previewed on the current shot using the “Live Preview” feature. Hovering the mouse cursor over a still and scrubbing left and right will show the current shot with the selected grade temporarily applied. It makes quick work of finding the most appropriate look from an existing library.

Another new feature I like is called “Shared Nodes.” A color grading node can be set as “shared,” which creates a common grading node that can be inserted into multiple shots. Changing one instance, changes all instances of that shared node. This approach is more flexible and visible than using Groups, as the node can be seen in each node layout and can sit at any point in the process flow.

As well as the addition of multiple play-heads, a popular feature in other grading systems, there is a plethora of minor improvements. For example, you can now drag the qualifier graphics to adjust settings, as opposed to just the numeric values below them. There are new features to finesse the mattes generated from the keying functions, as well as improvements to the denoise and face refinement features. Nodes can be selected with a single click instead of a double click. In fact, there are 34 color improvements or new features listed in the release notes.

For Editors
As with color, there are a wide range of minor tweaks all aimed at improving feel and ergonomics, particularly around dynamic trim modes, numeric timecode entry and the like. I really like one of the major new features, which is the ability to open multiple timelines on the screen at the same time. This is perfect for grabbing shots, sequences and settings from other timelines.

As someone who works a lot with VFX projects, I also like the new “Replace Edit” function, which is aimed at those of us that start our timelines with early drafts of VFX and then update them as improved versions come along. The new function allows updated shots to be dragged over their predecessors, replacing them but inheriting all modifications made, such as the color grade.

An additional feature to the existing markers and notes functions is called “Drawn Annotations.” An editor can point out issues in a shot with lines and arrows, then detail them with notes and highlight them with timeline markers. This is great as a “note to self” to fix later, or in collaborative workflows where notes can be left for other editors, colorists or compositors.

Previous versions of Resolve had very basic text titling. Thanks to the incorporation of Fusion, the edit page of Resolve now has a feature called Text+, a significant upgrade on the incumbent offering. It allows more detailed text control, animation, gradient fills, dotted outlines, circular typing and so on. Within Fusion there is a modifier called “Follower,” which enables letter-by-letter animation, allowing Text+ to compete with After Effects for type animation. On my beta test version of Resolve 15, this wasn’t available in the Edit page, which could be down to the beta status or an intent to keep the Text+ controls in the Edit page more streamlined.

For Audio
I’m not an audio guy, so my usefulness in reviewing these parts is distinctly limited. There are 25 listed improvements or new features, according to the release notes. One is the incorporation of Fairlight’s Automated Dialog Replacement processes, which creates a workflow for the replacement of unsalvageable originally recorded dialog.

There are also 13 new built-in audio effects plugins, such as Chorus, Echo and Flanger, as well as de-esser and de-hummer clean-up tools.
Another useful addition both for audio mixers and editors is the ability to import entire audio effects libraries, which can then be searched and star-rated from within the Edit and Fairlight pages.

Now With Added Fusion
So to the headline act — the incorporation of Fusion into Resolve. Fusion is a highly regarded node-based 2D and 3D compositing software package. I reviewed Version 9 in postPerspective last year [https://postperspective.com/review-blackmagics-fusion-9/]. Bringing it into Resolve links it directly to editing, color grading and audio mixing to create arguably the most agile post production suite available.

Combining Resolve and Fusion will create some interesting challenges for Blackmagic, who say that the integration of the two will be ongoing for some time. Their challenge isn’t just linking two software packages, each with their own long heritage, but in making a coherent system that makes sense to all users.

The issue is this: editors and colorists need to work at a fast pace, and want the minimum number of controls clearly presented. A compositor needs infinite flexibility and wants a button and value for every function, with a graph and ideally the ability to drive it with a mathematical expression or script. Creating an interface that suits both is near impossible. Dumbing down a compositing environment limits its ability, whereas complicating an editing or color environment destroys its flow.

Fusion occupies its own “page” within Resolve, alongside pages for “Color,” “Fairlight” (audio) and “Edit.” This is a good solution in so far that each interface can be tuned for its dedicated purpose. The ability to join Fusion also works very well. A user can seamlessly move from Edit to Fusion to Color and back again, without delays, rendering or importing. If a user is familiar with Resolve and Fusion, it works very well indeed. If the user is not accustomed to high-end node-based compositing, then the Fusion page can be daunting.

I think the challenge going forward will be how to make the creative possibilities of Fusion more accessible to colorists and editors without compromising the flexibility a compositor needs. Certainly, there are areas in Fusion that can be made more obvious. As with many mature software packages, Fusion has the occasional hidden right click or alt-click function that is hard for new users to discover. But beyond that, the answer is probably to let a subset of Fusion’s ability creep into the Edit and Color pages, where more common tasks can be accommodated with simplified control sets and interfaces. This is actually already the case with Text+; a Fusion “effect” that is directly accessible within the Edit section.

Another possible area to help is Fusion Macros. This is an inbuilt feature within Fusion that allows a designer to create an effect and then condense it down to a single node, including just the specific controls needed for that combined effect. Currently, Macros that integrate the Text+ effect can be loaded directly in the Edit page’s “Title Templates” section.

I would encourage Blackmagic to open this up further to allow any sort of Macro to be added for video transitions, graphics generators and the like. This could encourage a vibrant exchange of user-created effects, which would arm editors and colorists with a vast array of immediate and community sourced creative options.

Overall, the incorporation of Fusion is a definite success in my view, whether used to empower multi-skilled post creatives or to provide a common environment for specialized creatives to collaborate. The volume of updates and the speed at which the Resolve software developers address the issues exposed during public beta trials, remains nothing short of impressive.


David Cox is a VFX compositor and colorist with 20-plus years of experience. He started his career with MPC and The Mill before forming his own London-based post facility. Cox recently created interactive projects with full body motion sensors and 4D/AR experiences.

Framestore Chicago adds compositing lead Chris Beers

Framestore Chicago has added Chris Beers as compositing lead. He will be working across a variety of clients with the Chicago team, as well as diving into Nuke projects of his own.

Beers attended The Illinois Institute of Art, where he earned a BFA in visual effects and motion graphics. After graduation he honed his skills as a junior motion graphics artist at Leviathan in Chicago, and has since worked on projects of all sizes.

Beers’ career highlights include working as an After Effects artist on an expansive projection mapping project for Brazilian musician Amon Tobin’s ISAM world tour, and as a Nuke compositor for the title sequences on Marvel films Ant-Man, Captain America: Civil War and Doctor Strange. Beers was lead compositor on the series finale of Netflix science-fiction drama Sense8, having worked with the team as a Nuke compositor across both seasons of the show.

“As we recently celebrated our office’s official first year and continue to expand, it’s talent like Chris that makes our studio what it is: a creative hub with a strong sense of community, but the firepower of an integrated, global studio,” says Framestore Chicago’s MD Krystina Wilson.

Creative editorial and post boutique Hiatus opens in Detroit

Hiatus, a full-service, post production studio with in-house creative editorial, original music composition and motion graphics departments, has opened in Detroit. Their creative content offerings cover categories such as documentary, narrative, conceptual, music videos and advertising media for all video platforms.

Led by founder/senior editor Shane Patrick Ford, the new company includes executive producer/partner Catherine Pink, and executive producer Joshua Magee, who joins Hiatus from the animation studio Lunar North. Additional talents feature editor Josh Beebe, composer/editor David Chapdelaine and animator James Naugle.

The roots of Hiatus began with The Factory, a music venue founded by Ford while he was still in college. It provided a venue for local Detroit musicians to play, as well as touring bands. Ford, along with a small group of creatives, then formed The Work – a production company focused on commercial and advertising projects. For Ford, the launch of Hiatus is an opportunity to focus solely on his editorial projects and to expand his creative reach and that of his team nationally.

Leading up to the launch of Hiatus, the team has worked on projects for brands such as Sony, Ford Motor Company, Acura and Bush’s, as well as recent music videos for Lord Huron, Parquet Courts and the Wombats.

The Hiatus team is also putting the finishing touches on the company’s first original feature film Dare to Struggle, Dare to Win. The film uncovers a Detroit Police decoy unit named STRESS and the efforts made to restore civil order in 1970s post-rebellion Detroit. Dare to Struggle, Dare to Win makes its debut at the Indy Film Festival on Sunday April 29th and Tuesday May 1st in Indianapolis, before it hits the film festival circuit.

“Launching Hiatus was a natural evolution for me,” says Ford. “It was time to give my creative team even more opportunities, to expand our network and to collaborate with people across the country that I’ve made great connections with. As the post team evolved within The Work, we outgrew the original role it played within a production company. We began to develop our own team, culture, offerings and our own processes. With the launch of Hiatus, we are poised to better serve the visual arts community, to continue to grow and to be recognized for the talented creative team we are.”

“Instead of having a post house stacked with people, we’d prefer to stay small and choose the right personal fit for each project when it comes to color, VFX and heavy finishing,” explains Hiatus EP Catherine Pink. “We have a network of like-minded artists that we can call on, so each project gets the right creative attention and touch it deserves. Also, the lower overhead allows us to remain nimble and work with a variety of budget needs and all kinds of clients.”

VFX supervisor Lesley Robson-Foster on Amazon’s Mrs. Maisel

By Randi Altman

If you are one of the many who tend to binge-watch streaming shows, you’ve likely already enjoyed Amazon’s The Marvelous Mrs. Maisel. This new comedy focuses on a young wife and mother living in New York City in 1958, when men worked and women tended to, well, not work.

After her husband leaves her, Mrs. Maisel chooses stand-up comedy over therapy — or you could say stand-up comedy chooses her. The show takes place in a few New York neighborhoods, including the toney Upper West Side, the Garment District and the Village. The storyline brings real-life characters into this fictional world — Midge Maisel studies by listening to Red Foxx comedy albums, and she also befriends comic Lenny Bruce, who appears in a number of episodes.

Lesley Robson-Foster on set.

The show, created by Amy Sherman-Palladino and Dan Palladino, is colorful and bright and features a significant amount of visual effects — approximately 80 per episode.

We reached out to the show’s VFX supervisor, Lesley Robson-Foster, to find out more.

How early did you get involved in Mrs. Maisel?
The producer Dhana Gilbert brought my producer Parker Chehak and I in early to discuss feasibility issues, as this is a period piece and to see if Amy and Dan liked us! We’ve been on since the pilot.

What did the creators/showrunners say they needed?
They needed 1958 New York City, weather changes and some very fancy single-shot blending. Also, some fantasy and magic realism.

As you mentioned, this is a period piece, so I’m assuming a lot of your work is based on that.
The big period shots in Season 1 are the Garment District reconstruction. We shot on 19th Street between 5th and 6th — the brilliant production designer Bill Groom did 1/3 of the street practically and VFX took care of the rest, such as crowd duplication and CG cars and crowds. Then we shot on Park Avenue and had to remove the Met Life building down near Grand Central, and knock out anything post-1958.

We also did a major gag with the driving footage. We shot driving plates around the Upper West Side and had a flotilla of period-correct cars with us, but could not get rid of all the parked cars. My genius design partner on the show Douglas Purver created a wall of parked period CG cars and put them over the modern ones. Phosphene then did the compositing.

What other types of effects did you provide?
Amy and Dan — the creators and showrunners — haven’t done many VFX shows, but they are very, very experienced. They write and ask for amazing things that allow me to have great fun. For example, I was asked to make a shot where our heroine is standing inside a subway car, and then the camera comes hurtling backwards through the end of the carriage and then sees the train going away down the tunnel. All we had was a third of a carriage with two and a half walls on set. Douglas Purver made a matte painting of the tunnel, created a CG train and put it all together.

Can you talk about the importance of being on set?
For me being on set is everything. I talk directors out of VFX shots and fixes all day long. If you can get it practically you should get it practically. It’s the best advice you’ll ever give as a VFX supervisor. A trust is built that you will give your best advice, and if you really need to shoot plates and interrupt the flow of the day, then they know it’s important for the finished shot.

Having a good relationship with every department is crucial.

Can you give an example of how being on set might have saved a shot or made a shot stronger?
This is a character-driven show. The directors really like Steadicam and long, long shots following the action. Even though a lot of the effects we want to do really demand motion control, I know I just can’t have it. It would kill the performances and take up too much time and room.

I run around with string and tennis balls to line things up. I watch the monitors carefully and use QTake to make sure things line up within acceptable parameters.

In my experience you have to have the production’s best interests at heart. Dhana Gilbert knows that a VFX supervisor on the crew and as part of the team smooths out the season. They really don’t want a supervisor who is intermittent and doesn’t have the whole picture. I’ve done several shows with Dhana; she knows my idea of how to service a show with an in-house team.

You shot b-roll for this? What camera did you use, and why?
We used a Blackmagic Ursa Mini Pro. We rented one on The OA for Netflix last year and found it to be really easy to use. We liked that’s its self-contained and we can use the Canon glass from our DSLR kits. It’s got a built-in monitor and it can shoot RAW 4.6K. It cut in just fine with the Alexa Mini for establishing shots and plates. It fits into a single backpack so we could get a shot at a moment’s notice. The user interface on the camera is so intuitive that anyone on the VFX team could pick it up and learn how to get the shot in 30 minutes.

What VFX houses did you employ, and how do you like to work with them?
We keep as much as we can in New York City, of course. Phosphene is our main vendor, and we like Shade and Alkemy X. I like RVX in Iceland, El Ranchito in Spain and Rodeo in Montreal. I also have a host of secret weapon individuals dotted around the world. For Parker and I, it’s always horses for courses. Whom we send the work to depends on the shot.

For each show we build a small in-house team — we do the temps and figure out the design, and shoot plates and elements before shots leave us to go to the vendor.

You’ve worked on many critically acclaimed television series. Television is famous for quick turnarounds. How do you and your team prepare for those tight deadlines?
Television schedules can be relentless. Prep, shoot and post all at the same time. I like it very much as it keeps the wheels of the machine oiled. We work on features in between the series and enjoy that slower process too. It’s all the same skill set and workflow — just different paces.

If you have to offer a production a tip or two about how to make the process go more smoothly, what would it be?
I would say be involved with EVERYTHING. Keep your nose close to the ground. Really familiarize yourself with the scripts — head trouble off at the pass by discussing upcoming events with the relevant person. Be fluid and flexible and engaged!

Behind the Title: Senior compositing artist Marcel Lemme

We recently reached out to Marcel Lemme to find out more about how he works, his background and how he relaxes.

What is your job title and where are you based?
I’m a senior compositing artist based out of Hamburg, Germany.

What does your job entail?
I spend about 90 percent of my time working on commercial jobs for local and international companies like BMW, Audi and Nestle, but also dabble in feature films, corporate videos and music videos. On a regular day, I’m handling everything from job breakdowns to set supervision to conform. I’m also doing shot management for the team, interacting with clients, showing clients work and some compositing. Client review sessions and final approvals are regular occurrences for me too.

What would surprise people the most about the responsibilities that fall under that title?
When it comes to client attended sessions, you have to be part clown, part mind-reader. Half the job is being a good artist; the other half is keeping clients happy. You have to anticipate what the client will want and balance that with what you know looks best. I not only have to create and keep a good mood in the room, but also problem-solve with a smile.

What’s your favorite part of your job?
I love solving problems when compositing solo. There’s nothing better than tackling a tough project and getting results you’re proud of.

What’s your least favorite?
Sometimes the client isn’t sure what they want, which can make the job harder.

What’s your most productive time of day?
I’m definitely not a morning guy, so the evening — I’m more productive at night.

If you didn’t have this job, what would you be doing instead?
I’ve asked myself this question a lot, but honestly, I’ve never come up with a good answer.

How’d you get your first job, and did you know this was your path early on?
I fell into it. I was young and thought I’d give computer graphics a try, so I reached out to someonewho knew someone, and before I knew it I was interning at a company in Hamburg, which is how I came to know online editing. At the time, Quantel mostly dominated the industry with Editbox and Henry, and Autodesk Flame and Flint were just emerging. I dove in and started using all the technology I could get my hands on, and gradually started securing jobs based on recommendations.

Which tools are you using today, and why?
I use whatever the client and/or the project demands, whether it’s Flame or Foundry’s Nuke and for tracking I often use The Pixel Farm PFTrack and Boris FX Mocha. For commercial spots, I’ll do a lot of the conform and shot management on Flame and then hand off the shots to other team members. Or, if I do it myself, I’ll finish in Flame because I know I can do it fast.

I use Flame because it gives me different ways to achieve a certain look or find a solution to a problem. I can also play a clip at any resolution with just two clicks in Flame, which is important when you’re in a room with clients who want to see different versions on the fly. The recent open clip updates and python integration have also saved me time. I can import and review shots, with automatic versions coming in, and build new tools or automate tedious processes in the post chain that have typically slowed me down.

Tell us about some recent project work.
I recently worked on a project for BMW as a compositing supervisor and collaborated with eight other compositors to finish number of versions in a short amount of time. We did shot management, compositing, reviewing, versioning and such in Flame. Also individual shot compositing in Nuke and some tracking in Mocha Pro.

What is the project that you are most proud of?
There’s no one project that stands out in particular, but overall, I’m proud of jobs like the BMW spots, where I’ve led a team of artists and everything just works and flows. It’s rewarding when the client doesn’t know what you did or how you did it, but loves the end result.

Where do you find inspiration for your projects?
The obvious answer here is other commercials, but I also watch a lot of movies and, of course, spend time on the Internet.

Name three pieces of technology you can’t live without.
The off button on the telephone (they should really make that bigger), anything related to cinematography or digital cinema, and streaming technology.

What social media channels do you follow?
I’ve managed to avoid Facebook, but I do peek at Twitter and Instagram from time to time. Twitter can be a great quick reference for regional news or finding out about new technology and/or industry trends.

Do you listen to music while you work?
Less now than I did when I was younger. Most of the time, I can’t as I’m juggling too much and it’s distracting. When I listen to music, I appreciate techno, classical and singer/song writer stuff; whatever sets the mood for the shots I’m working on. Right now, I’m into Iron and Wine and Trentemøller, a Danish electronic music producer.

How do you de-stress from the job?
My drive home. It can take anywhere from a half an hour to an hour, depending on the traffic, and that’s my alone time. Sometimes I listen to music, other times I sit in silence. I cool down and prepare to switch gears before heading home to be with my family.

Foundry intros Mari 4.0

Foundry’s Mari 4.0 is the latest version of the company’s digital 3D painting and texturing tool. Foundry launches Mari 4.0 with a host of advanced features, making the tool easier to use and faster to learn. Mari 4.0 comes equipped with more flexible and configurable exporting, simpler navigation, and a raft of improved workflows.

Key benefits of Mari 4.0 include:
Quicker start-up and export: Mari 4.0 allows artists to get projects up-and-running faster with a new startup mechanism that automatically performs the steps previously completed manually by the user. Shaders are automatically built, with channels connected to them as defined by the channel presets in the startup dialog. The user also now gets the choice of initial lighting and shading setup. The new Export Manager configures the batch exporting of Channels and Bake Point Nodes. Artists can create and manage multiple export targets from the same source, as well as perform format conversions during export. This allows for far more control and flexibility when passing Mari’s texture maps down the pipeline.

Better navigation: A new Palettes Toolbar containing all Mari’s palettes offers easy access and visibility to everything Mari can do. It’s now easier to expand a Palette to fullscreen by hitting the spacebar while your mouse is hovered over it. Tools of a similar function have been grouped under a single button in the Tools toolbar, taking up less space and allowing the user to better focus on the Canvas. Various Palettes have been merged together, removing duplication and simplifying the UI, making Mari both easier to learn and use.

Improved UI: The Colors Palette is now scalable for better precision, and the component sliders have been improved to show the resulting color at each point along the control. Users can now fine tune their procedural operations with precision keyboard stepping functionality brought into Mari’s numeric controls.

The HUD has been redesigned so it no longer draws over the paint subject, allowing the user to better focus on their painting and work more effectively. Basic Node Graph mode has been removed: Advanced is now the default. For everyone learning Mari, the Non-Commercial version now has full Node Graph access.

Enhanced workflows: A number of key workflow improvements have been brought to Mari 4.0. A drag-and-drop fill mechanism allows users to fill paint across their selections in a far more intuitive manner, reducing time and increasing efficiency. The Brush Editor has been merged into the Tool Properties Palette, with the brush being used now clearly displayed. It’s now easy to browse and load sets of texture files into Mari, with a new Palette for browsing texture sets. The Layers Palette is now more intuitive when working with Group layers, allowing users to achieve the setups they desire with less steps. And users now have a shader in Mari that previews and works with the channels that match their final 3D program/shader: The Principled BRDF, based on the 2012 paper from Brent Burley of Walt Disney Animation Studios.

Core: Having upgraded to OpenSubdiv 3.1.x and introduced the features into the UI, users are able to better match the behavior of mesh subdivision that they get in software renderers. Mari’s user preference files are now saved with the application version embedded in the file names —meaning artists can work between different versions of Mari without the danger of corrupting their UI or preferences. Many preferences have had their groups, labels and tooltips modified to be easier to understand. All third-party libraries have been upgraded to match those specified by the VFX Reference Platform 2017.
Mari 4.0 is available now.

Storage in the Studio: Post Houses

By Karen Maierhofer

There are many pieces that go into post production, from conform, color, dubbing and editing to dailies and more. Depending on the project, a post house can be charged with one or two pieces of this complex puzzle, or even the entire workload. No matter the job, the tasks must be done on time and on budget. Unforeseen downtime is unacceptable.

That is why when it comes to choosing a storage solution, post houses are very particular. They need a setup that is secure, reliable and can scale. For them, one size simply does not fit all. They all want a solution that fits their particular needs and the needs of their clients.

Here, we look at three post facilities of various sizes and range of services, and the storage solutions that are a good fit for their business.

Liam Ford

Sim International
The New York City location of Sim has been in existence for over 20 years, operating under the former name of Post Factory NY up until about a month ago when Sim rebranded it and its seven other founding post companies as Sim International. Whether called by its new moniker or its previous one, the facility has grown to become a premier space in the city for offline editorial teams as well as one of the top high-end finishing studios in town, as the list of feature films and episodic shows that have been cut and finished at Sim is quite lengthy. And starting this past year, Sim has launched a boutique commercial finishing division.

According to senior VP of post engineering Liam Ford, the vast majority of the projects at the NYC facility are 4K, much of which is episodic work. “So, the need is for very high-capacity, very high-bandwidth storage,” Ford says. And because the studio is located in New York, where space is limited, that same storage must be as dense as possible.

For its finishing work, Sim New York is using a Quantum Xcellis SAN, a StorNext-based appliance system that can be specifically tuned for 4K media workflow. The system, which was installed approximately two years ago, runs on a 16Gb Fibre Channel network. Almost half a petabyte of storage fits into just a dozen rack units. Meanwhile, an Avid Nexis handles the facility’s offline work.

The Sim SAN serves as the primary playback system for all the editing rooms. While there are SSDs in some of the workstations for caching purposes, the scheduling demands of clients do not leave much time for staging material back and forth between volumes, according to Ford. So, everything gets loaded back to the SAN, and everything is played back from the SAN.

As Ford explains, content comes into the studio from a variety of sources, whether drives, tapes or Internet transfers, and all of that is loaded directly onto the SAN. An online editor then soft-imports all that material into his or her conform application and creates an edited, high-resolution sequence that is rendered back to the SAN. Once at the SAN, that edited sequence is available for a supervised playback session with the in-house colorists, finishing VFX artists and so forth.

“The point is, our SAN is the central hub through which all content at all stages of the finishing process flows,” Ford adds.

Before installing the Xcellis system, the facility had been using local workstation storage only, but the huge growth in the finishing division prompted the transition to the shared SAN file system. “There’s no way we could do the amount of work we now have, and with the flexibility our clients demand, using a local storage workflow,” says Ford.

When it became necessary for the change, there were not a lot of options that met Sim’s demands for high bandwidth and reliable streaming, Ford points out, as Quantum’s StorNext and SGI’s CXFS were the main shared file systems for the M&E space. Sim decided to go with Quantum because of the work the vendor has done in recent years toward improving the M&E experience as well as the ease of installing the new system.

Nevertheless, with the advent of 25Gb and 100Gb Ethernet, Sim has been closely monitoring the high-performance NAS space. “There are a couple of really good options out there right now, and I can see us seriously looking at those products in the near future as, at the very least, an augmentation to our existing Fibre Channel-based storage,” Ford says.

At Sim, editors deal with a significant amount of Camera Raw, DPX and OpenEXR data. “Depending on the project, we could find ourselves needing 1.5GB/sec or more of bandwidth for a single playback session, and that’s just for one show,” says Ford. “We typically have three or four [shows] playing off the SAN at any one time, so the bandwidth needs are huge!”

Master of None

And the editors’ needs continue to evolve, as does their need for storage. “We keep needing more storage, and we need it to be faster and faster. Just when storage technology finally got to the point that doing 10-bit 2K shows was pretty painless, everyone started asking for 16-bit 4K,” Ford points out.

Recently, Sim completed work on the feature American Made and the Netflix show Master of None, in addition to a number of other episodic projects. For these and others shows, the SAN acts as the central hub around which the color correction, online editing, visual effects and deliverables are created.

“The finishing portion of the post pipeline deals exclusively with the highest-quality content available. It used to be that we’d do our work directly from a film reel on a telecine, but those days are long past,” says Ford. “You simply can’t run an efficient finishing pipeline anymore without a lot of storage.”

DigitalFilm Tree
DigitalFilm Tree (DFT) opened its doors in 1999 and now occupies a 10,000-square-foot space in Universal City, California, offering full round-trip post services, including traditional color grading, conform, dailies and VFX, as well as post system rentals and consulting services.

While Universal City may be DFT’s primary location, it has dozens of remote satellite systems — mini post houses for production companies and studios – around the world. Those remote post systems, along with the increase in camera resolution (Alexa, Raw, 4K), have multiplied DFT’s storage needs. Both have resulted in a sea change in the facility’s storage solution.

According to CEO Ramy Katrib, most companies in the media and entertainment industry historically have used block storage, and DFT was no different. But four years ago, the company began looking at object storage, which is used by Silicon Valley companies, like Dropbox and AWS, to store large assets. After significant research, Katrib felt it was a good fit for DFT as well, believing it to be a more economical way to build petabytes of storage, compared to using proprietary block storage.

Ramy Katrib

“We were unique from most of the post houses in that respect,” says Katrib. “We were different from many of the other companies using object storage — they were tech, financial institutions, government agencies, health care; we were the rare one from M&E – but our need for extremely large, scalable and resilient storage was the same as theirs.”

DFT’s primary work centers around scripted television — an industry segment that continues to grow. “We do 15-plus television shows at any given time, and we encourage them to shoot whatever they like, at whatever resolution they desire,” says Katrib. “Most of the industry relies on LTO to back up camera raw materials. We do that too, but we also encourage productions to take advantage of our object storage, and we will store everything they shoot and not punish them for it. It is a rather Utopian workflow. We now give producers access to all their camera raw material. It is extremely effective for our clients.”

Over four years ago, DFT began using a cloud-based platform called OpenStack, which is open-source software that controls large pools of data, to build and design its own object storage system. “We have our own software developers and people who built our hardware, and we are able to adjust to the needs of our clients and the needs of our own workflow,” says Katrib.

DFT designs its custom PC- and Linux-based post systems, including chassis from Super Micro, CPUs from Intel and graphic cards from Nvidia. Storage is provided from a number of companies, including spinning-disc and SSD solutions from Seagate Technology and Western Digital.

DFT then deploys remote dailies systems worldwide, in proximity to where productions are shooting. Each day clients plug their production hard drives (containing all camera raw files) into DFT’s remote dailies system. From DFT’s facility, dailies technicians remotely produce editorial, viewing and promo dailies files, and transfer them to their destinations worldwide. All the while, the camera raw files are transported from the production location to DFT’s ProStack “massively scalable object storage.” In this case, “private cloud storage” consists of servers DFT designed that house all the camera raw materials, with management from DFT post professionals who support clients with access to and management of their files.

DFT provides color grading for Great News.

Recently, storage vendors such as Quantum and Avid have begun building and branding their own object storage solutions not unlike what DFT has constructed at its Universal City locale. And the reason is simple: Object storage provides a clear advantage because of reliability and the low cost. “We looked at it because the storage we were paying for, proprietary block storage, was too expensive to house all the data our clients were generating. And resolutions are only going up. So, every year we needed more storage,” Katrib explains. “We needed a solution that could scale with the practical reality we were living.”

Then, about four years ago when DFT started becoming a software company, one of the developers brought OpenStack to Katrib’s attention. “The open-source platform provided several storage solutions, networking capabilities and cloud compute capabilities for free,” he points out. Of course, the solution is not a panacea, as it requires a company to customize the offering for its own needs and even contribute back to the OpenStack community. But then again, that requirement enables DFT to evolve to the changing needs of its clients without waiting for a manufacturer to do it.

“It does not work out of the box like a solution from IBM, for instance. You have to develop around it,” Katrib says. “You have to have a lab mentality, designing your own hardware and software based on pain points in your own environment. And, sometimes it fails. But when you do it correctly, you realize it is an elegant solution.” However, there are vibrant communities, user groups and tech summits of those leveraging the technology who are willing to assist and collaborate.

DFT has evolved its object storage solution, extending its capabilities from an initial hundreds of terabytes – which is nothing to sneeze at — to hundreds of petabytes of storage. DFT also designs remote post systems and storage solutions for customers in remote locations around the world. And those remote locations can be as simple as a workstation running applications such as Blackmagic’s Resolve or Adobe After Effects and connected to object storage housing all the client’s camera raw material.

The key, Katrib notes, is to have great post and IT pros managing the projects and the system. “I can now place a remote post system with a calibrated 4K monitor and object storage housing the camera raw material, and I can bring the post process to you wherever you are, securely,” he adds. “From wherever you are, you can view the conform, color and effects, and sign off on the final timeline, as if you were at DFT.”

DFT posts American Housewife

In addition to the object storage, DFT is also using Facilis TerraBlock and Avid Nexis systems locally and on remote installs. The company uses those commercial solutions because they provide benefits, including storage performance and feature sets that optimize certain software applications. As Katrib points out, storage is not one flavor fits all, and different solutions work better for certain use cases. In DFT’s case, the commercial storage products provide performance for the playback of multiple 4K streams across the company’s color, VFX and conform departments, while its ProStack high-capacity object storage comes into play for storing the entirety of all files produced by our clients.

“Rather than retrieve files from an LTO tape, as most do when working on a TV series, with object storage, the files are readily available, saving hours in retrieval time,” says Katrib.

Currently, DFT is working on a number of television series, including Great News (color correction only) and Good Behavior (dailies only). For other shows, such as the Roseanne revival, NCIS: Los Angeles, American Housewife and more, it is performing full services such as visual effects, conform, color, dailies and dubbing. And in some instances, even equipment rental.

As the work expands, DFT is looking to extend upon its storage and remote post systems. “We want to have more remote systems where you can do color, conform, VFX, editorial, wherever you are, so the DP or producer can have a monitor in their office and partake in the post process that’s particular to them,” says Katrib. “That is what we are scaling as we speak.”

Broadway Video
Broadway Video is a global media and entertainment company that is primarily engaged in post-production services for television, film, music, digital and commercial projects for the past four decades. Located in New York and Los Angeles, the facility offers one-stop tools and talent for editorial, audio, design, color grading, finishing and screening, as well as digital file storage, preparation, aggregation and delivery of digital content across multiple platforms.

Since its founding in 1979, Broadway Video has grown into an independent studio. During this timeframe, content has evolved greatly, especially in terms of resolution, to where 4K and HD content — including HDR and Atmos sound — is becoming the norm. “Staying current and dealing with those data speeds are necessary in order to work fluidly on a 4K project at 60p,” says Stacey Foster, president and managing director, Broadway Video Digital and Production. “The data requirements are pretty staggering for throughput and in terms of storage.”

Stacey Foster

This led Broadway Video to begin searching a year ago for a storage system that would meet its needs now as well as in the foreseeable future — in short, it also needed a system that is scalable. Their solution: an all-Flash Hitachi Vantara Virtual Storage Platform (VSP) G series. Although quite expensive, a flash-based system is “ridiculously powerful,” says Foster. “Technology is always marching forward, and Flash-based systems are going to become the norm; they are already the norm at the high end.”

Foster has had a long-standing relationship with Hitachi for more than a decade and has witnessed the company’s growth into M&E from the medical and financial worlds where it has been firmly ensconced. According to Foster, Hitachi’s VSP series will enhance Broadway Video’s 4K offerings and transform internal operations by allowing quick turnaround, efficient and cost-effective production, post production and delivery of television shows and commercials. And, the system offers workload scalability, allowing the company to expand and meet the changing needs of the digital media production industry.

“The systems we had were really not that capable of handling DPX files that were up to 50TB, and Hitachi’s VSP product has been handling them effortlessly,” says Foster. “I don’t think other [storage] manufacturers can say that.”

Foster explains that as Broadway Video continued to expand its support of the latest 4K content and technologies, it became clear that a more robust, optimized storage solution was needed as the company moved in this new direction. “It allows us to look at the future and create a foundation to build our post production and digital distribution services on,” Foster says.

Broadway Video’s with Netflix projects sparked the need for a more robust system. Recently, Comedians in Cars Getting Coffee, an Embassy Row production, transitioned to Netflix, and one of the requirements by its new home was the move from 2K to 4K. “It was the perfect reason for us to put together a 4K end-to-end workflow that satisfies this client’s requirements for technical delivery,” Foster points out. “The bottleneck in color and DPX file delivery is completely lifted, and the post staff is able to work quickly and sometimes even faster than in real time when necessary to deliver the final product, with its very large files. And that is a real convenience for them.”

Broadway Video’s Hitachi Vantara Virtual Storage Platform G series.

As a full-service post company, Broadway Video in New York operates 10 production suites of Avids running Adobe Premiere and Blackmagic Resolve, as well as three full mixing suites. “We can have all our workstations simultaneously hit the [storage] system hard and not have the system slow down. That is where Hitachi’s VSP product has set itself apart,” Foster says.

For Comedians in Cars Getting Coffee, like many projects Broadway Video encounters, the cut is in a lower-resolution Avid file. The 4K media is then imported into the Resolve platform, so it is colored in its original material and format. In terms of storage, once the material is past the cutting stage, it is all stored on the Hitachi system. Once the project is completed, it is handed off on spinning disc for archival, though Foster foresees a limited future for spinning discs due to their inherent nature for a limited life span — “anything that spins breaks down,” he adds.

All the suites are fully HD-capable and are tied with shared SAN and ISIS storage; because work on most projects is shared between editing suites, there is little need to use local storage. Currently Broadway Video is still using its previous Avid ISIS products but is slowly transitioning to the Hitachi system only. Foster estimates that at this time next year, the transition will be complete, and the staff will no longer have to support the multiple systems. “The way the systems are set up right now, it’s just easier to cut on ISIS using the Avid workstations. But that will soon change,” he says.

Other advantages the Hitachi system provides is stability and uptime, which Foster maintains is “pretty much 100 percent guaranteed.” As he points out, there is no such thing as downtime in banking and medical, where Hitachi earned its mettle, and bringing that stability to the M&E industry “has been terrific.”

Of course, that is in addition to bandwidth and storage capacity, which is expandable. “There is no limit to the number of petabytes you can have attached,” notes Foster.

Considering that the majority of calls received by Broadway Video center on post work for 4K-based workflows, the new storage solution is a necessary technical addition to the facility’s other state-of-the-art equipment. “In the environment we work in, we spend more and more time on the creative side in terms of the picture cutting and sound mixing, and then it is a rush to get it out the door. If it takes you days to import, color correct, export and deliver — especially with the file sizes we are talking about – then having a fast system with the kind of throughput and bandwidth that is necessary really lifts the burden for the finishing team,” Foster says.

He continues: “The other day the engineers were telling me we were delivering 20 times faster using the Hitachi technology in the final cutting and coloring of a Jerry Seinfeld stand-up special we had done in 4K” resulting in a DPX file that was about 50TB. “And that is pretty significant,” Foster adds.

Main Image: DigitalFilm Tree’s senior colorist Patrick Woodard.

Autodesk Flame family updates offer pipeline enhancements

Autodesk has updated its Flame 2018 family of 3D visual effects and finishing software, which includes Flame, Flare, Flame Assist and Lustre. Flame 2018.3 offers more efficient ways of working in post, with feature enhancements that offer greater pipeline flexibility, speed and support for emerging formats and technology.

Flame 2018.3 highlights include:

• Action Selective: Apply FX color to an image surface or the whole action scene via the camera

• Motion Warp Tracking: Organically distort objects that are changing shape, angle and form with new 32-bit motion vector-based tracking technology

• 360-degree VR viewing mode: View LatLong images in a 360-degree VR viewing mode in the Flame player or any viewport during compositing and manipulate the field of view

• HDR waveform monitoring: Set viewport to show luminance waveform; red, green, blue (RGB) parade; color vectorscope or 3D cube; and monitor a range of HDR and wide color gamut (WCG) color spaces including Rec2100 PQ, Rec2020 and DCI P3

• Shotgun Software Loader: Load assets for a shot and build custom batches via Flame’s Python API, and browse a Shotgun project for a filtered view of individual shots

• User-requested improvements for Action, Batch, Timeline and Media Hub

“The new standalone Python console in Flame 2018.3 is a great,” says Treehouse Edit finishing artist John Fegan, a Flame family beta tester. “We’re also excited about the enhanced FBX export with physically based renderer (PBR) for Maya and motion analysis updates. Using motion vector maps, we can now achieve things we couldn’t with a planar tracker or 3D track.”

Flame Family 2018.3 is available today at no additional cost to customers with a current Flame Family 2018 subscription.

Winners: IBC2017 Impact Awards

postPerspective has announced the winners of our postPerspective Impact Awards from IBC2017. All winning products reflect the latest version of the product, as shown at IBC.

The postPerspective Impact Award winners from IBC2017 are:

• Adobe for Creative Cloud
• Avid for Avid Nexis Pro
• Colorfront for Transkoder 2017
• Sony Electronics for Venice CineAlta camera

Seeking to recognize debut products and key upgrades with real-world applications, the postPerspective Impact Awards are determined by an anonymous judging body made up of industry pros. The awards honor innovative products and technologies for the post production and production industries that will influence the way people work.

“All four of these technologies are very worthy recipients of our first postPerspective Impact Awards from IBC,” said Randi Altman, postPerspective’s founder and editor-in-chief. “These awards celebrate companies that push the boundaries of technology to produce tools that actually make users’ working lives easier and projects better, and our winners certainly fall into that category. You’ll notice that our awards from IBC span the entire pro pipeline, from acquisition to on-set dailies to editing/compositing to storage.

“As IBC falls later in the year, we are able to see where companies are driving refinements to really elevate workflow and enhance production. So we’ve tapped real-world users to vote for the Impact Awards, and they have determined what could be most impactful to their day-to-day work. We’re very proud of that fact, and it makes our awards quite special.”

IBC2017 took place September 15-19 in Amsterdam. postPerspective Impact Awards are next scheduled to celebrate innovative product and technology launches at the 2018 NAB Show.

Behind the Title: Artist Jayse Hansen

NAME: Jayse Hansen

COMPANY: Jayse Design Group

CAN YOU DESCRIBE YOUR COMPANY?
I specialize in designing and animating completely fake-yet-advanced-looking user interfaces, HUDs (head-up displays) and holograms for film franchises such as The Hunger Games, Star Wars, Iron Man, The Avengers, Guardians of the Galaxy, Spiderman: Homecoming, Big Hero 6, Ender’s Game and others.

On the side, this has led to developing untraditional, real-world, outside-the-rectangle type UIs, mainly with companies looking to have an edge in efficiency/data-storytelling and to provide a more emotional connection with all things digital.

Iron Man

WHAT’S YOUR JOB TITLE?
Designer/Creative Director

WHAT DOES THAT ENTAIL?
Mainly, I try to help filmmakers (or companies) figure out how to tell stories in quick reads with visual graphics. In a film, we sometimes only have 24 frames (one second) to get information across to the audience. It has to look super complex, but it has to be super clear at the same time. This usually involves working with directors, VFX supervisors, editorial and art directors.

With real-world companies, the way I work is similar. I help figure out what story can be told visually with the massive amount of data we have available to us nowadays. We’re all quickly finding that data is useless without some form of engaging story and a way to quickly ingest, make sense of and act on that data. And, of course, with design-savvy users, a necessary emotional component is that the user interface looks f’n rad.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
A lot of R&D! Movie audiences have become more sophisticated, and they groan if a fake UI seems outlandish, impossible or Playskool cartoon-ish. Directors strive to not insult their audience’s intelligence, so we spend a lot of time talking to experts and studying real UIs in order to ground them in reality while still making them exciting, imaginative and new.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Research, breaking down scripts and being able to fully explore and do things that have never been done before. I love the challenge of mixing strong design principles with storytelling and imagination.

WHAT’S YOUR LEAST FAVORITE?
Paperwork!

WHAT IS YOUR FAVORITE TIME OF THE DAY?
Early morning and late nights. I like to jam on design when everyone else is sleeping.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I actually can’t imagine doing anything else. It’s what I dream about and obsess about day and night. And I have since I was little. So I’m pretty lucky that they pay me well for it!

If I lost my sight, I’d apply for Oculus or Meta brain implants and live in the AR/VR world to keep creating visually.

SO YOU KNEW THIS WAS YOUR PATH EARLY ON?
When I was 10 I learned that they used small models for the big giant ships in Star Wars. Mind blown! Suddenly, it seemed like I could also do that!

As a kid I would pause movies and draw all the graphic parts of films, such as the UIs in the X-wings in Star Wars, or the graphics on the pilot helmets. I never guessed this was actually a “specialty niche” until I met Mark Coleran, an amazing film UI designer who coined the term “FUI” (Fictional User Interface). Once I knew it was someone’s “everyday” job, I didn’t rest until I made it MY everyday job. And it’s been an insanely great adventure ever since.

CAN YOU TALK MORE ABOUT FUI AND WHAT IT MEANS?
FUI stands for Fictional (or Future, Fantasy, Fake) User Interface. UIs have been used in films for a long time to tell an audience many things, such as: their hero can’t do what they need to do (Access Denied) or that something is urgent (Countdown Timer), or they need to get from point A to point B, or a threat is “incoming” (The Map).

Mockingjay Part I

As audiences are getting more tech-savvy, the potential for screens to act as story devices has developed, and writers and directors have gotten more creative. Now, entire lengths of story are being told through interfaces, such as in The Hunger Games: The Mockingjay Part I where Katniss, Peeta, Beetee and President Snow have some of their most tense moments.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
The most recent projects I can talk about are Guardians of the Galaxy 2 and Spider-Man: Homecoming, both with the Cantina Creative team and Marvel. For Guardians 2, I had a ton of fun designing and animating various screens, including Rocket, Gamora and Star-Lord’s glass screens and the large “Drone Tactical Situation Display” holograms for the Sovereign (gold people). Spider-Man was my favorite superhero as a child, so I was honored to be asked to define the “Stark-Designed” UI design language of the HUDs, holograms and various AR overlays.

I spent a good amount of time researching the comic book version of Spider-man. His suit and abilities are actually quite complex, and I ended up writing a 30-plus page guide to all of its functions so I could build out the HUD and blueprint diagrams in a way that made sense to Marvel fans.

In the end, it was a great challenge to blend the combination of the more military Stark HUDs for Iron Man, which I’m very used to designing, and a new, slightly “webby” and somewhat cute “training-wheels” UI that Stark designed for the young Peter Parker. I loved the fact that in the film they played up the humor of a teenager trying to understand the complexities of Stark’s UIs.

Star Wars: The Force Awakens

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
I think Star Wars: The Force Awakens is the one I was most proud to be a part of. It was my one bucket list film to work on from childhood, and I got to work with some of the best talents in the business. Not only JJ Abrams and his production team at Bad Robot, but with my longtime industry friends Navarro Parker and Andrew Kramer.

WHAT SOFTWARE DID YOU RELY ON?
As always, we used a ton of Maxon Cinema 4D, Adobe’s After Effects and Illustrator and Element 3D to pull off rather complex and lengthy design sequences such as the Starkiller Base hologram and the R2D2/BB8 “Map to Luke Skywalker” holograms.

Cinema 4D was essential in allowing us to be super creative while still meeting rather insane deadlines. It also integrates so well with the Adobe suite, which allowed us to iterate really quickly when the inevitable last-minute design changes came flying in. I would do initial textures in Adobe Illustrator, then design in C4D, and transfer that into After Effects using the Element 3D plugin. It was a great workflow.

YOU ALSO CREATE VR AND AR CONTENT. CAN YOU TELL US MORE ABOUT THAT?
Yes! Finally, AR and VR are allowing what I’ve been doing for years in film to actually happen in the real world. With a Meta (AR) or Oculus (VR) you can actually walk around your UI like an Iron Man hologram and interact with it like the volumetric UI’s we did for Ender’s Game.

For instance, today with Google Earth VR you can use a holographic mapping interface like in The Hunger Games to plan your next vacation. With apps like Medium, Quill, Tilt Brush or Gravity Sketch you can design 3D parts for your robot like Hiro did in Big Hero 6.

Big Hero 6

While wearing a Meta 2, you can surround yourself with multiple monitors of content and pull 3D models from them and enlarge them to life size.

So we have a deluge of new abilities, but most designers have only designed on flat traditional monitors or phone screens. They’re used to the two dimensions of up and down (X and Y), but have never had the opportunity to use the Z axis. So you have all kinds of new challenges like, “What does this added dimension do for my UI? How is it better? Why would I use it? And what does the back of a UI look like when other people are looking at it?”

For instance, in the Iron Man HUD, most of the time I was designing for when the audience is looking at Tony Stark, which is the back of the UI. But I also had to design it from the side. And it all had to look proper, of course, from the front. UI design becomes a bit like product design at this point.

In AR and VR, similar design challenges arise. When we are sharing volumetric UIs — we will see other people’s UIs from the back. At times, we want to be able to understand them, and at other times, they should be disguised, blurred or shrouded for privacy reasons.

How do you design when your UI can take up the whole environment? How can a UI give you important information without distracting you from the world around you? How do you deal with additive displays where black is not a color you can use? And on and on. These are all things we tackle with each film, so we have a bit of a head start in those areas.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
I love tech, but it would be fun to be stuck with just a pen, paper and a book… for a while, anyway.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
I’m on Twitter (@jayse_), Instagram (@jayse_) and Pinterest (skyjayse). Aside from that I also started a new FUI newsletter to discuss some behind the scenes of this type of work.

DO YOU LISTEN TO MUSIC WHILE YOU WORK?
Heck yeah. Lately, I find myself working to Chillstep and Deep House playlists on Spotify. But check out The Cocteau Twins. They sing in a “non-language,” and it’s awesome.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I chill with my best friend and fiancé, Chelsea. We have a rooftop wet-bar area with a 360-degree view of Las Vegas from the hills. We try to go up each evening at sunset with our puppy Bella and just chill. Sometimes it’s all fancy-like with a glass of wine and fruit. Chelsea likes to make it all pretty.

It’s a long way from just 10 years ago where we were hunting spare-change in the car to afford 99-cent nachos from Taco Bell, so we’re super appreciative of where we’ve come. And because of that, no matter how many times my machine has crashed, or how many changes my client wants — we always make time for just each other. It’s important to keep perspective and realize your work is not life or death, even though in films sometimes they try to make it seem that way.

It’s important to always have something that is only for you and your loved ones that nobody can take away. After all, as long as we’re healthy and alive, life is good!

Review: Blackmagic’s Fusion 9

By David Cox

At Siggraph in August, Blackmagic Design released a new version of its compositing software Fusion. For those not familiar with Fusion, it is a highly flexible node-based compositor that can composite in 2D and 3D spaces. Its closest competitor is Nuke from The Foundry.

The raft of new updates in Version 9 could be categorized into one of two areas: features created in response to user requests, and a set of tools for VR. Also announced with the new release is a price drop to $299 for the full studio version, which, judging by global resellers instantly running out of stock (Fusion ships via dongle), seems to have been a popular move!

As with other manufacturers in the film and broadcast area, the term “VR” is a little misused as they are really referring to “360 video.” VR, although a more exciting term, would demand interactivity. That said, as a post production suite for 360 video, Fusion already has a very strong tool set. It can create, manipulate, texture and light 3D scenes made from imported CGI models and built-in primitives and particles.

Added in Version 9 is a spherical camera that can capture a scene as a 360 2D or stereo 3D image. In addition, new tools are provided to cross-convert between many 360 video image formats. Another useful tool allows a portion of a 360-degree image to be unwrapped (or un-distorted) so that restoration or compositing work can be easily carried out on it before it is perfectly re-wrapped back into the 360-degree image.

There is also a new stabilizer for 360 wrap-around shots. A neat feature is that Fusion 9 can directly drive VR headsets such as Oculus Rift. Within Fusion, any node can be routed to any viewing monitor and the VR headset simply presents itself as an extra one of those.

Notably, Blackmagic has opted not to tackle 360-degree image stitching — the process by which images from multiple cameras facing in different directions are “stitched” together to form a single wrap-around view. I can understand this — on one hand, there are numerous free or cheap apps that perform stitching and so there’s no need for Blackmagic to reinvent that wheel. On the other hand, Blackmagic targets the mass user area, and given that 360 video production is a niche activity, productions that strap together multiple cameras form an even smaller and decreasing niche due to the growing number of single-step 360-degree cameras that provide complete wrap-around images without the need for stitching.

Moving on from VR/360, Fusion 9 now boasts some very significant additional features. While some Fusion users had expressed concerned that Blackmagic was favoring Resolve, in fact it is now clear that the Fusion development team have been very busy indeed.

Camera Tracker
First up is an embedded camera tracker and solver. Such a facility aims to deduce how the original camera in a live-action shoot moved through the scene and what lens must have been on it. From this, a camera tracker produces a virtual 3D scene into which a compositor can add objects that then move precisely with the original shot.

Fusion 9’s new camera tracker performed well in tests. It requires the user to break the process down into three logical steps: track, refine and export. Fusion initially offers auto-placed trackers, which follow scores of details in the scene quite quickly. The operator then removes any obviously silly trackers (like the ones chasing around the moving people in a scene) and sets Fusion about the task of “solving” the camera move.

Once done, Fusion presents a number of features to allow the user to measure the accuracy of the resulting track and to locate and remove trackers that are adversely affecting that result. This is a circular process by which the user can incrementally improve the track. The final track is then converted into a 3D scene with a virtual camera and a point cloud to show where the trackers would exist in 3D space. A ground plane is also provided, which the user can locate during the tracking process.

While Fusion 9’s camera tracker perhaps doesn’t have all the features of a dedicated 3D tracker such as SynthEyes from Andersson Technologies, it does satisfy the core need and has plenty of controls to ensure that the tool is flexible enough to deal with most scenarios. It will certainly be received as a welcome addition.

Planar Tracker
Next up is a built-in “planar” tracker. Planar trackers work differently than classic point trackers, which simply try to follow a small area of detail. A planar tracker follows a larger area of a shot, which makes up a flat plane — such as a wall or table top. From this, the planar tracker can deduce rotation, location, scale and perspective.

Fusion 9 Studio’s new planar tracker also performed well in tests. It assessed the track quickly and was not easily upset by foreground objects obscuring parts of the tracked area. The resulting track can either be used directly to insert another image into the resulting plane or to stabilize the shot, or indirectly by producing a separate Planar Transform node. This is used to warp any other asset such as a matte for rotoscoping work.

Inevitably, any planar tracker will be compared to the long-established “daddy” of them all, Mocha Pro from Boris FX. At a basic level, Fusion’s planar tracker worked just as well as Mocha, creating solid tracks from a user-defined area nicely and quickly. However, I would think that for complex rotoscoping, where a user will have many roto layers, driven by many tracking sources, with other layers acting as occlusion masks, Mocha’s working environment would be easier to control. Such a task would lead to many, many wired up nodes in Fusion, whereas Mocha would present the same functions within a simper layer-list. Of course, Mocha Pro is available as an OFX plug-in for Fusion Studio anyway, so users can have the best of both worlds.

Delta Keyer
Blackmagic also added a new keyer to Fusion called the Delta Keyer. It is a color difference keyer with a wide range of controls to refine the resulting matte and the edges of the key. It worked well when tested against one of my horrible greenscreens, something I keep for these very occasions!

The Delta Keyer can also take a clean plate as a reference input, which is essentially a frame of the green/bluescreen studio without the object to be keyed. The Delta Keyer then uses this to understand which deviations from the screen color represent the foreground object and which are just part of an uneven screen color.

To assist with this process, there is also a new Clean Plate node, which is designed to create an estimate of a clean plate in the absence of one being available from the shoot (for example, if the camera was moving). The combination of the clean plate and the Delta Keyer produced good results when challenged to extract subtle object shadows from an unevenly lit greenscreen shot.

Studio Player
Studio Player is also new for Fusion 9 Studio; it’s a multi-station shot review tool. Multiple versions of clips and comps can be added to the Studio Player’s single layer timeline, where simple color adjustments and notes can be added. A neat feature is that multiple studio players in different locations can be slaved together so that cross-facility review sessions can take place, with everyone looking at the same thing at the same time, which helps!

Fusion 9 Studio also supports the writing of Apple-approved Pro Res from all its supported platforms, including Windows and Linux. Yep – you read that right. Other format support has also been widened and improved, such as faster native handling for DNxHR codecs, for example.

Summing Up
All in all, the updates to Fusion 9 are comprehensive and very much in line with what professional users have been asking for. I think it certainly demonstrates that Blackmagic is as committed to Fusion as Resolve, and at $299, it’s a no-brainer for any professional VFX artist to have available to them.

Of course, the price drop shows that Blackmagic is also aiming Fusion squarely at the mass independent filmmaker market. Certainly, with Resolve and Fusion, those users will have pretty much all the post tools they will need.

Fusion by its nature and heritage is a more complex beast to learn than Resolve, but it is well supported with a good user manual, forums and video tutorials. I would think it likely that for this market, Fusion might benefit from some minor tweaks to make it more intuitive in certain areas. I also think the join between Resolve and Fusion will provide a lot of interest going forward for this market. Adobe has done a masterful job bridging Premiere and After Effects. The join between Resolve and Fusion is more rudimentary, but if Blackmagic gets this right, they will have a killer combination.

Finally, Fusion 9 extends what was already a very powerful and comprehensive compositing suite. It has become my primary compositing device and the additions in version 9 only serve to cement that position.


David Cox is a VFX compositor and colorist with 20+ years experience. He started his career with MPC and The Mill before forming his own London-based post facility. Cox recently created interactive projects with full body motion sensors and 4D/AR experiences.

Blackmagic’s Fusion 9 is now VR-enabled

At SIGGRAPH, Blackmagic was showing Fusion 9, its newly upgraded visual effects, compositing, 3D and motion graphics software. Fusion 9 features new VR tools, an entirely new keyer technology, planar tracking, camera tracking, multi-user collaboration tools and more.

Fusion 9 is available now with a new price point — Blackmagic has lowered the price of its Studio version from $995 to $299 Studio Version. (Blackmagic is also offering a free version of Fusion.) The software now works on Mac, PC and Linux.

Those working in VR get a full 360º true 3D workspace, along with a new panoramic viewer and support for popular VR headsets such as Oculus Rift and HTC Vive. Working in VR with Fusion is completely interactive. GPU acceleration makes it extremely fast so customers can wear a headset and interact with elements in a VR scene in realtime. Fusion 9 also supports stereoscopic VR. In addition, the new 360º spherical camera renders out complete VR scenes, all in a single pass and without the need for complex camera rigs.

The new planar tracker in Fusion 9 calculates motion planes for accurately compositing elements onto moving objects in a scene. For example, the new planar tracker can be used to replace signs or other flat objects as they move through a scene. Planar tracking data can also be used on rotoscope shapes. That means users don’t have to manually animate motion, perspective, position, scale or rotation of rotoscoped elements as the image changes.

Fusion 9 also features an entirely new camera tracker that analyzes the motion of a live-action camera in a scene and reconstructs the identical motion path in 3D space for use with cameras inside of Fusion. This lets users composite elements with precisely matched movement and perspective of the original. Fusion can also use lens metadata for proper framing, focal length and more.

The software’s new delta keyer features a complete set of matte finesse controls for creating clean keys while preserving fine image detail. There’s also a new clean plate tool that can smooth out subtle color variations on blue- and greenscreens in live action footage, making them easier to key.

For multi-user collaboration, Fusion 9 Studio includes Studio Player, a new app that features a playlist,
storyboard and timeline for playing back shots. Studio Player can track version history, display annotation notes, has support for LUTs and more. The new Studio Player is suited for customers that need to see shots in a suite or theater for review and approval. Remote synchronization lets artists  sync Studio Players in multiple locations.

In addition, Fusion 9 features a bin server so shared assets and tools don’t have to be copied onto each user’s local workstation.

Foundry’s Nuke and Hiero 11.0 now available

Foundry has made available Nuke and Hiero 11.0, the next major release for the Nuke line of products, including Nuke, NukeX, Nuke Studio, Hiero and HieroPlayer. The Nuke family is being updated to VFX Platform 2017, which includes several major updates to key libraries used within Nuke, including Python, Pyside and Qt.

The update also introduces a new type of group node, which offers a powerful new collaborative workflow for sharing work among artists. Live Groups referenced in other scripts automatically update when a script is loaded, without the need to render intermediate stages.

Nuke Studio’s intelligent background rendering is now available in Nuke and NukeX. The Frame Server takes advantage of available resource on your local machine, enabling you to continue working while rendering is happening in the background. The LensDistortion node has been completely revamped, with added support for fisheye and wide-angle lenses and the ability to use multiple frames to produce better results. Nuke Studio now has new GPU-accelerated disk caching that allows users to cache part or all of a sequence to disk for smoother playback of more complex sequences.

 

 

Quick Chat: Filmmaker/DP/VFX artist Mihran Stepanyan

Veteran Armenian artist Mihran Stepanyan has an interesting background. In addition to being a filmmaker and cinematographer, he is also a colorist and visual effects artist. In fact, he won the 2017 Flame Award, which was presented to him during NAB in April.

Let’s find out how his path led to this interesting mix of expertise.

Tell us about your background in VFX.
I studied feature film directing in Armenia from 1997 through 2002. During the process, I also became very interested in being a director of photography. As a self-taught DP, I was shooting all my work, as well as films produced by my classmates and colleagues. This was great experience. Nearly 10 years ago, I started to study VFX because I had some projects that I wanted to do myself. I’ve fallen in love with that world. Some years ago, I started to work in Moscow as a DP and VFX artist for a Comedy Club Production special project. Today, I not only work as a VFX artist but also as a director and cinematographer.

How do your experiences as a VFX artist inform your decisions as a director and cinematographer?
They are closely connected. As a director, you imagine something that you want to see in the end, and you can realize that because you know what you can achieve in production and post. And, as a cinematographer, you know that if problems arise during the shoot, you can correct them in VFX and post. Experience in cinematography also complements VFX artistry, because your understanding of the physics of light and optics helps you create more realistic visuals.

What do you love most about your job?
The infinity of mind, fantasy and feelings. Also, I love how creative teams work. When a project starts, it’s fun to see how the different team members interact with one another and approach various challenges, ultimately coming together to complete the job. The result of that collective team work is interesting as well.

Tell us about some recent projects you’ve worked on.
I’ve worked on Half Moon Bay, If Only Everyone, Carpenter Expecting a Son and Doktor. I also recently worked on a tutorial for FXPHD that’s different from anything I’ve ever done before. It is not only the work of an Autodesk Flame artist or a lecturer, but also gave me a chance to practice English, as my first language is Armenian.

Mihran’s Flame tutorial on FXPHD.

Where do you get your inspiration?
First, nature. There nothing more perfect to me. And, I’m picturalist, so for various projects I can find inspiration in any kind of art, from cave paintings to pictorial art and music. I’m also inspired by other artists’ work, which helps me stay tuned with the latest VFX developments.

If you had to choose the project that you’re most proud of in your career, what would it be, and why?
I think every artist’s favorite project is his/her last project, or the one he/she is working on right now. Their emotions, feelings and ideas are very fresh and close at the moment. There are always some projects that will stand out more than others. For me, it’s the film Half Moon Bay. I was the DP, post production supervisor and senior VFX artist for the project.

What is your typical end-to-end workflow for a project?
It differs on each project. In some projects, I do everything from story writing to directing and digital immediate (DI) finishing. For some projects, I only do editing or color grading.

How did you come to learn Flame?
During my work in Moscow, nearly five years ago, I had the chance to get a closer look at Flame and work on it. I’m a self-taught Flame artist, and since I started using the product it’s become my favorite. Now, I’m back in Armenia working on some feature films and upcoming commercials. I am also a member of Flame and Autodesk Maya Beta testing groups.

How did you teach yourself Flame? What resources did you use?
When I started to learn Flame, there weren’t as many resources and tutorials as we have now. It was really difficult to find training documentation online. In some cases, I got information from YouTube, NAB or IBC presentations. I learned mostly by experimentation, and a lot of trial and error. I continue to learn and experiment with Flame every time I work.

Any tips for using the product?
As for tips, “knowing” the software is not about understanding the tools or shortcuts, but what you can do with your imagination. You should always experiment to find the shortest and easiest way to get the end result. Also, imagine how you can construct your schematic without using unnecessary nods and tools ahead of time. Exploring Flame is like mixing the colors on the palette in painting to get the perfect tone. In the same way, you must imagine what tools you can “mix” together to get the result you want.

Any advice for other artists?
I would advise that you not be afraid of any task or goals, nor fear change. That will make you a more flexible artist who can adapt to every project you work on.

What’s next for you?
I don’t really know what’s next, but I am sure that it is a new beginning for me, and I am very interested where this all takes me tomorrow.

Nutmeg and Nickelodeon team up to remix classic SpongeBob songs

New York creative studio Nutmeg Creative was called on by Nickelodeon to create trippy music-video-style remixes of some classic SpongeBob SquarePants songs for the kids network’s YouTube channel. Catchy, sing-along kids’ songs have been an integral part of SpongeBob since its debut in 1999.

Though there are dozens of unofficial fan remixes on YouTube, Nickelodeon frequently turns to Nutmeg for official remixes: vastly reimagined versions accompanied by trippy, trance-inducing visuals that inevitably go viral. It all starts with the music, and the music is inspired by the show.

Infused with the manic energy of classic Warner Bros. Looney Toons, SpongeBob is simultaneously slapstick and surreal with an upbeat vibe that has attracted a cult-like following from the get-go. Now in its 10th season, SpongeBob attracts fans that span two generations: kids who grew up watching SpongeBob now have kids of their own.

The show’s sensibility and multi-generational audience informs the approach of Nutmeg sound designer, mixer and composer JD McMillin, whose remixes of three popular and vintage SpongeBob songs have become viral hits: Krusty Krab Pizza and Ripped My Pants from 1999, and The Campfire Song Song (yes, that’s correct) from 2004. With musical styles ranging from reggae, hip-hop and trap/EDM to stadium rock, drum and bass and even Brazilian dance, McMillin’s remixes expand the appeal of the originals with ear candy for whole new audiences. That’s why, when Nickelodeon provides a song to Nutmeg, McMillin is given free rein to remix it.

“No one from Nick is sitting in my studio babysitting,” he says. “They could, but they don’t. They know that if they let me do my thing they will get something great.”

“Nickelodeon gives us a lot of creative freedom,” says executive producer Mike Greaney. “The creative briefs are, in a word, brief. There are some parameters, of course, but, ultimately, they give us a track and ask us to make something new and cool out of it.”

All three remixes have collectively racked up hundreds of thousands of views on YouTube, with The Campfire Song Song remix generating 655K views in less than 24 hours on the SpongeBob Facebook page.

McMillin credits the success to the fact that Nutmeg serves as a creative collaborative force: what he delivers is more reinvention than remix.

“We’re not just mixing stuff,” he says. “We’re making stuff.”

Once Nick signs off on the audio, that approach continues with the editorial. Editors Liz Burton, Brian Donnelly and Drew Hankins each bring their own unique style and sensibility, with graphic Effects designer Stephen C. Walsh adding the finishing touches.

But Greaney isn’t always content with cut, shaken and stirred clips from the show, going the extra mile to deliver something unexpected. Case in point: he recently donned a pair of red track pants and high-kicked in front of a greenscreen to add a suitably outrageous element to the Ripped My Pants remix.

In terms of tools used for audio work, Nutmeg used Ableton Live, Native Instruments Maschine and Avid Pro Tools. For editorial they called on Avid Media Composer, Sapphire and Boris FX. Graphics were created in Adobe After Effects, and Mocha Pro.

Latest Autodesk Flame family updates and more

Autodesk was at NAB talking up new versions of its tools for media and entertainment, including the Autodesk Flame Family 2018 Update 1 for VFX, the Arnold 5.0 renderer, Maya 2017 Update 3 for 3D animation, performance updates for Shotgun production tracking and review software and 3DS Max 2018 software for 3D modeling.

The Autodesk Flame 2018 Update 1 includes new action and batch paint improvements such as 16-bit floating point (FP) depth support, scene detect and conform enhancements.

The Autodesk Maya 2017 Update 3 includes enhancements to character creation tools such as interactive grooming with XGen, an all-new UV workflow, and updates to the motion graphics toolset that includes a live link with Adobe After Effects and more.

Arnold 5.0 is offering several updates including better sampling, new standard surface, standard hair and standard volume shaders, Open Shading Language (OSL) support, light path expressions, refactored shading API and a VR camera.

— Shotgun updates accelerate multi-region performance and make media uploads and downloads faster regardless of location.

— Autodesk 3ds Max 2018 offers Arnold 5.0 rendering via a new MAXtoA 1.0 plug-in, customizable workspaces, smart asset creation tools, Bézier motion path animation, and a cloud-based large model viewer (LMV) that integrates with Autodesk Forge.

The Flame Family 2018 Update 1, Maya 2017 Update 3 and 3DS Max 2018 are all available now via Autodesk e-stores and Autodesk resellers. Arnold 5.0 and Shotgun are both available via their respective websites.

Exceptional Minds: Autistic students learn VFX, work on major feature films

After graduation, these artists have been working on projects for Marvel, Disney, Fox and HBO.

By Randi Altman

With an estimated 1 in 68 children in the US being born with some sort of autism spectrum disorder, according to the Centers for Disease Control’s Autism and Developmental Disabilities Monitoring, I think it’s fair to say that most people have been touched in some way by a child on the spectrum.

As a parent of a teenager with autism, I can attest to the fact that one of our biggest worries, the thing that keeps us up at night, is the question of independence. Will he be able to make a living? Will there be an employer who can see beyond his deficits to his gifts and exploit those gifts in the best possible way?

Enter Exceptional Minds, a school in Los Angeles that teaches young adults with autism how to create visual effects and animation while working as part of a team. This program recognizes how bright these young people are and how focused they can be, surrounds them with the right teachers and behavioral therapists, puts the right tools in their hands and lets them fly.

The school, which also has a VFX and animation studio that employs its graduates, was started in 2011 by a group of parents who have children on the spectrum. “They were looking for work opportunities for their kids, and quickly discovered they couldn’t find any. So they decided to start Exceptional Minds and prepare them for careers in animation and visual effects,” explains Susan Zwerman, the studio executive producer at Exceptional Minds and a long-time VFX producer whose credits include Broken Arrow, Alien Resurrection, Men of Honor, Around the World in 80 Days and The Guardian.

Since the program began, these young people have had the opportunity to work on some very high-profile films and TV programs. Recent credits include Game of Thrones, The Fate of the Furious and Doctor Strange, which was nominated for an Oscar for visual effects this year.

We reached out to Zwerman to find out more about this school, its studio and how they help young people with autism find a path to independence.

The school came first and then the studio?
Yes. We started training them for visual effects and animation and then the conversation turned to, “What do they do when they graduate?” That led to the idea to start a visual effects studio. I came on board two years ago to organize and set it up. It’s located downstairs from the school.

How do you pick who is suitable for the program?
We can only take 10 students each year, and unfortunately, there is a waiting list because we are the only program of its kind anywhere. We have a review process that our educators and teachers have in terms of assessing the student’s ability to be able to work in this area. You know, not everybody can function working on a computer for six or eight hours. There are different levels of the spectrum. So the higher functioning and the medium functioning are more suited for this work, which takes a lot of focus.

Students are vetted by our teachers and behavioral specialists, who take into account the student’s ability, as well as their enthusiasm for visual effects and animation — it’s very intense, and they have to be motivated.

Susie Zwerman (in back row, red hair) with artists in the Exceptional Minds studio.

I know that kids on the spectrum aren’t necessarily social butterflies, how do you teach them to work as a team?
Oh, that’s a really good question. We have what’s called our Work Readiness program. They practice interviewing, they practice working as a team, they learn about appearance, attitude, organization and how to problem solve in a work place.

A lot of it is all about working in a team, and developing their social skills. That’s something we really stress in terms of behavioral curriculum.

Can you describe how the school works?
It’s a three-year program. In the first year, they learn about the principles of design and using programs like Adobe’s Flash and Photoshop. In Flash, they study 2D animation and in Photoshop they learn how to do backgrounds for their animation work.

During year two, they learn how to work in a production pipeline. They are given a project that the class works on together, and then they learn how to edit using Adobe Premiere Pro and compositing on Adobe After Effects.

In the third year, they are developing their skills in 3D via Autodesk Maya and compositing with The Foundry’s Nuke. So they learn the way we work in the studio and our pipeline, as well as preparing their portfolios for the workplace. At the end of three years, each student completes their training with a demo reel and resume of their work.

Who helps with the reels and resumes?
Their teachers supervise that process and help them with editing and picking the best pieces for their reel. Having a reel is important for many reasons. While many students will work in our studio for a year after graduation, I was able to place some directly into the work environment because their talent was so good… and their reel was so good.

What is the transition like from school to studio?
They graduate in June and we transition many of them to the studio, where they learn about deadlines and get paid for their work. Here, many experience independence for the first time. We do a lot of 2D-type visual effects clean-up work. We give them shots to work on and test them for the first month to see how they are doing. That’s when we decide if they need more training.

The visual effects side of the studio deals with paint work, wire and rod removal and tracker or marker removals — simple composites — plus a lot of rotoscoping and some greenscreen keying. We also do end title credits for the major movies.

We just opened the animation side of the studio in 2016, so it’s still in the beginning stages, but we’re doing 2D animation. We are not a 3D studio… yet! The 2D work we’ve done includes music videos, Websites, Power Points and some stuff for the LA Zoo. We are gearing up for major projects.

How many work in the studio?
Right now, we have about 15 artists at workstations in our current studio. Some of these will be placed on the outside, but that’s part of using strategic planning in the future to figure out how much expansion we want to do over the next five years.

Thanks to your VFX background, you have many existing relationships with the major studios. Can you talk about how that has benefitted Exceptional Minds?
We have had so much support from the studios; they really want to help us get work for the artists. We started out with Fox, then Disney and then HBO for television. Marvel Studios is one of our biggest fans. Marvel’s Victoria Alonso is a big supporter, so much so that we gave her our Ed Asner Award last June.

Once we started to do tracker marker and end title credits for Marvel, it opened doors. People say, “Well, if you work for Marvel, you could work for us.” So she has been so instrumental in our success.

What were the Fox and Marvel projects?
Our very first client was Fox and we did tracker removals for Dawn of the Planet of the Apes — that was about three years ago. Marvel happened about two years ago and our first job for them was on Avengers: Age of Ultron.

What are some of the other projects Exceptional Minds has worked on?
We worked on Doctor Strange, providing tracker marker removals and end credits. We worked on Ant-Man, Captain America: Civil War, Pete’s Dragon, Alvin & the Chipmunks: The Road Chip and X-Men: Apocalypse.

Thanks to HBO’s Holly Schiffer we did a lot of Game of Thrones work. She has also been a huge supporter of ours.

It’s remarkable how far you guys have come in a short amount of time. Can you talk about how you ended up at Exceptional Minds?
I used to be DGA production manager/location manager and then segued into visual effects as a freelance VFX producer for all the major studios. About three years ago, my best friend Yudi Bennett, who is one of the founders of Exceptional Minds, convinced me to leave my career and  come here to help set up the studio. I was also tasked with producing, scheduling and budgeting work to come into the studio. For me, personally, this has been a spiritual journey. I have had such a good career in the industry, and this is my way of giving back.

So some of these kids move on to other places?
After they have worked in the studio for about a year, or sometimes longer, I look to have them placed at an outside studio. Some of them will stay here at our studio because they may not have the social skills to work on the outside.

Five graduates have been placed so far and they are working full time at various productions studios and visual effects facilities in Los Angeles. We have also had graduates in internships at Cartoon Network and Nickelodeon.

One student is at Marvel, and others are at Stargate Studios, Mr. Wolf and New Edit. To be able to place our artists on the outside is our ultimate goal. We love to place them because it’s sort of life changing. For example, one of the first students we placed, Kevin, is at Stargate. He moved out of his parents’ apartment, he is traveling by himself to and from the studio, he is getting raises and he is moving up as a rotoscope artist.

What is the tuition like?
Students pay about 50 percent and we fundraise the other 50 percent. We also have scholarships for those that can’t afford it. We have to raise a lot of money to support the efforts of the school and studio.

Do companies donate gear?
When we first started, Adobe donated software. That’s how we were able to fund the school before the studio was up and running. Now we’re on an educational plan with them where we pay the minimum. Autodesk and The Foundry also give us discounts or try to donate licenses to us. In terms of hardware, we have been working with Melrose Mac, who is giving us discounts on computers for the school and studio.


Check out Exceptional Minds Website for more info.

Aardman creates short film, struts its stuff

By Randi Altman

All creative studios strive for creative ways to show off their talent and offerings, and London-based Aardman is no exception. Famous for its stop-motion animation work (remember the Wallace and Gromit films?), this studio now provides so much more, including live-action, CG, 2D animation and character creation.

Danny Capozzi

In order to help hammer home all of their offerings, and in hopes of breaking that stop-motion stereotype, Aardman has created a satirical short film, called Visualize This, depicting a conference call between a production company and an advertising agency, giving the studio the ability to show off the range of solutions they can provide for clients. Each time the fictional client suggests something, that visual pops up on the screen, whether it’s adding graffiti to a snail’s shell or textured type or making a giant monster out of CG cardboard boxes.

We reached out to Aardman’s Danny Capozzi, who directed the short, to find out more about this project and the studio in general.

How did the idea for this short come about?
I felt that the idea of making a film based on a conference call was something that would resonate with a lot of people in any creative industry. The continuous spit balling of ideas and suggestions would make a great platform to demonstrate a lot of different styles that myself and Aardman can produce. Aardman is well known for its high level of stop-motion/Claymation work, but we do CGI, live action and 2D just as well. We also create brand new ways of animating by combining styles and techniques.

Why was now the right time to do this?
I think we are living in a time of uncertainty, and this film really expresses that. We do a lot of procrastinating. We have the luxury to change our minds, our tastes and our styles every two minutes. With so much choice of everything at our fingertips we can no longer make quick decisions and stick to them. There’s always that sense of “I love this… it’s perfect, but what if there’s something better?” I think Visualize This sums it up.

You guys work with agencies and directly with brands — how would you break that up percentage wise?
The large majority of our advertising work still comes through agencies, although we are increasingly doing one-off projects for clients who seek us out for our storytelling and characters. It’s hard to give a percentage on it because the one-offs vary so much in size that they can skew the numbers and give the wrong impression. More often than not, they aren’t advertising projects either and tend to fall into the realm of short films for organizations, which can be either charities, museums or visitor attractions, or even mass participation arts projects and events.

Can you talk about making the short? Your workflow?
When I first pitched the idea to our executive producer Heather Wright, she immediately loved the idea. After a bit of tweaking on the script and the pace of the dialogue we soon went into production. The film was achieved during some down time from commercial productions and took about 14 weeks on and off over several months.

What tools did you call on?
We used a large variety of techniques CGI, stop-motion, 2D, live action, timelapse photography and greenscreen. Compositing and CG was via Maya, Houdini and Nuke software. We used HDRI (High Dynamic Range Images). We also used Adobe’s After Effects, Premiere, Photoshop, and Illustrator, along with clay sculpting, model making and blood, sweat and, of course, some tears.

What was the most complicated shot?
The glossy black oil shot. This could have been done in CGI with a very good team of modelers and lighters and compositors, but I wanted to achieve this in-camera.

Firstly, I secretly stole some of my son Vinny’s toys away to Aardman’s model-making workshop and spray painted them black. Sorry Vinny! I hot glued the black toys onto a black board (huge mistake!), you’ll see why later. Then I cleared Asda out of cheap cooking oil — 72 litres of the greasy stuff. I mixed it with black oil paint and poured it into a casket.

We then rigged the board of toys to a motion control rig. This would act as the winch to raise the toys out of the black oily soup. Another motion control was rigged to do the panning shot with the camera attached to it. This way we get a nice up and across motion in-camera.

We lowered the board of toys into the black soup and the cables that held it up sagged and released the board of toys. Noooooo! I watched them sink. Then to add insult to injury, the hot glue gave way and the toys floated up. How do you glue something to an oily surface?? You don’t! You use screws. After much tinkering it was ready to be submerged again. After a couple of passes, it worked. I just love the way the natural glossy highlights move over the objects. All well worth doing in-camera for real, and so much more rewarding.

What sort of response has it received?
I’m delighted. It has really travelled since we launched a couple of weeks ago, and it’s fantastic to keep seeing it pop up in my news feed on various social media sites! I think we are on over 20,000 YouTube views and 40,000 odd views on Facebook.

Behind the Title: Director/Designer Ash Thorp

NAME: Ash Thorp (@ashthorp)

COMPANY: ALT Creative, Inc.

CAN YOU DESCRIBE YOUR COMPANY?
ALT Creative is co-owned by my wife Monica and myself. She helps coordinate and handle the company operations, while I manage the creative needs of clients. We work with a select list of outside contractors as needed, mainly depending on the size and scale of the project.

WHAT’S YOUR JOB TITLE?
I fulfill many roles, but if I had to summarize I would say I most commonly am hired for the role of director or designer.

WHAT DOES THAT ENTAIL?
Directing is about facilitating the team to achieve the best outcome on a given project. My ability to communicate with and engage my team toward a visionary goal is my top priority as a director. As a designer, I look at my role as an individual problem solver. My goal is to find the root of what is needed or requested and solve it using design as a mental process of solution.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I believe that directing is more about communication and not how well you can design, so many would be surprised by the amount of time and energy needed outside of “creative” tasks, such as emails, critiques, listening, observation and deep analysis.

WHAT’S YOUR FAVORITE PART OF THE JOB?
As a director, I love the freedom to expose the ideas in my mind to others and work closely with them to bring them to life. It’s immensely liberating and rewarding.

WHAT’S YOUR LEAST FAVORITE?
Redundancy often eats up my ambitions. Instructing my vision repeatedly to numerous teammates and partners can be taxing on my subconscious at times.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
The late evening because that is often when I have my mind to myself and am free of outside world distractions and noise.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Nothing. I strongly believe that this is what I was put on earth to do. This is the path I have been designed and focused on since I was a child.

SO YOU KNEW EARLY ON THIS WOULD BE YOUR PATH?
I grew up with a very artistic family; my mother’s side of the family displays creative traits in one media or another. They were and still are all very deeply committed to supporting me in my creative endeavors. Based on my upbringing, it was a natural progression to also be a creative person.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
As for client projects that are publicly released, I most recently worked on the Assassin’s Creed feature film and Call of Duty: Infinite Warfare video game.

For my own projects, I designed and co-directed a concept short for Lost Boy with Anthony Scott Burns. In addition, I released two personal projects: None is a short expression film devised to capture a tone and mood of finding oneself in a city of darkness, and Epoch
is an 11-minute space odyssey that merges my deep love of space and design.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
With Epoch being the most recently released project, I have received so many kind and congratulatory correspondences from viewers about how much they love the film. I am very proud of all the hard work and internal thought, development and personal growth it took to build this project with Chris Bjerre. I believe Epoch shows who I truly am, and I consider it one of the best projects of my personal career to date.

WHAT SOFTWARE DID YOU RELY ON FOR EPOCH?
We used a pretty wide spectrum of tools. Our general production tool kit was comprised of Adobe Photoshop for images and stills, texture building and 2D image editing; Adobe Bridge for reviewing frames and keeping a clear vision of the project; Adobe Premiere for editing everything from the beginning animatic to the final film; and, of course, our main staple in 3D was Maxon Cinema 4D, which we used to construct all of the final scenes and render everything using Octane Renderer.

We used Cinema 4D for everything — from building shots for the rough animatic to compiling entire scenes and shots for final render. We used it to animate the planets, moons, orbits, lights and the Vessel. It really is a rock-solid piece of software that I couldn’t imagine trying to build a film like Epoch without it. It allowed us to capture the animations, look, lighting and shots seamlessly from the project’s inception.

WHAT WAS YOUR INSPIRATION FOR THIS WORK?
I am personally inspired by so many things. Epoch was a personal tribute to Stanley Kubrick’s 2001: A Space Odyssey, Alien, Carl Sagan, my love of space and space travel, classical sci-fi art and literature, and my personal love of graphic design all combined into one. We put tremendous effort into Epoch to pay proper homage to these things, yet also invite a new audience to experience something uniquely new. We hope you all enjoyed it!

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
The Internet, computers and physical traveling devices (like cars, planes).

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
I try and limit my time spent on social media, but I have two Facebooks, Instagram, Twitter and a Behance account.

DO YOU LISTEN TO MUSIC WHILE YOU WORK?
I frequently listen to music while I work as it helps me fall deep into my mentally focused work state of mind. The type of music varies as some genres work better than others because they trigger different emotions for different tasks. When I am in deep thought, I listen to composers that have no lyrics in their work that may pull away my mind’s focus. When I am doing ordinary tasks or busy work, I listen to anything from heavy metal to drum and bass. The scale of music really varies for me as it’s also often based on my current mood. Music is a big part of my workday and my life.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I actually let the stress in and let it shape my decision making. I feel if I run away from it or unwind my mind, it takes double the effort to go back in to work. I embrace it as being a part of the high consumption industry in which I have chosen to work. It’s not always ideal and is often very demanding, but I often let it be the spark of the fire of my work.

An image scientist weighs in about this year’s SciTech winners

While this year’s Oscar broadcast was unforgettable due to the mix up in naming the Best Picture, many in the industry also remember actors Leslie Mann and John Cho joking about how no one understands what the SciTech Awards are about. Well, Shed’s SVP of imaging science, Matthew Tomlinson, was kind enough to answer some questions about the newest round of winners and what the technology means to the industry.

As an image scientist, what was the most exciting thing about this year’s Oscars’ Scientific and Technical Awards?
As an imaging scientist, I was excited about the five digital cameras — Viper, Genesis, Sony 65, Red Epic and Arri — that received accolades. I’ve been working with each of these cameras for years, and each of them has had a major impact in the industry. They’ve pioneered the digital revolution and have set a very high standard for future cameras that appear on the market.

The winners of the 2017 SciTech Awards. Credit: Todd Wawrychuk/A.M.P.A.S.

Another exciting aspect is that you actually have access to your “negative” with digital cameras and, if need be, you can make adjustments to that negative after you’ve exposed it. It’s an incredibly powerful option that we haven’t even realized the full potential of yet.

From an audience perspective, even though they’ll never know it, the facial performance capture solving system developed by ILM, as well as the facial performance-based software from Digital Domain and Sony Pictures Imageworks, is incredibly exciting. The industry is continuously pushing the boundaries of the scope of the visual image. As stories become more expansive, this technology helps the audience to engage with aliens or creatures that are created by a computer but based on the actions, movements and emotions of an actor. This is helping blur the lines between reality and fantasy. The best part is that these tools help tell stories without calling attention to themselves.

Which category or discipline saw the biggest advances from last year to this year? 
The advancements in each technology that received an award this year are based on years of work behind the scenes that led up to this moment. I will say that from an audience perspective, the facial animation advancements were significant this past year. We’re reaching a point where audiences are unaware major characters are synthetic or modified. It’s really mind blowing when you think about it.

Sony’s Toshihiko Ohnishi.

Which of the advancements will have the biggest impact on the work that you do, specifically?
The integration of digital cameras and intermixing various cameras into one project. It’s pretty common nowadays to see the Sony, Alexa and Red camera all used on the same project. Each one of these cameras comes with its own inherent colorspace and particular attributes, but part of my job is to make sure they can all work together — that we can interweave the various files they create — without the colorist having to do a lot of technical heavy lifting. Part of my job as an Imaging Scientist is handling the technicalities so that when creatives, such as the director, cinematographer and colorist, come together they can concentrate on the art and don’t have to worry about the technical aspects much at all.

Are you planning to use, or have you already begun using, any of these innovations in your work?

The digital cameras are very much part of my everyday life. Also, in working with a VFX house, I like to provide the knowledge and tools to help them view the imagery as it will be seen in the DI. The VFX artist spends an incredible amount of time and effort on every pixel they work on and it’s a real goal of mine to make sure that the work that they create is the best it can be throughout the DI.

Nickelodeon gets new on-air brand refresh

The children’s network Nickelodeon has debuted an all-new brand refresh of its on-air and online look and feel. Created with animation, design, global branding and creative agency Superestudio, based in Buenos Aires, Argentina, Nick’s new look features an array of kids interacting with the real world and Nick’s characters in live-action and graphic environments.

The new look consists of almost 300 deliverables, including bumpers, IDs, promo toolkits and graphic developments that first rolled out across the network’s US linear platform, followed by online, social media and off-channel. Updated elements for the network’s international channels will follow.

“We really wanted to highlight how much surprise and fun are parts of kids’ lives, so we took as our inspiration the surreal nature of GIFs, memes and emoticons and created an entire new visual vocabulary,” says Michael Waldron, SVP, creative director art and design for Nickelodeon Group and Nick@Nite. “Using a mix of real kids and on-air talent, the refresh looks through the lens of how kids see things — the unpredictable, extraordinary and joyful nature of a child’s imagination. Superestudio was the right company for this refresh because they use a great mix of different techniques, and they brought a fresh viewpoint that had just the right amount of quirk and whimsy.”

Nickelodeon’s new look was created by combining real kids with 2D and 3D graphics to create imaginative reinterpretations of Nickelodeon’s properties and characters as they became real-world playgrounds for kids to bring to life, rearrange and redesign. From turning SpongeBob’s face into a tongue-twisted fun zone to kids rearranging and rebuilding Lincoln Loud from The Loud House, everything from the overhead and docu-style camera angles to the seamless blend of real-world and tactile elements.

Nickelodeon’s classic orange logo is now set against an updated color palette of bright tones, including purple, light blue, lime and cream.

According to Superestudio executive creative director Ezequiel Rormoser, “The software that we used is Adobe After Effects and Maxon Cinema 4D. I think the most interesting thing is how we mixed live action with graphics, not in terms of technical complexity, but in the way they interact in an unexpected way. “

Recreating history for Netflix’s The Crown

By Randi Altman

If you, like me, binge-watched Netflix’s The Crown, you are now considerably better educated on the English monarchy, have a very different view of Queen Elizabeth, and were impressed with the show’s access to Buckingham Palace.

Well, it turns out they didn’t actually have access to the Palace. This is where London-based visual effects house One of Us came in. While the number of shots provided for the 10-part series varied, the average was 43 per episode.

In addition to Buckingham Palace, One of Us worked on photoreal digital set extensions, crowd replications and environments, including Downing Street and London Airport. The series follows a young Elizabeth who inherits the crown after her father, King George VI, dies. We see her transition from a vulnerable young married lady to a more mature woman who takes her role as head monarch very seriously.

We reached out to One of Us VFX supervisor Ben Turner to find out more.

How early did you join the production?
One of Us was heavily involved during an eight-month pre-production process, until shooting commenced in July 2015.

Ben Turner

Did they have clear vision of what they needed VFX vs. practical?
As we were involved from the pre-production stage, we were able to engage in discussions about how best to approach shooting the scenes with the VFX work in mind. It was important to us and the production that actors interacted with real set pieces and the VFX work would be “thrown away” in the background, not drawing attention to itself.

Were you on set?
I visited all relevant locations, assisted on set by Jon Pugh who gathered all VFX data required. I would attend all recces at these locations, and then supervise on the shoot days.

Did you do previs? If so, what software did you use?
We didn’t do much previs in the traditional sense. We did some tech-vis to help us figure out how best to film some things, such as the arrivals at the gates of Buckingham Palace and the Coronation sequence. We also did some concept images to help inform the shoot and design of some scenes. This work was all done in Autodesk Maya, The Foundry’s Nuke and Adobe Photoshop.

Were there any challenges in working in 4K? Did your workflow change at all, and how much of your work currently is in 4K?
Working in 4K didn’t really change our workflow too much. At One of Us, we are used to working on film projects that come in all different shapes and sizes (we recently completed work on Terrance Mallick’s Voyage of Time in IMAX 5K), but for The Crown we invested in the infrastructure that enabled us to take it in our stride — larger and faster disks to hold the huge amounts of data, as well as a new 4K monitor to review all the work.

     

What were some of your favorite, or most challenging, VFX for the show?
The most challenging work was the kind of shots that many people are already very familiar with. So the Queen’s Coronation, for example, was watched by 20 million people in 1953, and with Buckingham Palace and Downing Street being two of the most famous and recognizable addresses in the world, there wasn’t really anywhere for us to hide!

Some of my favorite shots are the ones where we were recreating real events for which there are amazing archive references, such as the tilt down on the scaffolding at Westminster Abbey on the eve of the Coronation, or the unveiling of the statue of King George VI.

     

Can you talk about the tools you used, and did you create any propriety tools during the workflow?
We used Enwaii and Maya for photogrammetry, Photoshop for digital matte painting and Nuke for compositing. For crowd replication we created our own in-house 2.5D tool in Nuke, which was a card generator that gave the artist a choice of crowd elements, letting them choose the costume, angle, resolution and actions required.

What are you working on now?
We are currently hard at work on Season 2 of The Crown, which is going to be even bigger and more ambitious, so watch this space! Recent work also includes King Arthur: Legend Of The Sword (Warner Bros.) and Assassin’s Creed (New Regency).

Chaos Group and Adobe partner for photorealistic rendering in CC

Chaos Group’s V-Ray rendering technology is featured in Adobe’s Creative Cloud, allowing graphic designers to easily create photorealistic 3D rendered composites with Project Felix.

Available now, Project Felix is a public beta desktop app that helps users composite 3D assets like models, materials and lights with background images, resulting in an editable render they can continue to design in Photoshop CC. For example, users can turn a basic 3D model of a generic bottle into a realistic product shot that is fully lit and placed in a scene to create an ad, concept mock-up or even abstract art.

V-Ray acts as a virtual camera, letting users test angles, perspectives and placement of their model in the scene before generating a final high-res render. Using the preview window, Felix users get immediate visual feedback on how each edit affects the final rendered image.

By integrating V-Ray, Adobe has brought the same raytracing technology used by companies Industrial Light & Magic to a much wider audience.

“We’re thrilled that Adobe has chosen V-Ray to be the core rendering engine for Project Felix, and to be a part of a new era for 3D in graphic design,” says Peter Mitev, CEO of Chaos Group. “Together we’re bringing the benefits of photoreal rendering, and a new design workflow, to millions of creatives worldwide.”

“Working with the amazing team at Chaos Group meant we could bring the power of the industry’s top rendering engine to our users,” adds Stefano Corazza, senior director of engineering at Adobe. “Our collaboration lets graphic designers design in a more natural flow. Each edit comes to life right before their eyes.”

The Foundry gives Jody Madden additional role, ups Phil Parsonage

The Foundry’s chief customer officer, Jody Madden, has been given the additional role of chief product officer (CPO). In another move, Phil Parsonage has been promoted to director of engineering.

As CPO, Madden — who had at one time held the title of COO at The Foundry — returns to her more technical roots, which includes stints at VFX studios such Digital Domain and Industrial Light & Magic. In this new role, Madden is responsible for managing The Foundry’s full product line.

“I’m really excited to be stepping into this new role as I continue my rewarding journey with The Foundry,” says Madden. “I started [in this industry] as a customer, so I’m intimately familiar with the challenges the market faces. Now as CPO, I’m truly excited to help our customers address their technical and business challenges by continuing to push the boundaries of visual effects and design software.”

Parsonage, who has been with The Foundry for over 10 years, will run The Foundry’s engineering efforts. As part of this job, he is responsible for conceiving and implementing the company’s technical strategy.

 

Behind the Title: Composer Michael Carey

NAME: Michael Carey (@MichaelCarey007)

COMPANY: Resonation Music

WHAT’S YOUR JOB TITLE?
Creative director/composer (film/commercials/TV) and songwriter/producer/mixer (album work).

WHAT DOES THAT ENTAIL?
For commercials, film and TV projects, I work closely with the director, producer and agency to come up with something that meets their needs and the needs of the project. I develop an understanding of their overall vision, and then I conceptualize, compose and produce original music to capture the essence of this vision, in a complimentary way.

i-want-to-say-composer-main-title-opening-scenes

Michael Carey was composer of the main title theme and the opening scenes for ‘I Want to Say.’

This includes themes, underscore, source, main titles, end titles, etc. When it comes to album projects and soundtrack songs, I often write for (or with) the featured artist or band and produce the track from end to end. This means that I am also the engineer, programmer, session player and often mixer for a project.

On large projects that require fast turnaround, I wear the “creative director” hat, and I assemble and manage a specific team of colleagues to collaborate with me — those I know can get the job done at the highest level. I keep things focused and cohesive, and strive to maintain a consistent musical voice.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
Whichever medium I’m working in, be it music-for-picture or album work, the underlying fundamentals are surprisingly similar. In both instances, it’s ultimately about storytelling – conveying maximum emotional impact in a compelling way. Using dynamics, melody, tension, release, density and space to create memorable moments and exciting transitions to keep the viewer or listener engaged.

I’m always striving to support the “main event.” In film, it’s visuals and dialog. In album work it’s the singer’s performance. I see my job as building a metaphorical “frame” around the picture. Enhance, reinforce, compliment, but never distract.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Two parts, really. First, the satisfaction of achieving a collective goal. Helping a filmmaker/artist realize their vision, while finding a way to authentically express my own musical vision and make a deeper connection with the audience experiencing the work.

There are moments in the course of a project when you hit on something that’s undeniable. Everyone involved immediately feels it. Human connections are made. Those are great moments, and ultimately you want the whole piece to feel like that.

The second part is the inspiration that comes from working collaboratively (usually with people at the top of their game) with those talented peers who challenge and push you in directions you might not have taken otherwise.

WHAT IS YOUR PROCESS FOR SCORING? HOW DO YOU BEGIN?
1) Watch film/read script. 2) Discuss with director, get a sense of their vision. 3) Create musical sketches and build a sonic palette. If there’s already some picture available to work with, then I’ll tackle a scene that feels representative of the rest of the project and refine it with input from the director. My goal is to create a musical/sonic “voice” or “sound” for the film that becomes an inextricable part of its personality.

CAN YOU WALK US THROUGH YOUR WORKFLOW?
Once overall direction has been established and scenes have been spotted, my first step with a scene is to map things out tempo/timing-wise, making note of any significant cuts, events or moments that need to be hit (or avoided) musically.

By defining this structure first, it frees me up to explore musically and texturally with a clear understanding of where “ins” and “outs” are. By then, I usually have a pretty clear sense of what I want to hear as it pertains to realizing the vision of the director, and from that point it is about execution —programming, recording live instrumentation, processing/manipulation and mixing — whatever is required to make the scene “feel” the way it does in my head.

DOES YOUR PROCESS CHANGE DEPENDING ON THE TYPE OF PROJECT? FILM VS. SPOT, ETC?
There are certain nuances that have to be considered when approaching these different types of projects. Nailing the details in short form (commercials) is often more crucial because you have an entire world of information to convey in 30 seconds or less. There can be no missed moment or opportunity. It needs to feel cohesive with a cinematic story arc, and a compelling payoff at the end, all in an incredibly compressed window of time.

This is less evident in long-form projects. With feature films or TV, you often have the luxury to build musical movements more naturally as a scene progresses.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
That’s a tough one. As a kid I wanted to be an anthropologist. At 21, I went to a cooking school in Paris for a month thinking that that might be cool. More recently, I’ve been dabbling with building websites for friends using template-based platforms like Squarespace.

I think the common themes with these other interests are curiosity, experimentation, creativity and storytelling. Bringing an idea to life, making the abstract tangible. At the end of the day, music still allows me to do these things with a greater degree of satisfaction.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I knew music would be my path by age 14. I was playing guitar in local bands at the time, and then moved into steady club gigs. By the time I was 18, I was in a signed band, recording and touring. I couldn’t have imagined doing anything else. When I hit my 20s, I knew that writing and composing was the path ahead (vs. being a “gun for hire” guitarist).

I still played in bands and did lots of session work, but I focused more on songwriting and learning about recording and production. During that time, I had the opportunity to work with some legendary British engineer producers. At one point, a well-known video director who had shot some videos with one of my bands had started doing commercials, and he was unhappy with the music that an ad agency had put in one of his spots. So he recruited me to take a shot a composing a new score. It all clicked, and that opened the door to a couple of decades of high-profile commercial spots, as well as consistent work from major ad agencies and brands.

Eventually, this journey led me down the road of TV and film. All the while, I kept a foot in the album world, writing for and producing artists in the US and internationally.

andy-vargas-the-beat-2016-hmma-winner-producer-songwriterCAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
I Want To Say— Composer: Main Title and opening scenes (Healdsburg International Film Festival – Best Documentary).
LBS– Songwriter/Producer: End Title Track feat. J.R. Richards of Dishwalla (Sundance Official Selection, Independent Spirit Awards nominee)
• Andy Vargas/The Beat (Producer/Songwriter – Winner 2016 Hollywood Music in Media Awards “R&B/Soul”)
• Escape The Fate/Alive (Songwriter — hit single, #26 Active Rock, album #2 Billboard Hard Rock charts)

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
It’s hard to pick one. Some of the projects listed above are contenders. There’s a young band I’m developing and producing right now called Bentley. I will be very proud when that is released. They’re fantastic.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
Pro Tools. It’s my “instrument” as much as any guitar or keyboard. It’s allowed me to be incredibly productive and make anything I hear in my head a reality. Steven Slate, Sound Toys and PSP plug-ins. Vibe, warmth, color, saturation, detail. My extensive collection of vintage gear (amps, mics, mic pres, compressors, guitars, boutique pedals, etc.). Not sure if these qualify as “technology,” but they all have buttons and knobs and make great noises!

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
Instagram, Twitter and Facebook (to a lesser extent lately).

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I have an amazing family who helps keep me centered with my eyes on the big picture. Running and exercise (not enough, but feels great when I do) and, increasingly, I try to meditate each morning. A friend and colleague whose studio demeanor I’ve always admired turned me onto it. He’s consistently calm and focused even in the midst of total drama and chaos. I’d like to think I’m getting there.

Main Image: Patricia Maureen Photography-P.M.P

Creating and tracking roaches for Hulu’s 11.22.63

By Randi Altman

Looking for something fun and compelling to watch while your broadcast shows are on winter break? You might want to try Hulu’s original eight-part miniseries 11.22.63, which the streaming channel released last February.

It comes with a pretty impressive pedigree — it’s based on a Stephen King novel, it’s executive produced by J.J. Abrams, it stars Oscar-nominee James Franco (127 Hours) and it’s about JFK’s assassination and includes time travel. C’mon!

The plot involves Franco’s character traveling back to 1960 in an effort to stop JFK’s assassination, but just as he makes headway, he feels the past pushing back in some dangerous, and sometimes gross, ways.

Bruce Branit

In the series pilot, Franco’s character, Jack Epping, is being chased by Kennedy’s security after he tries to sneak into a campaign rally. He ducks in a storage room to hide, but he’s already ticked off the past, which slowly serves him up a room filled with cockroaches that swarm him. The sequence is a slow build, with roaches crawling out, covering the floor and then crawling up him.

I’m not sure if Franco has a no-roach clause in his contract (I would), but in order to have control over these pests, it was best to create them digitally. This is where Bruce Branit, owner of BranitFX in Kansas City, Missouri came in. Yes, you read that right, Kansas City, and his resume is impressive. He is a frequent collaborator with Jay Worth, Bad Robot’s VFX supervisor.

So for this particular scene, BranitFX had one or two reference shots, which they used to create a roach brush via Photoshop. Once the exact look was determined regarding the amount of attacking roaches, they animated it in 3D and and composited. They then used 2D and 3D tracking tools to track Franco while the cockroaches swarmed all over him.

Let’s find out more from Bruce Branit.

How early did you get involved in that episode? How much input did you have in how it would play out?
For this show, there wasn’t a lot of lead time. I came on after shooting was done and there was a rough edit. I don’t think the edit changed a lot after we started.

What did the client want from the scene, and how did you go about accomplishing that?
VFX supervisor Jay Worth and I have worked together on a lot of shows. We’d done some roaches for an episode of Almost Human, and also I think for Fringe, so we had some similar assets and background with talking “roach.” The general description was tons of roaches crawling on James Franco.

Did you do previs?
Not really. I rendered about 10 angles of the roach we had previously worked with and made Adobe Photoshop brushes out of each frame. I used that to paint up a still of each shot to establish a baseline for size, population and general direction of the roaches in each of the 25 or so shots in the sequence.

Did you have to play with the movements a lot, or did it all just come together?
We developed a couple base roach walks and behaviors and then populated each scene with instances of that. This changed depending on whether we needed them crossing the floor, hanging on a light fixture or climbing on Franco’s suit. The roach we had used in the past was similar to what the producers on 11.22.63 had in mind. We made a few minor modifications with texture and modeling. Some of this affected the rig we’d built so a lot of the animations had to be rebuilt.

Can you talk about your process/workflow?
This sequence was shot in anamorphic and featured a constantly flashing light on the set going from dark emergency red lighting to brighter florescent lights. So I generated unsqueezed lens distortion, removed and light mitigated interim plates to pull all of our 2D and 3D tracking off of. The tracking was broken into 2D, 3D and 3D tracking by hand involving roaches on Franco’s body as he turns and swats at them in a panic. The production had taped large “Xs” on his jacket to help with this roto-tracking, but those two had to be painted out for many shots prior to the roaches reaching Franco.

The shots were tracked in Fusion Studio for 2D and SynthEyes for 3D. A few shots were also tracked in PFTrack.

The 3D roach assets were animated and rendered in NewTek LightWave. Passes for the red light and white light conditions were rendered as well as ambient show and specular passes. Although we were now using tracking plates with the 2:1 anamorphic stretch removed, a special camera was created in LightWave that was actually double the anamorphic squeeze to duplicate the vertical booked and DOF from an anamorphic lens. The final composite was completed in Blackmagic Fusion Studio using the original anamorphic plates.

What was the biggest challenge you faced working on this scene?
Understanding the anamorphic workflow was a new challenge. Luckily, I had just completed a short project of my own called Bully Mech that was shot with Lomo anamorphic lenses. So I had just recently developed some familiarity and techniques to deal with the unusual lens attributes of those lenses. Let’s just say they have a lot of character. I talked with a lot of cinematographer friends to try to understand how the lenses behaved and why they stretched the out-of-focus element vertically while the image was actually stretched the other way.

What are you working on now?
I‘ve wrapped up a small amount of work on Westworld and a handful of shots on Legends of Tomorrow. I’ve been directing some television commercials the last few months and just signed a development deal on the Bully Mech project I mentioned earlier.

We are making a sizzle reel of the short that expands the scope of the larger world and working with concept designers and a writer to flush out a feature film pitch. We should be going out with the project early next year.