Tag Archives: compositing

Review: FXhome’s HitFilm Pro 12 for editing, compositing, VFX

By Brady Betzel

If you have ever worked in Adobe Premiere Pro, Apple FCP X or Avid Media Composer and wished you could just flip a tab and be inside After Effects, with access to 3D objects directly in your timeline, you are going to want to take a look at FXhome’s HitFilm Pro 12.

Similar to how Blackmagic brought Fusion inside of its most recent versions of DaVinci Resolve, HitFilm Pro offers a nonlinear editor, a composite/VFX suite and a finishing suite combined into one piece of software. Haven’t heard about HitFilm yet? Let me help fill in some blanks.

Editing and 3D model Import

Editing and 3D model Import

What is HitFilm Pro 12?
Technically, HitFilm Pro 12 is a non-subscription-based nonlinear editor, compositor and VFX suite that costs $299. Not only does that price include 12 months of updates and tech support, but one license can be used on up to three computers simultaneously. In my eyes, HitFilm Pro is a great tool set for independent filmmakers, social media content generators and any editor who goes beyond editing and dives into topics like 3D modeling, tracking, keying, etc. without having to necessarily fork over money for a bunch of expensive third-party plugins. That doesn’t mean you won’t want to buy third-party plugins, but you are less likely to need them with HitFilm’s expansive list of native features and tools.

At my day job, I use Premiere, After Effects, Media Composer and Resolve. I often come home and want to work in something that has everything inside, and that is where HitFilm Pro 12 lives. Not only does it have the professional functionality that I am used to, such as trimming, color scopes and more, but it also has BorisFX’s Mocha planar tracking plugin built in for no extra cost. This is something I use constantly and love.

One of the most interesting and recent updates to HitFilm Pro 12 is the ability to use After Effects plugins. Not all plugins will work since there are so many, but in a video released after NAB 2019, HitFilm said plugins like Andrew Kramer’s Video CoPilot Element3D and ones from Red Giant are on the horizon. If you are within your support window, or you continue to purchase HitFilm, FXhome will work with you to get your favorite After Effects plugins working directly inside of HitFilm.

Timeline and 3D model editor

Some additional updates to HitFilm Pro 12 include a completely redesigned user interface that resembles Premiere Pro… kind of. Threaded rendering has also been added, so Windows users who have Intel and Nvidia hardware will see increased GPU speeds, the ability to add title directly in the editor and more.

The Review
So how doees HitFilm Pro 12 compare to today’s modern software packages? That is an interesting question. I have become more and more of a Resolve convert over the past two years, so I am constantly comparing everything to that. In addition, being an Avid user for over 15 years, I am used to a rock-solid NLE with only a few hiccups here and there. In my opinion, HitFilm 12 lands itself right where Premiere and FCP X live.

It feels prosumer-y, in a YouTuber or content-generator capacity. Would it stand up to 10 hours of abuse with content over 45 minutes? It probably would, but much like with Premiere, I would probably split my edits in scenes or acts to avoid slowdowns, especially when importing things like OBJ files or composites.

The nonlinear editor portion feels like Premiere and FCP X had a baby, but left out FCP X’s Magnetic Timeline feature. The trimming in the timeline feels smooth, and after about 20 minutes of getting comfortable with it I felt like it was what I am generally used to. Cutting in footage feels good using three-point edits or simply dragging and dropping. Using effects feels very similar to the Adobe world, where you can stack them on top of clips and they each affect each other from the top down.

Mocha within HitFilm Pro

Where HitFilm Pro 12 shines is in the inclusion of typically third-party plugins directly in the timeline. From the ability to create a scene with 3D cameras and particle generators to being able to track using BorisFX’s Mocha, HitFilm Pro 12 has many features that will help take your project to the next level. With HitFilm 12 Pro’s true 3D cameras, you can take flat text and enhance it with raytraced lighting, shadows and even textures. You can even use the included BorisFX Continuum 3D Objects to make great titles relatively easily. To take it a step further, you can even track them and animate them.

Color Tools
By day, I am an online editor/colorist who deals with the finishing aspect of media creation. Throughout the process, from color correction to exporting files, I need tools that are not only efficient but accurate. When I started to dig into the color correction side of HitFilm Pro 12, things slowed down for me. The color correction tools are very close to what you’ll find in other NLEs, like Premiere and FCP X, but they don’t quite rise to the level of Resolve. HitFilm Pro 12 does operate inside of a 32-bit color pipeline, which really helps avoid banding and other errors when color correcting. However, I didn’t feel that the toolset was making me more efficient; in fact, it was the opposite. I felt like I had to learn FXhome’s way of doing it. It wasn’t that it totally slowed me down, but I felt it could be better.

Color

Color

Summing Up
In the end, HitFilm 12 Pro will fill a lot of holes for individual content creators. If you love learning new things (like I do), then HitFilm Pro 12 will be a good investment of your time. In fact, FXhome post tons of video tutorials on all sorts of good and topical stuff, like how to create a Stranger Things intro title.

If you are a little more inclined to work with a layer-based workflow, like in After Effects, then HitFilm Pro Pro 12 is the app you’ll want to learn. Check out HitFilm Pro 12 on FXhome’s website and definitely watch some of the company’s informative tutorials.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

FXhome, Vegas Creative Software partner on Vegas Post

HitFilm creator FXhome has partnered with Vegas Creative Software to launch a new suite of editing, VFX, compositing and imaging tools for video pros, editors and VFX artists called Vegas Post.

Vegas Post will combine the editing tools of Vegas Pro with FXhome’s expertise in compositing and visual effects to offer an array of features and capabilities.

FXhome is developing customized effects and compositing tools specifically for Vegas Post. The new software suite will also integrate a custom-developed version of FXhome’s new non-destructive RAW image compositor that will enable video editors to work with still-image and graphical content and incorporate it directly into their final productions. All tools will work together seamlessly in an integrated, end-to-end workflow to accelerate and streamline the post production process for artists.

The new software suite is ideally suited for video pros in post facilities of all sizes and requirements — from individual artists to large post studios, broadcasters and small/medium enterprise installations. It will be available in the third quarter, with pricing to be announced.

Meanwhile, FXhome has teamed up with Filmstro, which offers a royalty-free music library, to provide HitFilm users with access to the entire Filmstro music library for 12 months. With Filmstro available directly from the FXhome store, HitFilm users can use Filmstro soundtracks on unlimited projects and get access to weekly new music updates.

Offering more than just a royalty-free music library, Filmstro has developed a user interface that gives artists flexibility and control over selected music tracks for use in their HitFilm projects. HitFilm users can control the momentum, depth and power of any Filmstro track, using sliders to perfectly match any sequence in a HitFilm project. Users can also craft soundtracks to perfectly fit images by using a keyframe graph editor within Filmstro. Moving sliders automatically create keyframes for each element and can be edited at any point.

Filmstro offers over 60 albums’ worth of music with weekly music releases. All tracks are searchable using keywords, film and video genre, musical style, instrumental palette or mood. All Filmstro music is licensed for usage worldwide and in perpetuity. The Filmstro dynamic royalty-free music library is available now on the FXhome Store for $249 and can be purchased here.

Fox Sports promotes US women’s World Cup team with VFX-heavy spots

Santa Monica creative studio Jamm worked with Wieden+Kennedy New York on the Fox Sports campaign “All Eyes on US.” Directed by Joseph Kahn out of Supply & Demand, the four spots celebrate the US Women’s soccer team as it gears up for the 2019 FIFA Women’s World Cup in June.

The newest 60-second spot All Eyes on US, features tens of thousands of screaming fans thanks to Jamm’s CG crowd work. On set, Jamm brainstormed with Kahn on how to achieve the immersive effect he was looking for. Much of the on-the-ground footage was shot using wide-angle lenses, which posed a unique set of challenges by revealing the entire environment as well as the close-up action. With pacing, Jamm achieved the sense of the game occurring in realtime, as the tempo of the camera keeps in step with the team moving the ball downfield.

The 30-second spot Goliath features the first CG crowd shot by the Jamm team, who successfully filled the soccer stadium with a roaring crowd. In Goliath, the entire US women’s soccer team runs toward the camera in slow motion. Captured locked off but digitally manipulated via a 3D camera to create a dolly zoom technique replicating real-life parallax, the altered perspective translates the unsettling feeling of being an opponent as the team literally runs straight into the camera.

On set, Jamm got an initial Lidar scan of the stadium as a base. From there, they used that scan along with reference photos taken on set to build a CG stadium that included accurate seating. They extended the stadium where there were gaps as well to make it a full 360 stadium. The stadium seating tools tie in with Jamm’s in-house crowd system (based on Side Effects Houdini) and allowed them to easily direct the performance of the crowd in every shot.

The Warrior focuses on Megan Rapinoe standing on the field in the rain, with a roaring crowd behind her. Whereas CG crowd simulation is typically captured with fast-moving cameras, the stadium crowd remains locked in the background of this sequence. Jamm implemented motion work and elements like confetti to make the large group of characters appear lively without detracting from Rapinoe in the foreground. Because the live-action scenes were shot in the rain, Jamm used water graphing to seamlessly blend the real-world footage and the CG crowd work.

The Finisher centers on Alex Morgan, who earned the nickname because “she’s the last thing they’ll see before it’s too late.”  The team ran down the field at a slow motion pace, while the cameraman rigged with a steady cam sprinted backwards through the goal. Then the footage was sped up by 600%, providing a realtime quality, as Morgan kicks a perfect strike to the back of the net.

Jamm used Autodesk Flame for compositing the crowds and CG ball, camera projections to rebuild and clean up certain parts of the environment, refining the skies and adding in stadium branding. They also used Foundry Nuke and Houdini for 3D.

The edit was via FinalCut and editor Spencer Campbell. The color grade was by Technicolor’s Tom Poole.

Roper Technologies set to acquire Foundry

Roper Technologies, a technology company and a constituent of the S&P 500, Fortune 1000 and the Russell 1000 indices, is expected to purchase Foundry — the deal is expected to close in April 2019, subject to regulatory approval and customary closing conditions.Foundry makes software tools used to create visual effects and 3D for the media and entertainment world, including Nuke, Modo, Mari and Katana.

Craig Rodgerson

It’s a substantial move that enables Foundry to remain an independent company, with Roper assuming ownership from Hg. Roper has a successful history of acquiring well-run technology companies in niche markets that have strong, sustainable growth potential.

“We’re excited about the opportunities this partnership brings. Roper understands our strategy and chose to invest in us to help us realize our ambitious growth plans,” says Foundry CEO Craig Rodgerson. “This move will enable us to continue investing in what really matters to our customers: continued product improvement, R&D and technology innovation and partnerships with global leaders in the industry.”

Quick Chat: Crew Cuts’ Nancy Jacobsen and Stephanie Norris

By Randi Altman

Crew Cuts, a full-service production and post house, has been a New York fixture since 1986. Originally established as an editorial house, over the years as the industry evolved they added services that target all aspects of the workflow.

This independently-owned facility is run by executive producer/partner Nancy Jacobsen, senior editor/partner Sherri Margulies Keenan and senior editor/partner Jake Jacobsen. While commercial spots might be in their wheelhouse, their projects vary and include social media, music videos and indie films.

We decided to reach out to Nancy Jacobsen, as well as EP of finishing Stephanie Norris, to find out about trends, recent work and succeeding in an industry and city that isn’t always so welcoming.

Can you talk about what Crew Cuts provides and how you guys have evolved over the years?
Jacobsen: We pretty much do it all. We have 10 offline editors as well as artists working in VFX, 2D/3D animation, motion graphics/design, audio mix and sound design, VO record, color grading, title treatment, advanced compositing and conform. Two of our editors double as directors.

In the beginning, Crew Cuts primarily offered only editorial. As the years went by and the industry climate changed we began to cater to the needs of clients and slowly built out our entire finishing department. We started with some minimal graphics work and one staff artist in 2008.

In 2009, we expanded the team to include graphics, conform and audio mix. From there we just continued to grow and expand our department to the full finishing team we have today.

As a woman owner of a post house, what challenges have you had to overcome?
Jacobsen: When I started in this business, the industry was very different. I made less money than my male counterparts and it took me twice as long to be promoted because I am a woman. I have since seen great change where women are leading post houses and production houses and are finally getting the recognition for the hard work they deserve. Unfortunately, I had to “wait it out” and silently work harder than the men around me. This has paid off for me, and now I can help women get the credit they rightly deserve

Do you see the industry changing and becoming less male-dominated?
Jacobsen: Yes, the industry is definitely becoming less male-dominated. In the current climate, with the birth of the #metoo movement and specifically in our industry with the birth of Diet Madison Avenue (@dietmadisonave), we are seeing a lot more women step up and take on leading roles.

Are you mostly a commercial house? What other segments of the industry do you work in?
Jacobsen: We are primarily a commercial house. However, we are not limited to just broadcast and digital commercial advertising. We have delivered specs for everything from the Godzilla screen in Times Square to :06 spots on Instagram. We have done a handful of music videos and also handle a ton of B2B videos for in-house client meetings, etc., as well as banner ads for conferences and trade shows. We’ve even worked on display ads for airports. Most recently, one of our editors finished a feature film called Public Figure that is being submitted around the film festival circuit.

What types of projects are you working on most often these days?
Jacobsen: The industry is all over the place. The current climate is very messy right now. Our projects are extremely varied. It’s hard to say what we work on most because it seems like there is no more norm. We are working on everything from sizzle pitch videos to spots for the Super Bowl.

What trends have you seen over the last year, and where do you expect to be in a year?
Jacobsen: Over the last year, we have noticed that the work comes from every angle. Our typical client is no longer just the marketing agency. It is also the production company, network, brand, etc. In a year we expect to be doing more production work. Seeing as how budgets are much smaller than they used to be and everyone wants a one-stop shop, we are hoping to stick with our gut and continue expanding our production arm.

Crew Cuts has beefed up its finishing services. Can you talk about that?
Stephanie Norris: We offer a variety of finishing services — from sound design to VO record and mix, compositing to VFX, 2D and 3D motion graphics and color grading. Our fully staffed in-house team loves the visual effects puzzle and enjoys working with clients to help interpret their vision.

Can you name some recent projects and the services you provided?
Norris: We just worked on a new campaign for New Jersey Lottery in collaboration with Yonder Content and PureRed. Brian Neaman directed and edited the spots. In addition to editorial, Crew Cuts also handled all of the finishing, including color, conform, visual effects, graphics, sound design and mix. This was one of those all-hands-on-deck projects. Keeping everything under one roof really helped us to streamline the process.

New Jersey Lottery

Working with Brian to carefully plan the shooting strategy, we filmed a series of plate shots as elements that could later be combined in post to build each scene. We added falling stacks of cash to the reindeer as he walks through the loading dock and incorporated CG inflatable decorations into a warehouse holiday lawn scene. We also dramatically altered the opening and closing exterior warehouse scenes, allowing one shot to work for multiple seasons. Keeping lighting and camera positions consistent was mission-critical, and having our VFX supervisor, Dulany Foster, on set saved us hours of work down the line.

For the New Jersey Lottery Holiday spots, the Crew Cuts CG team, led by our creative director Ben McNamara created a 3D Inflatable display of lottery tickets. This was something that proved too costly and time consuming to manufacture and shoot practically. After the initial R&D, our team created a few different CG inflatable simulations prior to the shoot, and Dulany was able to mock them up live while on set. Creating the simulations was crucial for giving the art department reference while building the set, and also helped when shooting the plates needed to composite the scene together.

Ben and his team focused on the physics of the inflation, while also making sure the fabric simulations, textures and lighting blended seamlessly into the scene — it was important that everything felt realistic. In addition to the inflatables, our VFX team turned the opening and closing sunny, summer shots of the warehouse into a December winter wonderland thanks to heavy compositing, 3D set extension and snow simulations.

New Jersey Lottery

Any other projects you’d like to talk about?
Jacobsen: We are currently working on a project here that we are handling soup to nuts from production through finishing. It was a fun challenge to take on. The spot contains a hand model on a greenscreen showing the audience how to use a new product. The shoot itself took place here at Crew Cuts. We turned our common area into a stage for the day and were able to do so without interrupting any of the other employees and projects going on.

We are now working on editorial and finishing. The edit is coming along nicely. What really drives the piece here is the graphic icons. Our team is having a lot of fun designing these elements and implementing them into the spot. We are so proud because we budgeted wisely to make sure to accommodate all of the needs of the project so that we could handle everything and still turn a profit. It was so much fun to work in a different setting for the day and has been a very successful project so far. Clients are happy and so are we.

Main Image: (L-R) Stephanie Norris and Nancy Jacobsen

Behind the Title: Senior compositing artist Marcel Lemme

We recently reached out to Marcel Lemme to find out more about how he works, his background and how he relaxes.

What is your job title and where are you based?
I’m a senior compositing artist based out of Hamburg, Germany.

What does your job entail?
I spend about 90 percent of my time working on commercial jobs for local and international companies like BMW, Audi and Nestle, but also dabble in feature films, corporate videos and music videos. On a regular day, I’m handling everything from job breakdowns to set supervision to conform. I’m also doing shot management for the team, interacting with clients, showing clients work and some compositing. Client review sessions and final approvals are regular occurrences for me too.

What would surprise people the most about the responsibilities that fall under that title?
When it comes to client attended sessions, you have to be part clown, part mind-reader. Half the job is being a good artist; the other half is keeping clients happy. You have to anticipate what the client will want and balance that with what you know looks best. I not only have to create and keep a good mood in the room, but also problem-solve with a smile.

What’s your favorite part of your job?
I love solving problems when compositing solo. There’s nothing better than tackling a tough project and getting results you’re proud of.

What’s your least favorite?
Sometimes the client isn’t sure what they want, which can make the job harder.

What’s your most productive time of day?
I’m definitely not a morning guy, so the evening — I’m more productive at night.

If you didn’t have this job, what would you be doing instead?
I’ve asked myself this question a lot, but honestly, I’ve never come up with a good answer.

How’d you get your first job, and did you know this was your path early on?
I fell into it. I was young and thought I’d give computer graphics a try, so I reached out to someonewho knew someone, and before I knew it I was interning at a company in Hamburg, which is how I came to know online editing. At the time, Quantel mostly dominated the industry with Editbox and Henry, and Autodesk Flame and Flint were just emerging. I dove in and started using all the technology I could get my hands on, and gradually started securing jobs based on recommendations.

Which tools are you using today, and why?
I use whatever the client and/or the project demands, whether it’s Flame or Foundry’s Nuke and for tracking I often use The Pixel Farm PFTrack and Boris FX Mocha. For commercial spots, I’ll do a lot of the conform and shot management on Flame and then hand off the shots to other team members. Or, if I do it myself, I’ll finish in Flame because I know I can do it fast.

I use Flame because it gives me different ways to achieve a certain look or find a solution to a problem. I can also play a clip at any resolution with just two clicks in Flame, which is important when you’re in a room with clients who want to see different versions on the fly. The recent open clip updates and python integration have also saved me time. I can import and review shots, with automatic versions coming in, and build new tools or automate tedious processes in the post chain that have typically slowed me down.

Tell us about some recent project work.
I recently worked on a project for BMW as a compositing supervisor and collaborated with eight other compositors to finish number of versions in a short amount of time. We did shot management, compositing, reviewing, versioning and such in Flame. Also individual shot compositing in Nuke and some tracking in Mocha Pro.

What is the project that you are most proud of?
There’s no one project that stands out in particular, but overall, I’m proud of jobs like the BMW spots, where I’ve led a team of artists and everything just works and flows. It’s rewarding when the client doesn’t know what you did or how you did it, but loves the end result.

Where do you find inspiration for your projects?
The obvious answer here is other commercials, but I also watch a lot of movies and, of course, spend time on the Internet.

Name three pieces of technology you can’t live without.
The off button on the telephone (they should really make that bigger), anything related to cinematography or digital cinema, and streaming technology.

What social media channels do you follow?
I’ve managed to avoid Facebook, but I do peek at Twitter and Instagram from time to time. Twitter can be a great quick reference for regional news or finding out about new technology and/or industry trends.

Do you listen to music while you work?
Less now than I did when I was younger. Most of the time, I can’t as I’m juggling too much and it’s distracting. When I listen to music, I appreciate techno, classical and singer/song writer stuff; whatever sets the mood for the shots I’m working on. Right now, I’m into Iron and Wine and Trentemøller, a Danish electronic music producer.

How do you de-stress from the job?
My drive home. It can take anywhere from a half an hour to an hour, depending on the traffic, and that’s my alone time. Sometimes I listen to music, other times I sit in silence. I cool down and prepare to switch gears before heading home to be with my family.

MPC adds Flame artists and executive producer to its finishing team

MPC has strengthened its finishing capabilities with the addition of Flame artist and creative director Claus Hansen, senior Flame artist Noah Caddis and executive producer Robert Owens. The trio, who have joined MPC from Method, have over a decade of experience working together. They will be based in MPC’s Culver City studio.

Owens, Hansen and Caddis are all looking forward to collaborating with MPC’s colorists and artists who are located all around the world. “We were attracted to MPC for the quality of work they are renowned for. At the same time it feels very accessible, like we’re working in a collective group, all driven by the same thing, to make great work,” says Hansen. “We are at a point in our careers where we can take our knowledge and skills to make the best experience possible for the company and clients.”

“There is an assurance that all projects will be treated with an artistic eye and scrutiny that is not typically found in the fast-paced nature of finishing and beauty,” adds Caddis.

Hansen has worked with agencies, such as CP+B, Wieden + Kennedy and Deutsch, creating effects, beauty and finishing work on content for brands including BMW, Lexus, Maserati, Microsoft, Target and Revlon.

Caddis has worked on spots for Infiniti, Kia, Adobe, Diet Dr Pepper and others. He too has a strong history of partnering with high-profile agencies like Deutsch, CP+B, Media Arts Lab, Agency 215 and David & Goliath.

“Robert, Noah and I have noticed the strong sense of camaraderie since we arrived, and it’s contagious,” says Hansen. “It gives the feeling of being in a tight-knit, creatively focused group that you want to be a part of. And that’s very appealing.”

Main Image: (L-R) Noah Caddis, Robert Owen and Claus Hansen.

Review: Blackmagic Resolve 14

By David Cox

Blackmagic has released Version 14 of its popular DaVinci Resolve “color grading” suite, following a period of open public beta development. I put color grading in quotes, because one of the most interesting aspects about the V14 release is how far-reaching Resolve’s ambitions have become, beyond simply color grading.

Fairlight audio within Resolve.

Prior to being purchased by Blackmagic, DaVinci Resolve was one of a small group of high-end color grading systems being offered in the industry. Blackmagic then extended the product to include editing, and Version 14 offers several updates in this area, particularly around speed and fluidity of use. A surprise addition is the incorporation of Fairlight Audio — a full-featured audio mixing platform capable of producing feature film quality 3D soundscapes. It is not just an external plugin, but an integrated part of the software.

This review concentrates on the color finishing aspects of Resolve 14, and on first view the core color tools remain largely unchanged save for a handful of ergonomic improvements. This is not surprising given that Resolve is already a mature grading product. However, Blackmagic has added some very interesting tools and features clearly aimed at enabling colorists to broaden their creative control. I have been a long-time advocate of the idea that a colorist doesn’t change the color of a sequence, but changes the mood of it. Manipulating the color is just one path to that result, so I am happy to see more creatively expansive facilities being added.

Face Refinement
One new feature that epitomizes Blackmagic’s development direction is the Face Refinement tool. It provides features to “beautify” a face and underlines two interesting development points. Firstly, it shows an intention by the developers to create a platform that allows users to extend their creative control across the traditional borders of “color” and “VFX.”

Secondly, such a feature incorporates more advanced programming techniques that seek to recognize objects in the scene. Traditional color and keying tools simply replace one color for another, without “understanding” what objects those colors are attached to. This next step toward a more intelligent diagnosis of scene content will lead to some exciting tools and Blackmagic has started off with face-feature tracking.

Face Refinement

The Face Refinement function works extremely well where it recognizes a face. There is no manual intervention — the tool simply finds a face in the shot and tracks all the constituent parts (eyes, lips, etc). Where there is more than one face detected, the system offers a simple box selector for the user to specify which face to track. Once the analysis is complete, the user has a variety of simple sliders to control the smoothness, color and detail of the face overall, but also specific controls for the forehead, cheeks, chin, lips, eyes and the areas around and below the eyes.

I found the face de-shine function particularly successful. A light touch with the controls yields pleasing results very quickly. A heavy touch is what you need if you want to make someone look like an android. I liked the fact that you can go negative with some controls and make a face look more haggard!

In my tests, the facial tracking was very effective for properly framed faces, even those with exaggerated expressions, headshakes and so on. But it would fail where the face became partially obscured, such as when the camera panned off the face. This led to all the added improvements popping off mid shot. While the fully automatic operation makes it quick and simple to use, it affords no opportunity for the user to intervene and assist the facial tracking if it fails. All things considered though, this will be a big help and time saver for the majority of beauty work shots.

Resolve FX
New for Resolve 14 are a myriad of built-in effects called Resolve FX, all GPU-accelerated and available to be added in the edit “page” directly to clips, or in the color page attached to nodes. They are categorized into Blurs, Light, Color, Refine, Repair, Stylize, Texture and Warp. A few particularly caught my eye, for example in “color,” the color compressor brings together nearby colors to a central hue. This is handy for unifying colors of an unevenly lit client logo into their precise brand reference, or dealing with blotchy skin. There is also a color space transform tool that enables LUT-less conversion between all the major color “spaces.”

Color

The dehaze function derives a depth map by some mysterious magic to help improve contrast over distance. The “light” collection includes a decent lens flare that allows plenty of customizing. “Styles” creates watercolor and outline looks while Texture includes a film grain effect with several film-gauge presets. I liked the implementation of the new Warp function. Rather than using grids or splines, the user simply places “pins” in the image to drag certain areas around. Shift-adding a pin defines a locked position immune from dragging. All simple, intuitive and realtime, or close to it.

Multi-Skilled and Collaborative Workflows
A dilemma for the Resolve developers is likely to be where to draw the line between editing, color and VFX. Blackmagic also develops Fusion, so they have the advanced side of VFX covered. But in the middle, there are editors who want to make funky transitions and title sequences, and colorists who use more effects, mattes and tracking. Resolve runs out of ability in these areas quite quickly and this forces the more adventurous editor or colorist into the alien environment of Fusion. The new features of Resolve help in this area, but a few additions to Resolve, such as better keyframing of effects and easier ability to reference other timeline layers in the node panel could help to extend Resolve’s ability to handle many common VFX-ish demands.

Some have criticized Blackmagic for turning Resolve into a multi-discipline platform, suggesting that this will create an industry of “jack of all trades and masters of none.” I disagree with this view for several reasons. Firstly, if an artist wants to major in a specific discipline, having a platform that can do more does not impede them. Secondly, I think the majority of content (if you include YouTube, etc.) is created by a single person or small teams, so the growth of multi-skilled post production people is simply an inevitable and logical progression which Blackmagic is sensibly addressing.

Edit

But for professional users within larger organisations, the cross-discipline features of Resolve take on a different meaning when viewed in the context of “collaboration.” Resolve 14 permits editors to edit, colorists to color and sound mixers to mix, all using different installations of the same platform, sharing the same media and contributing to the same project, even the same timeline. On the face of it, this promises to remove “conforms” and eradicate wasteful import/export processes and frustrating compatibility issues, while enabling parallel workflows across editing, color grading and audio.

For fast-turnaround projects, or projects where client approval cannot be sought until the project progresses beyond a “rough” stage, the potential advantages are compelling. Of course, the minor hurdle to get over will be to persuade editors and audio mixers to adopt Resolve as their chosen weapon. If they do, Blackmagic might well be on the way to providing collaborative utopia.

Summing Up
Resolve 14 is a massive upgrade from Resolve 12 (there wasn’t a Resolve 13 — who would have thought that a company called Blackagic might be superstitious?). It provides a substantial broadening of ability that will suit both the multi-skilled smaller outfits or fit as a grading/finishing platform and collaborative backbone in larger installations.


David Cox is a VFX compositor and colorist with 20-plus years of experience. He started his career with MPC and The Mill before forming his own London-based post facility. Cox recently created interactive projects with full body motion sensors and 4D/AR experiences.

Autodesk Flame family updates offer pipeline enhancements

Autodesk has updated its Flame 2018 family of 3D visual effects and finishing software, which includes Flame, Flare, Flame Assist and Lustre. Flame 2018.3 offers more efficient ways of working in post, with feature enhancements that offer greater pipeline flexibility, speed and support for emerging formats and technology.

Flame 2018.3 highlights include:

• Action Selective: Apply FX color to an image surface or the whole action scene via the camera

• Motion Warp Tracking: Organically distort objects that are changing shape, angle and form with new 32-bit motion vector-based tracking technology

• 360-degree VR viewing mode: View LatLong images in a 360-degree VR viewing mode in the Flame player or any viewport during compositing and manipulate the field of view

• HDR waveform monitoring: Set viewport to show luminance waveform; red, green, blue (RGB) parade; color vectorscope or 3D cube; and monitor a range of HDR and wide color gamut (WCG) color spaces including Rec2100 PQ, Rec2020 and DCI P3

• Shotgun Software Loader: Load assets for a shot and build custom batches via Flame’s Python API, and browse a Shotgun project for a filtered view of individual shots

• User-requested improvements for Action, Batch, Timeline and Media Hub

“The new standalone Python console in Flame 2018.3 is a great,” says Treehouse Edit finishing artist John Fegan, a Flame family beta tester. “We’re also excited about the enhanced FBX export with physically based renderer (PBR) for Maya and motion analysis updates. Using motion vector maps, we can now achieve things we couldn’t with a planar tracker or 3D track.”

Flame Family 2018.3 is available today at no additional cost to customers with a current Flame Family 2018 subscription.

Blackmagic’s Fusion 9 is now VR-enabled

At SIGGRAPH, Blackmagic was showing Fusion 9, its newly upgraded visual effects, compositing, 3D and motion graphics software. Fusion 9 features new VR tools, an entirely new keyer technology, planar tracking, camera tracking, multi-user collaboration tools and more.

Fusion 9 is available now with a new price point — Blackmagic has lowered the price of its Studio version from $995 to $299 Studio Version. (Blackmagic is also offering a free version of Fusion.) The software now works on Mac, PC and Linux.

Those working in VR get a full 360º true 3D workspace, along with a new panoramic viewer and support for popular VR headsets such as Oculus Rift and HTC Vive. Working in VR with Fusion is completely interactive. GPU acceleration makes it extremely fast so customers can wear a headset and interact with elements in a VR scene in realtime. Fusion 9 also supports stereoscopic VR. In addition, the new 360º spherical camera renders out complete VR scenes, all in a single pass and without the need for complex camera rigs.

The new planar tracker in Fusion 9 calculates motion planes for accurately compositing elements onto moving objects in a scene. For example, the new planar tracker can be used to replace signs or other flat objects as they move through a scene. Planar tracking data can also be used on rotoscope shapes. That means users don’t have to manually animate motion, perspective, position, scale or rotation of rotoscoped elements as the image changes.

Fusion 9 also features an entirely new camera tracker that analyzes the motion of a live-action camera in a scene and reconstructs the identical motion path in 3D space for use with cameras inside of Fusion. This lets users composite elements with precisely matched movement and perspective of the original. Fusion can also use lens metadata for proper framing, focal length and more.

The software’s new delta keyer features a complete set of matte finesse controls for creating clean keys while preserving fine image detail. There’s also a new clean plate tool that can smooth out subtle color variations on blue- and greenscreens in live action footage, making them easier to key.

For multi-user collaboration, Fusion 9 Studio includes Studio Player, a new app that features a playlist,
storyboard and timeline for playing back shots. Studio Player can track version history, display annotation notes, has support for LUTs and more. The new Studio Player is suited for customers that need to see shots in a suite or theater for review and approval. Remote synchronization lets artists  sync Studio Players in multiple locations.

In addition, Fusion 9 features a bin server so shared assets and tools don’t have to be copied onto each user’s local workstation.

Foundry’s Nuke and Hiero 11.0 now available

Foundry has made available Nuke and Hiero 11.0, the next major release for the Nuke line of products, including Nuke, NukeX, Nuke Studio, Hiero and HieroPlayer. The Nuke family is being updated to VFX Platform 2017, which includes several major updates to key libraries used within Nuke, including Python, Pyside and Qt.

The update also introduces a new type of group node, which offers a powerful new collaborative workflow for sharing work among artists. Live Groups referenced in other scripts automatically update when a script is loaded, without the need to render intermediate stages.

Nuke Studio’s intelligent background rendering is now available in Nuke and NukeX. The Frame Server takes advantage of available resource on your local machine, enabling you to continue working while rendering is happening in the background. The LensDistortion node has been completely revamped, with added support for fisheye and wide-angle lenses and the ability to use multiple frames to produce better results. Nuke Studio now has new GPU-accelerated disk caching that allows users to cache part or all of a sequence to disk for smoother playback of more complex sequences.

 

 

ILM’s Richard Bluff talks VFX for Marvel’s Doctor Strange

By Daniel Restuccio

Comic book fans have been waiting for over 30 years for Marvel’s Doctor Strange to come to the big screen, and dare I say it was worth the wait. This is in large part because of the technology now available to create the film’s stunning visual effects.

Fans have the option to see the film in traditional 2D, Dolby Cinema (worthy of an interstate or plane fare pilgrimage, in my opinion) and IMAX 3D. Doctor Strange, Marvel Studios’ 15th film offering, is also receiving good critical reviews and VFX Oscar buzz — it’s currently on the list of 20 films still in the running in the Visual Effects category for the 89th Academy Awards.

Marvel Doctor StrangeThe unapologetically dazzling VFX shots, in many cases directly inspired by the original comic visuals by Steve Dittko, were created by multiple visual effects houses, including Industrial Light & Magic, Luma Pictures, Lola VFX, Method Studios, Rise FX, Crafty Apes, Framestore, Perception and previs house The Third Floor. Check out our interview with the film’s VFX supervisor Stephane Ceretti.

Director Scott Derrickson said in in a recent Reddit chat that Doctor Strange is “a fantastical superhero movie.

“Watching the final cut of the film was deeply satisfying,” commented Derrickson. “A filmmaker cannot depend upon critical reviews or box office for satisfaction — even if they are good. The only true reward for any artist is to pick a worthy target and hit it. When you know you’ve hit your target that is everything. On this one, I hit my target.”

Since we got an overview of how the visual effects workflow went from Ceretti, we decided to talk to one of the studios that provided VFX for the film, specifically ILM and their VFX supervisor Richard Bluff.

Richard Bluff

According to Bluff, early in pre-production Marvel presented concept art, reference images and previsualization on “what were the boundaries of what the visuals could be.” After that, he says, they had the freedom to search within those bounds.

During VFX presentations with Marvel, they frequently showed three versions of the work. “They went with the craziest version to the point where the next time we would show three more versions and we continued to up the ante on the crazy,” recalls Bluff.

As master coordinator of this effort for ILM, Bluff encouraged his artists, “to own the visuals and try to work out how the company could raise the quality of the work or the designs on the show to another level. How could we introduce something new that remains within the fabric of the movie?”

As a result, says Bluff, they had some amazing ideas flow from individuals on the film. Jason Parks came up with the idea of traveling through the center of a subway train as it fractured. Matt Cowey invented the notion of continually rotating the camera to heighten the sense of vertigo. Andrew Graham designed the kaleidoscope-fighting arena “largely because his personal hobby is building and designing real kaleidoscopes.”

Unique to Doctor Strange is that the big VFX sequences are all very “self-contained.” For example, ILM did the New York and Hong Kong sequence, Luma did the Dark Dimension and Method did the multi-universe. ILM also designed and developed the original concept for the Eldridge Magic and provided all the shared “digital doubles” — CGI rigged and animatable versions of the actors — that tied sequences together. The digital doubles were customized to the needs of each VFX house.

Previs
In some movies previs material is generated and thrown away. Not so with Doctor Strange. What ILM did this time was develop a previs workflow where they could actually hang assets and continue to develop, so it became part of the shot from the earliest iteration.

There was extensive previs done for Marvel by The Third Floor as a creative and technical guide across the movie, and further iterations internal to ILM done by ILM’s lead visualization artist, Landis Fields.

Warning! Spoiler! Once Doctor Strange moves the New York fight scene into the mirror universe, the city starts coming apart in an M.C. Escher-meets-Chris Nolan-Inception kind of way. To make that sequence, ILM created a massive tool kit of New York set pieces and geometry, including subway cars, buildings, vehicles and fire escapes.

In the previs, Fields started breaking apart, duplicating and animating those objects, like the fire escapes, to tell the story of what a kaleidoscoping city would look like. The artists then fleshed out a sequence of shots, a.k.a. “mini beats.” They absorbed the previs into the pipeline by later switching out the gross geometry elements in Fields’ previs with the actual New York hero assets.

Strange Cam
Landis and the ILM team also designed and built what ILM dubbed the “strange cam,” a custom 3D printed 360 GoPro rig that had to withstand the rigors of being slung off the edge of skyscrapers. What ILM wanted to do was to be able to capture 360 degrees of rolling footage from that vantage point to be used as a moving background “plates” that could be reflected within the New York City glass buildings.

VFX, Sound Design and the Hong Kong
One of the big challenges with the Hong Kong sequence was that time was reversing and moving forward at the same time. “What we had to do was ensure the viewer understands that time is reversing throughout that entire sequence.” During the tight hand-to-hand action moments that are moving forward in time, there’s not really much screen space to show you time reversing in the background. So they designed the reversing destruction sequence to work in concert with the sound design. “We realized we had to move away from a continuous shower of debris toward rhythmic beats of debris being sucked out of frame.”

before-streetafter-street

Bluff says the VFX the shot count on the film — 1,450 VFX — was actually a lot less than Captain America: Civil War. From a VFX point of view, The Avengers movies lean on the assets generated in Iron Man and Captain America. The Thor movies help provide the context for what an Avengers movie would look and feel like. In Doctor Strange “almost everything in the movie had to be designed (from scratch) because they haven’t already existed in a previous Marvel film. It’s a brand-new character to the Marvel world.”

Bluff started development on the movie in October of 2014 and really started doing hands on work in February of 2016, frequently traveling between Vancouver, San Francisco and London. A typical day, working out of the ILM London office, would see him get in early and immediately deal with review requests from San Francisco. Then he would jump into “dailies” in London and work with them until the afternoon. After “nightlies” with London there was a “dailies” session with San Francisco and Vancouver, work with them until evening, hit the hotel, grab some dinner, come back around 11:30pm or midnight and do nightlies with San Francisco. “It just kept the team together, and we never missed a beat.”

2D vs. IMAX 3D vs. Dolby Cinema
Bluff saw the entire movie for the first time in IMAX 3D, and is looking forward to seeing it in 2D. Considering sequences in the movie are surreal in nature and Escher-like, there’s an argument that suggests that IMAX 3D is a better way to see it because it enhances the already bizarre version of that world. However, he believes the 2D and 3D versions are really “two different experiences.”

Dolby Cinema is the merging of Dolby Atmos — 128-channel surround sound — with the high dynamic range of Dolby Vision, plus really comfortable seats. It is, arguably, the best way to see a movie. Bluff says as far as VFX goes, high dynamic range information has been there for years. “I’m just thankful that exhibition technology is finally catching up with what’s always been there for us on the visual effects side.”

During that Reddit interview, Derrickson commented, “The EDR (Extended Dynamic Range) print is unbelievable — if you’re lucky enough to live where an EDR print is playing. As for 3D and/or IMAX, see it that way if you like that format. If you don’t, see it 2D.”

Doctor Strange is probably currently playing in a theater near you, but go see it in Dolby Cinema if you can.


In addition to being a West Coast correspondent for postPerspective, Daniel Restuccio is the multimedia department chair at California Lutheran University and former Walt Disney Imagineer.

Grading & Compositing Storage: Northern Lights

Speed is key for artist Chris Hengeveld.

By Beth Marchant

For Flame artist Chris Hengeveld of Northern Lights in New York City, high-performance file-level storage and a Fibre Channel connection mean it’s never been easier for him to download original source footage and share reference files with editorial on another floor. But Hengeveld still does 80 percent of his work the old-fashioned way: off hand-delivered drives that come in with raw footage from production.

Chris Hengeveld

The bicoastal editorial and finishing facility Northern Lights — parent company to motion graphics house Mr. Wonderful, the audio facility SuperExploder and production boutique Bodega — has an enviably symbiotic relationship with its various divisions. “We’re a small company but can go where we need to go,” says colorist/compositor Hengeveld. “We also help each other out. I do a lot of compositing, and Mr. Wonderful might be able to help me out or an assistant editor here might help me with After Effects work. There’s a lot of spillover between the companies, and I think that’s why we stay busy.”

Hengeveld, who has been with Northern Lights for nine years, uses Flame Premium, Autodesk’s visual effects finishing bundle of Flame and Flare with grading software Lustre. “It lets me do everything from final color work, VFX and compositing to plain-old finishing to get it out of the box and onto the air,” he says. With Northern Lights’ TV-centric work now including a growing cache of Web content, Hengeveld must often grade and finish in parallel. “No matter how you send it out, chances are what you’ve done is going to make it to the Web in some way. We make sure that what we make look good on TV also looks good on the Web. It’s often just two different outputs. What looks good on broadcast you often have to goose a bit to get it to look good on the Web. Also, the audio specs are slightly different.”

Hengeveld provided compositing and color on this spot for Speedo.

Editorial workflows typically begin on the floor above Hengeveld in Avid, “and an increasing number, as time goes by, in Adobe Premiere,” he says. Editors are connected to media through a TerraBlock shared storage system from Facilis. “Each room works off a partition from the TerraBlock, though typically with files transcoded from the original footage,” he says. “There’s very little that gets translated from them to me, in terms of clip-based material. But we do have an Aurora RAID from Rorke (now Scale Logic) off which we run a HyperFS SAN — a very high-performance, file-level storage area network — that connects to all the rooms and lets us share material very easily.”

The Avids in editorial at Northern Lights are connected by Gigabit Ethernet, but Hengeveld’s room is connected by Fibre. “I get very fast downloading of whatever I need. That system includes Mr. Wonderful, too, so we can share what we need to, when we need to. But I don’t really share much of the Avid work except for reference files.” For that, he goes back to raw camera footage. “I’d say bout 80 percent of the time, I’m pulling that raw shoot material off of G-Technology drives. It’s still sneaker-net on getting those source drives, and I don’t think that’s ever going to change,” he says. “I sometimes get 6TB of footage in for certain jobs and you’re not going to copy that all to a centrally located storage, especially when you’ll end up using about a hundredth of that material.”

The source drives are typically dupes from the production company, which more often than not is sister company Bodega. “These drives are not made for permanent storage,” he says. “These are transitional drives. But if you’re storing stuff that you want to access in five to six years, it’s really got to go to LTO or some other system.” It’s another reason he’s so committed to Flame and Lustre, he says. Both archive every project locally with its complete media, which can be then be easily dropped onto an LTO for safe long-term storage.

Time or money constraints can shift this basic workflow for Hengeveld, who sometimes receives a piece of a project from an editor that has been stripped of its color correction. “In that case, instead of loading in the raw material, I would load in the 15- or 30-second clip that they’ve created and work off of that. The downside with that is if the clip was shot with an adjustable format camera like a Red or Arri RAW, I lose that control. But at least, if they shoot it in Log-C, I still have the ability to have material that has a lot of latitude to work with. It’s not desirable, but for better stuff I almost always go back to the original source material and do a conform. But you sometimes are forced to make concessions, depending on how much time or budget the client has.”

A recent spot for IZod, with color by Hengeveld.

Those same constraints, paired with advances in technology, also mean far fewer in-person client meetings. “So much of this stuff is being evaluated on their computer after I’ve done a grade or composite on it,” he says. “I guess they feel more trust with the companies they’re working with. And let’s be honest: when you get into these very detailed composites, it can be like watching paint dry. Yet, many times when I’m grading,  I love having a client here because I think the sum of two is always greater than one. I enjoy the interaction. I learn something and I get to know my client better, too. I find out more about their subjectivity and what they like. There’s a lot to be said for it.”

Hengeveld also knows that his clients can often be more efficient at their own offices, especially when handling multiple projects at once, influencing their preferences for virtual meetings. “That’s the reality. There’s good and bad about that trade off. But sometimes, nothing beats an in-person session.”

Our main image is from NBC’s Rokerthon.

Jon Neill joins Axis as head of lighting, rendering, compositing

Axis Animation in Glasgow, Scotland, has added Jon Neill as their new head of lighting, rendering and compositing (LRC). He has previously held senior positions at MPC and Cinesite, working on such projects as Jungle Book, Skyfall and Harry Potter and the Order of the Phoenix.

His role at Axis will be overseeing the LRC team at both the department and project level, providing technical and artistic leadership across multiple projects and managing the day-to-day production needs.

“Jon’s supervisory skills coupled with knowledge in a diverse range of execution techniques is another step forward in raising the bar in both our short- and long-form projects.” says Graham McKenna, co-founder and head of 3D at Axis.

SGO Mistika now compatible with AJA’s Kona, Corvid

SGO, makers of the color grading/finishing tool Mistika, has partnered with video hardware developer AJA. Mistika is now fully compatible with AJA’s line of Kona and Corvid video capture and playback cards, offering optimized video output. The combination of the latest version of Mistika with the AJA hardware boosts support for extreme video formats, including 4K stereo 3D dual link, even at HFR frame rates up to 60p.

AJA’s Kona capture, display and mastering products for SD, HD, 3G, Dual Link HD, 2K and 4K are a good match with Mistika, which provides a complete post feature set for projects of any practical resolution and frame rate, even beyond 8K. Stereo 3D output in 4K using the Corvid 88 I/O card is already available, along with viable future 8K capabilities for Mistika Ultima 8K systems.

Blackmagic makes Fusion 8 Studio public beta available, releases Resolve 12.2

Fusion 8 Studio, the full version of Blackmagic’s visual effects and motion graphics software, is available for download for both Mac OS X and Windows. A public beta of the free version of Fusion 8 was released earlier this year at SIGGRAPH. The new Fusion 8 Studio public beta builds upon all of the tools in the free version and adds advanced optical flow tools for retiming, image repair, color smoothing and morphing between different images, along with the ability to render at resolutions larger than Ultra HD.

The Fusion 8 Studio public beta also adds advanced stereoscopic tools for converting 2D shows to 3D, support for third-party plug-ins, remote scripting and Avid Connect, a plug-in that allows customers to use Fusion directly from Media Composer timelines.

Projects created with the free version of Fusion can be opened and finished in Fusion 8 Studio, regardless of which platform they were created on. Fusion 8 Studio also includes Generation — multi-user studio software for managing assets, tracking versions and doing shot-based review and approval.

In addition, Fusion 8 Studio public beta also includes render node software that lets customers install an unlimited number of Fusion render nodes on additional computers for free, saving them thousands of dollars in licensing fees. That means customers working on high-end film and television projects in large multi user studios can now accelerate their workflow by distributing render jobs across an unlimited number of systems on their network.

Fusion 8 is available in two versions. Fusion 8 Studio, which is now in public beta, will be available for Mac and Windows for $995, with Linux to be released in Q1 2016. Fusion 8 Studio has all of the same features as the free version and adds advanced optical flow image analysis tools for stereoscopic 3D work, retiming and stabilization. Fusion Studio also includes support for third party OpenFX plug-ins, unlimited distributed network rendering and Generation for studio-wide, multi-user collaboration to track, manage, review and approve shots when working with large creative teams on complex projects.

In other news, there is a free DaVinci Resolve 12.2 update that adds support for the latest color science technologies, along with decoding of HEVC/H.265 QuickTime files on OS X, additional high dynamic range features and more. The DaVinci Resolve 12.2 update is available now for both DaVinci Resolve 12 and DaVinci Resolve 12 Studio customers, and can be downloaded from the Blackmagic Design website.

Resolve

Since November’s release of version 12.1, Blackmagic has been adding features pro editors and colorists need, as well as support for the latest formats with expanded color spaces and wide dynamic range. With this DaVinci Resolve 12.2 update, Blackmagic Design continues to improve the software and extend its lead in color, dynamic range and image processing, putting DaVinci Resolve far ahead of other color correction software.

The DaVinci Resolve 12.2 update adds support for the latest Blackmagic and third-party cameras while also delivering significant improvements to DaVinci Resolve color management. Customers get new support for HDR Hybrid Log Gamma, conversion LUTs for Hybrid Log Gamma, ACES IDTs for Canon C300 Mk II clips, and updated ST 2084 HDR color science. That means colorists have even better tools for finishing high dynamic range projects that are going to be distributed to the latest theaters with the latest projection systems like IMAX Laser and Dolby Vision. This also lets customers prepare content that is ready for next generation HDR 4K televisions.

In addition, the DaVinci Resolve 12.2 update adds support for NewBlue Titler Pro titles using Media Composer AAF sequences, improves ProRes 4444 alpha channel support by defaulting to straight blend mode, retains Power Window opacity and invert settings when converting to Power Curve windows and more.

Quick Chat: ‘Mermaids on Mars’ director Jon V. Peters

Athena Studios, a Bay Area production and animation company, has completed work on a short called Mermaids on Mars, which is based on a children’s book and original music by the film’s producer Nancy Guettier. It was directed by Jon V. Peters and features the work of artists whose credits include the stop-motion offerings Coraline, James and the Giant Peach and The Nightmare Before Christmas, as well as many other feature length films.

The film is about a young boy who is magically transported to Mars, where he tries to stop an evil Martian from destroying the last of the planet’s mermaids. The entire story was told with stop-motion animation, which was shot on Athena Studios‘ (@AthenaStudios) soundstage.

The 24-minute film was comprised of 300 shots. Many involved complex compositing, putting heavy demands on Athena’s small team of visual effects artists who were working within a post schedule of just over three months.

Mermaids on Mars

Kat Alioshin (Coraline, The Nightmare Before Christmas, Monkeybone, Corpse Bride) was co-producer of the film, running stages and animation production. Vince De Quattro (Hellboy, Pirates of the Caribbean, Star Wars, Mighty Joe Young) is the film’s digital post production supervisor.

Let’s find out more from Peters who in addition to directing and producing Mermaids on Mars, is also the founder of Athena Studios.

Why did you decide to create Mermaids on Mars as an animated short?
The decision was budget-driven, primarily. We were originally approached by Nancy Guettier, who is the author of the book the film is based on, and one of the film’s producers. She had originally presented us with a feature length script with 12 songs. Given budgetary restrictions, however, we worked with Nancy and her screenwriter, Jarrett Galante, to cut the film down to a 24-minute short that retained five of her original songs.

What are some of the challenges you faced turning a book into an animated short?
The original book is a charming short story that centers more on mermaids conserving water. The first feature-length script had added many other elements, which brought in Martian armies and a much more detailed and storyline. The biggest problem we had was trying to simplify the story as much as possible without losing the heart of the material. Because of our budget, we were also limited in the number of puppets and the design of our sets.

julian_mars

Are there wrong ways to go about this?
There are hundreds, perhaps thousands, of ways to approach production on a film like this. The only “wrong” way would have been to ignore the budget. As many other films have shown, limitations (financial or otherwise) can breed creativity. If you walk the budget backward it can help you define your approach. The film’s co-producer, Kat Alioshin, had worked on numerous stop-motion features previously, so she had a good handle on what the cost for each element would be.

Describe your thought process for setting the stage for Mermaids on Mars.
Originally, we looked at doing the entire production as more of a 2D stop-motion down shooter design, but the producer really wanted 3D characters. We did not have the budget for full sets however. As we looked at combining a 2D set design with 3D practical stop-motion puppets it took us all the way back to Georges Méliès, the father of visual effects. He was a stage magician and his films made use of flats in combination with his actors. We drew inspiration from his work in the design of our production.

l

While we wanted to shoot as much in-camera as possible we knew that because of the budget we would need to rely almost as much on post production as the production itself. We shot many of the elements individually and then combined them in post. That part of the production was headed up by veteran visual effects artist Vince De Quattro.

What cameras did you use? 
Animation was shot on Canon DSLR cameras, usually 60D, using DragonFrame. The puppeted live-action wave rank shots were done on a Blackmagic Studio Camera in RAW and then graded in DaVinci Resolve to fit with the Canon shots. Live action shots (for the bookends of the film) were shot on Red Epic cameras.

What was used for compositing and animation?

All compositing was done in Adobe After Effects. There was no 3D animation in the film since it was all practical, stop-motion, but the 3D models for the puppet bodies (used for 3D printing and casting) was done in Autodesk Maya.

Was the 2D all hand drawn?
Yes, all 2D was hand drawn and hand painted. We wanted to keep a handmade feel to as many aspects of the film as possible.

How much time did you devote to the set-up and which pieces took the longest to perfect?
It was a fairly quick production for a stop-motion piece. Given the number of stages, shop needs, size of the project and other shoots we had scheduled, we knew we could not shoot it in our main building, so we needed to find another space. We spent a lot of our time looking for the right building, one that met the criteria for the production. Once we found it we had stages set up and running within a week of signing the lease agreement.

Our production designer Tom Proost (Galaxy Quest, Star Wars — The Phantom Menace, Lemony Snicket’s, Coraline) focused on set and prop building of the hero elements, always taking a very “stage-like” approach to each. We had a limited crew so his team worked on those pieces that were used in the most shots first. The biggest pieces were the waves of the ocean, used on both Earth and Mars, a dock set, the young boy’s bedroom, the mermaid palace, the Martian fortress and a machine called the “siphonator.”

GilbertOnDock

Initial builds and animation took approximately six months, and post production took an equal amount of time.

What was your favorite set to work with, and why?
There were many great sets, but I think the wave set that Tom Proost and his team built was my favorite. It was very much a practical set that had been designed as a raked stage with slots for each of the wave ranks. It was manually puppeted by the crew as they pulled the waves back and forth to create the proper movement. That was filmed and then the post production team composited in the mermaid characters, since they could not be animated within the many wave ranks.

You did the post at Athena?
Twenty-four minutes of film with an average of five composited iterations per shot equates to approximately 300,000 frames processed to final, all completed by Athena’s small team under tight festival deadlines.

IBC: Autodesk to release Extension 1 for Flame 2016 line

Autodesk will soon release Extension 1 for its Flame 2016 family of 3D VFX software, which includes Autodesk Flame, Autodesk Flare, Autodesk Lustre and Autodesk Flame Assist. Inspired by user feedback, Autodesk added workflow improvements, new creative tools and a performance boost. Flame 2016 Extension 1 will be available to subscription customers on September 23.

Highlights of the Flame 2016 Extension 1 release are:
– Connected Conform: A new, unified media management approach to sharing, sorting and syncing media across different sequences for faster finishing in Flame Premium, Flame and Flare. New capabilities include shared sources, source sequence, shots sequence, shared segment syncing and smart replace.
– Advanced Performance: Realtime, GPU-accelerated debayering of Red and ArriRaw source media using high-performance Nvidia K6000 or M6000 graphics cards. The performance boost allows artist to begin work instantly in Flame Premium, Flame, Flare and Lustre.
– GMask Tracer: New to Flame Premium, Flame and Flare, this feature simplifies VFX creation with spline-based shape functionality and a chroma-keying algorithm.
– User-Requested Features: Proxy workflow enhancements, new batch context views, refined cache status, full-screen views, redesigned tools page and more.

Behind the Title: Encore VFX’s Robert Minshall

NAME: Robert Minshall

COMPANY: Encore VFX (@encorepost) in Hollywood.

CAN YOU DESCRIBE WHAT ENCORE DOES?
Encore is a post facility that specializes in the picture finishing of episodic television. This includes dailies, online editing, final color and VFX. Encore is a division of Deluxe Entertainment Services.

WHAT’S YOUR JOB TITLE?
Senior Compositor

WHAT DOES THAT ENTAIL?
I create VFX by combining a variety of 2D and 3D elements in a (mostly) 2D environment.

Neverland Stop the Bleeding

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
In my case, it would probably be the things that aren’t included in the title. I have extensive editorial experience dating back to the mid-‘80s, and I tap into those skills on a regular basis, especially when working on NCIS (pictured above). In addition to extensive VFX work, I handle all VFX drop-ins into the masters, drop-in inserts, re-conforms and, occasionally, I even handle minor recuts to better accommodate the VFX.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Finishing difficult shots seamlessly. Of course, if I do it right, the average person would have no idea that anything was done to a particular shot, which is the ultimate objective. Client contact is also a part of the work that I like, which can be quite extensive.

WHAT’S YOUR LEAST FAVORITE?
Working on difficult production fixes that take a lot of time or iteration, with very little payoff.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
That’s a tough question. This is actually my third career. I was an engineer when I got out of college and then spent a number of years pursuing a career in music before ending up in post. At this point, I’d probably be involved in some other aspect of television.

WHY DID YOU CHOOSE THIS PROFESSION?
It’s more like this profession chose me. I graduated with an engineering degree from MIT and moved to LA from Boston to pursue a music career. When that didn’t take off immediately, I found myself working as a materials engineer on the space shuttle program (at which point, I could actually call myself a rocket scientist from MIT).

After a few years, I chose to shift my focus back into music —with moderate success — mostly in the R&B field, working with artists such as Barry White, Deniece Williams, Johnny Nash and the Motown production team of Holland-Dozier-Holland, both recording and touring.

Since the work was sporadic, I also took side jobs to make ends meet, one of which landed me in a video facility at the time when the “video explosion” was the subject of magazine covers. Eventually, I went from wiring a facility to tape-op to editor to senior editor, with extensive visual effects work, to finally the Inferno workstation, where I still am.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
NCIS, NCIS: New Orleans, Under the Dome, Extant, Newsroom, House M.D., Weeds.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
Working on the original The X-Files really helped me build a reputation in the industry. I delivered the pilot personally to Chris Carter and was the finish editor on the show for the first four years. After the fourth season, I jumped to the Inferno, which is where I still am. My involvement with such a wildly popular show provided me with an unusually high profile. I also made significant contributions to Ally McBeal, Deadwood and now NCIS, which are obvious points of pride.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
My smart phone is indispensable; I use it to stay in constant contact with clients. My PC is also important because I am always emailing QuickTime files to producers for notes and approval, as well as doing various searches relevant to my work. Obviously, my workstation is key — without it, I wouldn’t be doing any of this.

DO YOU LISTEN TO MUSIC WHILE YOU WORK? CARE TO SHARE YOUR FAVORITE MUSIC TO WORK TO?
Sometimes I listen to music but other times it gets in the way. I like everything from classical music to ‘60s rock and ‘70s R&B. I have a penchant for female vocalists, and it’s a great time for them – Rihanna, Katy Perry, Beyonce, Kelly Clarkson, etc., along with a number of more obscure ones. Taylor Swift is also hard to ignore. I enjoy Prince as well and have a soft spot for Sly and the Family Stone. As far as I’m concerned, he pretty much invented funk, which, for a time, was a large part of my life. Folk, blues, guitar-driven rock, even some hip-hop if it’s good. There isn’t much I don’t like.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I’ve delivered hundreds of TV shows either the day before or the day of air, so tight deadlines are just part of what I do, and it doesn’t stress me out the way you might think. It’s just the end product of what I do.

Outside of work, I spend time with my wife. We’ve been together for 37 years and I love her as much as I ever did. I enjoy going to the beach and riding waves. I play some golf and try to play at least an hour of music every day, either piano (mostly classical) or guitar. I love to travel when I can. I am never at a loss for things to do.

Free public beta of Fusion 8 now available for Mac and PC

The public beta of the free version of Blackmagic’s Fusion 8, the company’s visual effects and motion graphics software, is now available for download from the Blackmagic Design website. This beta is for the free version of Fusion 8 and is available for both Mac OS X and Windows.

A beta for the paid version, Fusion 8 Studio, which adds stereoscopic 3D tools and is designed for multi-user workgroups and larger studios, will be available shortly. However, current Fusion Studio customers can download the public beta for the free version of Fusion 8 and start using it today.

stereoscopic@2x

This public beta is also the first-ever Mac compatible release of Fusion, which was previously a Windows-only product. In addition, projects can be easily moved between Mac and Windows versions of Fusion so customers can work on the platform of their choice.

In the six months since Fusion 8 was launched at NAB there have been many improvements to the user interface — it features a more modern look. There will be many more improvements to the user interface as the Fusion engineering teams continue to work with the visual effects community.

Featuring a node-based interface, Fusion makes it easy to build high-end visual effects compositions very quickly. Nodes are small icons that represent effects, filters and other image processing operations that can be connected together in any order to create unlimited visual effects. Nodes are laid out logically like a flow chart, so customers won’t waste time hunting through nested stacks of confusing layers with filters and effects. With a node-based interface, it’s easy to see and adjust any part of a project in Fusion by clicking on a node.

interface-01@2xWith a massive toolset consisting of hundreds of built in tools, customers can pull keys, track objects, rotoscope, retouch images, animate titles, create amazing particle effects and much more, all in a true 3D workspace. Fusion can also import 3D models, point cloud data, cameras or even entire 3D scenes from Maya, 3ds Max or LightWave and render them seamlessly with other elements. Deep pixel tools can be used to add volumetric fog, lighting and reflection mapping of rendered objects using world position passes so customers can create amazing atmospheric effects that render in seconds, instead of hours.

Fusion has been used on thousands of feature film and television projects, including Thor, Edge of Tomorrow, The Hunger Games trilogy, White House Down, Battlestar Galactica and others.

Killer wasps and an ’80s look for horror flick ‘Stung’

Rat Pack Films and XYZ Films movie Stung is an homage to the golden age of VHS, featuring a campy look of the 1980’s horror film genre. It’s proof positive that monster movies still exist in the world of low-budget horror/comedy.

They used an Arri Alexa 2K with Hawk anamorphic lenses to shoot the film. The anamorphic lenses produced a distinctive intensity that the filmmakers felt helped with the strong color definition needed to achieve a 1980’s look and feel.

Stung focuses on a fancy garden party gone terribly wrong when a colony of killer wasps mutates into seven-foot tall predators.

German-based freelance colorist Peter Hacker custom built — from top to bottom — his own PC-based compositing/grading workstation, equipped with an Nvidia 760GTX, an internal hardware RAID for storage and some SSDs for realtime playback. The calibrated NEC LCD monitors are supported by a Sony OLED screen in order to accurately judge the final color grading.

Peter Hacker

Hacker (pictured above)  has a strong background in visual effects and compositing, which is why he was hired to be the VFX producer and compositor, as well as colorist — more on that in a minute — for Stung (see the trailer here). Hacker has several years experience in color grading, working on numerous commercials for Mercedes, Audi and Fanta; a few indie features; and many shorts.

In collaboration with director Benni Diez and VFX supervisor Sebastian Nozon, he took over the post-production management of the movie. He was also in charge of preparing all the footage for the VFX shots and handed it over to the remotely working animation, rigging, modeling and compositing artists.

He also developed the movie’s look and was involved in the compositing of more than a hundred shots. However, schedule conflicts with the original color grading team required a new plan, and Hacker took on the color grading and finishing of the film as well.

Hacker’s weapon of choice for his post work on Stung was Assimilate Scratch. “As a student I had worked in Scratch at Filmakademie Baden-Württemberg and really dug into fully learning the system. I found it to be the most straightforward tool suite, and I still feel that way. As a freelancer working on a variety of imagery projects, it has all the realtime functions I need — conform, color grading, versioning and finishing – as well as some bells and whistles like VR capability. And it’s now at a price I can personally afford, which means that I, as well as all indie productions, can set up an at-home studio and have a second license on the laptop for hitting the road.”

“For Stung I created different looks during the first two days of grading because I didn’t have LUTs as a reference. It’s easy to create multiple looks for review in Scratch, and those LUTs are now in my archive for possible future use. Then I created a separate Scratch Construct (timeline) with all the movie’s master shots to ensure the look would work and to allow me to track the changes within the story, which were bound to occur due to changes of the seasons and different weather/lighting conditions within a sequence.”

Working Remotely, and on a Budget
The horror film genre is synonymous with low-budget production, so there was not a lot of wiggle room, which meant they had to get creative with workflows, especially since they were working and reviewing shots remotely.

The finishing team.

The finishing team at work.

The movie was shot in Berlin.  Dominik Kattwinkel and Benni Diez edited on Avid Media Composer in Cologne. Also working in Cologne were animators Waldemar and Harry Fast. Sebastian Nozon and Sascha Geddert did the compositing, lighting and rendering in Berlin. “I was doing parts of the compositing and finally graded the entire movie in Ludwigsburg,” explains Hacker. “To make all the data transfer possible among those numerous locations we used BTSync, which kept us all in sync without a hassle.” The Foundry’s Nuke and Adobe After Effects were used for compositing and Autodesk 3ds Max and Maya for 3D animation and rendering.

During editing, the number of visual effects shots increased from 150 to 600. “I had 8TB of storage for the Alexa material and some Red footage from the pick-up shoot. There were 1,600 edits in the film that runs for 84 minutes, so that gives you an idea of the project’s heavy workload — and all while being on a tight budget,” explains Hacker. “To ensure the data’s safety we had back-up RAIDs set up at several locations spread over the country. Furthermore, we separated the data being worked on from the back-ups, and scheduled the day’s work to back up during the night.”

“With a couple weeks left until delivery, the rendered shots (JPEGs, in the end replaced by DPXs) were transferred from Berlin and Cologne to me in Ludwigsburg where I dropped them into the Scratch timeline. With peer-to-peer uploaded previews of the film, or just smaller sequences, we all were continually on the same page.”

They used Skype for review conversations. “Two weeks before delivery we all came together in a small office space in Ludwigsburg to finish the compositing. At that time I switched from compositing to color grading for 12 straight days in a darkened tent in the corner of the room. It was a cheerful time with all of us finally sharing the same space and adding some final touches and even bringing some sequences to life for the first time. For viewing pleasure, I brought in my 55-inch Sony TV for a few relaxed reviews, which also sped up the process and helped to keep the budget in line.”

before after

These sessions included director Benni Diez watching back to back with Hacker. “It was very helpful that he could view and judge the color grading in realtime on a separate monitor without the need to watch over my shoulder all the time,” he says. “It was also crucial for all the VFX shots — Diez and Nozon immediately could discuss how they looked with the grading applied. It’s always a big challenge when it comes to CGI content being integrated into live action back plates. With the different nature of the content, they either fit together even better after the grading is applied, or not. Once in a while we had shots working completely fine in comps, but they got torn apart in the grading. Altogether, it was a magical experience to see all the elements come together right before your eyes, and literally any changes could be made on the fly.”

Hacker says one particularly helpful  Scratch feature, which he used a lot on Stung, was the ability to continue working while Scratch was rendering in the background. “That’s a huge timesaver and I wouldn’t like to work without it.”

Chaos Group shows V-Ray for Nuke at SIGGRAPH 2015

Chaos Group’s V-Ray for Nuke is a new tool for lighting and compositing that integrates production-quality ray-traced rendering into the company’sNuke,NukeX, and NukeStudio products. V-Ray for Nuke enables compositors to take advantage of V-Ray’s lighting, shading and rendering tools inside NUKE’s node-based workflow.

V-Ray forNuke brings the same technology used on Game of Thrones, Avengers: Age of Ultron, and other film, commercial and television projects to professional compositors.

Built on the same adaptive rendering core as V-Ray’s plugins for Autodesk 3ds Max and Maya, V-Ray for Nuke is designed for production pipelines. V-Ray forNuke gives compositors the ability to adjust lighting, materials and render elements up to final shot delivery. Full control of 3D scenes in Nuke lets compositors match 2D footage and 3D renders simultaneously, saving time for environments and set extension work. V-Ray for Nuke includes a range of features for rendering and geometry with 36 beauty, matte and utility render elements, as well as effects for lights, cameras, materials, and textures.

Review: Blackmagic Design’s Fusion 7.6 Studio

This compositing pro gives us an overview of the newest version.

By Joël Gibbs

Blackmagic made big news with its acquisition of Eyeon last year, and it didn’t take long for them to rebrand the software and make the tool its own. The acquisition happened in mid September, and by December 19, Fusion 7.6 Studio was shipping.

Personally, I was pretty excited about the acquisition. Those of us that have been using Fusion for a few years (I’m pretty recent, picking it up about five or six years ago) were worried about the product’s future. Fusion the software was still great, but there were marketing issues, and third-party developers were pulling away from it. Now Blackmagic has put Fusion back in the Continue reading

Making ‘Being Evel’: James Durée walks us through post

Compositing played a huge role in this documentary film.

By Randi Altman

Those of us of a certain age will likely remember being glued to the TV as a child watching Evel Knievel jump his motorcycle over cars and canyons. It felt like the world held its collective breath, hoping that something horrible didn’t happen… or maybe wondering what it would be like if something did.

Well, Johnny Knoxville, of Jackass and Bad Grandpa fame, was one of those kids, as witnessed by, well, his career. Knoxville and Oscar-winning filmmaker Daniel Junge (Saving Face) combined to make Being Evel, a documentary on the daredevil’s life and career. Produced by Knoxville’s Dickhouse Productions (yup, that’s right) and HeLo, it premiered at Sundance this year.

Continue reading

Five Adobe After Effects Shortcuts

By Brady Betzel

As an editor, most of my day is spent inside of Avid Media Composer, but occasionally I will get to turn on my Spotify, groove to the music and crank out some Adobe After Effects or Maxon Cinema 4D work. Over the years I’ve found some shortcuts within After Effects that make my job easier, and I wanted to share five of my favorites… from an editor’s perspective.

Double Click in the project window to import an asset
When importing assets into an Adobe After Effects project I often see people do the archaic: File > Import. Instead, if you just double click in the Project Window you will save yourself a few steps. Simple, but I see it all the time.

Tilde key (`) to make full screen
Continue reading

Behind the Title: Company 3 Smoke artist Matthew Johnson

NAME: Matthew Johnson (@MattJ678​)

COMPANY: Los Angeles-based-Company 3 (@Company3)

CAN YOU DESCRIBE YOUR COMPANY?
Company 3 provides high-end post services to feature film, commercial, music video and television clients. Our services include all aspects of post production, including color correction, editorial finishing and some visual effects compositing.

WHAT’S YOUR JOB TITLE?
Smoke Artist

WHAT DOES THAT ENTAIL?
Continue reading

King and Country shoots, posts Ford Transit spot for Team Detroit

To help promote the 2015 Ford Transit utility van, agency Team Detroit tapped King and Country (K&C) to produce a 30-second spot, 9 to 5’ers, which mixes 3D animation, design and live-action footage.

K&C’s concept was to illustrate how a variety of professionals can use the Transit models — from contractors to deliverymen to IT specialists, etc. The spot shows how different workers and companies can customize these vans to suit their needs. “You don’t drive to an office, your van is your office,” explains the voiceover.

“Combining live-action and CG allowed for the best coverage of the Transit, inside and out,” explains K&C partner/director Efrain Montanez. “The key to transitioning from scene to scene was keeping the tempo of the pod movements dynamic, which we achieved with a range of zooms and perspective shifts, and evenly proportioned so you seamlessly experience the singular flexibility of the model.”

04_transit_small06_transit_small

According to Paul Kirner and Dan Weber, creative directors at Team Detroit, “For us, the key was collaboration. Our commercial was intricate, fast-paced and CG-intense. We needed a partner with the design chops to create something beautiful and real, and the communication skills to make sure every detail was nailed. King and Country worked in perfect sync with us, from early concept boards all the way through final post.”

“By varying the van colors and transforming the interiors in CG, as well as the graphic aesthetics, we were able to express the immense versatility of the Transit for the various occupations featured in the spot,” says Montanez.

The 2D graphics unfold on an orange, black and white palette. These two elements were layered with the flexibility to stand alone or integrate with the 3D world. The K&C team also designed fictional business logos for the different vans.

K&C shot the spot, using a Red Epic camera over two-days at a soundstage in LA, using both greenscreen and practical sets, including one where a cross-section of a man cave was flooded with 3,000 gallons of water.

Interior and exterior van details, props, and talent were captured in-camera and augmented in CG. Lighting was used to create the realistic look of the vehicles within the varied environments. Rather than studio lighting, K&C used warm natural light, which pops from the graphic background.

Tools used by K&C included Gazelle Motion Control, Autodesk Maya 3D software and Adobe After Effects for compositing.

 

IBC: Blackmagic acquires eyeon, shows new products

Those of us here in Amsterdam woke up this morning to the news that Blackmagic Design acquired eyeon Software, the Toronto-based makers of high-end digital compositing, VFX and motion graphics software used in features, spots, TV and broadcast. eyeon, run and founded by longtime industry vet Steve Roberts, is now a wholly owned subsidiary of Blackmagic Design.

Fusion 7, being demonstrated at the Blackmagic stand at IBC, has been used on thousands of feature film and television projects. Most recently, Fusion 7 has been used on feature films like Maleficent, Edge of Tomorrow, Sin City: A Dame to Kill ForThe Amazing Spiderman 2, Captain America, Gravity and more.

Continue reading

Quick Chat: Northern Lights’ Chris Hengeveld

By Randi Altman

Last fall I had the pleasure of moderating a panel on color grading during the CCW Show in New York City. The panel was made up of dedicated colorists. One of them, Chris Hengeveld, is more than just a colorist: he also provides compositing and visual effects via a variety of Autodesk tools.

Hengeveld is a post production veteran who started out as an assistant editor back in 1983 at a company called Creative Technology in Akron, Ohio. In 1986, he moved to New York and landed a job as editor at National Video Center, where he began work on an early version of Smoke. While there he continued editing but also started providing graphic compositing for the likes of MTV and VH-1. In 2002 he moved on to Sony Recording Studios, where has was a senior Smoke Continue reading

Review: Adobe Creative Cloud 2014

By Brady Betzel

When I got the call to review the latest release of Adobe Creative Cloud for postPerspective, I almost jumped out of my skin with excitement. I am a big fan of Adobe tools in addition to how they handle their social media and customer outreach. You can submit a feature request and it seems like Adobe addresses it instantly.

About a year ago I asked another company for features such as higher than 1920×1080 projects and there still is no answer. On Twitter you can see @AdobeAE or @AdobePremiere answer technical, support, or even feature request questions. Long story short, they really seem to care about their products and the people who use them.

For this review I’m focusing on the video and motion graphics side of CC — Adobe Premiere and Adobe After Effects — but I will lightly touch on some of the other products like Continue reading

Eyeon updates Fusion to version 7

Eyeon Software has updated its compositing engine to Fusion 7, featuring core updates that significantly increase speed and efficiencies.

Fusion 7’s 3D system and renderer import geometry from FBX and Alembic, as well as OBJ, 3DS and Collada. Millions of polygons, complex Shaders, Ambient Occlusion, Deep Volumetric Atmospherics, Particles Systems and other toolsets, are now all final rendered with advanced optimization for GPUs, benchmarking in seconds instead of hours.

Fusion 7 offers the ability to have multiple 3D renderers all in one project, all integrated and rendering different aspects from the same scene. Generating Deep Passes, such as World Position, Normals, UV and Velocity, with Fusion’s flexibility to combine 3D and 2D in a single workflow, is a significant demarcation point from other applications.

Productivity and workflow are streamlined further with automation tools and enhanced rendering. Fusion 7’s built-in Render Manager, and the Scripting engine that supports Python 2.x and 3.x, and Lua, are supported with scripts that ship as part of Fusion 7. Developing Macro tools, managing and sharing tools, jobs, and footage, are now part of Fusion 7’s design for studio-wide use via the integrated Bin System.

Fusion can have multiple projects open at the same time, with cut and paste abilities between comps adding to the integrated environment design. It is now a matter of seconds to test one comp while rendering another. The Integrated Script Debug Console works for Python and Lua to step through code, set breakpoints, and have multiple scripts open at the same time.

Just-In-Time Compiling for Fuses and OpenCL tools gives the ability to develop sophisticated tools without having a C++ development environment. Immediately compile and use these tools while working in the Fusion comp. These are multithreaded for
speed, and OpenCL is GPU accelerated.

Fusion 7 updates and enhancements include:
• Animation Indicators
• Drag and Drop Layout
• User Interface Templates
• Learning Environment
• Multi Projects/Documents
• Connected Node Position and Prediction
• Templates
• Native Camera Support
• Screen Space Ambient Occlusion
• 3D Custom Vertex
• Alembic Import
• Latest FBX Library
• Replace Normals 3D
• 3D Interactive Splines
• 3D Ribbon
• UV Render and Super Sampling
• 3D Text Bevel Shaper
• Dimension – Optical Flow and Stereoscopic Tools
• Just-In-time Compiling
• Script Development Interface
• Linear Light Color/Open Color IO
• Deep Volume Processing
• Roto Onion Skinning

 

 

Review: FXHome’s HitFilm 2 Ultimate, HitFilm Plug-ins for indies

By Brady Betzel

Over the past few months, I’ve been hearing a lot of buzz about what FXHome is doing with its products. It interests me because recently I’ve been doing more YouTube-based work and side projects, and many of my industry friends have that budget-strapped “passion project” they are working on.

In addition, we all know that the traditional “editor” role is being superseded by the “editor/VFX/compositor” role, so the more you know — whether you are just starting out or a veteran learning something new — the more valuable you become.

All of this has left me very interested in seeing an offering that combines editing, VFX and compositing in one package. That is where FXHome’s HitFilm 2 Ultimate comes in. And my timing couldn’t have been better. When I contacted them about reviewing their HitFilm Continue reading

SGO at NAB with updates to Mistika, Mambo FX

Madrid, Spain — At the NAB Show, in Las Vegas, SGO unveiled its new Mistika range — Mistika Post, Mistika Optima and Mistika Ultima — in Version 8. Mistika includes realtime grading, compositing and editing tools in one system.

New Mistika V.8 features include a spatial keyer with the ability to allow grading selections or keying mattes to be derived from CGI object metadata; an all-new node-based intuitive compositing interface; support for Canon’s RMF; Sony’s XAVC; enhanced AAF support; ProRes 4:4:4:4 file formats; a re-branding output render module; Dolby Atmos DCP Generation and realtime playback DPX RGB 10bit 4K at 60p.

SGO also announced that Mistika will support the Precision grading panel made by Digital Vision.

Mistika Air, a version of Mistika tailored for broadcasters of HD, Ultra HD, 8K productions and beyond was shown. Mistika also offers 4K HFR modes at 59p and 60p speeds for the UHD market, supported through the QuadSDI standard. Mistika Air can now render to Sony XAVC.

Making its first US appearance was Mamba FX, which SGO launched at IBC 2013. It offers unlimited compositing layers and effects features a node-based graphic interface. It runs on Linux and Windows and will soon be available on Mac.

Mamba FX can also be extended with OFX plug-ins or additional options from SGO. These options include DCP creation and access to SGO’s stereo correction tools.

Mamba FX incorporates a variety of realistic effects and filters, including SGO’s “optical flow” technology. Recently GenArts Sapphire 7, was added offering compositors over 250 new effects and pre-sets.

FuseFX strengthens VFX pipeline, preps for end-to-end post

Burbank — FuseFX is busy 24/7 these days, providing visual effects for television, film, and commercials. Their credit list is impressive — they are the primary effects house for Disney/Marvel’s Agents of S.H.I.E.L.D; they worked on all three seasons of American Horror Story; and provided effects for Hell on Wheels (for which they are nominated in this year’s VES awards for Outstanding Created Environment), Criminal Minds, and Glee, to name a few.

Continue reading

Digital Film Tools: Composite Suite Pro

By Brady Betzel

Editing reality television often forces an editor to wear multiple hats. Most importantly they edit, but increasingly they are being asked to provide basic as well as advanced effects, too. Sometimes they will create effects within their NLE, or sometimes they will be asked to work in After Effects, Motion, etc.

Continue reading

SGO offers free open Mamba FX beta

 

MADRID – SGO’s compositing software Mamba FX, which was introduced at IBC last month, is now available for free evaluation in beta via the company’s Website www.sgo.es/shop. Mamba FX, which will ultimately cost $299 US, can be used by independents as well as TV and studio productions. It runs under Windows and is available  in a variety of PC configurations.

MADRID – SGO’s compositing software Mamba FX, which was introduced at IBC last month, is now available for free evaluation in beta via the company’s Website www.sgo.es/shop. Mamba FX, which will ultimately cost $299 US, can be used by independents as well as TV and studio productions. It runs under Windows and is available  in a variety of PC configurations.

As well as offering  an entire visual effects suite with keying, tracking, painting and restoration, Mamba FX can also extend its feature-set as a fully OFX-compliant plug-in host. Using a new intuitive node-based graphic interface, SGO reports that Mamba FX offers unlimited compositing layers and effects. Its compositing “trees” also generate plain text files that describe the chain of processes that are scripted and manipulated to automate functions and workflows. In addition, Mamba FX can run other SGO feature options (at additional cost) including its stereo 3D toolset, for shot by shot-based corrections, and DCP creation.

MAMBA FX Screengrab1

The same software optimization that is used in SGO’s Mistika product exists inside Mamba FX, allowing effects to be constructed and reviewed either directly in realtime or processed at a fast rendering pace. According to the company, this is possible thanks to SGO’s experience of super-charging their algorithms through efficient programming and extensive use of Nvidia GPU graphics boards.