Tag Archives: David Cox

Review: Blackmagic Resolve 14

By David Cox

Blackmagic has released Version 14 of its popular DaVinci Resolve “color grading” suite, following a period of open public beta development. I put color grading in quotes, because one of the most interesting aspects about the V14 release is how far-reaching Resolve’s ambitions have become, beyond simply color grading.

Fairlight audio within Resolve.

Prior to being purchased by Blackmagic, DaVinci Resolve was one of a small group of high-end color grading systems being offered in the industry. Blackmagic then extended the product to include editing, and Version 14 offers several updates in this area, particularly around speed and fluidity of use. A surprise addition is the incorporation of Fairlight Audio — a full-featured audio mixing platform capable of producing feature film quality 3D soundscapes. It is not just an external plugin, but an integrated part of the software.

This review concentrates on the color finishing aspects of Resolve 14, and on first view the core color tools remain largely unchanged save for a handful of ergonomic improvements. This is not surprising given that Resolve is already a mature grading product. However, Blackmagic has added some very interesting tools and features clearly aimed at enabling colorists to broaden their creative control. I have been a long-time advocate of the idea that a colorist doesn’t change the color of a sequence, but changes the mood of it. Manipulating the color is just one path to that result, so I am happy to see more creatively expansive facilities being added.

Face Refinement
One new feature that epitomizes Blackmagic’s development direction is the Face Refinement tool. It provides features to “beautify” a face and underlines two interesting development points. Firstly, it shows an intention by the developers to create a platform that allows users to extend their creative control across the traditional borders of “color” and “VFX.”

Secondly, such a feature incorporates more advanced programming techniques that seek to recognize objects in the scene. Traditional color and keying tools simply replace one color for another, without “understanding” what objects those colors are attached to. This next step toward a more intelligent diagnosis of scene content will lead to some exciting tools and Blackmagic has started off with face-feature tracking.

Face Refinement

The Face Refinement function works extremely well where it recognizes a face. There is no manual intervention — the tool simply finds a face in the shot and tracks all the constituent parts (eyes, lips, etc). Where there is more than one face detected, the system offers a simple box selector for the user to specify which face to track. Once the analysis is complete, the user has a variety of simple sliders to control the smoothness, color and detail of the face overall, but also specific controls for the forehead, cheeks, chin, lips, eyes and the areas around and below the eyes.

I found the face de-shine function particularly successful. A light touch with the controls yields pleasing results very quickly. A heavy touch is what you need if you want to make someone look like an android. I liked the fact that you can go negative with some controls and make a face look more haggard!

In my tests, the facial tracking was very effective for properly framed faces, even those with exaggerated expressions, headshakes and so on. But it would fail where the face became partially obscured, such as when the camera panned off the face. This led to all the added improvements popping off mid shot. While the fully automatic operation makes it quick and simple to use, it affords no opportunity for the user to intervene and assist the facial tracking if it fails. All things considered though, this will be a big help and time saver for the majority of beauty work shots.

Resolve FX
New for Resolve 14 are a myriad of built-in effects called Resolve FX, all GPU-accelerated and available to be added in the edit “page” directly to clips, or in the color page attached to nodes. They are categorized into Blurs, Light, Color, Refine, Repair, Stylize, Texture and Warp. A few particularly caught my eye, for example in “color,” the color compressor brings together nearby colors to a central hue. This is handy for unifying colors of an unevenly lit client logo into their precise brand reference, or dealing with blotchy skin. There is also a color space transform tool that enables LUT-less conversion between all the major color “spaces.”

Color

The dehaze function derives a depth map by some mysterious magic to help improve contrast over distance. The “light” collection includes a decent lens flare that allows plenty of customizing. “Styles” creates watercolor and outline looks while Texture includes a film grain effect with several film-gauge presets. I liked the implementation of the new Warp function. Rather than using grids or splines, the user simply places “pins” in the image to drag certain areas around. Shift-adding a pin defines a locked position immune from dragging. All simple, intuitive and realtime, or close to it.

Multi-Skilled and Collaborative Workflows
A dilemma for the Resolve developers is likely to be where to draw the line between editing, color and VFX. Blackmagic also develops Fusion, so they have the advanced side of VFX covered. But in the middle, there are editors who want to make funky transitions and title sequences, and colorists who use more effects, mattes and tracking. Resolve runs out of ability in these areas quite quickly and this forces the more adventurous editor or colorist into the alien environment of Fusion. The new features of Resolve help in this area, but a few additions to Resolve, such as better keyframing of effects and easier ability to reference other timeline layers in the node panel could help to extend Resolve’s ability to handle many common VFX-ish demands.

Some have criticized Blackmagic for turning Resolve into a multi-discipline platform, suggesting that this will create an industry of “jack of all trades and masters of none.” I disagree with this view for several reasons. Firstly, if an artist wants to major in a specific discipline, having a platform that can do more does not impede them. Secondly, I think the majority of content (if you include YouTube, etc.) is created by a single person or small teams, so the growth of multi-skilled post production people is simply an inevitable and logical progression which Blackmagic is sensibly addressing.

Edit

But for professional users within larger organisations, the cross-discipline features of Resolve take on a different meaning when viewed in the context of “collaboration.” Resolve 14 permits editors to edit, colorists to color and sound mixers to mix, all using different installations of the same platform, sharing the same media and contributing to the same project, even the same timeline. On the face of it, this promises to remove “conforms” and eradicate wasteful import/export processes and frustrating compatibility issues, while enabling parallel workflows across editing, color grading and audio.

For fast-turnaround projects, or projects where client approval cannot be sought until the project progresses beyond a “rough” stage, the potential advantages are compelling. Of course, the minor hurdle to get over will be to persuade editors and audio mixers to adopt Resolve as their chosen weapon. If they do, Blackmagic might well be on the way to providing collaborative utopia.

Summing Up
Resolve 14 is a massive upgrade from Resolve 12 (there wasn’t a Resolve 13 — who would have thought that a company called Blackagic might be superstitious?). It provides a substantial broadening of ability that will suit both the multi-skilled smaller outfits or fit as a grading/finishing platform and collaborative backbone in larger installations.


David Cox is a VFX compositor and colorist with 20-plus years of experience. He started his career with MPC and The Mill before forming his own London-based post facility. Cox recently created interactive projects with full body motion sensors and 4D/AR experiences.

Review: Blackmagic’s Fusion 9

By David Cox

At Siggraph in August, Blackmagic Design released a new version of its compositing software Fusion. For those not familiar with Fusion, it is a highly flexible node-based compositor that can composite in 2D and 3D spaces. Its closest competitor is Nuke from The Foundry.

The raft of new updates in Version 9 could be categorized into one of two areas: features created in response to user requests, and a set of tools for VR. Also announced with the new release is a price drop to $299 for the full studio version, which, judging by global resellers instantly running out of stock (Fusion ships via dongle), seems to have been a popular move!

As with other manufacturers in the film and broadcast area, the term “VR” is a little misused as they are really referring to “360 video.” VR, although a more exciting term, would demand interactivity. That said, as a post production suite for 360 video, Fusion already has a very strong tool set. It can create, manipulate, texture and light 3D scenes made from imported CGI models and built-in primitives and particles.

Added in Version 9 is a spherical camera that can capture a scene as a 360 2D or stereo 3D image. In addition, new tools are provided to cross-convert between many 360 video image formats. Another useful tool allows a portion of a 360-degree image to be unwrapped (or un-distorted) so that restoration or compositing work can be easily carried out on it before it is perfectly re-wrapped back into the 360-degree image.

There is also a new stabilizer for 360 wrap-around shots. A neat feature is that Fusion 9 can directly drive VR headsets such as Oculus Rift. Within Fusion, any node can be routed to any viewing monitor and the VR headset simply presents itself as an extra one of those.

Notably, Blackmagic has opted not to tackle 360-degree image stitching — the process by which images from multiple cameras facing in different directions are “stitched” together to form a single wrap-around view. I can understand this — on one hand, there are numerous free or cheap apps that perform stitching and so there’s no need for Blackmagic to reinvent that wheel. On the other hand, Blackmagic targets the mass user area, and given that 360 video production is a niche activity, productions that strap together multiple cameras form an even smaller and decreasing niche due to the growing number of single-step 360-degree cameras that provide complete wrap-around images without the need for stitching.

Moving on from VR/360, Fusion 9 now boasts some very significant additional features. While some Fusion users had expressed concerned that Blackmagic was favoring Resolve, in fact it is now clear that the Fusion development team have been very busy indeed.

Camera Tracker
First up is an embedded camera tracker and solver. Such a facility aims to deduce how the original camera in a live-action shoot moved through the scene and what lens must have been on it. From this, a camera tracker produces a virtual 3D scene into which a compositor can add objects that then move precisely with the original shot.

Fusion 9’s new camera tracker performed well in tests. It requires the user to break the process down into three logical steps: track, refine and export. Fusion initially offers auto-placed trackers, which follow scores of details in the scene quite quickly. The operator then removes any obviously silly trackers (like the ones chasing around the moving people in a scene) and sets Fusion about the task of “solving” the camera move.

Once done, Fusion presents a number of features to allow the user to measure the accuracy of the resulting track and to locate and remove trackers that are adversely affecting that result. This is a circular process by which the user can incrementally improve the track. The final track is then converted into a 3D scene with a virtual camera and a point cloud to show where the trackers would exist in 3D space. A ground plane is also provided, which the user can locate during the tracking process.

While Fusion 9’s camera tracker perhaps doesn’t have all the features of a dedicated 3D tracker such as SynthEyes from Andersson Technologies, it does satisfy the core need and has plenty of controls to ensure that the tool is flexible enough to deal with most scenarios. It will certainly be received as a welcome addition.

Planar Tracker
Next up is a built-in “planar” tracker. Planar trackers work differently than classic point trackers, which simply try to follow a small area of detail. A planar tracker follows a larger area of a shot, which makes up a flat plane — such as a wall or table top. From this, the planar tracker can deduce rotation, location, scale and perspective.

Fusion 9 Studio’s new planar tracker also performed well in tests. It assessed the track quickly and was not easily upset by foreground objects obscuring parts of the tracked area. The resulting track can either be used directly to insert another image into the resulting plane or to stabilize the shot, or indirectly by producing a separate Planar Transform node. This is used to warp any other asset such as a matte for rotoscoping work.

Inevitably, any planar tracker will be compared to the long-established “daddy” of them all, Mocha Pro from Boris FX. At a basic level, Fusion’s planar tracker worked just as well as Mocha, creating solid tracks from a user-defined area nicely and quickly. However, I would think that for complex rotoscoping, where a user will have many roto layers, driven by many tracking sources, with other layers acting as occlusion masks, Mocha’s working environment would be easier to control. Such a task would lead to many, many wired up nodes in Fusion, whereas Mocha would present the same functions within a simper layer-list. Of course, Mocha Pro is available as an OFX plug-in for Fusion Studio anyway, so users can have the best of both worlds.

Delta Keyer
Blackmagic also added a new keyer to Fusion called the Delta Keyer. It is a color difference keyer with a wide range of controls to refine the resulting matte and the edges of the key. It worked well when tested against one of my horrible greenscreens, something I keep for these very occasions!

The Delta Keyer can also take a clean plate as a reference input, which is essentially a frame of the green/bluescreen studio without the object to be keyed. The Delta Keyer then uses this to understand which deviations from the screen color represent the foreground object and which are just part of an uneven screen color.

To assist with this process, there is also a new Clean Plate node, which is designed to create an estimate of a clean plate in the absence of one being available from the shoot (for example, if the camera was moving). The combination of the clean plate and the Delta Keyer produced good results when challenged to extract subtle object shadows from an unevenly lit greenscreen shot.

Studio Player
Studio Player is also new for Fusion 9 Studio; it’s a multi-station shot review tool. Multiple versions of clips and comps can be added to the Studio Player’s single layer timeline, where simple color adjustments and notes can be added. A neat feature is that multiple studio players in different locations can be slaved together so that cross-facility review sessions can take place, with everyone looking at the same thing at the same time, which helps!

Fusion 9 Studio also supports the writing of Apple-approved Pro Res from all its supported platforms, including Windows and Linux. Yep – you read that right. Other format support has also been widened and improved, such as faster native handling for DNxHR codecs, for example.

Summing Up
All in all, the updates to Fusion 9 are comprehensive and very much in line with what professional users have been asking for. I think it certainly demonstrates that Blackmagic is as committed to Fusion as Resolve, and at $299, it’s a no-brainer for any professional VFX artist to have available to them.

Of course, the price drop shows that Blackmagic is also aiming Fusion squarely at the mass independent filmmaker market. Certainly, with Resolve and Fusion, those users will have pretty much all the post tools they will need.

Fusion by its nature and heritage is a more complex beast to learn than Resolve, but it is well supported with a good user manual, forums and video tutorials. I would think it likely that for this market, Fusion might benefit from some minor tweaks to make it more intuitive in certain areas. I also think the join between Resolve and Fusion will provide a lot of interest going forward for this market. Adobe has done a masterful job bridging Premiere and After Effects. The join between Resolve and Fusion is more rudimentary, but if Blackmagic gets this right, they will have a killer combination.

Finally, Fusion 9 extends what was already a very powerful and comprehensive compositing suite. It has become my primary compositing device and the additions in version 9 only serve to cement that position.


David Cox is a VFX compositor and colorist with 20+ years experience. He started his career with MPC and The Mill before forming his own London-based post facility. Cox recently created interactive projects with full body motion sensors and 4D/AR experiences.

Creating a deeper talent pool: training for Mistika, Mamba FX

Now that SGO Mistika systems are being installed here in the US, postPerspective thought it would make sense to find out what the company is doing about training artists so studios that have invested in the product have a deeper pool of talent to pick from when the need arises.

With that in mind, we reached out to David Cox, who has been helping SGO with their training efforts. Here is his take on the subject.

By David Cox

An interesting challenge for any manufacturer that aspires to bring a new product to market — or a different way of thinking to an existing market — is how to cultivate an extensive user-base by training enough individuals to allow their technology to take hold.

Continue reading