Tag Archives: Blackmagic

Review: Blackmagic’s Fusion 9

By David Cox

At Siggraph in August, Blackmagic Design released a new version of its compositing software Fusion. For those not familiar with Fusion, it is a highly flexible node-based compositor that can composite in 2D and 3D spaces. Its closest competitor is Nuke from The Foundry.

The raft of new updates in Version 9 could be categorized into one of two areas: features created in response to user requests, and a set of tools for VR. Also announced with the new release is a price drop to $299 for the full studio version, which, judging by global resellers instantly running out of stock (Fusion ships via dongle), seems to have been a popular move!

As with other manufacturers in the film and broadcast area, the term “VR” is a little misused as they are really referring to “360 video.” VR, although a more exciting term, would demand interactivity. That said, as a post production suite for 360 video, Fusion already has a very strong tool set. It can create, manipulate, texture and light 3D scenes made from imported CGI models and built-in primitives and particles.

Added in Version 9 is a spherical camera that can capture a scene as a 360 2D or stereo 3D image. In addition, new tools are provided to cross-convert between many 360 video image formats. Another useful tool allows a portion of a 360-degree image to be unwrapped (or un-distorted) so that restoration or compositing work can be easily carried out on it before it is perfectly re-wrapped back into the 360-degree image.

There is also a new stabilizer for 360 wrap-around shots. A neat feature is that Fusion 9 can directly drive VR headsets such as Oculus Rift. Within Fusion, any node can be routed to any viewing monitor and the VR headset simply presents itself as an extra one of those.

Notably, Blackmagic has opted not to tackle 360-degree image stitching — the process by which images from multiple cameras facing in different directions are “stitched” together to form a single wrap-around view. I can understand this — on one hand, there are numerous free or cheap apps that perform stitching and so there’s no need for Blackmagic to reinvent that wheel. On the other hand, Blackmagic targets the mass user area, and given that 360 video production is a niche activity, productions that strap together multiple cameras form an even smaller and decreasing niche due to the growing number of single-step 360-degree cameras that provide complete wrap-around images without the need for stitching.

Moving on from VR/360, Fusion 9 now boasts some very significant additional features. While some Fusion users had expressed concerned that Blackmagic was favoring Resolve, in fact it is now clear that the Fusion development team have been very busy indeed.

Camera Tracker
First up is an embedded camera tracker and solver. Such a facility aims to deduce how the original camera in a live-action shoot moved through the scene and what lens must have been on it. From this, a camera tracker produces a virtual 3D scene into which a compositor can add objects that then move precisely with the original shot.

Fusion 9’s new camera tracker performed well in tests. It requires the user to break the process down into three logical steps: track, refine and export. Fusion initially offers auto-placed trackers, which follow scores of details in the scene quite quickly. The operator then removes any obviously silly trackers (like the ones chasing around the moving people in a scene) and sets Fusion about the task of “solving” the camera move.

Once done, Fusion presents a number of features to allow the user to measure the accuracy of the resulting track and to locate and remove trackers that are adversely affecting that result. This is a circular process by which the user can incrementally improve the track. The final track is then converted into a 3D scene with a virtual camera and a point cloud to show where the trackers would exist in 3D space. A ground plane is also provided, which the user can locate during the tracking process.

While Fusion 9’s camera tracker perhaps doesn’t have all the features of a dedicated 3D tracker such as SynthEyes from Andersson Technologies, it does satisfy the core need and has plenty of controls to ensure that the tool is flexible enough to deal with most scenarios. It will certainly be received as a welcome addition.

Planar Tracker
Next up is a built-in “planar” tracker. Planar trackers work differently than classic point trackers, which simply try to follow a small area of detail. A planar tracker follows a larger area of a shot, which makes up a flat plane — such as a wall or table top. From this, the planar tracker can deduce rotation, location, scale and perspective.

Fusion 9 Studio’s new planar tracker also performed well in tests. It assessed the track quickly and was not easily upset by foreground objects obscuring parts of the tracked area. The resulting track can either be used directly to insert another image into the resulting plane or to stabilize the shot, or indirectly by producing a separate Planar Transform node. This is used to warp any other asset such as a matte for rotoscoping work.

Inevitably, any planar tracker will be compared to the long-established “daddy” of them all, Mocha Pro from Boris FX. At a basic level, Fusion’s planar tracker worked just as well as Mocha, creating solid tracks from a user-defined area nicely and quickly. However, I would think that for complex rotoscoping, where a user will have many roto layers, driven by many tracking sources, with other layers acting as occlusion masks, Mocha’s working environment would be easier to control. Such a task would lead to many, many wired up nodes in Fusion, whereas Mocha would present the same functions within a simper layer-list. Of course, Mocha Pro is available as an OFX plug-in for Fusion Studio anyway, so users can have the best of both worlds.

Delta Keyer
Blackmagic also added a new keyer to Fusion called the Delta Keyer. It is a color difference keyer with a wide range of controls to refine the resulting matte and the edges of the key. It worked well when tested against one of my horrible greenscreens, something I keep for these very occasions!

The Delta Keyer can also take a clean plate as a reference input, which is essentially a frame of the green/bluescreen studio without the object to be keyed. The Delta Keyer then uses this to understand which deviations from the screen color represent the foreground object and which are just part of an uneven screen color.

To assist with this process, there is also a new Clean Plate node, which is designed to create an estimate of a clean plate in the absence of one being available from the shoot (for example, if the camera was moving). The combination of the clean plate and the Delta Keyer produced good results when challenged to extract subtle object shadows from an unevenly lit greenscreen shot.

Studio Player
Studio Player is also new for Fusion 9 Studio; it’s a multi-station shot review tool. Multiple versions of clips and comps can be added to the Studio Player’s single layer timeline, where simple color adjustments and notes can be added. A neat feature is that multiple studio players in different locations can be slaved together so that cross-facility review sessions can take place, with everyone looking at the same thing at the same time, which helps!

Fusion 9 Studio also supports the writing of Apple-approved Pro Res from all its supported platforms, including Windows and Linux. Yep – you read that right. Other format support has also been widened and improved, such as faster native handling for DNxHR codecs, for example.

Summing Up
All in all, the updates to Fusion 9 are comprehensive and very much in line with what professional users have been asking for. I think it certainly demonstrates that Blackmagic is as committed to Fusion as Resolve, and at $299, it’s a no-brainer for any professional VFX artist to have available to them.

Of course, the price drop shows that Blackmagic is also aiming Fusion squarely at the mass independent filmmaker market. Certainly, with Resolve and Fusion, those users will have pretty much all the post tools they will need.

Fusion by its nature and heritage is a more complex beast to learn than Resolve, but it is well supported with a good user manual, forums and video tutorials. I would think it likely that for this market, Fusion might benefit from some minor tweaks to make it more intuitive in certain areas. I also think the join between Resolve and Fusion will provide a lot of interest going forward for this market. Adobe has done a masterful job bridging Premiere and After Effects. The join between Resolve and Fusion is more rudimentary, but if Blackmagic gets this right, they will have a killer combination.

Finally, Fusion 9 extends what was already a very powerful and comprehensive compositing suite. It has become my primary compositing device and the additions in version 9 only serve to cement that position.


David Cox is a VFX compositor and colorist with 20+ years experience. He started his career with MPC and The Mill before forming his own London-based post facility. Cox recently created interactive projects with full body motion sensors and 4D/AR experiences.

Blackmagic’s Fusion 9 is now VR-enabled

At SIGGRAPH, Blackmagic was showing Fusion 9, its newly upgraded visual effects, compositing, 3D and motion graphics software. Fusion 9 features new VR tools, an entirely new keyer technology, planar tracking, camera tracking, multi-user collaboration tools and more.

Fusion 9 is available now with a new price point — Blackmagic has lowered the price of its Studio version from $995 to $299 Studio Version. (Blackmagic is also offering a free version of Fusion.) The software now works on Mac, PC and Linux.

Those working in VR get a full 360º true 3D workspace, along with a new panoramic viewer and support for popular VR headsets such as Oculus Rift and HTC Vive. Working in VR with Fusion is completely interactive. GPU acceleration makes it extremely fast so customers can wear a headset and interact with elements in a VR scene in realtime. Fusion 9 also supports stereoscopic VR. In addition, the new 360º spherical camera renders out complete VR scenes, all in a single pass and without the need for complex camera rigs.

The new planar tracker in Fusion 9 calculates motion planes for accurately compositing elements onto moving objects in a scene. For example, the new planar tracker can be used to replace signs or other flat objects as they move through a scene. Planar tracking data can also be used on rotoscope shapes. That means users don’t have to manually animate motion, perspective, position, scale or rotation of rotoscoped elements as the image changes.

Fusion 9 also features an entirely new camera tracker that analyzes the motion of a live-action camera in a scene and reconstructs the identical motion path in 3D space for use with cameras inside of Fusion. This lets users composite elements with precisely matched movement and perspective of the original. Fusion can also use lens metadata for proper framing, focal length and more.

The software’s new delta keyer features a complete set of matte finesse controls for creating clean keys while preserving fine image detail. There’s also a new clean plate tool that can smooth out subtle color variations on blue- and greenscreens in live action footage, making them easier to key.

For multi-user collaboration, Fusion 9 Studio includes Studio Player, a new app that features a playlist,
storyboard and timeline for playing back shots. Studio Player can track version history, display annotation notes, has support for LUTs and more. The new Studio Player is suited for customers that need to see shots in a suite or theater for review and approval. Remote synchronization lets artists  sync Studio Players in multiple locations.

In addition, Fusion 9 features a bin server so shared assets and tools don’t have to be copied onto each user’s local workstation.

A glimpse at what was new at NAB

By Lance Holte

I made the trek out to Las Vegas last week for the annual NAB show to take in the latest in post production technology, discuss new trends and products and get lost in a sea of exhibits. With over 1,700 exhibitors, it’s impossible to see everything (especially in the two days I was there), but here are a handful of notable things that caught my eye.

Blackmagic DaVinci Resolve Studio 14: While the “non-studio” version is still free, it’s hard to beat the $299 license for the full version of Resolve. As 4K and 3D media becomes increasingly prevalent, and with the release of their micro and mini panels, Resolve can be a very affordable solution for editors, mobile colorists and DITs.

The new editorial and audio tools are particularly appealing to someone like me, who is often more hands-on on the editorial side than the grading side of post. To that regard, the new tracking features look to provide extra ease of use for quick and simple grades. I also love that Blackmagic has gotten rid of the dongles, which removes the hassle of tracking numerous dongles in a post environment where systems and rooms are swapped regularly. Oh, and there’s bin, clip and timeline locking for collaborative workflows, which easily pushes Resolve into the competition for an end-to-end post solution.

Adobe Premiere CC 2017 with After Effects and Audition Adobe Premiere is typically my editorial application of choice, and the increased integration of AE and Audition promise to make an end-to-end Creative Cloud workflow even smoother. I’ve been hoping for a revamp of Premiere’s title tool for a while, and the Essential Graphics panel/new Title Tool appears to greatly increase and streamline Premiere’s motion graphics capabilities — especially as someone who does almost all my graphics work in After Effects and Photoshop. The more integrated the various applications can be, the better; and Adobe has been pushing that aspect for some time now.

On the audio side, Premiere’s Essential Sound Panel tools for volume matching, organization, cleanup and other effects without going directly into Audition (or exporting for ProTools, etc.) will be really helpful, especially for smaller projects and offline mixes. And as a last note, the new Camera Shake Deblur effect in After Effects is fantastic.

Dell UltraSharp 4K HDR Monitor — There were a lot of great looking HDR monitors at the show, but I liked that this one fell in the middle of the pack in terms of price point ($2K), with solid specs (1000 nits, 97.7% of P3, and 76.9% of Rec. 2020) and a reasonable size (27 inches). Seems like a good editorial or VFX display solution, though the price might be pushing budgetary constraints for smaller post houses. I wish it was DCI 4K instead of UHD and a little more affordable, but that will hopefully come with time.

On that note, I really like HP’s DreamColor Z31x Studio Display. It’s not HDR, but it’s 99% of the P3 colorspace, and it’s DCI 4K — as well as 2K, by multiplying every pixel at 2K resolution into exactly 4 pixels — so there’s no odd-numbered scaling and sharpening required. Also, I like working with large monitors, especially at high resolutions. It offers automated (and schedulable) color calibration, though I’d love to see a non-automated display in the future if it could bring the price down. I could see the HP monitor as a great alternative to using more expensive HDR displays for the majority of workstations at many post houses.

As another side note, Flanders Scientific’s OLED 55-inch HDR display was among the most beautiful I’ve ever seen, but with numerous built-in interfaces and scaling capabilities, it’s likely to come at a higher price.

Canon 4K600STZ 4K HDR laser projector — This looks to be a great projection solution for small screening rooms or large editorial bays. It offers huge 4096×2400 resolution, is fairly small and compact, and apparently has very few restraints when it comes to projection angle, which would be nice for a theatrical edit bay (or a really nice home theater). The laser light source is also attractive because it will be low maintenance. At $63K, it’s at the more affordable end of 4K projector pricing.

Mettle 360 Degree/VR Depth plug-ins: I haven’t worked with a ton of 360-degree media, but I have dealt with the challenges of doing depth-related effects in a traditional single-camera space, so the fact that Mettle is doing depth-of-field effects, dolly effects and depth volumetric effects with 360-degree/VR content is pretty incredible. Plus, their plug-ins are designed to integrate with Premiere and After Effects, which is good news for an Adobe power user. I believe they’re still going to be in beta for a while, but I’m very curious to see how their plug-ins play out.

Finally, in terms of purely interesting tech, Sony’s Bravia 4K acoustic surface TVs are pretty wild. Their displays are OLED, so they look great, and the fact that the screen vibrates to create sound instead of having separate speakers or an attached speaker bar is awfully cool. Even at very close viewing, the screen doesn’t appear to move, though it can clearly be felt vibrating when touched. A vibrating acoustic surface raises some questions about mounting, so it may not be perfect for every environment, but interesting nonetheless.


Lance Holte is an LA-based post production supervisor and producer. He has spoken and taught at such events as NAB, SMPTE, SIGGRAPH and Createasphere. You can email him at lance@lanceholte.com.

FotoKem’s Alastor Arnold helps set look for ‘Ash vs Evil Dead’

The colorist worked hand in hand with director Sam Raimi and editor Bob Murawski

By Randi Altman

Halloween is known for its ghosts, goblins and gruesome zombies, but this year we got an extra serving of the non-alive, dished up by Sam Raimi and Starz Network. Fans of Raimi’s The Evil Dead (1981) and its sequels (Evil Dead 2, Army of Darkness) were treated to the pilot episode of Ash vs Evil Dead. Many consider The Evil Dead films cult classics, but they are so much more than that. Yes, they are campy and gory and more bloody than necessary, but it’s all done in an effort to make people laugh.

Back for this comedy/action/horror series on Starz is Bruce Campbell as Ash, the man who lost his hand in battle and then cleverly replaced it with a chainsaw. His quick wit and sarcasm have amazingly not diminished over the years. You know, it’s not easy to keep your sense of humor when evil dead people are after you!

Alistor Arnold

Alastor Arnold

Raimi, who directed the first episode, worked very closely with long-time editor and collaborator Bob Murawski and FotoKem colorist Alastor Arnold to create the look of the pilot.

While the show was shot digitally on Arri Alexa (with a couple of pickups shot via a Sony F55), Raimi wanted a filmic look, and that is a big part of what Murawski and Arnold worked to accomplish.

Arnold has some history with Raimi and Murawski — he remastered The Evil Dead for theatrical and Blu-ray release. While Murawski and Arnold work together often, Ash vs Evil Dead is only the second project for the colorist and Raimi.

“I do a lot of work with Bob. In addition to being an Oscar-award winning editor (The Hurt Locker), he has a company called Grindhouse Releasing,” explains Arnold. “They specialize in the restoration and distribution of exploitation and horror films, and I’ve had the pleasure of remastering numerous titles with Bob over the years. When he can bring me in to work with him, he does. And that’s how we got to do the pilot of Ash vs Evil Dead.”

Let’s find out more about the color grade and creating the look for the pilot and series.

How early were you brought on?
Just after shooting — when they started cutting. They had some questions about what work could be accomplished in the color suite when they were doing their rough cuts for the executive screeners. There was one scene in particular… they wanted to see if we could accomplish a specific look without having to go to visual effects.

What was that look?
There was a scene in a room with no lights, and it needed to be lit by a spinning flashlight. So the actors would be coming in and out of darkness, illuminated by only a flashlight. Originally when they shot it, they intended it to be a visual effect, so it was shot brighter than intended. Through color correction, we were able to create the effect they were going for.

How did they describe the look that they wanted for the pilot and the series?
Bob and Sam are both fans of a “filmic” look. They like the image to stay warm and high contrast. Based on their relationship, Sam entrusted Bob with the first pass of color. When Sam walked in for his first day of grading, the show was already in a good place for dialing in looks and trims, with a focus on shaping the frame with Power Windows and integrating visual effects more thoroughly. The look of the pilot is very warm, saturated and punchy, very chromatic — not what I would call a typical kind of horror movie look. A lot of times horror movies are drab or pretty desaturated and a lot of the times they are very cool. This is against that grain.

The pilot was shot almost entirely with an Arri Alexa. How did that play a role in getting the filmic look?
Arri has done a fantastic job with their color science. It responds in a natural way. All the base grades started with a film emulation, internally built at FotoKem with our color scientist, and based on our film lab experience.

The series has a campy feel. Would you say that’s reflected in the look?
The first Evil Dead was much more of a horror movie when compared to Evil Dead 2 and Army of Darkness. The tone of the series has evolved. Sam always injects humor into his movies, even in the first Evil Dead. In the TV show, there’s lots of horror and definitely gore, but it’s actually really funny. There’s an ingrained sense of humor in what Sam does, and that really comes through. Maybe that is reflected in the chromatic, warm look. It may complement that.

What kind of terms or language do you like to use when talking to someone about a look? And do you get examples, such as stills?
I like to approach color from an instinctual artistic level. When I start a project it’s important for me to engage with clients and discuss not only the literal of what they might like to achieve but also what it is emotionally they’re going for, and how color might enhance that. In addition, visual references are always great. I’m always happy when they reference other movies or projects or bring in stills. It’s common these days for looks to be set somewhat in dailies. Any visual reference is always good, but for me, I find it more important to engage artistically and emotionally with people to derive a look for a project.

What about the technical aspects of the grade and the system, in your case Blackmagic’s DaVinci Resolve?
There’s an expectation when people walk into a room with a professional colorist that the technical side of things won’t be an issue; that the colorist is going to be able to help you reach your creative goals. Solidifying and understanding what those creative goals are in the beginning is very important. So, I’m generally less concerned with how to technically arrive somewhere than creatively. Often the technical side of things can be driven by the creative goals.

It’s very important to experiment and have fun; that’s what this process is all about. Engage creatively and artistically; that is the most important part. The technical will happen.

Were Sam and Bob open to suggestions and experimenting?
Bob has been involved in just about everything Sam has done since Darkman (1990), which was their first project together; they have a short hand. Sam was very involved in this episode, and we spent probably two or three days together going through the show, but Sam is less technically driven. When he walked into the room, Bob had already gone through it and gotten it to a good starting place, based on his knowledge of Sam’s sensibilities.

Sam is generally more concerned with what is going to enhance the performances or the emotion of a scene. There’s lots of Windowing in different parts of the frame to either bring things up or down, or tinting things slightly to enhance an emotional feel. That’s where Sam comes from.

So the initial sessions with Bob are where you did the heavy lifting and decided on the overall look?
Yes, the technical grading — matching shots, fixes, general levels and looks. That’s what Bob focuses on during the pre-grading.

Ash vs Evil Dead

Can you talk about the lighting and working with the Resolve?
Lighting wise, it’s actually pretty up, even though the intent may be to have it slightly darker in final color. The nice thing about Resolve is its tracking tools are very good, so you can bring up parts of the frame individually while still keeping other areas very dark.

We did have to do some noise reduction in certain parts as well. The built-in noise reduction tool is very good. I find it very easy to use — I don’t find myself struggling to reach a look or correction, it generally happens quick and easy. That’s important when you have a client in the room. You don’t want to take too long to come up with something.

FotoKem used Resolve for the online as well?
Yes. With the exception of the visual effects, the entire online edit was completed in Resolve, in addition to the color and deliverables.

How does being able to do so much in that one system help you?
I came up working on a system that was more of a hero suite, so it did the color, it did the graphics, it did the minor visual effects work. So it’s nice to see Resolve now competing at that level.

Although I didn’t do the bulk of the editorial work, it was nice to be in the room with Bob and be able to slip a shot a couple of frames, or drop in the visual effects as they came in last minute along with their associated mattes… it all happens very quickly and easily in Resolve.

Where do you find your inspiration?
I love movies and find my inspiration in them. I always try to stay artistically engaged; I like to work on my own projects, in addition to enjoying and contributing to other people’s work. I make an effort to get to the theater two or three times a week. I’m a member of the Visual Effects Society, so I go to lots of their member screenings too. To me, it’s important to stay current in my craft and to be inspired by other people’s work. I enjoy seeing what people are doing with different cameras and how things hold up in different theaters. I like seeing films in a theater as they’re intended and viewing them with an audience. To see how other people are practicing the craft is important. If you’re a painter, you’re going to go to the museum. If you’re a colorist, you should go to the movies, and lots of them.

What have you seen recently that you respected?
I really liked the movie The Diary of a Teenage Girl. It was beautiful. Also Cartel Land, which was lovely, especially considering it was a documentary. Those are small movies, but I saw Sicario recently and that was a very impressive and pretty movie… beautifully shot.

Another movie I enjoyed this year was Tangerine, which was shot entirely on an iPhone. The artist in me wanted to see it for the story and craft. But it was also really important for me to view it in the theater on a large screen and see how well it held up technically. For a colorist it’s an artistic and technical exercise to watch movies.

—–
Ash vs. Evil Dead can be seen weekly on Starz at 9pm EST.

AlphaDogs employs roundtripping workflow for surfing film ‘Gone’

The AlphaDogs post house in Burbank color graded the film Gone, from producer/director Mark Kronemeyer of Pargo Media. Gone takes audiences on a journey through Mexican deserts and jungles, from Baja to Oaxaca, on the search for the soul of surfing in Mexico.

Edited in Final Cut Pro X by Kronemeyer, Gone required a roundtrip workflow through DaVinci Resolve before the color grading process could begin in order to match mixed frame rates between FCP X and Resolve. Roundtripping often causes playback judder if not done properly. To avoid this problem, AlphaDogs colorist Sean Stack, who was in charge of creating the look for the film, rendered the footage outside of Resolve using the original source frame rate, then allowed for adjustment in playback quality once the footage was back in the editing application.

GONEImage2

Non-native frame rates can sometimes appear jittery, which is especially problematic with action footage. The post house used Cinema Tools on short clips to simply convert the playback rate to match the timeline. Although there is a slight speed ramp applied when using this technique, it is typically not noticeable on shorter clips.

Gone was shot in various locations throughout Mexico, so it encompasses a wide variety of beach terrain. To give each location its own personality and character, Stack made specific creative color decisions, such as making southern beaches more teal and green in color while adding more blue and purple/red into the shadows of the surf on northern beaches. Kronemeyer specifically wanted the sections of larger waves to appear even more dangerous and menacing. Stack achieved this look by punching up the blue in the surf, making the water appear darker and in turn giving the waves a deeper and more hazardous look.

While FCP X and Resolve workflows are mostly reliable when it comes to roundtrip accuracy, Stack remains diligent in making sure he always has a QuickTime reference movie with time code delivered to the color session before any conforming begins.

GONEmovieposter

“Without that roadmap, commonly known as a ‘chase reference,’ I cannot guarantee sync with the original offline locked cut,” explains Stack. “The audio mixer should use the same chase reference as the colorist, as this will further guarantee that the mix stems will sync up perfectly with the color graded final sequence.”

Round-trip workflows also present unique challenges when it comes to audio. Because FCP X cannot export proper materials for a pro mix, specific steps are required so as to not slow down the audio process in post. AlphaDogs audio engineer Curtis Fritsch used workaround methods, such as applying Assisted Editing’s Xto7 app and streamlining the audio tracks to ease the transition from FCP X to Pro Tools. Fritsch then added extra EQ to the low and high ends of each song to help elevate the drive of the music to better match the fast pace and lush visuals of the beaches in Mexico.

NAB 2015: Love and hate, plus blogs and videos

By Randi Altman

I have been to more NABs than I would like to admit, and I loved them all… I’ve also hated them all, but that is my love/hate relationship with the show. I love seeing the new technology, trends and friends I’ve made from my many years in the business.

I hate the way my feet feel at the end of the day. I hate the way that there is not enough lotion on the planet to keep my skin from falling off.  I extra-hate the cab lines, but mostly I hate not being able to see everything that needs to be seen.

Continue reading

NAB 2015: A veteran’s perspective

Where have all the buttons gone? They’ve gone to servers.

By Jonathan Moser

For me, NAB used to be about “the buttons.” The latest and the greatest. Which black box could do what to a picture… and sound. K-Scopes, ADO, A53, DDRs…switchers with a LOT of buttons and colors. Consoles, decks, 1-inch, all the Ds (D1,D2…D5). Cool hardware. But post hardware (with the exception of some, like NewTek’s TriCaster and a handful of others) has gotten decidedly boring. Let’s face it — hard drives and ASCII keyboards just aren’t sexy.

Now, NAB post production seems to be about everything else — servers, distribution, Ultra HD Continue reading

NAB Day 1: Me, myself and Monday

By William Rogers

Let’s dive right into the craziness.

RED sat me down with the other members of the press in a comfortably dark theater, as they blasted my face with footage demoed from their new Weapon-sensor equipped cameras. There was a bit of awkwardness in the air shared between the RED representatives and the press members—RED admitted that they hadn’t done this sort of sleek, private reveal before at NAB.

Continue reading

Behind the Title: Cinetic Studios colorist/editor Jason Bowdach

NAME: Jason Bowdach

COMPANY: Cinetic Studios (@CineticStudios)

CAN YOU DESCRIBE YOUR COMPANY?
Cinetic Studios is a boutique-style color and finishing studio that was created to provide easier access to high-end color grading and finishing services that most assume are out of their reach.

Our slogan, “We Tell Stories With Color,” represents our belief that color is a very powerful narrative tool that shouldn’t be overlooked. On the technical side, we take a bleeding-edge approach, as we feel the latest technology allows us to offer services that are cost-effective yet quality driven.

WHAT’S YOUR JOB TITLE? Colorist and Online Editor Continue reading

Review: Autopilot Camera Stabilizer from ProAm USA

By Brady Betzel

In the last five years, content creation and distribution has exploded. Every person with a smartphone has the ability to create outstanding content. Think about it… everyone with an iPhone has a fully capable 1080p video camera in his or her pocket at all times. So once the explosion of random, and frankly terrible content settled (or is currently settling), viewers were looking for quality not just quantity.

YouTube has been instrumental in content distribution; the amount of content is truly amazing if you allow yourself to get lost down the YouTube rabbit hole. So the question becomes: How do I set myself apart from the other million YouTube content creators? If you’re a kid who wants to make skateboard videos, how do you go beyond the now standard “crazy” GoPro angle? Or how do you save time in post by not having to stabilize every piece of footage you Continue reading