Tag Archives: Adobe Premiere

Editing 360 Video in VR (Part 2)

By Mike McCarthy

In the last article I wrote on this topic, I looked at the options for shooting 360-degree video footage, and what it takes to get footage recorded on a Gear 360 ready to review and edit on a VR-enabled system. The remaining steps in the workflow will be similar regardless of which camera you are using.

Previewing your work is important so, if you have a VR headset you will want to make sure it is installed and functioning with your editing software. I will be basing this article on using an Oculus Rift to view my work in Adobe Premiere Pro 11.1.2 on a Thinkpad P71 with an Nvidia Quadro P5000 GPU. Premiere requires an extra set of plugins to interface to the Rift headset. Adobe acquired Mettle’s Skybox VR Player plugin back in June, and has made it available to Creative Cloud users upon request, which you can do here.

Skybox VR player

Skybox can project the Adobe UI to the Rift, as well as the output, so you could leave the headset on when making adjustments, but I have not found that to be as useful as I had hoped. Another option is to use the GoPro VR Player plugin to send the Adobe Transmit output to the Rift, which can be downloaded for free here (use the 3.0 version or above). I found this to have slightly better playback performance, but fewer options (no UI projection, for example). Adobe is expected to integrate much of this functionality into the next release of Premiere, which should remove the need for most of the current plugins and increase the overall functionality.

Once our VR editing system is ready to go, we need to look at the footage we have. In the case of the Gear 360, the dual spherical image file recorded by the camera is not directly usable in most applications and needs to be processed to generate a single equirectangular projection, stitching the images from both cameras into a single continuous view.

There are a number of ways to do this. One option is to use the application Samsung packages with the camera: Action Director 360. You can download the original version here, but will need the activation code that came with the camera in order to use it. Upon import, the software automatically processes the original stills and video into equirectangular 2:1 H.264 files. Instead of exporting from that application, I pull the temp files that it generates on media import, and use them in Premiere. (C:\Users\[Username]\Documents\CyberLink\ActionDirector\1.0\360) is where they should be located by default. While this is the simplest solution for PC users, it introduces an extra transcoding step to H.264 (after the initial H.265 recording), and I frequently encountered an issue where there was a black hexagon in the middle of the stitched image.

Action Director

Activating Automatic Angle Compensation in the Preferences->Editing panel gets around this bug, while trying to stabilize your footage to some degree. I later discovered that Samsung had released a separate Version 2 of Action Director available for Windows or Mac, which solves this issue. But I couldn’t get the stitched files to work directly in the Adobe apps, so I had to export them, which was yet another layer of video compression. You will need a Samsung activation code that came with the Gear 360 to use any of the versions, and both versions took twice as long to stitch a clip as its run time on my P71 laptop.

An option that gives you more control over the stitching process is to do it in After Effects. Adobe’s recent acquisition of Mettle’s SkyBox VR toolset makes this much easier, but it is still a process. Currently you have to manually request and install your copy of the plugins as a Creative Cloud subscriber. There are three separate installers, and while this stitching process only requires Skybox Suite AE, I would install both the AE and Premiere Pro versions for use in later steps, as well as the Skybox VR player if you have an HMD to preview with. Once you have them installed, you can use the Skybox Converter effect in After Effects to convert from the Gear 360’s fisheye files to the equirectangular assets that Premiere requires for editing VR.

Unfortunately, Samsung’s format is not one of the default conversions supported by the effect, so it requires a little more creativity. The two sensor images have to be cropped into separate comps and with plugin applied to each of them. Setting the Input to fisheye and the output to equirectangular for each image will give the desired distortion. A feathered mask applied to the circle to adjust the seam, and the overlap can be adjusted with the FOV and re-orient camera values.

Since this can be challenging to setup, I have posted an AE template that is already configured for footage from the Gear 360. The included directions should be easy to follow, and the projection, overlap and stitch can be further tweaked by adjusting the position, rotation and mask settings in the sub-comps, and the re-orientation values in the Skybox Converter effects. Hopefully, once you find the correct adjustments for your individual camera, they should remain the same for all of your footage, unless you want to mask around an object crossing the stitch boundary. More info on those types of fixes can be found here. It took me five minutes to export 60 seconds of 360 video using this approach, and there is no stabilization or other automatic image analysis.

Video Stitch Studio

Orah makes Video-Stitch Studio, which is a similar product but with a slightly different feature set and approach. One limitation I couldn’t find a way around is that the program expects the various fisheye source images to be in separate files, and unlike AVP I couldn’t get the source cropping tool to work without rendering the dual fisheye images into separate square video source files. There should be a way to avoid that step, but I couldn’t find one. (You can use the crop effect to remove 1920 pixels on one side or the other to make the conversions in Media Encoder relatively quickly.) Splitting the source file and rendering separate fisheye spheres adds a workflow step and render time, and my one-minute clip took 11 minutes to export. This is a slower option, which might be significant if you have hours of footage to process instead of minutes.

Clearly, there are a variety of ways to get your raw footage stitched for editing. The results vary greatly between the different programs, so I made video to compare the different stitching options on the same source clip. My first attempt was with a locked-off shot in the park, but that shot was too simple to see the differences, and it didn’t allow for comparison of the stabilization options available in some of the programs. I shot some footage from a moving vehicle to see how well the motion and shake would be handled by the various programs. The result is now available on YouTube, fading between each of the five labeled options over the course of the minute long clip. I would categorize this as testing how well the various applications can handle non-ideal source footage, which happens a lot in the real world.

I didn’t feel that any of the stitching options were perfect solutions, so hopefully we will see further developments in that regard in the future. You may want to explore them yourself to determine which one best meets your needs. Once your footage is correctly mapped to equirectangular projection, ideally in a 2:1 aspect ratio, and the projects are rendered and exported (I recommend Cineform or DNxHR), you are ready to edit your processed footage.

Launch Premiere Pro and import your footage as you normally would. If you are using the Skybox Player plugin, turn on Adobe Transmit with the HMD selected as the only dedicated output (in the Skybox VR configuration window, I recommend setting the hot corner to top left, to avoid accidentally hitting the start menu, desktop hide or application close buttons during preview). In the playback monitor, you may want to right click the wrench icon and select Enable VR to preview a pan-able perspective of the video, instead of the entire distorted equirectangular source frame. You can cut, trim and stack your footage as usual, and apply color corrections and other non-geometry-based effects.

In version 11.1.2 of Premiere, there is basically one VR effect (VR Projection), which allows you to rotate the video sphere along all three axis. If you have the Skybox Suite for Premiere installed, you will have some extra VR effects. The Skybox Rotate Sphere effect is basically the same. You can add titles and graphics and use the Skybox Project 2D effect to project them into the sphere where you want. Skybox also includes other effects for blurring and sharpening the spherical video, as well as denoise and glow. If you have Kolor AVP installed that adds two new effects as well. GoPro VR Horizon is similar to the other sphere rotation ones, but allows you to drag the image around in the monitor window to rotate it, instead of manually adjusting the axis values, so it is faster and more intuitive. The GoPro VR Reframe effect is applied to equirectangular footage, to extract a flat perspective from within it. The field of view can be adjusted and rotated around all three axis.

Most of the effects are pretty easy to figure out, but Skybox Project 2D may require some experimentation to get the desired results. Avoid placing objects near the edges of the 2D frame that you apply it to, to keep them facing toward the viewer. The rotate projection values control where the object is placed relative to the viewer. The rotate source values rotate the object at the location it is projected to. Personally, I think they should be placed in the reverse order in the effects panel.

Encoding the final output is not difficult, just send it to Adobe Media Encoder using either H.264 or H.265 formats. Make sure the “Video is VR” box is checked at the bottom of the Video Settings pane, and in this case that the frame layout is set to monoscopic. There are presets for some of the common framesizes, but I would recommend lowering the bitrates, at least if you are using Gear 360 footage. Also, if you have ambisonic audio set channels to 4.0 in the audio pane.

Once the video is encoded, you can upload it directly to Facebook. If you want to upload to YouTube, exports from AME with the VR box checked should work fine, but for videos from other sources you will need to modify the metadata with this app here.  Once your video is uploaded to YouTube, you can embed it on any webpage that supports 2D web videos. And YouTube videos can be streamed directly to your Rift headset using the free DeoVR video player.

That should give you a 360-video production workflow from start to finish. I will post more updated articles as new software tools are developed, and as I get new 360 cameras with which to test and experiment.


Mike McCarthy is an online editor/workflow consultant with 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.

Michael Kammes’ 5 Things – Video editing software

By Randi Altman

Technologist Michael Kammes is back with a new episode of 5 Things, which focuses on simplifying film, TV and media technology. The web series answers, according to Kammes, the “five burning tech questions” people might have about technologies and workflows in the media creation space. This episode tackles professional video editing software being used (or not used) in Hollywood.

Why is now the time to address this segment of the industry? “The market for NLEs is now more crowded than it has been in over 20 years,” explains Kammes. “Not since the dawn of modern NLEs have there been this many questions over what tools should be used. In addition, the massive price drop of NLEs, coupled with the pricing shift (monthly/yearly, as opposed to outright) has created more confusion in the market.”

In his video, Kammes focuses on Avid Media Composer, Adobe Premiere, Apple Final Cut Pro, Lightworks, Blackmagic Resolve and others.

Considering its history and use on some major motion pictures, (such as The Wolf of Wall Street), why hasn’t Lightworks made more strides in the Hollywood community? “I think Lightworks has had massive product development and marketing issues,” shares Kammes. “I rarely see the product pushed online, at user groups or in forums.  EditShare, the parent company of LightWorks, also deals heavily in storage, so one can only assume the marketing dollars are being spent on larger ticket items like professional and enterprise storage over a desktop application.”

What about Resolve, considering its updated NLE tools and the acquisition of audio company Fairlight? Should we expect to see more Resolve being used as a traditional NLE? “I think in Hollywood, adoption will be very, very slow for creative editorial, and unless something drastic happens to Avid and Adobe, Resolve will remain in the minority. For dailies, transcodes or grading, I can see it only getting bigger, but I don’t see larger facilities adopting Resolve for creative editorial. Outside of Hollywood, I see it gaining more traction. Those outlets have more flexibility to pivot and try different tools without the locked-in TV and feature film machine in Hollywood.”

Check it out:

Jimmy Helm upped to editor at The Colonie

The Colonie, the Chicago-based editorial, visual effects and motion graphics shop, has promoted Jimmy Helm to editor. Helm has honed his craft over the past seven years, working with The Colonie’s senior editors on a wide range of projects. Most recently, he has been managing ongoing social media work with Facebook and conceptualizing and editing short format ads. Some clients he has collaborated with include Lyft, Dos Equis, Capital One, Heineken and Microsoft. He works on both Avid Media Composer and Adobe Premiere.

A filmmaking major at Columbia College Chicago, Helm applied for an internship at The Colonie in 2010. Six months later he was offered a full-time position as an assistant editor, working alongside veteran cutter Tom Pastorelle on commercials for McDonald’s, Kellogg’s, Quaker and Wrangler. During this time, Helm edited numerous projects on his own, including broadcast commercials for Centrum and Kay Jewelers.

“Tom is incredible to work with,” says Helm. “Not only is he a great editor but a great person. He shared his editorial methods and taught me the importance of bringing your instinctual creativity to the process. I feel fortunate to have had him as a mentor.”

In 2014, Helm was promoted to senior assistant editor and continued to hone his editing skills while taking on a leadership role.

“My passion for visual storytelling began when I was young,” says Helm “Growing up in Memphis, I spent a great deal of time watching classic films by great directors. I realize now that I was doing more than watching — I was studying their techniques and, particularly, their editing styles. When you’re editing a scene, there’s something addictive about the rhythm you create and the drama you build. I love that I get to do it every day.”

Helm joins The Colonie’s editorial team, comprised of Joe Clear, Keith Kristinat, Pastorelle and Brian Salazar, along with editors and partners Bob Ackerman and Brian Sepanik.

 

 

Quick Chat: Lucky Post’s Sai Selvarajan on editing Don’t Fear The Fin

Costa, makers of polarized sunglasses, has teamed up with Ocearch, a group of explorers and scientists dedicated to generating data on the movement, biology and health of sharks, in order to educate people on how saving the sharks will save our oceans. In a 2.5-minute video, three shark attack survivors — Mike Coots, Paul de Gelder, and Lisa Mondy — explain why they are now on a quest to help save the very thing that attacked them, took their limbs and almost their lives.

The video edited by Lucky Post’s Sai Selvarajan for agency McGarrah Jessee and Rabbit Food Studios, tells the viewer that the number of sharks killed by long-lining, illegal fishing and the shark finning trade exceeds human shark attacks by millions. And as go the sharks, so go our oceans.

For editor Selvarajan, the goal was to strike a balance with the intimate stories and the global message, from striking footage filmed in Hawaii’s surf mecca, the North Shore. “Stories inside stories,” describes Selvarajan, who reveres the subjects’ dedication to saving the misunderstood creatures, despite having their life-changing encounters.

We spoke with the Texas-based editor to find out more about this project.

How early on did you become involved in the project?
I got a call when the project was greenlit and Jeff Bednarz the creative head at Rabbit Foot walked me through the concept. He wanted to showcase the whole teamwork aspect of Costa, Ocearch and shark survivors all coming together and using their skills to save sharks.

Did working on Don’t Fear The Fin change your perception of sharks?
Yes it did.  Before working on the project I had no idea that sharks were in trouble. After working on Don’t Fear the Fin, I’m totally for shark conservation, and I admire anyone who is out there fighting for the species.

What equipment did you use for the edit?
Adobe Premiere on Mac Tower.

What were the biggest creative challenges?
The biggest creative challenge was how to tell the shark survivors’ stories and then the shark’s story, and then Ocearch/Costa’s mission story. It was stories inside stories, which made it very dense and challenging to cut into a three-minute story. I had to do justice to all the stories and weave them into each other. The footage was gorgeous, but there had to be a sense of gravity to it all, so I used pacing and score to give us that gravity.

What do you think of the fact that sharks are not shown much in the film?
We made a conscious effort to show sharks and people in the same shot. The biggest misconception is that sharks are these big man-eating monsters. Seeing people diving with the sharks tied them to our story and the mission of the project.

What’s your biggest fear, and how would/can you overcome it?
Snakes are my biggest fear. I’m not sure how I’ll ever overcome it. I respect snakes and keep a safe distance. Living in Texas, I’ve read up on which ones are poisonous, so I know which ones to stay away from. But if I came across a rat snake in the wild, I’m sure to jump 20 feet in the air.

Check out the full video below…

 

Adobe acquires Mettle’s SkyBox tools for 360/VR editing, VFX

Adobe has acquired all SkyBox technology from Mettle, a developer of 360-degree and virtual reality software. As more media and entertainment companies embrace 360/VR, there is a need for seamless, end-to-end workflows for this new and immersive medium.

The Skybox toolset is designed exclusively for post production in Adobe Premiere Pro CC and Adobe After Effects CC and complements Adobe Creative Cloud’s existing 360/VR cinematic production technology. Adobe will integrate SkyBox plugin functionality natively into future releases of Premiere Pro and After Effects.

To further strengthen Adobe’s leadership in 360-degree and virtual reality, Mettle co-founder Chris Bobotis will join Adobe, bringing more than 25 years of production experience to his new role.

“We believe making virtual reality content should be as easy as possible for creators. The acquisition of Mettle SkyBox technology allows us to deliver a more highly integrated VR editing and effects experience to the film and video community,” says Steven Warner, VP of digital video and audio at Adobe. “Editing in 360/VR requires specialized technology, and as such, this is a critical area of investment for Adobe, and we’re thrilled Chris Bobotis has joined us to help lead the charge forward.”

“Our relationship started with Adobe in 2010 when we created FreeForm for After Effects, and has been evolving ever since. This is the next big step in our partnership,” says Bobotis, now director, professional video at Adobe. “I’ve always believed in developing software for artists, by artists, and I’m looking forward to bringing new technology and integration that will empower creators with the digital tools they need to bring their creative vision to life.”

Introduced in April 2015, SkyBox was the first plugin to leverage Mettle’s proprietary 3DNAE technology, and its success quickly led to additional development of 360/VR plugins for Premiere Pro and After Effects.

Today, Mettle’s plugins have been adopted by companies such as The New York Times, CNN, HBO, Google, YouTube, Discovery VR, DreamWorks TV, National Geographic, Washington Post, Apple and Facebook, as well as independent filmmakers and YouTubers.

Comprimato plug-in manages Ultra HD, VR files within Premiere

Comprimato, makers of GPU-accelerated storage compression and video transcoding solutions, has launched Comprimato UltraPix. This video plug-in offers proxy-free, auto-setup workflows for Ultra HD, VR and more on hardware running Adobe Premiere Pro CC.

The challenge for post facilities finishing in 4K or 8K Ultra HD, or working on immersive 360­ VR projects, is managing the massive amount of data. The files are large, requiring a lot of expensive storage, which can be slow and cumbersome to load, and achieving realtime editing performance is difficult.

Comprimato UltraPix addresses this, building on JPEG2000, a compression format that offers high image quality (including mathematically lossless mode) to generate smaller versions of each frame as an inherent part of the compression process. Comprimato UltraPix delivers the file at a size that the user’s hardware can accommodate.

Once Comprimato UltraPix is loaded on any hardware, it configures itself with auto-setup, requiring no specialist knowledge from the editor who continues to work in Premiere Pro CC exactly as normal. Any workflow can be boosted by Comprimato UltraPix, and the larger the files the greater the benefit.

Comprimato UltraPix is a multi-platform video processing software for instant video resolution in realtime. It is a lightweight, downloadable video plug-in for OS X, Windows and Linux systems. Editors can switch between 4K, 8K, full HD, HD or lower resolutions without proxy-file rendering or transcoding.

“JPEG2000 is an open standard, recognized universally, and post production professionals will already be familiar with it as it is the image standard in DCP digital cinema files,” says Comprimato founder/CEO Jirˇí Matela. “What we have achieved is a unique implementation of JPEG2000 encoding and decoding in software, using the power of the CPU or GPU, which means we can embed it in realtime editing tools like Adobe Premiere Pro CC. It solves a real issue, simply and effectively.”

“Editors and post professionals need tools that integrate ‘under the hood’ so they can focus on content creation and not technology,” says Sue Skidmore, partner relations for Adobe. “Comprimato adds a great option for Adobe Premiere Pro users who need to work with high-resolution video files, including 360 VR material.”

Comprimato UltraPix plug-ins are currently available for Adobe Premiere Pro CC and Foundry Nuke and will be available on other post and VFX tools soon. You can download a free 30-day trial or buy Comprimato UltraPix for $99 a year.

Frame.io 2.0 offers 100 new features, improvements for collaboration

Frame.io, developers of the video review and collaboration platform for content creators, has unveiled Frame.io 2.0 , an upgrade offering over 100 new features and improvements. This new version features new client Review Pages, which expands content review and sharing. In addition, the new release offers deeper workflow integration with Final Cut Pro X and Avid Media Composer, plus a completely re-engineered player.

“Frame.io 2 is based on everything we’ve learned from our customers over the past two years and includes our most-requested features,” says Emery Wells, CEO of Frame.io.

Just as internal teams can collaborate using Frame.io’s comprehensive annotation and feedback tools, clients can now provide detailed feedback on projects with Review Pages, which is designed to make the sharing experience simple, with no log-in required.

Review Pages give clients the same commenting ability as collaborators, without exposing them to the full Frame.io interface. Settings are highly configurable to meet specific customer needs, including workflow controls (approvals), security (password protection, setting expiration date) and communication (including a personalized message for the client).

The Review Pages workflow simplifies the exchange of ideas, consolidating feedback in a succinct manner. For those using Adobe Premiere or After Effects, those thoughts flow directly into the timeline, where you can immediately take action and upload a new version. Client Review Pages are also now available in the Frame.io iOS app, allowing collaboration via iPhones and iPads.

Exporting and importing comments and annotations into Final Cut Pro X and Media Composer has gotten easier with the upgraded, free desktop companion app, which allows users to open downloaded comment files and bring them into the editor as markers. There is now no need to toggle between Frame.io and the NLE.

Users can also now copy and paste comments from one version to another. The information is exportable in a variety of formats, whether that’s a PDF containing a thumbnail, timecode, comment, annotation and completion status that can be shared and reviewed with the team or as a .csv or .xml file containing tons of additional data for further processing.

Also new to Frame.io 2.0 is a SMPTE-compliant source timecode display that works with both non-drop and drop-frame timecode. Users can now download proxies straight from Frame.io.

The Frame.io 2.0 player page now offers better navigation, efficiency and accountability. New “comment heads” allow artists to visually see who left a comment and where so they can quickly find and prioritize feedback on any given project. Users can also preview the next comment, saving them time when one comment affects another.

The new looping feature, targeting motion and VFX artists, lets users watch the same short clip on loop. You can even select a range within a clip to really dive in deep. Frame.io 2.0’s asset slider makes it easy to navigate between assets from the player page.

The new Frame.io 2.0 dashboard has been redesigned for speed and simplicity. Users can manage collaborators for any given project from the new collaborator panel, where adding an entire team to a project takes one click. A simple search in the project search bar makes it easy to bring up a project. The breadcrumb navigation bar tracks every move deeper into a sub-sub-subfolder, helping artists stay oriented when getting lost in their work. The new list view option with mini-scrub gives users the birds-eye view of everything happening in Frame.io 2.0.

Copying and moving assets between projects takes up no additional storage, even when users make thousands of copies of a clip or project. Frame.io 2.0 also now offers the ability to publish direct to Vimeo, with full control over publishing options, so pros can create the description and set privacy permissions, right then and there.

Review: Nvidia’s new Pascal-based Quadro cards

By Mike McCarthy

Nvidia has announced a number of new professional graphic cards, filling out their entire Quadro line-up with models based on their newest Pascal architecture. At the absolute top end, there is the new Quadro GP100, which is a PCIe card implementation of their supercomputer chip. It has similar 32-bit (graphics) processing power to the existing Quadro P6000, but adds 16-bit (AI) and 64-bit (simulation). It is intended to combine compute and visualization capabilities into a single solution. It has 16GB of new HBM2 (High Bandwidth Memory) and two cards can be paired together with NVLink at 80GB/sec to share a total of 32GB between them.

This powerhouse is followed by the existing P6000 and P5000 announced last July. The next addition to the line-up is the single-slot VR-ready Quadro P4000. With 1,792 CUDA cores running at 1200MHz, it should outperform a previous-generation M5000 for less than half the price. It is similar to its predecessor the M4000 in having 8GB RAM, four DisplayPort connectors, and running on a single six-pin power connector. The new P2000 follows next with 1024 cores at 1076MHz and 5GB of RAM, giving it similar performance to the K5000, which is nothing to scoff at. The P1000, P600 and P400 are all low-profile cards with Mini-DisplayPort connectors.

All of these cards run on PCIe Gen3 x16, and use DisplayPort 1.4, which adds support for HDR and DSC. They all support 4Kp60 output, with the higher end cards allowing 5K and 4Kp120 displays. In regards to high-resolution displays, Nvidia continues to push forward with that, allowing up to 32 synchronized displays to be connected to a single system, provided you have enough slots for eight Quadro P4000 cards and two Quadro Sync II boards.

Nvidia also announced a number of Pascal-based mobile Quadro GPUs last month, with the mobile P4000 having roughly comparable specifications to the desktop version. But you can read the paper specs for the new cards elsewhere on the Internet. More importantly, I have had the opportunity to test out some of these new cards over the last few weeks, to get a feel for how they operate in the real world.

DisplayPorts

Testing
I was able to run tests and benchmarks with the P6000, P4000 and P2000 against my current M6000 for comparison. All of these test were done on a top-end Dell 7910 workstation, with a variety of display outputs, primarily using Adobe Premiere Pro, since I am a video editor after all.

I ran a full battery of benchmark tests on each of the cards using Premiere Pro 2017. I measured both playback performance and encoding speed, monitoring CPU and GPU use, as well as power usage throughout the tests. I had HD, 4K, and 6K source assets to pull from, and tested monitoring with an HD projector, a 4K LCD and a 6K array of TVs. I had assets that were RAW R3D files, compressed MOVs and DPX sequences. I wanted to see how each of the cards would perform at various levels of production quality and measure the differences between them to help editors and visual artists determine which option would best meet the needs of their individual workflow.

I started with the intuitive expectation that the P2000 would be sufficient for most HD work, but that a P4000 would be required to effectively handle 4K. I also assumed that a top-end card would be required to playback 6K files and split the image between my three Barco Escape formatted displays. And I was totally wrong.

Besides when using the higher-end options within Premiere’s Lumetri-based color corrector, all of the cards were fully capable of every editing task I threw at them. To be fair, the P6000 usually renders out files about 30 percent faster than the P2000, but that is a minimal difference compared to the costs. Even the P2000 was able to playback my uncompressed 6K assets onto my array of Barco Escape displays without issue. It was only when I started making heavy color changes in Lumetri that I began to observe any performance differences at all.

Lumetri

Color correction is an inherently parallel, graphics-related computing task, so this is where GPU processing really shines. Premiere’s Lumetri color tools are based on SpeedGrade’s original CUDA processing engine, and it can really harness the power of the higher-end cards. The P2000 can make basic corrections to 6K footage, but it is possible to max out the P6000 with HD footage if I adjust enough different parameters. Fortunately, most people aren’t looking for more stylized footage than the 300 had, so in this case, my original assumptions seem to be accurate. The P2000 can handle reasonable corrections to HD footage, the P4000 is probably a good choice for VR and 4K footage, while the P6000 is the right tool for the job if you plan to do a lot of heavy color tweaking or are working on massive frame sizes.

The other way I expected to be able to measure a difference between the cards would be in playback while rendering in Adobe Media Encoder. By default, Media Encoder pauses exports during timeline playback, but this behavior can be disabled by reopening Premiere after queuing your encode. Even with careful planning to avoid reading from the same disks as the encoder was accessing from, I was unable to get significantly better playback performance from the P6000 compared to the P2000. This says more about the software than it says about the cards.

P6000

The largest difference I was able to consistently measure across the board was power usage, with each card averaging about 30 watts more as I stepped up from the P2000 to the P4000 to the P6000. But they all are far more efficient than the previous M6000, which frequently sucked up an extra 100 watts in the same tests. While “watts” may not be a benchmark most editors worry too much about, among other things it does equate to money for electricity. Lower wattage also means less cooling is needed, which results in quieter systems that can be kept closer to the editor without being distracting from the creative process or interfering with audio editing. It also allows these new cards to be installed in smaller systems with smaller power supplies, using up fewer power connectors. My HP Z420 workstation only has one 6-pin PCIe power plug, so the P4000 is the ideal GPU solution for that system.

Summing Up
It appears that we have once again reached a point where hardware processing capabilities have surpassed the software capacity to use them, at least within Premiere Pro. This leads to the cards performing relatively similar to one another in most of my tests, but true 3D applications might reveal much greater differences in their performance. Further optimization of CUDA implementation in Premiere Pro might also lead to better use of these higher-end GPUs in the future.


Mike McCarthy is an online editor and workflow consultant with 10 years of experience on feature films and commercials. He has been on the forefront of pioneering new solutions for tapeless workflows, DSLR filmmaking and now multiscreen and surround video experiences. If you want to see more specific details about performance numbers and benchmark tests for these Nvidia cards, check out techwithmikefirst.com.

Review: Apple’s new MacBook Pro

By Brady Betzel

What do you need to know about the latest pro laptop from Apple? Well, the MacBook Pro is fast and light; the new Touch Bar is handy and sharp but not fully realized; the updated keys on the keyboard are surprisingly great; and working with ProRes QuickTime files in resolutions higher than 1920×1080 inside of FCP X, or any NLE for that matter, is blazing fast.

When I was tasked with reviewing the new MacBook Pro, I came into it with an open mind. After all, I did read a few other reviews that weren’t exactly glowing, but I love speed and innovation among professional workstation computers, so I was eager to test it myself.

I am pretty open-minded when it comes to operating systems and hardware. I love Apple products and I love Windows-based PCs. I think both have their place in our industry, and to be quite honest it’s really a bonus for me that I don’t rely heavily on one OS or get too tricked by the Command Key vs. Windows/Alt Key.

Let’s start with the call I had with the Apple folks as they gave me the lowdown on the new MacBook Pro. The Apple reps were nice, energetic, knowledgeable and extremely helpful. While I love Apple products, including this laptop, it’s not the be-all-end-all.

The Touch Bar is nice, but not a revolution. It feels like the first step in an evolution, a version 1 of an innovation that I am excited to see more of in later iterations. When I talked with the Apple folks they briefed me on what Tim Cook showed off in the reveal: emoji buttons, wide gamut display, new speakers and USB-C/Thunderbolt 3 connectivity.

NLEs
They had an FCPX expert on the call, which was nice considering I planned on reviewing the MacBook Pro with a focus on the use of nonlinear editing apps, such as Adobe Premiere Pro, Avid Media Composer and Blackmagic’s Resolve. Don’t get me wrong, FCPX is growing on me — it’s snappy jumping around the timeline with ProRes 5K footage; assigning roles are something I wish every other app would pick up on; and the timeline is more of a breeze to use with the latest update.

The other side to this is that in my 13 years of working in television post I have never worked on a show that primarily used FCP or FCPX to edit or finish on. This doesn’t mean I don’t like the NLE, it simply means I haven’t relied on it in a professional working environment. Like I said, I really like the road it’s heading down, and if they work their way into mainstream broadcast or streaming platforms a little more I am sure I will see it more frequently.

Furthermore, with the ever-growing reduction in reliance on groups of editors and finishing artists apps like FCPX are poised to shine with their innovation. After all that blabbering, in this review I will touch on FCPX, but I really wanted to see how the MacBook Pro performed with the pro NLEs I encounter the most.

Specs
Let’s jump into the specs. I was sent a top-of-the-line 15-inch MacBook Pro with Touch Bar, which costs $3,499 if configured online. It comes with a quad/-core Intel Core i7 2.9GHz (up to 3.8 GHz using Turbo Boost) processor, 16GB of 2133MHz memory, 1TB PCI-e SSD hard drive and Radeon Pro 460 with 4GB of memory. It’s loaded. I think the only thing that can actually be upgraded beyond this configuration would be to include a 2TB hard drive, which would add another $800 to the price tag.

Physically, the MacBook Pro is awesome — very sturdy, very thin and very light. It feels great when holding it and carrying it around. Apple even sent along a Thunderbolt 3 (USB-C) to Thunderbolt 2 adapter, which costs an extra $29 and a USB-C to Lightning Cable that costs an extra $29.

So yes, it feels great. Apple has made a great new MacBook Pro. Is it worth upgrading if you have a new-ish MacBook Pro at home already? Probably not, unless the Touch Bar really gets you going. The speed is not too far off from the previous version. However, if you have a lot of Thunderbolt 3/USB-C-connected peripherals, or plan on moving to them, then it is a good upgrade.

Testing
I ran some processor/graphics card intensive tests while I had the new MacBook Pro and came to the conclusion that FCPX is not that much faster than Adobe Premiere Pro CC 2017 when working with non-ProRes-based media. Yes, FCPX tears through ProRes QuickTimes if you already have your media in that format. What about if you shoot on a camera like the Red and don’t want to transcode to a more edit-friendly codec? Well, that is another story. To test out my NLEs, I grabbed a sample Red 6K 6144×3160 23.98fps clip from the Red sample footage page, strung out a 10-minute-long sequence in all the NLEs and exported both a color-graded version and a non-color-graded version as ProRes HQ QuickTimes files matching the source file’s specs.

In order to work with Red media in some of the NLEs, you must download a few patches: for FCPX you must install the Red Apple workflow installer and for Media Composer you must install the Red AMA plug-in. Premiere doesn’t need anything extra.

Test 1: Red 6K 6144×3160 23.98fps R3D — 10-minute sequence (no color grade or FX) exported as ProRes HQ matching the source file’s specs. Premiere > Media Encoder = 1 hour, 55 minutes. FCPX = 1 hour, 57 minutes. Media Composer = two hours, 42 minutes (Good news, Media Composer’s interface and fonts display correctly on the new display).

You’ll notice that Resolve is missing from this list and that is because I installed Resolve 12.5.4 Studio but then realized my USB dongle won’t fit into the USB-C port — and I am not buying an adapter for a laptop I do not get to keep. So, unfortunately, I didn’t test a true 6K ProRes HQ export from Resolve but in the last test you will see some Resolve results.

Overall, there was not much difference in speeds. In fact, I felt that Premiere Pro CC 2017 played the Red file a little smoother and at a higher frames-per-second count. FCPX struggled a little. Granted a 6K Red file is one of the harder files for a CPU to process with no debayer settings enabled, but Apple touts this as a MacPro semi-replacement for the time being and I am holding them to their word.

Test 2: Red 6K 6144×3160 23.98fps R3D — 10-minute color-graded sequence exported as ProRes HQ matching the source files specs. Premiere > Media Encoder = one hour, 55 minutes. FCPX = one hour, 58 minutes. Media Composer = two hours, 34 minutes.

It’s important to note that the GPU definitely helped out in both Adobe Premiere and FCPX. Little to no extra time was added on the ProRes HQ export. I was really excited to see this as sometimes without a good GPU — resizing, GPU-accelerated effects like color correction and other effects will slow your system to a snail’s pace if it doesn’t fully crash. Media Composer surprisingly speed up its export when I added the color grade as a new color layer in the timeline. By adding the color correction layer to another layer Avid might have forced the Radeon to kick in and help push the file out. Not really sure what that is about to be honest.

Test 3: Red 6K 6144×3160 23.98fps R3D — 10-minute color-graded sequence resized to 1920×1080 on export as ProRes HQ. Premiere > Media Encoder = one hour, 16 minutes. FCPX = one hour, 14 minutes. Media Composer = one hour, 48 minutes. Resolve = one hour, 16 minutes

So after these tests, it seems that exporting and transcoding are all about the same. It doesn’t really come as too big of a surprise that all the NLEs, except for Media Composer, processed the Red file in the same amount of time. Regardless of the NLE, you would need to knock the debayering down to a half or more to start playing these clips at realtime in a timeline. If you have the time to transcode to ProRes you will get much better playback and rendering speed results. Obviously, transcoding all of your files to a codec, like ProRes or Avid DNX, takes way more time up front but could be worth it if you crunched for time on the back end.

In addition to Red 6K files, I also tested ProRes HQ 4K files inside of Premiere and FCPX, and both played them extremely smoothly without hiccups, which is pretty amazing. Just a few years ago I was having trouble playing down 10:1 compressed files in Media Composer and now I can playback superb-quality 4K files without a problem, a tremendous tip of the hat to technology and, specifically, Apple for putting so much power in a thin and light package.

While I was in the mood to test speeds, I hooked up a Thunderbolt 2 SSD RAID (OWC Thunderbay 4 mini) configured in RAID-0 to see what kind of read/write bandwidth I would get running through the Apple Thunderbolt 3 to Thunderbolt 2 adapter. I used both AJA System Test as well as the Blackmagic Disk Speed Test. The AJA test reported a write speed of 929MB/sec. and read speed of 1120MB/sec. The Blackmagic test reported a write speed of 683.1MB/sec. and 704.7MB/sec. from different tests and a read speed of 1023.3MB/sec. I set the test file for both at 4GB. These speeds are faster than what I have previously found when testing this same Thunderbolt 2 SSD RAID on other systems.

For comparison, the AJA test reported a write speed of 1921MB/sec. and read speed of 2134MB/sec. when running on the system drive. The Blackmagic test doesn’t allow for testing on the system drive.

What Else You Need to Know
So what about the other upgrades and improvements? When exporting these R3D files I noticed the fan kicked on when resizing or adding color grading to the files. Seems like the GPU kicked on and heated up which is to be expected. The fan is not the loudest, but it is noticeable.

The battery life on the new MacBook Pro is great when just playing music, surfing the web or writing product reviews. I found that the battery lasted about two days without having to plug in the power adapter. However, when exporting QuickTimes from either Premiere or FCPX the battery life dropped — a lot. I was getting a battery life of one hour and six minutes, which is not good when your export will take two hours. Obviously, you need to plug in when doing heavy work; you don’t really have an option.

This leads me to the new USB-C/Thunderbolt 3 ports — and, yes, you still have a headphone jack (thank goodness they didn’t talk with the iPhone developers). First off, I thought the MagSafe power adapter should have won a Nobel Peace Prize. I love it. It must be responsible for saving millions of dollars in equipment when people trip over a power cord — gracefully disconnecting without breaking or pulling your laptop off the table. However, I am disappointed Apple didn’t create a new type of MagSafe cable with the USB-C port. I will miss it a lot. The good news is you can now plug in your power adapter to either side of the MacBook Pro.

Adapters and dongles will have to be purchased if you pick up a new MacBook Pro. Each time I used an external peripheral or memory card like an SD card, Tangent Ripple Color Correction panel or external hard drive, I was disappointed that I couldn’t plug them in. Nonetheless, a good Thunderbolt 3 dock is a necessity in my opinion. You could survive with dongles but my OCD starts flaring up when I have to dig around my backpack for adapters. I’m just not a fan. I love how Apple dedicated themselves to a fast I/O like USB-C/Thunderbolt 3, but I really wish they gave it another year. Just one old-school USB port would have been nice. I might have even gotten over no SD card reader.

The Touch Bar
I like it. I would even say that I love it — in the apps that are compatible. Right now there aren’t many. Adobe released an update to Adobe Photoshop that added compatibility with the Touch Bar, and it is really handy especially when you don’t have your Wacom tablet available (or a USB dongle to attach it). I love how it gives access to so many levels of functionality to your tools within your immediate reach.

It has super-fast feedback. When I adjusted the contrast on the Touch Bar I found that the MacBook Pro was responding immediately. This becomes even more evident in FCPX and the latest Resolve 12.5.4 update. It’s clear Apple did their homework and made their apps like Mail and Messages work with the Touch Bar (hence emojis on the Touch Bar). FCPX has a sweet ability to scrub the timeline, zoom in to the timeline, adjust text and more from just the Touch Bar — it’s very handy, and after a while I began missing it when using other computers.
In Blackmagic’s latest DaVinci Resolve release, 12.5.4, they have added Touch Bar compatibility. If you can’t plug in your color correction panels, the Touch Bar does a nice job of easing the pain. You can do anything from contrast work to saturation, even adjust the midtones and printer lights, all from the Touch Bar. If you use external input devices a lot, like Wacom tablets or color correction panels, the Touch Bar will be right up your alley.

One thing I found missing was a simple application launcher on the Touch Bar. If you do pick up the new MacBook Pro with Touch Bar, you might want to download Touch Switcher, a free app I found via 9to5mac.com that allows you to have an app launcher on your Touch Bar. You can hide the dock, allowing you more screen real estate and the efficient use of the Touch Bar to launch apps. I am kind of surprised Apple didn’t make something like this standard.

The Display
From a purely superficial and non-scientific point of view, the newly updated P3-compatible wide-gamut display looks great… really great, actually. The colors are rich and vibrant. I did a little digging under the hood and noticed that it is an 8-bit display (data that you can find by locating the pixel depth in the System Information > Graphics/Display), which might limit the color gradations when working in a color space like P3 as opposed to a 10-bit display displaying in a P3 color space. Simply, you have a wider array of colors in P3 but a small amount of color shades to fill it up.

The MacBook Pro display is labeled as 32-bit color meaning the RGB and Alpha channels each have 8 bits, giving a total of 32 bits. Eight-bit color gives 256 shades per color channel while 10-bit gives 1,024 shades per channel, allowing for much smoother transitions between colors and luminance values (imagine a sky at dusk going smoothly from an orange to light blue to dark blue — the more colors per channel allows for a smoother gradient between lights and darks). A 10-bit display would have 30-bit color with each channel having 10 bits.

I tried to hook up a 10-bit display, but the supplied Thunderbolt 3 to Thunderbolt 2 dongle Apple sent me did not work with the mini display port. I did a little digging and it seems people are generally not happy that Apple doesn’t allow this to work, especially since Thunderbolt 2 and mini DisplayPort are the same connection. Some people have been able to get around this by hooking up their display through daisy chaining something like a Thunderbolt 2 RAID.

While I couldn’t directly test an external display when I had the MacBook Pro, I’ve read that people have been able to push 10-bit color out of the USB-C/Thunderbolt 3 ports to an external monitor. So as long as you are at a desk with a monitor you can most likely have 10-bit color output from this system.

I reached out to Apple on the types of adapters they recommend for an external display and they suggest a USB-C to DisplayPort adapter made by Aukey. It retails for $9.99. They also recommend the USB-C to DisplayPort cable from StarTech, which retails for $39.99. Make sure you read the reviews on Amazon because the experience people have with this varies wildly. I was not able to test either of these so I cannot give my personal opinion.

Summing Up
In the end, the new MacBook Pro is awesome. If you own a recent release of the MacBook Pro and don’t have $3,500 to spare, I don’t know if this is the update you will be looking for. If you are trying to find your way around going to a Windows-based PC because of the lack of Mac Pro updates, this may ease the pain slightly. Without more than 16GB of memory and an Intel Xeon or two, however, it might actually slow you down.

The battery life is great when doing light work, one of the longest batteries I’ve used on a laptop. But when doing the heavy work, you need to be near an outlet. When plugged into that outlet be careful no one yanks out your USB-C power adapter as it might throw your MacBook Pro to the ground or break off inside.

I really do love Apple products. They typically just work. I didn’t even touch on the new Touch ID Sensor that can immediately switch you to a different profile or log you in after waking up the MacBook Pro from sleep. I love that you can turn the new MacBook Pro on and it simply works, and works fast.

The latest iteration of FCPX is awesome as well, and just because I don’t see it being used a lot professionally doesn’t mean it shouldn’t be. It’s a well-built NLE that should be given a fairer shake than it has been given. If you are itching for an update to an old MacBook Pro, don’t mind having a dock or carrying around a bunch of dongles, then the 2016 MacBook Pro with the Touch Bar is for you.

The new MacBook Pro chews through ProRes-based media from 1920×1080 to 4K, 6K and higher will play but might slow down. If you are a Red footage user this new MacBook Pro works great, but you still might have to knock the debayering down a couple notches.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Mettle VR plug-ins for Adobe Premiere

By Barry Goch

I was very frustrated. I took a VR production class, I bought a LG 360 camera, but I felt like I was missing something. Then it dawned on me — I wanted to have more control. I started editing 360 videos using the VR video viewing tools in Adobe Premiere Pro, but I still was lacking the control I desired. I wanted my audience to have a guided, immersive experience without having to be in a swivel chair to get the most out of my work. Then, like a bolt of lightning, it came to me — I needed to rotate the 360 video sphere. I needed to be able to reorient it to accomplish my vision, but how would I do that?

Rotate Sphere plug-in showing keyframing.

Mettle’s Skybox 360/VR Tools are exactly what I was looking for. The Rotate Sphere plug-in alone is worth the price of the entire plug-in package. With this one plug-in, you’re able to re-orient your 360 video without worrying about any technical issues — it gives you complete creative control to re-frame your 360 video — and it’s completely keyframable too! For example, I mounted my 360 camera on my ski helmet this winter and went down a ski run at Heavenly in Lake Tahoe. There are amazing views of the lake from this run, but I also needed to follow the skiers ahead of me. Plus, the angle of the slope changed and the angle to the subjects I was following changed as well. Since the camera was fixed, how could I guide the viewer? By using the Rotate Sphere plug-in from Mettle and keyframing the orientation of the shot as the slope/subject relationship changed relative to my position.

My second favorite plug-in is Project 2D. Without the Project 2D plug-in, when you add titles to your 360 videos they become warped and you have very little control over their appearance. In Project 2D, you create your title using the built-in titler in Premiere Pro, add it to the timeline, then apply the Project 2D Mettle Skybox plug-in. Now you have complete control over the scale, rotation of the titling element and the placement of the title within the 360 video sphere. You can also use the Project 2D plug-in to composite graphics or video into your 360 video environment.

Mobius Zoom transition in action.

Rounding out the Skybox plug-in set are 360 video-aware plug-ins that every content creator needs. What do I mean but 360 video-aware? For example, when you apply a blur that is not 360 video-content-aware, it crosses the seam where the equi-rectangular video’s edges join together and makes the seam unseemly. With the Skybox Blur, Denoise, Glow and Sharpen plug-ins, you don’t have this problem. Just as the Rotate Sphere plug-in does the crazy math to rotate your 360 video without distortion or introducing artifacts, these plug-ins do the same.

Transitioning between cuts in 360 video is an evolving art form. There is really no right or wrong way. Longer cuts, shorter cuts, dissolves and dips to black are some of the basic options. Now, Mettle is adding to our creative toolkit by applying their crazy math skills on transitions in 360 videos. Mettle started with their first pack of four transitions: Mobius Zoom, Random Blocks, Gradient Wipe and Iris Wipe. I used the Mobius Zoom to transition from the header card to the video and then the Iris Wipe with a soft edge to transition from one shot to the next in the linked video.

Check out this video, which uses Rotate Sphere, Project 2D, Mobius Zoom and Iris wipe effects.

New Plug-Ins
I’m pleased to be among the first to show you their second set of plug-ins specifically designed for 360 / VR video! Chroma Leaks, Light Leaks, Spherical Blurs and everyone’s favorite, Light Rays!

Mettle plug-ins work on both Mac and Windows platforms — on qualified systems — and in realtime. The Mettle plug-ins are also both mono- and stereo-aware.

The Skybox plug-in set for Adobe Premiere Pro is truly the answer I’ve been looking for since I started exploring 360 video. It’s changed the way I work and opened up a world of control that I had been wishing for. Try it for yourself by downloading a demo at www.mettle.com.


Barry Goch is currently a digital intermediate editor for Deluxe in Culver City, working on Autodesk Flame. He started his career as a camera tech for Panavision Hollywood. He then transitioned to an offline Avid/FCP editor. His resume includes Passengers, Money Monster, Eye in the Sky and Game of Thrones. His latest endeavor is VR video.