Tag Archives: Adobe Creative Cloud

Adobe Max 2018: Creative Cloud updates and more

By Mike McCarthy

I attended my first Adobe Max 2018 last week in Los Angeles. This huge conference takes over the LA convention center and overflows into the surrounding venues. It began on Monday morning with a two-and-a-half-hour keynote outlining the developments and features being released in the newest updates to Adobe’s Creative Cloud. This was followed by all sorts of smaller sessions and training labs for attendees to dig deeper into the new capabilities of the various tools and applications.

The South Hall was filled with booths from various hardware and software partners, with more available than any one person could possibly take in. Tuesday started off with some early morning hands-on labs, followed by a second keynote presentation about creative and career development. I got a front row seat to hear five different people, who are successful in their creative fields — including director Ron Howard — discuss their approach to work and life. The rest of the day was so packed with various briefings, meetings and interviews that I didn’t get to actually attend any of the classroom sessions.

By Wednesday, the event was beginning to wind down, but there was still a plethora of sessions and other options for attendees to split their time. I presented the workflow for my most recent project Grounds of Freedom at Nvidia’s booth in the community pavilion, and spent the rest of the time connecting with other hardware and software partners who had a presence there.

Adobe released updates for most of its creative applications concurrent with the event. Many of the most relevant updates to the video tools were previously announced at IBC in Amsterdam last month, so I won’t repeat those, but there are still a few new video ones, as well as many that are broader in scope in regards to media as a whole.

Adobe Premiere Rush
The biggest video-centric announcement is Adobe Premiere Rush, which offers simplified video editing workflows for mobile devices and PCs.  Currently releasing on iOS and Windows, with Android to follow in the future, it is a cloud-enabled application, with the option to offload much of the processing from the user device. Rush projects can be moved into Premiere Pro for finishing once you are back on the desktop.  It will also integrate with Team Projects for greater collaboration in larger organizations. It is free to start using, but most functionality will be limited to subscription users.

Let’s keep in mind that I am a finishing editor for feature films, so my first question (as a Razr-M user) was, “Who wants to edit video on their phone?” But what if the user shot the video on their phone? I don’t do that, but many people do, so I know this will be a valuable tool. This has me thinking about my own mentality toward video. I think if I was a sculptor I would be sculpting stone, while many people are sculpting with clay or silly putty. Because of that I would have trouble sculpting in clay and see little value in tools that are only able to sculpt clay. But there is probably benefit to being well versed in both.

I would have no trouble showing my son’s first-year video compilation to a prospective employer because it is just that good — I don’t make anything less than that. But there was no second-year video, even though I have the footage because that level of work takes way too much time. So I need to break free from that mentality, and get better at producing content that is “sufficient to tell a story” without being “technically and artistically flawless.” Learning to use Adobe Rush might be a good way for me to take a step in that direction. As a result, we may eventually see more videos in my articles as well. The current ones took me way too long to produce, but Adobe Rush should allow me to create content in a much shorter timeframe, if I am willing to compromise a bit on the precision and control offered by Premiere Pro and After Effects.

Rush allows up to four layers of video, with various effects and 32-bit Lumetri color controls, as well as AI-based audio filtering for noise reduction and de-reverb and lots of preset motion graphics templates for titling and such.  It should allow simple videos to be edited relatively easily, with good looking results, then shared directly to YouTube, Facebook and other platforms. While it doesn’t fit into my current workflow, I may need to create an entirely new “flow” for my personal videos. This seems like an interesting place to start, once they release an Android version and I get a new phone.

Photoshop Updates
There is a new version of Photoshop released nearly every year, and most of the time I can’t tell the difference between the new and the old. This year’s differences will probably be a lot more apparent to most users after a few minutes of use. The Undo command now works like other apps instead of being limited to toggling the last action. Transform operates very differently, in that they made proportional transform the default behavior instead of requiring users to hold Shift every time they scale. It allows the anchor point to be hidden to prevent people from moving the anchor instead of the image and the “commit changes” step at the end has been removed. All positive improvements, in my opinion, that might take a bit of getting used to for seasoned pros. There is also a new Framing Tool, which allows you to scale or crop any layer to a defined resolution. Maybe I am the only one, but I frequently find myself creating new documents in PS just so I can drag the new layer, that is preset to the resolution I need, back into my current document. For example, I need a 200x300px box in the middle of my HD frame — how else do you do that currently? This Framing tool should fill that hole in the features for more precise control over layer and object sizes and positions (As well as provide its easily adjustable non-destructive masking.).

They also showed off a very impressive AI-based auto selection of the subject or background.  It creates a standard selection that can be manually modified anywhere that the initial attempt didn’t give you what you were looking for.  Being someone who gives software demos, I don’t trust prepared demonstrations, so I wanted to try it for myself with a real-world asset. I opened up one of my source photos for my animation project and clicked the “Select Subject” button with no further input and got this result.  It needs some cleanup at the bottom, and refinement in the newly revamped “Select & Mask” tool, but this is a huge improvement over what I had to do on hundreds of layers earlier this year.  They also demonstrated a similar feature they are working on for video footage in Tuesday night’s Sneak previews.  Named “Project Fast Mask,” it automatically propagates masks of moving objects through video frames and, while not released yet, it looks promising.  Combined with the content-aware background fill for video that Jason Levine demonstrated in AE during the opening keynote, basic VFX work is going to get a lot easier.

There are also some smaller changes to the UI, allowing math expressions in the numerical value fields and making it easier to differentiate similarly named layers by showing the beginning and end of the name if it gets abbreviated.  They also added a function to distribute layers spatially based on the space between them, which accounts for their varying sizes, compared to the current solution which just evenly distributes based on their reference anchor point.

In other news, Photoshop is coming to iPad, and while that doesn’t affect me personally, I can see how this could be a big deal for some people. They have offered various trimmed down Photoshop editing applications for iOS in the past, but this new release is supposed to be based on the same underlying code as the desktop version and will eventually replicate all functionality, once they finish adapting the UI for touchscreens.

New Apps
Adobe also showed off Project Gemini, which is a sketch and painting tool for iPad that sits somewhere between Photoshop and Illustrator. (Hence the name, I assume) This doesn’t have much direct application to video workflows besides being able to record time-lapses of a sketch, which should make it easier to create those “white board illustration” videos that are becoming more popular.

Project Aero is a tool for creating AR experiences, and I can envision Premiere and After Effects being critical pieces in the puzzle for creating the visual assets that Aero will be placing into the augmented reality space.  This one is the hardest for me to fully conceptualize. I know Adobe is creating a lot of supporting infrastructure behind the scenes to enable the delivery of AR content in the future, but I haven’t yet been able to wrap my mind around a vision of what that future will be like.  VR I get, but AR is more complicated because of its interface with the real world and due to the variety of forms in which it can be experienced by users.  Similar to how web design is complicated by the need to support people on various browsers and cell phones, AR needs to support a variety of use cases and delivery platforms.  But Adobe is working on the tools to make that a reality, and Project Aero is the first public step in that larger process.

Community Pavilion
Adobe’s partner companies in the Community Pavilion were showing off a number of new products.  Dell has a new 49″ IPS monitor, the U4919DW, which is basically the resolution and desktop space of two 27-inch QHD displays without the seam (5120×1440 to be exact). HP was displaying their recently released ZBook Studio x360 convertible laptop workstation, (which I will be posting a review of soon), as well as their Zbook X2 tablet and the rest of their Z workstations.  NVidia was exhibiting their new Turing-based cards with 8K Red decoding acceleration, ray tracing in Adobe Dimension and other GPU accelerated tasks.  AMD was demoing 4K Red playback on a MacBookPro with an eGPU solution, and CPU based ray-tracing on their Ryzen systems.  The other booths spanned the gamut from GoPro cameras and server storage devices to paper stock products for designers.  I even won a Thunderbolt 3 docking station at Intel’s booth. (Although in the next drawing they gave away a brand new Dell Precision 5530 2-in-1 convertible laptop workstation.)   Microsoft also garnered quite a bit of attention when they gave away 30 MS Surface tablets near the end of the show.  There was lots to see and learn everywhere I looked.

The Significance of MAX
Adobe MAX is quite a significant event, especially now that I have been in the industry long enough to start to see the evolution of certain trends — things are not as static as we may expect.  I have attended NAB for the last 12 years, and the focus of that show has shifted significantly away from my primary professional focus. (No Red, Ncidia, or Apple booths, among many other changes)  This was the first year that I had the thought “I should have gone to Sundance,” and a number of other people I know had the same impression. Adobe Max is similar, although I have been a little slower to catch on to that change.  It has been happening for over ten years, but has grown dramatically in size and significance recently.  If I still lived in LA, I probably would have started attending sooner, but it was hardly on my radar until three weeks ago.  Now that I have seen it in person, I probably won’t miss it in the future.


Mike McCarthy is an online editor/workflow consultant with 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.

Adobe updates Creative Cloud

By Brady Betzel

You know it’s almost fall when when pumpkin spice lattes are  back and Adobe announces its annual updates. At this year’s IBC, Adobe had a variety of updates to its Creative Cloud line of apps. From more info on their new editing platform Project Rush to the addition of Characterizer to Character Animator — there are a lot of updates so I’m going to focus on a select few that I think really stand out.

Project Rush

I use Adobe Premiere quite a lot these days; it’s quick and relatively easy to use and will work with pretty much every codec in the universe. In addition, the Dynamic Link between Adobe Premiere Pro and Adobe After Effects is an indispensible feature in my world.

With the 2018 fall updates, Adobe Premiere will be closer to a color tool like Blackmagic’s Resolve with the addition of new hue saturation curves in the Lumetri Color toolset. In Resolve these are some of the most important aspects of the color corrector, and I think that will be the same for Premiere. From Hue vs. Sat, which can help isolate a specific color and desaturate it to Hue vs. Luma, which can help add or subtract brightness values from specific hues and hue ranges — these new color correcting tools further Premiere’s venture into true professional color correction. These new curves will also be available inside of After Effects.

After Effects features many updates, but my favorites are the ability to access depth matte data of 3D elements and the addition of the new JavaScript engine for building expressions.

There is one update that runs across both Premiere and After Effects that seems to be a sleeper update. The improvements to motion graphics templates, if implemented correctly, could be a time and creativity saver for both artists and editors.

AI
Adobe, like many other companies, seem to be diving heavily into the “AI” pool, which is amazing, but… with great power comes great responsibility. While I feel this way and realize others might not, sometimes I don’t want all the work done for me. With new features like Auto Lip Sync and Color Match, editors and creators of all kinds should not lose the forest for the trees. I’m not telling people to ignore these features, but asking that they put a few minutes into discovering how the color of a shot was matched, so you can fix something if it goes wrong. You don’t want to be the editor who says, “Premiere did it” and not have a great solution to fix something when it goes wrong.

What Else?
I would love to see Adobe take a stab at digging up the bones of SpeedGrade and integrating that into the Premiere Pro world as a new tab. Call it Lumetri Grade, or whatever? A page with a more traditional colorist layout and clip organization would go a long way.

In the end, there are plenty of other updates to Adobe’s 2018 Creative Cloud apps, and you can read their blog to find out about other updates.

NAB: Adobe’s spring updates for Creative Cloud

By Brady Betzel

Adobe has had a tradition of releasing Creative Cloud updates prior to NAB, and this year is no different. The company has been focused on improving existing workflows and adding new features, some based on Adobe’s Sensei technology, as well as improved VR enhancements.

In this release, Adobe has announced a handful of Premiere Pro CC updates. While I personally don’t think that they are game changing, many users will appreciate the direction Adobe is going. If you are color correcting, Adobe has added the Shot Match function that allows you to match color between two shots. Powered by Adobe’s Sensei technology, Shot Match analyzes one image and tries to apply the same look to another image. Included in this update is the long-requested split screen to compare before and after color corrections.

Motion graphic templates have been improved with new adjustments like 2D position, rotation and scale. Automatic audio ducking has been included in this release as well. You can find this feature in the Essential Sound panel, and once applied it will essentially dip the music in your scene based on dialogue waveforms that you identify.

Still inside of Adobe Premiere Pro CC, but also applicable in After Effects, is Adobe’s enhanced Immersive Environment. This update is for people who use VR headsets to edit and or process VFX. Team Project workflows have been updated with better version tracking and indicators of who is using bins and sequences in realtime.

New Timecode Panel
Overall, while these updates are helpful, none are barn burners, the thing that does have me excited is the new Timecode Panel — it’s the biggest new update to the Premiere Pro CC app. For years now, editors have been clamoring for more than just one timecode view. You can view sequence timecodes, source media timecodes from the clips on the different video layers in your timeline, and you can even view the same sequence timecode in a different frame rate (great for editing those 23.98 shows to a 29.97/59.94 clock!). And one of my unexpected favorites is the clip name in the timecode window.

I was testing this feature in a pre-release version of Premiere Pro, and it was a little wonky. First, I couldn’t dock the timecode window. While I could add lines and access the different menus, my changes wouldn’t apply to the row I had selected. In addition, I could only right click and try to change the first row of contents, but it would choose a random row to change. I am assuming the final release has this all fixed. If it the wonkiness gets flushed out, this is a phenomenal (and necessary) addition to Premiere Pro.

Codecs, Master Property, Puppet Tool, more
There have been some compatible codec updates, specifically Raw Sony X-OCN (Venice), Canon Cinema Raw Light (C200) and Red IPP2.

After Effects CC has also been updated with Master Property controls. Adobe said it best during their announcement: “Add layer properties, such as position, color or text, in the Essential Graphics panel and control them in the parent composition’s timeline. Use Master Property to push individual values to all versions of the composition or pull selected changes back to the master.”

The Puppet Tool has been given some love with a new Advanced Puppet Engine, giving access to improving mesh and starch workflows to animate static objects. Beyond updates to Add Grain, Remove Grain and Match Grain effects, making them multi-threaded, enhanced disk caching and project management improvements have been added.

My favorite update for After Effects CC is the addition of data-driven graphics. You can drop a CSV or JSON data file and pick-whip data to layer properties to control them. In addition, you can drag and drop data right onto your comp to use the actual numerical value. Data-driven graphics is a definite game changer for After Effects.

Audition
While Adobe Audition is an audio mixing application, it has some updates that will directly help anyone looking to mix their edit in Audition. In the past, to get audio to a mixing program like Audition, Pro Tools or Fairlight you would have to export an AAF (or if you are old like me possibly an OMF). In the latest Audition update you can simply open your Premiere Pro projects directly into Audition, re-link video and audio and begin mixing.

I asked Adobe whether you could go back and forth between Audition and Premiere, but it seems like it is a one-way trip. They must be expecting you to export individual audio stems once done in Audition for final output. In the future, I would love to see back and forth capabilities between apps like Premiere Pro and Audition, much like the Fairlight tab in Blackmagic’s Resolve. There are some other updates like larger tracks and under-the-hood updates which you can find more info about on: https://theblog.adobe.com/creative-cloud/.

Adobe Character Animator has some cool updates like overall character building updates, but I am not too involved with Character Animator so you should definitely read about things like the Trigger Improvements on their blog.

Summing Up
In the end, it is great to see Adobe moving forward on updates to its Creative Cloud video offerings. Data-driven animation inside of After Effects is a game-changer. Shot color matching in Premiere Pro is a nice step toward a professional color correction application. Importing Premiere Pro projects directly into Audition is definitely a workflow improvement.

I do have a wishlist though: I would love for Premiere Pro to concentrate on tried-and-true solutions before adding fancy updates like audio ducking. For example, I often hear people complain about how hard it is to export a QuickTime out of Premiere with either stereo or mono/discrete tracks. You need to set up the sequence correctly from the jump, adjust the pan on the tracks, as well as adjust the audio settings and export settings. Doesn’t sound streamlined to me.

In addition, while shot color matching is great, let’s get an Adobe SpeedGrade-style view tab into Premiere Pro so it works like a professional color correction app… maybe Lumetri Pro? I know if the color correction setup was improved I would be way more apt to stay inside of Premiere Pro to finish something instead of going to an app like Resolve.

Finally, consolidating and transcoding used clips with handles is hit or miss inside of Premiere Pro. Can we get a rock-solid consolidate and transcode feature inside of Premiere Pro? Regardless of some of the few negatives, Premiere Pro is an industry staple and it works very well.

Check out Adobe’s NAB 2018 update video playlist for details on each and every update.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Dell’s 8K LCD monitor

By Mike McCarthy

At CES 2017, Dell introduced its UP3218K LCD 32-inch monitor, which was the first commercially available 8K display. It runs 7680×4320 pixels at 60fps, driven by two DisplayPort 1.4 cables. That is over 33 million pixels per frame, and nearly 2 billion per second, which requires a lot of GPU power to generate. Available since March, not long ago I was offered one to review as part of a wider exploration of 8K video production workflows, and there will be more articles about that larger story in the near future.

For this review, I will be focusing on only this product and its uses.

The UP3218K showed up in a well-designed box that was easy to unpack — it was also easy getting the monitor onto the stand. I plugged it into my Nvidia Quadro P6000 card with the included DisplayPort cables, and it came up as soon as I turned it on… at full 60Hz and without any issues or settings to change. Certain devices with only one DisplayPort 1.4 connector will only power the display at 30Hz, as full 60Hz connections saturate the bandwidth of two DP 1.4 cables, but the display does require a Displayport 1.4 connection, and will not revert to lower resolution when connected to a 1.2 port. This limits the devices that can drive it to Pascal-based GPUs on the Nvidia side, or top-end Vega GPUs on the AMD side. I have a laptop with a P5000 in it, so I was disappointed to discover that the DisplayPort connector was still only 1.2, thereby making it incompatible with this 8K monitor.

Dell’s top Precision laptops (7720 and 7520) support DP1.4, while HP and Lenovo’s mobile workstations do not yet. This is a list of every device I am aware of that explicitly claims to support 8K output:
1. Quadro P6000, P5000, P4000, P2000 workstation GPU cards
2. TitanX and Geforce10 Series graphics cards
3. RadeonPro SSG, WX9100 & WX7100 workstation GPU cards
4. RX Vega 64 and 56 graphics cards
5. Dell Precision 7520 and 7720 mobile workstations
6. Comment if you know of other laptops with DP1.4

So once you have a system that can drive the monitor, what can you do with it? Most people reading this article will probably be using this display as a dedicated full-screen monitor for their 8K footage. But smooth 8K editing and playback is still a ways away for most people. The other option is to use it as your main UI monitor to control your computer and its applications. In either case, color can be as important as resolution when it comes to professional content creation, and Dell has brought everything it has to the table in this regard as well.

The display supports Dell’s PremierColor toolset, which is loosely similar to the functionality that HP offers under their DreamColor branding. PremierColor means a couple of things, including that the display has the internal processing power that allows it to correctly emulate different color spaces; it can also be calibrated with an X-Rite iDisplay Pro independent of the system driving it. It also interfaces with a few software tools that Dell has developed for its professional users. The mo

st significant functionality within that feature set is the factory-calibrated options for emulating AdobeRGB, sRGB, Rec.709 and DCI-P3. Dell tests each display individually after manufacturing to ensure that it is color accurate. These are great features, but they are not unique to this monitor, and many users have been using them on other display models for the last few years. While color accuracy is important, the main selling point of this particular model is resolution, and lots of it. And that is what I spent the majority of my time analyzing.

Resolution
The main issue here is the pixel density. Ten years ago, 24-inch displays were 1920×1200, and 30-inch displays had 2560×1600 pixels. This was around 100 pixels per inch, and most software was hard coded to look correct at that size. When UHD displays were released, the 32-inch version had a DPI of 140. That resulted in applications looking quite small and hard to read on the vast canvas of pixels, but this trend increased pressure on software companies to scale their interfaces better for high DPI displays. Windows 7 was able to scale things up an extra 50%, but a lot of applications ignored that setting or were not optimized for it. Windows 10 now allows scaling beyond 300%, which effectively triples the size of the text and icons. We have gotten to the point where even 15-inch laptops have UHD screens, resulting in 280 DPI, which is unreadable to most people without interface scaling.

Premiere Pro

With 8K resolution, this monitor has 280 DPI, twice that of a 4K display of similar size. This is on par with a 15-inch UHD laptop screen, but laptops are usually viewed from a much closer range. Since I am still using Windows 7 on my primary workstation, I was expecting 280 DPI to be unusable for effective work. And while everything is undoubtedly small, it is incredibly crisp, and once I enabled Windows scaling at 150%, it was totally usable (although I am used to small fonts and lots of screen real estate). The applications I use, especially Adobe CC, scale much smoother than they used to, so everything looks great, even with Windows 7, as long as I sit fairly close to the monitor.

I can edit 6K footage in Premiere Pro at full resolution for the first time, with space left over for my timeline and tool panels. In After Effects, I can work on 4K shots in full resolution and still have 70 layers of data visible in my composition. In Photoshop, setting the UI to 200% causes the panel to behave similar to a standard 4K 32-inch display, but with your image having four times the detail. I can edit my 5.6K DSLR files in full resolution, with nearly every palette open to work smoothly through my various tools.

This display replaces my 34-inch curved U3415W as my new favorite monitor for Adobe apps, although I would still prefer the extra-wide 34-inch display for gaming and other general usability. But for editing or VFX work, the 8K panel is a dream come true. Every tool is available at the same time, and all of your imagery is available at HiDPI quality.

Age of Empires II

When gaming, the resolution doesn’t typically affect the field of view of 3D applications, but for older 2D games, you can see the entire map at once. Age of Empires II HD offers an expansive view of really small units, but there is a texture issue with the background of the bottom quarter of the screen. I think I used to see this at 4K as well, and it got fixed in an update, so maybe the same thing will happen with this one, once 8K becomes more common.

I had a similar UI artifact issue in RedCine player when I full-screened the Window on the 8K display, which was disappointing since that was one of the few ways to smoothly play 8K footage on the monitor at full resolution. Using it as a dedicated output monitor works as well, but I did run into some limitations. I did eventually get it to work with RedCine-X Pro, after initially experiencing some aspect ratio issues. It would playback cached frames smoothly, but only for 15 seconds at a time before running out of decoded frames, even with a Rocket-X accelerator card.

When configured as a secondary display for dedicated full-screen output, it is accessible via Mercury Transmit in the Adobe apps. This is where it gets interesting, because the main feature that this monitor brings to the table is increased resolution. While that is easy to leverage in Photoshop, it is very difficult to drive that many pixels in real-time for video work, and decreasing the playback resolution negates the benefit of having an 8K display. At this point, effectively using the monitor becomes more an issue of workflow.

After Effects

I was going to use 8K Red footage for my test, but that wouldn’t play smoothly in Premiere, even on my 20-core workstation, so I converted it to a variety of other files to test with. I created 8K test assets that matched the monitor resolution in DNxHR, Cineform, JPEG2000, OpenEXR and HEVC. DNxHR was the only format that offered full-resolution playback at 8K, and even that resulted in dropped frames on a regular basis. But being able to view 8K video is pretty impressive, and probably forever shifts my view of “sharp” in the subjective sense, but we are at a place where we are still waiting for the hardware to catch up in regards to processing power — for 8K video editing to be an effective reality for users.

Summing Up
The UP3218K is the ultimate monitor for content creators and artists looking for a large digital canvas, regardless of whether that is measured in inches or pixels. All those pixels come at a price — it is currently available from Dell for $3,900. Is it worth it? That will depend on what your needs and your budget are. Is a Mercedes Benz worth the increased price over a Honda? Some people obviously think so.

There is no question that this display and the hardware to drive it effectively would be a luxury to the average user. But for people who deal with high resolution content on a regular basis, the increased functionality that it offers them can’t be measured in the same way, and reading an article and seeing pictures online can’t compare to actually using the physical item. The screenshots are all scaled to 25% to be a reasonable size for the web. I am just trying to communicate a sense of the scope of the desktop real estate available to users on an 8K screen. So yes, it is expensive, but at the moment, it is the highest resolution monitor that money can buy, and the closest alternative (5K screens) does not even come close.


Mike McCarthy is an online editor/workflow consultant with 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.

 

Adobe intros updates to Creative Cloud, including Team Projects

Later this year, Adobe will be offering new capabilities within its Adobe Creative Cloud video tools and services. This includes updates for VR/360, animation, motion graphics, editing, collaboration and Adobe Stock. Many of these features are powered by Adobe Sensei, the company’s artificial intelligence and machine learning framework. Adobe will preview these advancements at IBC.

The new capabilities coming later this year to Adobe Creative Cloud for video include:
• Access to motion graphics templates in Adobe Stock and through Creative Cloud Libraries, as well as usability improvements to the Essential Graphics panel in Premiere Pro, including responsive design options for preserving spatial and temporal.
• Character Animator 1.0 with changes to core and custom animation functions, such as pose-to-pose blending, new physics behaviors and visual puppet controls. Adobe Sensei will help improve lip-sync capability by accurately matching mouth shape with spoken sounds.
• Virtual reality video creation with a dedicated viewing environment in Premiere Pro. Editors can experience the deeply engaging qualities of content, review their timeline and use keyboard driven editing for trimming and markers while wearing the same VR head-mounts as their audience. In addition, audio can be determined by orientation or position and exported as ambisonics audio for VR-enabled platforms such as YouTube and Facebook. VR effects and transitions are now native and accelerated via the Mercury playback engine.
• Improved collaborative workflows with Team Projects on the Local Area Network with managed access features that allow users to lock bins and provide read-only access to others. Formerly in beta, the release of Team Projects will offer smoother workflows hosted in Creative Cloud and the ability to more easily manage versions with auto-save history.
• Flexible session organization to multi-take workflows and continuous playback while editing in Adobe Audition. Powered by Adobe Sensei, auto-ducking is added to the Essential Sound panel that automatically adjusts levels by type: dialogue, background sound or music.

Integration with Adobe Stock
Adobe Stock is now offering over 90 million assets including photos, illustrations and vectors. Customers now have access to over 4 million HD and 4K Adobe Stock video footage directly within their Creative Cloud video workflows and can now search and scrub assets in Premiere Pro.

Coming to this new release are hundreds of professionally-created motion graphics templates for Adobe Stock, available later this year. Additionally, motion graphic artists will be able to sell Motion Graphic templates for Premiere Pro through Adobe Stock. Earlier this year, Adobe added editorial and premium collections from Reuters, USA Today Sports, Stocksy and 500px.

Chaos Group and Adobe partner for photorealistic rendering in CC

Chaos Group’s V-Ray rendering technology is featured in Adobe’s Creative Cloud, allowing graphic designers to easily create photorealistic 3D rendered composites with Project Felix.

Available now, Project Felix is a public beta desktop app that helps users composite 3D assets like models, materials and lights with background images, resulting in an editable render they can continue to design in Photoshop CC. For example, users can turn a basic 3D model of a generic bottle into a realistic product shot that is fully lit and placed in a scene to create an ad, concept mock-up or even abstract art.

V-Ray acts as a virtual camera, letting users test angles, perspectives and placement of their model in the scene before generating a final high-res render. Using the preview window, Felix users get immediate visual feedback on how each edit affects the final rendered image.

By integrating V-Ray, Adobe has brought the same raytracing technology used by companies Industrial Light & Magic to a much wider audience.

“We’re thrilled that Adobe has chosen V-Ray to be the core rendering engine for Project Felix, and to be a part of a new era for 3D in graphic design,” says Peter Mitev, CEO of Chaos Group. “Together we’re bringing the benefits of photoreal rendering, and a new design workflow, to millions of creatives worldwide.”

“Working with the amazing team at Chaos Group meant we could bring the power of the industry’s top rendering engine to our users,” adds Stefano Corazza, senior director of engineering at Adobe. “Our collaboration lets graphic designers design in a more natural flow. Each edit comes to life right before their eyes.”

Review: Lenovo ThinkStation P410

By Brady Betzel

With the lukewarm reaction of the professional community to the new Apple MacBook Pro, there are many creative professionals who are seriously — for the first time in their careers — considering whether or not to jump into a Windows-based world.

I grew up using an Apple II GS from 1986 (I was born in 1983, if you’re wondering), but I always worked on both Windows and Apple computers. I guess my father really instilled the idea of being independent and not relying on one thing or one way of doing something — he wanted me to rely on my own knowledge and not on others.

Not to get too philosophical, but when he purchased all the parts I needed to build my own Windows system, it was incredibly gratifying. I would have loved to have built my own Apple system, but obviously never could. That is why I am so open to computer systems of any operating system software.

If you are deciding whether or not to upgrade your workstation and have never considered solutions other than HP, Dell or Apple, you will want to read what I have to say about Lenovo‘s latest workstation, the P410.

When I set out on this review, I didn’t have any Display Port-compatible monitors and Lenovo was nice enough to send their beautiful Think Vision Pro 2840m — another great piece of hardware.

Digging In
I want to jump right into the specs of the ThinkStation P410. Under the hood is an Intel Xeon E5-1650 v4, which in plain terms is a 6-core 3.6GHz 15MB CPU that can reach all the way up to 4.0GHz if needed using Intel’s Turbo Boost technology. The graphics card is a medium-sized monster — the Nvidia Quadro M4000 with 8GB of GDDR5 memory and 1664 CUDA cores. It has 4 DisplayPort 1.2 ports to power those four 30-bit 4096×2160 @60Hz displays you will run when editing and color correcting.

If you need more CUDA power you could step up to the Nvidia M5000, which runs 2048 CUDA cores or the M6000, which runs 3072 CUDA cores, but that power isn’t cheap (and as of this review they are not even an option from Lenovo in the P410 customization — you will probably have to step up to a higher model number).

There is 16GB of DD4-2400 ECC memory, 1TB 2.5-inch SATA 6Gb/s SSD (made by Macron), plus a few things like a DVD writer, media card reader, keyboard and mouse. At the time I was writing this review, you could configure this system for a grand total of $2,794, but if you purchase it online at shop.lenovo.com it will cost a little under $2,515 with some online discounts. As I priced this system out over a few weeks I noticed the prices changed, so keep in mind it could be higher. I configured a similar style HP z440 workstation for around $3,600 and a Dell Precision Tower 5000 for around $3,780, so Lenovo’s prices are on the low end for major-brand workstations.

For expansion (which Windows-based PCs seem to lead the pack in), you have a total of four DIMM slots for memory (two are taken up already by two 8GB sticks), four PCIe slots and four hard drive bays. Two of the hard drive bays are considered Flex Bays, which can be used for hard drives, hard drive + slim optical drive or something like front USB 3.0 ports.

On the back there are your favorite PS/2 keyboard port and mouse port, two USB 2.0 ports, four USB 3.0 ports, audio in/out/mic and four DisplayPorts.

Testing
I first wanted to test the P410’s encoding speed when using Adobe Media Encoder. I took a eight-minute, 30 second 1920×1080 23.98fps ProRes HQ QuickTime that I had filmed using a Blackmagic Pocket Cinema Camera, did a quick color balance in Adobe Premiere Pro CC 2017 using the Lumetri Color Correction tools and exported a Single Pass, variable bit rate 25Mb/s H.264 using Media Encoder. Typically, CUDA cores kick in when you use GPU-accelerated tools like transitions, scaling in Premiere and when you export files with GPU effects such as Lumetri Color tools. Typically, when exporting from tools, like Adobe Premiere Pro CC or Adobe Media Encoder, the GPU acceleration kicks in only if you’ve applied GPU-accelerated effects, color correction with something like Lumetri (which is GPU accelerated) or a resize effect. Otherwise if you are just transcoding from one codec to another the CPU will handle the task.

In this test, it took Media Encoder about six minutes to encode the H.264 when Mercury Playback Engine GPU Acceleration (CUDA) was enabled. Without the GPU acceleration enabled it took 14 minutes. So by using the GPU, I got about a 40 percent speed increase thanks to the power of the Nvidia Quadro M4000 with 8GB of GDDR5 RAM.

For comparison, I did the same test on a newly released MacBook Pro with Touch Bar i7 2.9Ghz Quad Core, 16GB of 2133 MHz LPDDR3 memory and AMD Radeon Pro 460 4GB of RAM (uses OpenCL as opposed to CUDA); it took Media Encoder about nine minutes using the GPU.

Another test I love to run uses Maxon’s Cinebench, which simply runs real-world scenarios like photorealistic rendering and a 3D car chase. This taxes your system with almost one million polygons and textures. Basically, it makes your system do a bunch of math, which helps in separating immature workstations from the professional ones. This system came in around 165 frames per second. In comparison to other systems, with similar configurations to the P410, it placed first or second. So it’s fast.

Lenovo Performance Tuner
While the low price is what really sets the P410 apart from the rest of the pack, Lenovo has recently released a hardware tuning software program called Lenovo Performance Tuner. Performance Tuner is a free app that helps to focus your Lenovo workstation on the app you are using. For instance, I use Adobe CC a lot at home, so when I am working in Premiere I want all of my power focused there with minimal power focused on background apps that I may not have turned off — sometimes I let Chrome run in the background or I want to jump between Premiere, Resolve and Photoshop. You can simply launch Performance Tuner and click the app you want to launch in Lenovo’s “optimized” state. You can go further by jumping into the Settings tab and customize things like Power Management Mode to always be on Max Performance. It’s a pretty handy tool when you want to quickly funnel all of your computing resources to one app.

The Think Vision Pro Monitor
Lastly, I wanted to quickly touch on the Think Vision Pro 2840m LED backlit LCD monitor Lenovo let me borrow for this review. The color fidelity is awesome and can work at a resolution up to 3840×2160 (UHD, not full 4K). It will tilt and rotate almost any way you need it to, and it will even go full vertical at 90 degrees.

When working with P410 I had some problems with DisplayPort not always kicking in with the monitor, or any monitor for that matter. Sometimes I would have to unplug and plug the DisplayPort cable back in while the system was on for the monitor to recognize and turn on. Nonetheless, the monitor is awesome at 28 inches. Keep in mind it has a glossy finish so it might not be for you if you are near a lot of light or windows — while the color and brightness punch through, there is a some glare with other light sources in the room.

Summing Up
In the end, the Lenovo ThinkStation P410 workstation is a workhorse. Even though it’s at the entry level of Lenovo’s workstations, it has a lot of power and a great price. When I priced out a similar system using PC Partpicker, it ran about $2,600 — you can check out the DIY build I put together on PCPartpicker.com: https://pcpartpicker.com/list/r9H4Ps.

A drawback of DIY custom builds though is that they don’t include powerful support, a complete warranty from a single company or ISV certifications (ISV = Independent Software Vendors). Simply, ISVs are the way major workstation builders like HP, Dell and Lenovo test their workstations against commonly used software like Premiere Pro or Avid Media Composer in workstation-focused industries like editing or motion graphics.

One of the most misunderstood benefits of a workstation is that it’s meant to run day and night. So not only do you get enterprise-level components like Nvidia Quadro graphics cards and Intel Xeon CPUs, the components are made for durability as well as performance. This way there is little downtime, especially in mission-critical environments. I didn’t get to run this system for months constantly, but I didn’t see any sign of problems in my testing.

When you buy a Lenovo workstation it comes with a three-year on-site warranty, which covers anything that goes wrong with the hardware itself, including faulty workmanship. But it won’t cover things like spills, drops or electrical surges.

I liked the Lenovo ThinkStation P410. It’s fast, does the job and has quality components. I felt that it lacked a few of today’s necessary I/O ports like USB-C/Thunderbolt 3.

The biggest pro for this workstation is the overwhelmingly low price point for a major brand workstation like the ThinkStation P410. Check out the Lenovo website for the P410 and maybe even wander into the P910 aisle, which showcases some of the most powerful workstations they make.

Check out this video I made that gives you a closer look at (and inside) the workstation.

Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Behind the Title: Billboard Magazine video editor Zack Wolder

Name: Zack Wolder

Company: Billboard Magazine

Job Title: Video Editor

What does that entail?
I edit behind-the-scenes videos of magazine cover shoots, as well as on-site event coverage for music festivals, award shows and industry conferences. I also create branded content pieces, multi-camera studio performances and mini docs.

What are typical projects you work on?
Recently my focus has been on branded content and mini docs.

Can you name some?
I’m currently working on a video series for Sour Patch Kids. Each video features a different artist staying at The Patch House, talking about their upcoming tour/album and it usually features a short performance. The Patch House is a space provided by Sour Patch Kids for artists to stay, relax and create while on tour. One was for artist Raury (pictured). It’s about Raury writing the theme song for the Heaven Sent chute-less skydive that Fox aired at the end of July.  It was presented by Stride Gum.  This video follows him while he meets the skydiver Luke Aikins and does a jump with him.

What is the typical turnaround on these?
Turnaround time varies from project to project. The behind-the-scenes videos for the magazine are sometimes turned around in two to three days, while I have about a week for the more narrative pieces.

How do you manage and deal with the challenges of quick turnarounds?
It’s all about being organized and well-managed. The first thing I do is listen to the interview in its entirety while making a few notes. My goal is to get a good understanding of the story so I don’t have to fully watch it again. Note taking is essential. After the first pass-through I usually have a story mapped out in my head and know how it will unfold. Then I skim through the b-roll, just to get an idea of how much there is, the quality and variety.

Zack cutting on-location at a music festival.

These few steps typically take a few hours; it all depends on the length of the interview. My goal is to have a complete story edit with a music bed done by the end of day one and maybe start a b-roll pass. The first half of day two is filling in with b-roll, and the second half is some basic color correcting. Day three is for revisions.

What editing system do you use? Any plug-ins? What about storage?
I use the Adobe Creative Cloud, mainly Premiere Pro, After Effects and SpeedGrade (I have not yet updated to the newest version of Premiere, which removes the direct link to SpeedGrade function.)

I don’t use many plug-ins other than Twixtor and a few free transition plug-ins that I’ve found over the years.

For storage, we have a large EditShare server set up in the office. We have another, slightly smaller EditShare unit that we bring to the larger festivals and conferences. For smaller events we use LaCie and G-Tech G-RAID drives.

Are you asked to do more than editing on some of these? If so, what are you asked to do.
Mostly editing. For festivals I tend to help plan the post workflow. I recently planned a simple workflow for our Hot 100 Festival, which includes six editors (one offsite), a DIT and 13 cameras.

 

Updates to Adobe Creative Cloud include project sharing, more

By Brady Betzel

Adobe has announced team project sharing!! You read that right — the next Adobe Creative Cloud update, to be released later this year, will have the one thing I’ve always said kept Adobe from punching into Avid’s NLE stake with episodic TV and film editors.

While “one thing” is a bit of hyperbole, Team Projects will be much more than just simple sharing within Adobe Premiere Pro. Team Projects, in its initial stage, will also work with Adobe After Effects, but not with Adobe Audition… at least not in the initial release. Technically speaking, sharing projects within Creative Cloud seems like it will follow a check-in/check-out workflow, allowing you to approve another person’s updates to override yours or vice-versa.

During a virtual press demo, I was shown how the Team Projects will work. I asked if it would work “offline,” meaning without Internet connection. Adobe’s representative said that Team Projects will work with intermittent Internet disconnections, but not fully offline. I asked this because many companies do not allow their NLEs or their storage to be attached to any Internet-facing network connections. So if this is important to you, you may need to do a little more research once we actually can get our hands on this release.

My next question was if Team Projects was a paid service. The Adobe rep said they are not talking the business side of this update yet. I took this as an immediate yes, which is fine, but officially they have no comment on pricing or payment structure, or if it will even cost extra at all.

Immediately after I asked my last question, I realized that this will definitely tie in with the Creative Cloud service, which likely means a monthly fee. Then I wondered where exactly will my projects live? In the cloud? I know the media can live locally on something like an Avid ISIS or Nexis, but will the projects be shared over the Internet? Will we be able to share individual sequences and/or bins or just entire projects? There are so many questions and so many possibilities in my mind, it really could change the multiple editor NLE paradigm if Adobe can manage it properly. No pressure Adobe.

Other Updates
Some other Premiere Pro updates include: improved caption and subtitling tools; updated Lumetri Color tools, including much needed improvement to the HSL secondaries color picker; automatic recognition of VR/360 video and what type of mapping it needs; improved virtual reality workflow; destination publishing will now include Behance (No Instagram export option?); improved Live Text Templates, including a simplified workflow that allows you to share Live Text Templates with other users (will even sync Fonts if they aren’t present from Typekit) and without need for an After Effects License; native DNxHD and DNxHR QuickTime export support, audio effects from Adobe Audition, Global FX mute to toggle on and off all video effects in a sequence; and, best of all, a visual keyboard to map shortcuts! Finally, another prayer for Premiere Pro has been answered. Unfortunately, After Effects users will have to wait for a visual keyboard for shortcut assignment (bummer).

After Effects has some amazing updates in addition to Project Sharing, including a new 3D render engine! Wow! I know this has been an issue for anybody trying to do 3D inside of After Effects via Cineware. Most people will purchase VideoCopilot’s Element 3D to get around this, but for those that want to work directly with Maxon’s Cinema 4D, this may be the update that alleviates some of your 3D disdain via Cineware. They even made mention that you do not need a GPU for this to work well. Oh, how I would love for this to come to fruition. Finally, there’s a new video preview architecture for faster playback that will hopefully allow for a much more fluid and dynamic playback experience.

After Effects C4D RenderAdobe Character Animator has some updates too. If you haven’t played with Character Animator you need to download it now and just watch the simple tutorials that come with the app — you will be amazed, or at least your kids will be. If you haven’t seen how the Simpson’s used Character Animator, you should check it out with a YouTube search. It is pretty sweet. In terms of incoming updates, there will be faster and easier puppet creation, improved round trip workflow between Photoshop and Illustrator, and the ability to use grouped keyboard triggers.

Summing Up
In the end, the future is still looking up for the Adobe Creative Cloud video products, like Premiere Pro and After Effects. If there is one thing to jump out of your skin over in the forthcoming update it is Team Projects. If Team Projects works and works well, the NLE tide may be shifting. That is a big if though because there have been some issues with previous updates — like media management within Premiere Pro — that have yet to be completely ironed out.

Like I said, if Adobe does this right it will be game-changing for them in the shared editing environment. In my opinion, Adobe is beginning to get its head above water in the video department. I would love to see these latest updates come in guns blazing and working. From the demo I saw it looks promising, but really there is only one way to find out: hands-on experience.

Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter @allbetzroff. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

Top 3: My picks from Adobe’s Creative Cloud update

By Brady Betzel

Adobe’s resolve to update its Creative Cloud apps on a regular basis has remained strong. The latest updates, released on December 1, really hammer home Adobe’s commitment to make editing video, creating visual effects and color correcting on a tablet a reality, but it doesn’t end there. They have made their software stronger across the board, whether you are using a tablet, mobile workstation or desktop.

After Effects and Stacked Panels

I know everyone is going to have their own favorites, but here are my top three from the latest release:

1. Stacked Panels
In both After Effects and Premiere you will notice the ability to arrange your menus in Stacked Panels. I installed the latest updates on a Sony VAIO tablet and these Stacked Panels were awesome!

It’s really a nice way to have all of your tools on screen without having them take up too much real estate. In addition, Adobe has improved touch-screen interaction with the improved ability to pinch and zoom different parts of the interface, like increasing the size of a thumbnail with a pinch-to-zoom.

In Premiere, to find the Stacked Panels you need to find the drop down menu in the project panel, locate Panel Group Settings and then choose Stacked Panel Group and Solo Panels in Stack, if you want to only view one at a time. I highly recommend using the Stacked Panels if you are using a touchscreen, like a tablet or some of the newer mobile workstations out there in the world. Even if you aren’t, I really think it works well.

Premiere Pro and Optical Flow

Premiere Pro and Optical Flow

2. Optical Flow Time Remapping
Most editors are probably thinking, “Avid has had this for years and years and years, just like Avid had Fluid Morph years before Adobe introduced Morph Cut.” While I thought the exact same thing, I really love that Adobe’s version is powered by the GPU. This really beefs up the speed of the latest HP z840 with Nvidia Quadro or GTX 980 Ti graphics cards and all their CUDA cores. Be warned though, Optical Flow (much like Morph Cut) works only in certain situations.

If you’ve ever used Twixtor or Fluid Motion in Media Composer, you know that sometimes there is a lot of work that goes into making those effects work. It’s not always the right solution to time remapping footage, especially if you are working on content that will air on broadcast television — even though Optical Flow may look great, some content will fail certain networks’ quality control because of the weird Jello-looking artifacting that can occur.

After Effects and the Lumetri Color Panel

3. Lumetri Color inside of After Effects
While you might already have a favorite coloring app or plug-in to use, having the ability to take clips from Premiere to After Effects, while carrying over the color correction you made inside of the Lumetri panels, is key. In addition, you can use the Lumetri effect inside of After Effects (located under the Utility category) to quickly color your clips inside of After Effects.

Overall, this round of updates seemed to be par for the course, nothing completely revolutionary but definitely useful and wanted. Personally, I don’t think that adding HDR capabilities should have taken precedence over some other updates, such as collaboration improvements (think Avid Media Composer and Avid’s Shared Storage solution, ISIS), general stability improvements, media management, etc. But Adobe is holding true to their word and bringing some of the latest and greatest improvements to their users… and causing users (and manufacturers) of other tools to take notice.

Brady Betzel is an online editor at Margarita Mix in Hollywood. Previously, he was editing The Real World at Bunim Murray Productions. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter, @allbetzroff.