Tag Archives: Adobe Creative Cloud

Adobe’s new Content-Aware fill in AE is magic, plus other CC updates

By Brady Betzel

NAB is just under a week away, and we are here to share some of Adobe’s latest Creative Cloud offerings. And there are a few updates worth mentioning, such as a freeform project panel in Premiere Pro, AI-driven Auto Ducking for Ambience for Audition and addition of a Twitch extension for Character Animator. But, in my opinion, the Adobe After Effects updates are what this year’s release will be remembered by.


Content Aware: Here is the before and after. Our main image is the mask.

There is a new expression editor in After Effects, so us old pseudo-website designers can now feel at home with highlighting, line numbers and more. There are also performance improvements, such as faster project loading times and new deBayering support for Metal on macOS. But the first prize ribbon goes to the Content-Aware fill for video powered by Adobe Sensei, the company’s AI technology. It’s one of those voodoo features that when you use it, you will be blown away. If you have ever used Mocha Pro by BorisFX then you have had a similar tool known as the “Object Removal” tool. Essentially, you draw around the object you want to remove, such as a camera shadow or boom mic, hit the magic button and your object will be removed with a new background in its place. This will save users hours of manual work.

Freeform Project panel in Premiere.

Here are some details on other new features:

● Freeform Project panel in Premiere Pro— Arrange assets visually and save layouts for shot selects, production tasks, brainstorming story ideas, and assembly edits.
● Rulers and Guides—Work with familiar Adobe design tools inside Premiere Pro, making it easier to align titling, animate effects, and ensure consistency across deliverables.
● Punch and Roll in Audition—The new feature provides efficient production workflows in both Waveform and Multitrack for longform recording, including voiceover and audiobook creators.
● Surprise viewers in Twitch Live-Streaming Triggers with Character Animator Extension—Livestream performances are enhanced where audiences engage with characters in real-time with on-the-fly costume changes, impromptu dance moves, and signature gestures and poses—a new way to interact and even monetize using Bits to trigger actions.
● Auto Ducking for ambient sound in Audition and Premiere Pro — Also powered by Adobe Sensei, Auto Ducking now allows for dynamic adjustments to ambient sounds against spoken dialog. Keyframed adjustments can be manually fine-tuned to retain creative control over a mix.
● Adobe Stock now offers 10 million professional-quality, curated, royalty-free HD and 4K video footage and Motion Graphics templates from leading agencies and independent editors to use for editorial content, establishing shots or filling gaps in a project.
● Premiere Rush, introduced late last year, offers a mobile-to-desktop workflow integrated with Premiere Pro for on-the-go editing and video assembly. Built-in camera functionality in Premiere Rush helps you take pro-quality video on your mobile devices.

The new features for Adobe Creative Cloud are now available with the latest version of Creative Cloud.

Adobe acquires Allegorithmic, makers of Substance

Adobe has acquired Allegorithmic, makers of Substance, the industry standard for 3D textures and material creation in game and post production. By combining Allegorithmic’s Substance 3D design tools with Creative Cloud’s imaging, video and motion graphics tools, Adobe will empower video game creators, VFX artists working in film and television, designers and marketers to deliver the next generation of immersive experiences.

As brands look to compete and differentiate themselves, compelling, interactive experiences enabled by 3D content, VR and AR will become more critical to their future success. 3D content is already transforming traditional workflows into fully immersive and digital ones that save time, reduce cost and open new creative horizons. With the acquisition of Allegorithmic, Adobe has added expanded 3D and immersive workflows to Creative Cloud and provides Adobe’s users a new set of tools for 3D projects.

“We are seeing an increasing appetite from customers to leverage 3D technology across media, entertainment, retail and marketing to design and deliver fully immersive experiences,” says Scott Belsky, chief product officer/executive VP, Creative Cloud, Adobe. “Substance products are a natural complement to existing Creative Cloud apps that are used in the creation of immersive content, including Photoshop, Dimension, After Effects and Project Aero.”

Allegorithmic has users working across the gaming, film and television, automotive, design and advertising industries, including brands like Electronic Arts, Ubisoft, BMW, Ikea, Louis Vuitton and Foster + Partners. Allegorithmic is used on AAA gaming franchises, including Call of Duty, Assassin’s Creed and Forza, and was used in the making of movies, including Blade Runner 2049, Pacific Rim Uprising and Tomb Raider.

Allegorithmic tools are already offered as a subscription service to individuals and enterprise customers, and in the future Adobe will focus on expanding the availability of the Allegorithmic tools via subscription. Later this year, Adobe will announce an update on new offerings that will bring the full power of Allegorithmic technology and Adobe Creative Cloud together.

You can now export ProRes on a PC with Adobe’s video apps

By Brady Betzel

Listen up post pros! You can now natively export ProRes from a Windows 10-based PC for $20.99 with the latest release of Adobe’s Premiere, After Effects and Media Encoder.

I can’t overstate how big of a deal this is. Previously, the only way to export ProRes from a PC was to use a knock-off reverse-engineered codec that would mimic the process — creating footage that would often fail QC checks at networks — or be in possession of a high-end app like Fusion, Nuke, Nucoda or Scratch. The only other way would be to have a Cinedeck in your hands and output your files in realtime through it. But, starting today, you can export native ProRes 4444 and ProRes 422 from your Adobe Creative Cloud Suite apps like Premiere Pro, After Effects, and Media Encoder. Have you wanted to use those two or three Nvidia GTX 1080ti graphics cards that you can’t stuff into a Mac Pro? Well, now you can. No more being tied to AMD for ProRes exports.

Apple seems to be leaving their creative clients in the dust. Unless you purchase an iMac Pro or MacBook Pro, you have been stuck using a 2013 Mac Pro to export or encode your files to ProRes specifications. A lot of customers, who had given Apple the benefit of the doubt and stuck around for a year or two longer than they probably should have waiting for a new Mac Pro — allegedly being released in 2019 — began to transition over to Windows-based platforms. All the while, most would keep that older Mac just to export ProRes files while using the more powerful and updated Windows PC to do their daily tasks.

Well, that day is now over and, in my opinion, leads me to believe that Apple is less concerned with keeping their professional clients than ever before. That being said, I love that Apple has finally opened their ProRes codecs up to the Adobe Creative Cloud.

Let’s hope it can become a system-wide feature, or at least added to Blackmagic’s Resolve and Avid’s Media Composer. You can individually rent Adobe Premiere Pro or After Effects for $20.99 month, rent the entire Adobe Creative Cloud library for $52.99 a month or, if you are a student or teacher, you can take advantage of the best deal around for $19.99 a month, which gives you ALL the Creative Cloud apps.

Check out Adobe’s blog about the latest Windows ProRes export features.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Adobe Max 2018: Creative Cloud updates and more

By Mike McCarthy

I attended my first Adobe Max 2018 last week in Los Angeles. This huge conference takes over the LA convention center and overflows into the surrounding venues. It began on Monday morning with a two-and-a-half-hour keynote outlining the developments and features being released in the newest updates to Adobe’s Creative Cloud. This was followed by all sorts of smaller sessions and training labs for attendees to dig deeper into the new capabilities of the various tools and applications.

The South Hall was filled with booths from various hardware and software partners, with more available than any one person could possibly take in. Tuesday started off with some early morning hands-on labs, followed by a second keynote presentation about creative and career development. I got a front row seat to hear five different people, who are successful in their creative fields — including director Ron Howard — discuss their approach to work and life. The rest of the day was so packed with various briefings, meetings and interviews that I didn’t get to actually attend any of the classroom sessions.

By Wednesday, the event was beginning to wind down, but there was still a plethora of sessions and other options for attendees to split their time. I presented the workflow for my most recent project Grounds of Freedom at Nvidia’s booth in the community pavilion, and spent the rest of the time connecting with other hardware and software partners who had a presence there.

Adobe released updates for most of its creative applications concurrent with the event. Many of the most relevant updates to the video tools were previously announced at IBC in Amsterdam last month, so I won’t repeat those, but there are still a few new video ones, as well as many that are broader in scope in regards to media as a whole.

Adobe Premiere Rush
The biggest video-centric announcement is Adobe Premiere Rush, which offers simplified video editing workflows for mobile devices and PCs.  Currently releasing on iOS and Windows, with Android to follow in the future, it is a cloud-enabled application, with the option to offload much of the processing from the user device. Rush projects can be moved into Premiere Pro for finishing once you are back on the desktop.  It will also integrate with Team Projects for greater collaboration in larger organizations. It is free to start using, but most functionality will be limited to subscription users.

Let’s keep in mind that I am a finishing editor for feature films, so my first question (as a Razr-M user) was, “Who wants to edit video on their phone?” But what if the user shot the video on their phone? I don’t do that, but many people do, so I know this will be a valuable tool. This has me thinking about my own mentality toward video. I think if I was a sculptor I would be sculpting stone, while many people are sculpting with clay or silly putty. Because of that I would have trouble sculpting in clay and see little value in tools that are only able to sculpt clay. But there is probably benefit to being well versed in both.

I would have no trouble showing my son’s first-year video compilation to a prospective employer because it is just that good — I don’t make anything less than that. But there was no second-year video, even though I have the footage because that level of work takes way too much time. So I need to break free from that mentality, and get better at producing content that is “sufficient to tell a story” without being “technically and artistically flawless.” Learning to use Adobe Rush might be a good way for me to take a step in that direction. As a result, we may eventually see more videos in my articles as well. The current ones took me way too long to produce, but Adobe Rush should allow me to create content in a much shorter timeframe, if I am willing to compromise a bit on the precision and control offered by Premiere Pro and After Effects.

Rush allows up to four layers of video, with various effects and 32-bit Lumetri color controls, as well as AI-based audio filtering for noise reduction and de-reverb and lots of preset motion graphics templates for titling and such.  It should allow simple videos to be edited relatively easily, with good looking results, then shared directly to YouTube, Facebook and other platforms. While it doesn’t fit into my current workflow, I may need to create an entirely new “flow” for my personal videos. This seems like an interesting place to start, once they release an Android version and I get a new phone.

Photoshop Updates
There is a new version of Photoshop released nearly every year, and most of the time I can’t tell the difference between the new and the old. This year’s differences will probably be a lot more apparent to most users after a few minutes of use. The Undo command now works like other apps instead of being limited to toggling the last action. Transform operates very differently, in that they made proportional transform the default behavior instead of requiring users to hold Shift every time they scale. It allows the anchor point to be hidden to prevent people from moving the anchor instead of the image and the “commit changes” step at the end has been removed. All positive improvements, in my opinion, that might take a bit of getting used to for seasoned pros. There is also a new Framing Tool, which allows you to scale or crop any layer to a defined resolution. Maybe I am the only one, but I frequently find myself creating new documents in PS just so I can drag the new layer, that is preset to the resolution I need, back into my current document. For example, I need a 200x300px box in the middle of my HD frame — how else do you do that currently? This Framing tool should fill that hole in the features for more precise control over layer and object sizes and positions (As well as provide its easily adjustable non-destructive masking.).

They also showed off a very impressive AI-based auto selection of the subject or background.  It creates a standard selection that can be manually modified anywhere that the initial attempt didn’t give you what you were looking for.  Being someone who gives software demos, I don’t trust prepared demonstrations, so I wanted to try it for myself with a real-world asset. I opened up one of my source photos for my animation project and clicked the “Select Subject” button with no further input and got this result.  It needs some cleanup at the bottom, and refinement in the newly revamped “Select & Mask” tool, but this is a huge improvement over what I had to do on hundreds of layers earlier this year.  They also demonstrated a similar feature they are working on for video footage in Tuesday night’s Sneak previews.  Named “Project Fast Mask,” it automatically propagates masks of moving objects through video frames and, while not released yet, it looks promising.  Combined with the content-aware background fill for video that Jason Levine demonstrated in AE during the opening keynote, basic VFX work is going to get a lot easier.

There are also some smaller changes to the UI, allowing math expressions in the numerical value fields and making it easier to differentiate similarly named layers by showing the beginning and end of the name if it gets abbreviated.  They also added a function to distribute layers spatially based on the space between them, which accounts for their varying sizes, compared to the current solution which just evenly distributes based on their reference anchor point.

In other news, Photoshop is coming to iPad, and while that doesn’t affect me personally, I can see how this could be a big deal for some people. They have offered various trimmed down Photoshop editing applications for iOS in the past, but this new release is supposed to be based on the same underlying code as the desktop version and will eventually replicate all functionality, once they finish adapting the UI for touchscreens.

New Apps
Adobe also showed off Project Gemini, which is a sketch and painting tool for iPad that sits somewhere between Photoshop and Illustrator. (Hence the name, I assume) This doesn’t have much direct application to video workflows besides being able to record time-lapses of a sketch, which should make it easier to create those “white board illustration” videos that are becoming more popular.

Project Aero is a tool for creating AR experiences, and I can envision Premiere and After Effects being critical pieces in the puzzle for creating the visual assets that Aero will be placing into the augmented reality space.  This one is the hardest for me to fully conceptualize. I know Adobe is creating a lot of supporting infrastructure behind the scenes to enable the delivery of AR content in the future, but I haven’t yet been able to wrap my mind around a vision of what that future will be like.  VR I get, but AR is more complicated because of its interface with the real world and due to the variety of forms in which it can be experienced by users.  Similar to how web design is complicated by the need to support people on various browsers and cell phones, AR needs to support a variety of use cases and delivery platforms.  But Adobe is working on the tools to make that a reality, and Project Aero is the first public step in that larger process.

Community Pavilion
Adobe’s partner companies in the Community Pavilion were showing off a number of new products.  Dell has a new 49″ IPS monitor, the U4919DW, which is basically the resolution and desktop space of two 27-inch QHD displays without the seam (5120×1440 to be exact). HP was displaying their recently released ZBook Studio x360 convertible laptop workstation, (which I will be posting a review of soon), as well as their Zbook X2 tablet and the rest of their Z workstations.  NVidia was exhibiting their new Turing-based cards with 8K Red decoding acceleration, ray tracing in Adobe Dimension and other GPU accelerated tasks.  AMD was demoing 4K Red playback on a MacBookPro with an eGPU solution, and CPU based ray-tracing on their Ryzen systems.  The other booths spanned the gamut from GoPro cameras and server storage devices to paper stock products for designers.  I even won a Thunderbolt 3 docking station at Intel’s booth. (Although in the next drawing they gave away a brand new Dell Precision 5530 2-in-1 convertible laptop workstation.)   Microsoft also garnered quite a bit of attention when they gave away 30 MS Surface tablets near the end of the show.  There was lots to see and learn everywhere I looked.

The Significance of MAX
Adobe MAX is quite a significant event, especially now that I have been in the industry long enough to start to see the evolution of certain trends — things are not as static as we may expect.  I have attended NAB for the last 12 years, and the focus of that show has shifted significantly away from my primary professional focus. (No Red, Ncidia, or Apple booths, among many other changes)  This was the first year that I had the thought “I should have gone to Sundance,” and a number of other people I know had the same impression. Adobe Max is similar, although I have been a little slower to catch on to that change.  It has been happening for over ten years, but has grown dramatically in size and significance recently.  If I still lived in LA, I probably would have started attending sooner, but it was hardly on my radar until three weeks ago.  Now that I have seen it in person, I probably won’t miss it in the future.


Mike McCarthy is an online editor/workflow consultant with 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.

Adobe updates Creative Cloud

By Brady Betzel

You know it’s almost fall when when pumpkin spice lattes are  back and Adobe announces its annual updates. At this year’s IBC, Adobe had a variety of updates to its Creative Cloud line of apps. From more info on their new editing platform Project Rush to the addition of Characterizer to Character Animator — there are a lot of updates so I’m going to focus on a select few that I think really stand out.

Project Rush

I use Adobe Premiere quite a lot these days; it’s quick and relatively easy to use and will work with pretty much every codec in the universe. In addition, the Dynamic Link between Adobe Premiere Pro and Adobe After Effects is an indispensible feature in my world.

With the 2018 fall updates, Adobe Premiere will be closer to a color tool like Blackmagic’s Resolve with the addition of new hue saturation curves in the Lumetri Color toolset. In Resolve these are some of the most important aspects of the color corrector, and I think that will be the same for Premiere. From Hue vs. Sat, which can help isolate a specific color and desaturate it to Hue vs. Luma, which can help add or subtract brightness values from specific hues and hue ranges — these new color correcting tools further Premiere’s venture into true professional color correction. These new curves will also be available inside of After Effects.

After Effects features many updates, but my favorites are the ability to access depth matte data of 3D elements and the addition of the new JavaScript engine for building expressions.

There is one update that runs across both Premiere and After Effects that seems to be a sleeper update. The improvements to motion graphics templates, if implemented correctly, could be a time and creativity saver for both artists and editors.

AI
Adobe, like many other companies, seem to be diving heavily into the “AI” pool, which is amazing, but… with great power comes great responsibility. While I feel this way and realize others might not, sometimes I don’t want all the work done for me. With new features like Auto Lip Sync and Color Match, editors and creators of all kinds should not lose the forest for the trees. I’m not telling people to ignore these features, but asking that they put a few minutes into discovering how the color of a shot was matched, so you can fix something if it goes wrong. You don’t want to be the editor who says, “Premiere did it” and not have a great solution to fix something when it goes wrong.

What Else?
I would love to see Adobe take a stab at digging up the bones of SpeedGrade and integrating that into the Premiere Pro world as a new tab. Call it Lumetri Grade, or whatever? A page with a more traditional colorist layout and clip organization would go a long way.

In the end, there are plenty of other updates to Adobe’s 2018 Creative Cloud apps, and you can read their blog to find out about other updates.

NAB: Adobe’s spring updates for Creative Cloud

By Brady Betzel

Adobe has had a tradition of releasing Creative Cloud updates prior to NAB, and this year is no different. The company has been focused on improving existing workflows and adding new features, some based on Adobe’s Sensei technology, as well as improved VR enhancements.

In this release, Adobe has announced a handful of Premiere Pro CC updates. While I personally don’t think that they are game changing, many users will appreciate the direction Adobe is going. If you are color correcting, Adobe has added the Shot Match function that allows you to match color between two shots. Powered by Adobe’s Sensei technology, Shot Match analyzes one image and tries to apply the same look to another image. Included in this update is the long-requested split screen to compare before and after color corrections.

Motion graphic templates have been improved with new adjustments like 2D position, rotation and scale. Automatic audio ducking has been included in this release as well. You can find this feature in the Essential Sound panel, and once applied it will essentially dip the music in your scene based on dialogue waveforms that you identify.

Still inside of Adobe Premiere Pro CC, but also applicable in After Effects, is Adobe’s enhanced Immersive Environment. This update is for people who use VR headsets to edit and or process VFX. Team Project workflows have been updated with better version tracking and indicators of who is using bins and sequences in realtime.

New Timecode Panel
Overall, while these updates are helpful, none are barn burners, the thing that does have me excited is the new Timecode Panel — it’s the biggest new update to the Premiere Pro CC app. For years now, editors have been clamoring for more than just one timecode view. You can view sequence timecodes, source media timecodes from the clips on the different video layers in your timeline, and you can even view the same sequence timecode in a different frame rate (great for editing those 23.98 shows to a 29.97/59.94 clock!). And one of my unexpected favorites is the clip name in the timecode window.

I was testing this feature in a pre-release version of Premiere Pro, and it was a little wonky. First, I couldn’t dock the timecode window. While I could add lines and access the different menus, my changes wouldn’t apply to the row I had selected. In addition, I could only right click and try to change the first row of contents, but it would choose a random row to change. I am assuming the final release has this all fixed. If it the wonkiness gets flushed out, this is a phenomenal (and necessary) addition to Premiere Pro.

Codecs, Master Property, Puppet Tool, more
There have been some compatible codec updates, specifically Raw Sony X-OCN (Venice), Canon Cinema Raw Light (C200) and Red IPP2.

After Effects CC has also been updated with Master Property controls. Adobe said it best during their announcement: “Add layer properties, such as position, color or text, in the Essential Graphics panel and control them in the parent composition’s timeline. Use Master Property to push individual values to all versions of the composition or pull selected changes back to the master.”

The Puppet Tool has been given some love with a new Advanced Puppet Engine, giving access to improving mesh and starch workflows to animate static objects. Beyond updates to Add Grain, Remove Grain and Match Grain effects, making them multi-threaded, enhanced disk caching and project management improvements have been added.

My favorite update for After Effects CC is the addition of data-driven graphics. You can drop a CSV or JSON data file and pick-whip data to layer properties to control them. In addition, you can drag and drop data right onto your comp to use the actual numerical value. Data-driven graphics is a definite game changer for After Effects.

Audition
While Adobe Audition is an audio mixing application, it has some updates that will directly help anyone looking to mix their edit in Audition. In the past, to get audio to a mixing program like Audition, Pro Tools or Fairlight you would have to export an AAF (or if you are old like me possibly an OMF). In the latest Audition update you can simply open your Premiere Pro projects directly into Audition, re-link video and audio and begin mixing.

I asked Adobe whether you could go back and forth between Audition and Premiere, but it seems like it is a one-way trip. They must be expecting you to export individual audio stems once done in Audition for final output. In the future, I would love to see back and forth capabilities between apps like Premiere Pro and Audition, much like the Fairlight tab in Blackmagic’s Resolve. There are some other updates like larger tracks and under-the-hood updates which you can find more info about on: https://theblog.adobe.com/creative-cloud/.

Adobe Character Animator has some cool updates like overall character building updates, but I am not too involved with Character Animator so you should definitely read about things like the Trigger Improvements on their blog.

Summing Up
In the end, it is great to see Adobe moving forward on updates to its Creative Cloud video offerings. Data-driven animation inside of After Effects is a game-changer. Shot color matching in Premiere Pro is a nice step toward a professional color correction application. Importing Premiere Pro projects directly into Audition is definitely a workflow improvement.

I do have a wishlist though: I would love for Premiere Pro to concentrate on tried-and-true solutions before adding fancy updates like audio ducking. For example, I often hear people complain about how hard it is to export a QuickTime out of Premiere with either stereo or mono/discrete tracks. You need to set up the sequence correctly from the jump, adjust the pan on the tracks, as well as adjust the audio settings and export settings. Doesn’t sound streamlined to me.

In addition, while shot color matching is great, let’s get an Adobe SpeedGrade-style view tab into Premiere Pro so it works like a professional color correction app… maybe Lumetri Pro? I know if the color correction setup was improved I would be way more apt to stay inside of Premiere Pro to finish something instead of going to an app like Resolve.

Finally, consolidating and transcoding used clips with handles is hit or miss inside of Premiere Pro. Can we get a rock-solid consolidate and transcode feature inside of Premiere Pro? Regardless of some of the few negatives, Premiere Pro is an industry staple and it works very well.

Check out Adobe’s NAB 2018 update video playlist for details on each and every update.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: Dell’s 8K LCD monitor

By Mike McCarthy

At CES 2017, Dell introduced its UP3218K LCD 32-inch monitor, which was the first commercially available 8K display. It runs 7680×4320 pixels at 60fps, driven by two DisplayPort 1.4 cables. That is over 33 million pixels per frame, and nearly 2 billion per second, which requires a lot of GPU power to generate. Available since March, not long ago I was offered one to review as part of a wider exploration of 8K video production workflows, and there will be more articles about that larger story in the near future.

For this review, I will be focusing on only this product and its uses.

The UP3218K showed up in a well-designed box that was easy to unpack — it was also easy getting the monitor onto the stand. I plugged it into my Nvidia Quadro P6000 card with the included DisplayPort cables, and it came up as soon as I turned it on… at full 60Hz and without any issues or settings to change. Certain devices with only one DisplayPort 1.4 connector will only power the display at 30Hz, as full 60Hz connections saturate the bandwidth of two DP 1.4 cables, but the display does require a Displayport 1.4 connection, and will not revert to lower resolution when connected to a 1.2 port. This limits the devices that can drive it to Pascal-based GPUs on the Nvidia side, or top-end Vega GPUs on the AMD side. I have a laptop with a P5000 in it, so I was disappointed to discover that the DisplayPort connector was still only 1.2, thereby making it incompatible with this 8K monitor.

Dell’s top Precision laptops (7720 and 7520) support DP1.4, while HP and Lenovo’s mobile workstations do not yet. This is a list of every device I am aware of that explicitly claims to support 8K output:
1. Quadro P6000, P5000, P4000, P2000 workstation GPU cards
2. TitanX and Geforce10 Series graphics cards
3. RadeonPro SSG, WX9100 & WX7100 workstation GPU cards
4. RX Vega 64 and 56 graphics cards
5. Dell Precision 7520 and 7720 mobile workstations
6. Comment if you know of other laptops with DP1.4

So once you have a system that can drive the monitor, what can you do with it? Most people reading this article will probably be using this display as a dedicated full-screen monitor for their 8K footage. But smooth 8K editing and playback is still a ways away for most people. The other option is to use it as your main UI monitor to control your computer and its applications. In either case, color can be as important as resolution when it comes to professional content creation, and Dell has brought everything it has to the table in this regard as well.

The display supports Dell’s PremierColor toolset, which is loosely similar to the functionality that HP offers under their DreamColor branding. PremierColor means a couple of things, including that the display has the internal processing power that allows it to correctly emulate different color spaces; it can also be calibrated with an X-Rite iDisplay Pro independent of the system driving it. It also interfaces with a few software tools that Dell has developed for its professional users. The mo

st significant functionality within that feature set is the factory-calibrated options for emulating AdobeRGB, sRGB, Rec.709 and DCI-P3. Dell tests each display individually after manufacturing to ensure that it is color accurate. These are great features, but they are not unique to this monitor, and many users have been using them on other display models for the last few years. While color accuracy is important, the main selling point of this particular model is resolution, and lots of it. And that is what I spent the majority of my time analyzing.

Resolution
The main issue here is the pixel density. Ten years ago, 24-inch displays were 1920×1200, and 30-inch displays had 2560×1600 pixels. This was around 100 pixels per inch, and most software was hard coded to look correct at that size. When UHD displays were released, the 32-inch version had a DPI of 140. That resulted in applications looking quite small and hard to read on the vast canvas of pixels, but this trend increased pressure on software companies to scale their interfaces better for high DPI displays. Windows 7 was able to scale things up an extra 50%, but a lot of applications ignored that setting or were not optimized for it. Windows 10 now allows scaling beyond 300%, which effectively triples the size of the text and icons. We have gotten to the point where even 15-inch laptops have UHD screens, resulting in 280 DPI, which is unreadable to most people without interface scaling.

Premiere Pro

With 8K resolution, this monitor has 280 DPI, twice that of a 4K display of similar size. This is on par with a 15-inch UHD laptop screen, but laptops are usually viewed from a much closer range. Since I am still using Windows 7 on my primary workstation, I was expecting 280 DPI to be unusable for effective work. And while everything is undoubtedly small, it is incredibly crisp, and once I enabled Windows scaling at 150%, it was totally usable (although I am used to small fonts and lots of screen real estate). The applications I use, especially Adobe CC, scale much smoother than they used to, so everything looks great, even with Windows 7, as long as I sit fairly close to the monitor.

I can edit 6K footage in Premiere Pro at full resolution for the first time, with space left over for my timeline and tool panels. In After Effects, I can work on 4K shots in full resolution and still have 70 layers of data visible in my composition. In Photoshop, setting the UI to 200% causes the panel to behave similar to a standard 4K 32-inch display, but with your image having four times the detail. I can edit my 5.6K DSLR files in full resolution, with nearly every palette open to work smoothly through my various tools.

This display replaces my 34-inch curved U3415W as my new favorite monitor for Adobe apps, although I would still prefer the extra-wide 34-inch display for gaming and other general usability. But for editing or VFX work, the 8K panel is a dream come true. Every tool is available at the same time, and all of your imagery is available at HiDPI quality.

Age of Empires II

When gaming, the resolution doesn’t typically affect the field of view of 3D applications, but for older 2D games, you can see the entire map at once. Age of Empires II HD offers an expansive view of really small units, but there is a texture issue with the background of the bottom quarter of the screen. I think I used to see this at 4K as well, and it got fixed in an update, so maybe the same thing will happen with this one, once 8K becomes more common.

I had a similar UI artifact issue in RedCine player when I full-screened the Window on the 8K display, which was disappointing since that was one of the few ways to smoothly play 8K footage on the monitor at full resolution. Using it as a dedicated output monitor works as well, but I did run into some limitations. I did eventually get it to work with RedCine-X Pro, after initially experiencing some aspect ratio issues. It would playback cached frames smoothly, but only for 15 seconds at a time before running out of decoded frames, even with a Rocket-X accelerator card.

When configured as a secondary display for dedicated full-screen output, it is accessible via Mercury Transmit in the Adobe apps. This is where it gets interesting, because the main feature that this monitor brings to the table is increased resolution. While that is easy to leverage in Photoshop, it is very difficult to drive that many pixels in real-time for video work, and decreasing the playback resolution negates the benefit of having an 8K display. At this point, effectively using the monitor becomes more an issue of workflow.

After Effects

I was going to use 8K Red footage for my test, but that wouldn’t play smoothly in Premiere, even on my 20-core workstation, so I converted it to a variety of other files to test with. I created 8K test assets that matched the monitor resolution in DNxHR, Cineform, JPEG2000, OpenEXR and HEVC. DNxHR was the only format that offered full-resolution playback at 8K, and even that resulted in dropped frames on a regular basis. But being able to view 8K video is pretty impressive, and probably forever shifts my view of “sharp” in the subjective sense, but we are at a place where we are still waiting for the hardware to catch up in regards to processing power — for 8K video editing to be an effective reality for users.

Summing Up
The UP3218K is the ultimate monitor for content creators and artists looking for a large digital canvas, regardless of whether that is measured in inches or pixels. All those pixels come at a price — it is currently available from Dell for $3,900. Is it worth it? That will depend on what your needs and your budget are. Is a Mercedes Benz worth the increased price over a Honda? Some people obviously think so.

There is no question that this display and the hardware to drive it effectively would be a luxury to the average user. But for people who deal with high resolution content on a regular basis, the increased functionality that it offers them can’t be measured in the same way, and reading an article and seeing pictures online can’t compare to actually using the physical item. The screenshots are all scaled to 25% to be a reasonable size for the web. I am just trying to communicate a sense of the scope of the desktop real estate available to users on an 8K screen. So yes, it is expensive, but at the moment, it is the highest resolution monitor that money can buy, and the closest alternative (5K screens) does not even come close.


Mike McCarthy is an online editor/workflow consultant with 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.

 

Adobe intros updates to Creative Cloud, including Team Projects

Later this year, Adobe will be offering new capabilities within its Adobe Creative Cloud video tools and services. This includes updates for VR/360, animation, motion graphics, editing, collaboration and Adobe Stock. Many of these features are powered by Adobe Sensei, the company’s artificial intelligence and machine learning framework. Adobe will preview these advancements at IBC.

The new capabilities coming later this year to Adobe Creative Cloud for video include:
• Access to motion graphics templates in Adobe Stock and through Creative Cloud Libraries, as well as usability improvements to the Essential Graphics panel in Premiere Pro, including responsive design options for preserving spatial and temporal.
• Character Animator 1.0 with changes to core and custom animation functions, such as pose-to-pose blending, new physics behaviors and visual puppet controls. Adobe Sensei will help improve lip-sync capability by accurately matching mouth shape with spoken sounds.
• Virtual reality video creation with a dedicated viewing environment in Premiere Pro. Editors can experience the deeply engaging qualities of content, review their timeline and use keyboard driven editing for trimming and markers while wearing the same VR head-mounts as their audience. In addition, audio can be determined by orientation or position and exported as ambisonics audio for VR-enabled platforms such as YouTube and Facebook. VR effects and transitions are now native and accelerated via the Mercury playback engine.
• Improved collaborative workflows with Team Projects on the Local Area Network with managed access features that allow users to lock bins and provide read-only access to others. Formerly in beta, the release of Team Projects will offer smoother workflows hosted in Creative Cloud and the ability to more easily manage versions with auto-save history.
• Flexible session organization to multi-take workflows and continuous playback while editing in Adobe Audition. Powered by Adobe Sensei, auto-ducking is added to the Essential Sound panel that automatically adjusts levels by type: dialogue, background sound or music.

Integration with Adobe Stock
Adobe Stock is now offering over 90 million assets including photos, illustrations and vectors. Customers now have access to over 4 million HD and 4K Adobe Stock video footage directly within their Creative Cloud video workflows and can now search and scrub assets in Premiere Pro.

Coming to this new release are hundreds of professionally-created motion graphics templates for Adobe Stock, available later this year. Additionally, motion graphic artists will be able to sell Motion Graphic templates for Premiere Pro through Adobe Stock. Earlier this year, Adobe added editorial and premium collections from Reuters, USA Today Sports, Stocksy and 500px.

Chaos Group and Adobe partner for photorealistic rendering in CC

Chaos Group’s V-Ray rendering technology is featured in Adobe’s Creative Cloud, allowing graphic designers to easily create photorealistic 3D rendered composites with Project Felix.

Available now, Project Felix is a public beta desktop app that helps users composite 3D assets like models, materials and lights with background images, resulting in an editable render they can continue to design in Photoshop CC. For example, users can turn a basic 3D model of a generic bottle into a realistic product shot that is fully lit and placed in a scene to create an ad, concept mock-up or even abstract art.

V-Ray acts as a virtual camera, letting users test angles, perspectives and placement of their model in the scene before generating a final high-res render. Using the preview window, Felix users get immediate visual feedback on how each edit affects the final rendered image.

By integrating V-Ray, Adobe has brought the same raytracing technology used by companies Industrial Light & Magic to a much wider audience.

“We’re thrilled that Adobe has chosen V-Ray to be the core rendering engine for Project Felix, and to be a part of a new era for 3D in graphic design,” says Peter Mitev, CEO of Chaos Group. “Together we’re bringing the benefits of photoreal rendering, and a new design workflow, to millions of creatives worldwide.”

“Working with the amazing team at Chaos Group meant we could bring the power of the industry’s top rendering engine to our users,” adds Stefano Corazza, senior director of engineering at Adobe. “Our collaboration lets graphic designers design in a more natural flow. Each edit comes to life right before their eyes.”

Review: Lenovo ThinkStation P410

By Brady Betzel

With the lukewarm reaction of the professional community to the new Apple MacBook Pro, there are many creative professionals who are seriously — for the first time in their careers — considering whether or not to jump into a Windows-based world.

I grew up using an Apple II GS from 1986 (I was born in 1983, if you’re wondering), but I always worked on both Windows and Apple computers. I guess my father really instilled the idea of being independent and not relying on one thing or one way of doing something — he wanted me to rely on my own knowledge and not on others.

Not to get too philosophical, but when he purchased all the parts I needed to build my own Windows system, it was incredibly gratifying. I would have loved to have built my own Apple system, but obviously never could. That is why I am so open to computer systems of any operating system software.

If you are deciding whether or not to upgrade your workstation and have never considered solutions other than HP, Dell or Apple, you will want to read what I have to say about Lenovo‘s latest workstation, the P410.

When I set out on this review, I didn’t have any Display Port-compatible monitors and Lenovo was nice enough to send their beautiful Think Vision Pro 2840m — another great piece of hardware.

Digging In
I want to jump right into the specs of the ThinkStation P410. Under the hood is an Intel Xeon E5-1650 v4, which in plain terms is a 6-core 3.6GHz 15MB CPU that can reach all the way up to 4.0GHz if needed using Intel’s Turbo Boost technology. The graphics card is a medium-sized monster — the Nvidia Quadro M4000 with 8GB of GDDR5 memory and 1664 CUDA cores. It has 4 DisplayPort 1.2 ports to power those four 30-bit 4096×2160 @60Hz displays you will run when editing and color correcting.

If you need more CUDA power you could step up to the Nvidia M5000, which runs 2048 CUDA cores or the M6000, which runs 3072 CUDA cores, but that power isn’t cheap (and as of this review they are not even an option from Lenovo in the P410 customization — you will probably have to step up to a higher model number).

There is 16GB of DD4-2400 ECC memory, 1TB 2.5-inch SATA 6Gb/s SSD (made by Macron), plus a few things like a DVD writer, media card reader, keyboard and mouse. At the time I was writing this review, you could configure this system for a grand total of $2,794, but if you purchase it online at shop.lenovo.com it will cost a little under $2,515 with some online discounts. As I priced this system out over a few weeks I noticed the prices changed, so keep in mind it could be higher. I configured a similar style HP z440 workstation for around $3,600 and a Dell Precision Tower 5000 for around $3,780, so Lenovo’s prices are on the low end for major-brand workstations.

For expansion (which Windows-based PCs seem to lead the pack in), you have a total of four DIMM slots for memory (two are taken up already by two 8GB sticks), four PCIe slots and four hard drive bays. Two of the hard drive bays are considered Flex Bays, which can be used for hard drives, hard drive + slim optical drive or something like front USB 3.0 ports.

On the back there are your favorite PS/2 keyboard port and mouse port, two USB 2.0 ports, four USB 3.0 ports, audio in/out/mic and four DisplayPorts.

Testing
I first wanted to test the P410’s encoding speed when using Adobe Media Encoder. I took a eight-minute, 30 second 1920×1080 23.98fps ProRes HQ QuickTime that I had filmed using a Blackmagic Pocket Cinema Camera, did a quick color balance in Adobe Premiere Pro CC 2017 using the Lumetri Color Correction tools and exported a Single Pass, variable bit rate 25Mb/s H.264 using Media Encoder. Typically, CUDA cores kick in when you use GPU-accelerated tools like transitions, scaling in Premiere and when you export files with GPU effects such as Lumetri Color tools. Typically, when exporting from tools, like Adobe Premiere Pro CC or Adobe Media Encoder, the GPU acceleration kicks in only if you’ve applied GPU-accelerated effects, color correction with something like Lumetri (which is GPU accelerated) or a resize effect. Otherwise if you are just transcoding from one codec to another the CPU will handle the task.

In this test, it took Media Encoder about six minutes to encode the H.264 when Mercury Playback Engine GPU Acceleration (CUDA) was enabled. Without the GPU acceleration enabled it took 14 minutes. So by using the GPU, I got about a 40 percent speed increase thanks to the power of the Nvidia Quadro M4000 with 8GB of GDDR5 RAM.

For comparison, I did the same test on a newly released MacBook Pro with Touch Bar i7 2.9Ghz Quad Core, 16GB of 2133 MHz LPDDR3 memory and AMD Radeon Pro 460 4GB of RAM (uses OpenCL as opposed to CUDA); it took Media Encoder about nine minutes using the GPU.

Another test I love to run uses Maxon’s Cinebench, which simply runs real-world scenarios like photorealistic rendering and a 3D car chase. This taxes your system with almost one million polygons and textures. Basically, it makes your system do a bunch of math, which helps in separating immature workstations from the professional ones. This system came in around 165 frames per second. In comparison to other systems, with similar configurations to the P410, it placed first or second. So it’s fast.

Lenovo Performance Tuner
While the low price is what really sets the P410 apart from the rest of the pack, Lenovo has recently released a hardware tuning software program called Lenovo Performance Tuner. Performance Tuner is a free app that helps to focus your Lenovo workstation on the app you are using. For instance, I use Adobe CC a lot at home, so when I am working in Premiere I want all of my power focused there with minimal power focused on background apps that I may not have turned off — sometimes I let Chrome run in the background or I want to jump between Premiere, Resolve and Photoshop. You can simply launch Performance Tuner and click the app you want to launch in Lenovo’s “optimized” state. You can go further by jumping into the Settings tab and customize things like Power Management Mode to always be on Max Performance. It’s a pretty handy tool when you want to quickly funnel all of your computing resources to one app.

The Think Vision Pro Monitor
Lastly, I wanted to quickly touch on the Think Vision Pro 2840m LED backlit LCD monitor Lenovo let me borrow for this review. The color fidelity is awesome and can work at a resolution up to 3840×2160 (UHD, not full 4K). It will tilt and rotate almost any way you need it to, and it will even go full vertical at 90 degrees.

When working with P410 I had some problems with DisplayPort not always kicking in with the monitor, or any monitor for that matter. Sometimes I would have to unplug and plug the DisplayPort cable back in while the system was on for the monitor to recognize and turn on. Nonetheless, the monitor is awesome at 28 inches. Keep in mind it has a glossy finish so it might not be for you if you are near a lot of light or windows — while the color and brightness punch through, there is a some glare with other light sources in the room.

Summing Up
In the end, the Lenovo ThinkStation P410 workstation is a workhorse. Even though it’s at the entry level of Lenovo’s workstations, it has a lot of power and a great price. When I priced out a similar system using PC Partpicker, it ran about $2,600 — you can check out the DIY build I put together on PCPartpicker.com: https://pcpartpicker.com/list/r9H4Ps.

A drawback of DIY custom builds though is that they don’t include powerful support, a complete warranty from a single company or ISV certifications (ISV = Independent Software Vendors). Simply, ISVs are the way major workstation builders like HP, Dell and Lenovo test their workstations against commonly used software like Premiere Pro or Avid Media Composer in workstation-focused industries like editing or motion graphics.

One of the most misunderstood benefits of a workstation is that it’s meant to run day and night. So not only do you get enterprise-level components like Nvidia Quadro graphics cards and Intel Xeon CPUs, the components are made for durability as well as performance. This way there is little downtime, especially in mission-critical environments. I didn’t get to run this system for months constantly, but I didn’t see any sign of problems in my testing.

When you buy a Lenovo workstation it comes with a three-year on-site warranty, which covers anything that goes wrong with the hardware itself, including faulty workmanship. But it won’t cover things like spills, drops or electrical surges.

I liked the Lenovo ThinkStation P410. It’s fast, does the job and has quality components. I felt that it lacked a few of today’s necessary I/O ports like USB-C/Thunderbolt 3.

The biggest pro for this workstation is the overwhelmingly low price point for a major brand workstation like the ThinkStation P410. Check out the Lenovo website for the P410 and maybe even wander into the P910 aisle, which showcases some of the most powerful workstations they make.

Check out this video I made that gives you a closer look at (and inside) the workstation.

Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.