Author Archives: Randi Altman

VFX house Kevin adds three industry vets

Venice, California-based visual effects house Kevin, founded by Tim Davies, Sue Troyan and Darcy Parsons, has beefed up its team even further with the hiring of head of CG Mike Dalzell, VFX supervisor Theo Maniatis and head of technology Carl Loeffler. This three-month-old studio has already worked on spots for Jaguar, Land Rover, Target and Old Spice, and is currently working on a series of commercials for the Super Bowl.

Dalzell brings years of experience as a CG aupervisor and lead artist — he started as a 3D generalist before focusing on look development and lighting — at top creative studios including Digital Domain, MPC and Psyop, The Mill, Sony Imageworks and Method. He was instrumental in look development for VFX Gold Clio and British Arrow-winner Call of Duty Seize Glory and GE’s Childlike Imagination. He has also worked on commercials for Nissan, BMW, Lexus, Visa, Cars.com, Air Force and others. Early on, Dalzell honed his skills on music videos in Toronto, and then on feature films such as Iron Man 3 and The Matrix movies, as well as The Curious Case of Benjamin Button.

Maniatis, a Flame artist and on-set VFX supervisor, has a wide breadth of experience in the US, London and his native Sydney. “Tim [Davies] and I used to work together back in Australia, so reconnecting with him and moving to LA has been a blast.”

Maniatis’s work includes spots for Apple Watch 3 + Apple Music’s Roll (directed by Sam Brown), TAG Heuer’s To Jack (directed by and featuring Patrick Dempsey), Destiny 2’s Rally the Troops and Titanfall 2’s Become One (via Blur Studios), and PlayStation VR’s Batman Arkham and Axe’s Office Love, both directed by Filip Engstrom. Prior to joining Kevin, Maniatis worked with Blur Studios, Psyop, The Mill, Art Jail and Framestore.

Loeffler is creating the studio’s production model using the latest Autodesk Flame systems, high-end 3D workstations and render nodes and putting new networking and storage systems into place. Kevin’s new Culver City studio will open its doors in Q1, 2018 and Loeffler will guide the current growth in both hardware and software, plan for the future and make sure Kevin’s studio is optimized for the needs of production. He has over two decades of experience building out and expanding the technologies for facilities including MPC and Technicolor.

Image: (L-R) Mike Dalzell, Carl Loeffler and Theo Maniatis.

Quick Chat: Ntropic CD, NIM co-founder Andrew Sinagra

Some of the most efficient tools being used by pros today were created by their peers, those working in real-world post environments who develop workflows in-house. Many are robust enough to share with the world. One such tool is NIM, a browser-based studio management app for post houses that tracks a production pipeline from start to finish.

Andrew Sinagra, co-founder of NIM Labs and creative director of Ntropic, a creative studio that provides VFX, design, color and live action, was kind enough to answer some trends questions relating to tight turnarounds in post and visual effects.

What do you feel are the biggest challenges facing post and VFX studios in the coming year?
It’s an interesting time for VFX, in general. The post-Netflix-era has ushered in a whole new range of opportunities, but the demands have shifted. We’re seeing quality expectations for television soar, but schedules and budgets have remained the same — or have tightened.

The challenges that face post production studios is going to be how do they continue to create quality and competitive work while also working with faster turnarounds and ever fluctuating budgets. It seems like an impossible problem, but thankfully tools, technology and talent continue to improve and deliver better results at a fraction of the time. By investing in those three Ts, the forward-thinking studios can balance expectation with necessary cost.

What have you found to be the typical pain points for studios with regards to project management in the past? What are the main complaints you hear time and time again?
Throughout my career I have met with many industry pros, from on-the-box artists and creative directors through to heads of production and studio owners. They have all shared their trials and tribulations – as well as their methods for staying ahead of the curve. The common pain point question is always the same: “How can I get a clearer view of my studio operations on a daily basis from resource utilization through running actuals?” It’s a growing concern. Managing budgets has been a major pain point for studios. Most just want a better way to visualize and gain back some control over what’s being spent and where. It’s all about the need for efficiency and clarity of vision on a project.

Is business intelligence very important to post studios at this point? Do you see it as an emerging trend over 2018?
Yes, absolutely. Studios need to know what’s going on, on any project, at a moment’s notice. They need to know if it will be affected by endless change orders, or if they’re consistently underbidding on a specific discipline, or if they’re marking something up that is actually affecting their overall margins. These can be the kind of statistics and influences that can impact the bottom line but, the problem is, they are incredibly difficult to pull out from an ocean of numbers on a spreadsheet.

Studios that invest in business intelligence, and can see such issues immediately quantified, will be capable of performing at a much higher efficiency level than those that do not. The status quo of comparing spreadsheets and juggling emails works to an extent, but it’s very difficult to pull analysis out of that. Studios instead need solutions that can help them to better visualize their approach from the inside out. It enables stakeholders to make decisions going by their brain, rather than their gut. I can’t imagine any studio heading into 2018 will want to brave the turbulent seas without having that kind of business intelligence on their side.

What are the limitations with today’s approaches to bidding and the time and materials model? What changes do you see around financial modeling in VFX in the coming years?
The time and materials model seems largely dead, and has been for quite some time.  I have seen a few studios still working with the time and materials model in regards to specific clients, but as a whole I find studios working to flat bids with explicitly clear statements of work. The burden is then on the studio to stay within their limits and find creative solutions to the project challenges. This puts extra stress on producers to fully understand the financial ramifications of decisions made on a day-to-day basis. Will slipping in a client request push the budget when we don’t have the margin to spare? How can I reallocate my crew to be more efficient? Can we reorganize the project so that waiting for client feedback doesn’t stop us dead in the water. These are just a few of the questions that, when answered, can squeeze out that extra 10% to get the job done.

Additionally, having the right information arms the studio with the right ammunition to approach the client for overages when the time comes. Having all the information at your fingertips to the extent of time that has been spent on a project and what any requested changes would require allows studios the opportunity to educate their clients. And educating clients is a big part of being profitable.

What will studios need to do in 2018 to ensure continued success? What advice would you give them at this stage?
Other than business intelligence, staying ahead of the curve in today’s environment will also mean staying flexible, scalable and nimble. Nimbleness is perhaps the most important of the three — studios need to have this attribute to work in the ever-changing world of post production. It is rare that projects reach the finish line with the deliveries matching exactly what was outlined in the initial bid. Studios must be able to respond to the inevitable requested changes even in the middle of production. That means being able to make informed decisions that meet the client’s expectations, while also remaining within the scope of the budget. That can mean the difference between a failed project and a triumphant delivery.

Basically, my advice is this: Going into 2018, ask yourself, “Are you using your resources to your maximum potential, or are you leaving man hours on the table?” Take a close look at everything your doing and ensure you’re not pouring budget into areas it’s simply not needed. With so many moving pieces in production it’s imperative to understand at a glance where your efforts are being placed and how you can better use your artists.

Review: Dell’s 8K LCD monitor

By Mike McCarthy

At CES 2017, Dell introduced its UP3218K LCD 32-inch monitor, which was the first commercially available 8K display. It runs 7680×4320 pixels at 60fps, driven by two DisplayPort 1.4 cables. That is over 33 million pixels per frame, and nearly 2 billion per second, which requires a lot of GPU power to generate. Available since March, not long ago I was offered one to review as part of a wider exploration of 8K video production workflows, and there will be more articles about that larger story in the near future.

For this review, I will be focusing on only this product and its uses.

The UP3218K showed up in a well-designed box that was easy to unpack — it was also easy getting the monitor onto the stand. I plugged it into my Nvidia Quadro P6000 card with the included DisplayPort cables, and it came up as soon as I turned it on… at full 60Hz and without any issues or settings to change. Certain devices with only one DisplayPort 1.4 connector will only power the display at 30Hz, as full 60Hz connections saturate the bandwidth of two DP 1.4 cables, but the display does require a Displayport 1.4 connection, and will not revert to lower resolution when connected to a 1.2 port. This limits the devices that can drive it to Pascal-based GPUs on the Nvidia side, or top-end Vega GPUs on the AMD side. I have a laptop with a P5000 in it, so I was disappointed to discover that the DisplayPort connector was still only 1.2, thereby making it incompatible with this 8K monitor.

Dell’s top Precision laptops (7720 and 7520) support DP1.4, while HP and Lenovo’s mobile workstations do not yet. This is a list of every device I am aware of that explicitly claims to support 8K output:
1. Quadro P6000, P5000, P4000, P2000 workstation GPU cards
2. TitanX and Geforce10 Series graphics cards
3. RadeonPro SSG, WX9100 & WX7100 workstation GPU cards
4. RX Vega 64 and 56 graphics cards
5. Dell Precision 7520 and 7720 mobile workstations
6. Comment if you know of other laptops with DP1.4

So once you have a system that can drive the monitor, what can you do with it? Most people reading this article will probably be using this display as a dedicated full-screen monitor for their 8K footage. But smooth 8K editing and playback is still a ways away for most people. The other option is to use it as your main UI monitor to control your computer and its applications. In either case, color can be as important as resolution when it comes to professional content creation, and Dell has brought everything it has to the table in this regard as well.

The display supports Dell’s PremierColor toolset, which is loosely similar to the functionality that HP offers under their DreamColor branding. PremierColor means a couple of things, including that the display has the internal processing power that allows it to correctly emulate different color spaces; it can also be calibrated with an X-Rite iDisplay Pro independent of the system driving it. It also interfaces with a few software tools that Dell has developed for its professional users. The mo

st significant functionality within that feature set is the factory-calibrated options for emulating AdobeRGB, sRGB, Rec.709 and DCI-P3. Dell tests each display individually after manufacturing to ensure that it is color accurate. These are great features, but they are not unique to this monitor, and many users have been using them on other display models for the last few years. While color accuracy is important, the main selling point of this particular model is resolution, and lots of it. And that is what I spent the majority of my time analyzing.

Resolution
The main issue here is the pixel density. Ten years ago, 24-inch displays were 1920×1200, and 30-inch displays had 2560×1600 pixels. This was around 100 pixels per inch, and most software was hard coded to look correct at that size. When UHD displays were released, the 32-inch version had a DPI of 140. That resulted in applications looking quite small and hard to read on the vast canvas of pixels, but this trend increased pressure on software companies to scale their interfaces better for high DPI displays. Windows 7 was able to scale things up an extra 50%, but a lot of applications ignored that setting or were not optimized for it. Windows 10 now allows scaling beyond 300%, which effectively triples the size of the text and icons. We have gotten to the point where even 15-inch laptops have UHD screens, resulting in 280 DPI, which is unreadable to most people without interface scaling.

Premiere Pro

With 8K resolution, this monitor has 280 DPI, twice that of a 4K display of similar size. This is on par with a 15-inch UHD laptop screen, but laptops are usually viewed from a much closer range. Since I am still using Windows 7 on my primary workstation, I was expecting 280 DPI to be unusable for effective work. And while everything is undoubtedly small, it is incredibly crisp, and once I enabled Windows scaling at 150%, it was totally usable (although I am used to small fonts and lots of screen real estate). The applications I use, especially Adobe CC, scale much smoother than they used to, so everything looks great, even with Windows 7, as long as I sit fairly close to the monitor.

I can edit 6K footage in Premiere Pro at full resolution for the first time, with space left over for my timeline and tool panels. In After Effects, I can work on 4K shots in full resolution and still have 70 layers of data visible in my composition. In Photoshop, setting the UI to 200% causes the panel to behave similar to a standard 4K 32-inch display, but with your image having four times the detail. I can edit my 5.6K DSLR files in full resolution, with nearly every palette open to work smoothly through my various tools.

This display replaces my 34-inch curved U3415W as my new favorite monitor for Adobe apps, although I would still prefer the extra-wide 34-inch display for gaming and other general usability. But for editing or VFX work, the 8K panel is a dream come true. Every tool is available at the same time, and all of your imagery is available at HiDPI quality.

Age of Empires II

When gaming, the resolution doesn’t typically affect the field of view of 3D applications, but for older 2D games, you can see the entire map at once. Age of Empires II HD offers an expansive view of really small units, but there is a texture issue with the background of the bottom quarter of the screen. I think I used to see this at 4K as well, and it got fixed in an update, so maybe the same thing will happen with this one, once 8K becomes more common.

I had a similar UI artifact issue in RedCine player when I full-screened the Window on the 8K display, which was disappointing since that was one of the few ways to smoothly play 8K footage on the monitor at full resolution. Using it as a dedicated output monitor works as well, but I did run into some limitations. I did eventually get it to work with RedCine-X Pro, after initially experiencing some aspect ratio issues. It would playback cached frames smoothly, but only for 15 seconds at a time before running out of decoded frames, even with a Rocket-X accelerator card.

When configured as a secondary display for dedicated full-screen output, it is accessible via Mercury Transmit in the Adobe apps. This is where it gets interesting, because the main feature that this monitor brings to the table is increased resolution. While that is easy to leverage in Photoshop, it is very difficult to drive that many pixels in real-time for video work, and decreasing the playback resolution negates the benefit of having an 8K display. At this point, effectively using the monitor becomes more an issue of workflow.

After Effects

I was going to use 8K Red footage for my test, but that wouldn’t play smoothly in Premiere, even on my 20-core workstation, so I converted it to a variety of other files to test with. I created 8K test assets that matched the monitor resolution in DNxHR, Cineform, JPEG2000, OpenEXR and HEVC. DNxHR was the only format that offered full-resolution playback at 8K, and even that resulted in dropped frames on a regular basis. But being able to view 8K video is pretty impressive, and probably forever shifts my view of “sharp” in the subjective sense, but we are at a place where we are still waiting for the hardware to catch up in regards to processing power — for 8K video editing to be an effective reality for users.

Summing Up
The UP3218K is the ultimate monitor for content creators and artists looking for a large digital canvas, regardless of whether that is measured in inches or pixels. All those pixels come at a price — it is currently available from Dell for $3,900. Is it worth it? That will depend on what your needs and your budget are. Is a Mercedes Benz worth the increased price over a Honda? Some people obviously think so.

There is no question that this display and the hardware to drive it effectively would be a luxury to the average user. But for people who deal with high resolution content on a regular basis, the increased functionality that it offers them can’t be measured in the same way, and reading an article and seeing pictures online can’t compare to actually using the physical item. The screenshots are all scaled to 25% to be a reasonable size for the web. I am just trying to communicate a sense of the scope of the desktop real estate available to users on an 8K screen. So yes, it is expensive, but at the moment, it is the highest resolution monitor that money can buy, and the closest alternative (5K screens) does not even come close.


Mike McCarthy is an online editor/workflow consultant with 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.

 

Quantum’s Xcellis scale-out NAS targets IP workflows for M&E

Quantum is now offering a new Xcellis Scale-out NAS targeting data-heavy IP-based media workflows. Built off Quantum’s StorNext shared storage and data management platform, the multi-protocol, multi-client Xcellis Scale-out NAS system combines media and metadata management with high performance and scalability. Users can configure an Xcellis solution with both scale-out SAN and NAS to provide maximum flexibility.

“Media professionals have been looking for a solution that combines the performance and simplified scalability of a SAN with the cost efficiency and ease of use of NAS,” says Quantum’s Keith Lissak. “Quantum’s new Xcellis Scale-out NAS platform bridges that gap. By affordably delivering high performance, petabyte-level scalability and advanced capabilities such as integrated AI, Xcellis Scale-out NAS is [a great] solution for migrating to all-IP environments.”

Specific benefits of Xcellis Scale-out NAS include:
• Increased Productivity in All-IP Environments: It features a converged architecture that saves space and power, continuous scalability for simplified scaling of performance and capacity and unified access to content.
• Cost-Effective Scaling of Performance and Capacity: One appliance provides 12 GB/sec per client. An Xcellis cluster can scale performance and capacity together or independently to reach hundreds of petabytes in capacity and more than a terabyte per second in performance. When deployed as part of a multitier StorNext infrastructure ― which can include object, tape and cloud storage ― Xcellis Scale-out NAS can cost as little as 1/10 that of an enterprise-only NAS solution with the same capacity.
• Lifecycle, Location and Cost Management: It’s built off of Quantum’s StorNext software, which provides automatic tiering between flash, disk, tape, object storage and public cloud. Copies can be created for content distribution, collaboration, data protection and disaster recovery.
• Integrated Artificial Intelligence: Xcellis can integrate artificial intelligence (AI) capabilities to enable users to extract more value for their assets through the automated creation of metadata. The system can actively interrogate data across multiple axes to uncover events, objects, faces, words and sentiments, automatically generating new, custom metadata that unlocks additional possibilities for the use of stored assets.

Xcellis Scale-out NAS will be generally available this month with entry configurations and those leveraging tiering starting at under $100 per terabyte (raw).

Shotgun 7.6 adds analytics feature set for VFX and animation

Shotgun Software has released Shotgun 7.6, the latest version of its cloud-based review and production tracking software, featuring a new set of analytics and reporting tools that give studios the ability to visualize key production information, keep a close eye on the progress of their projects and make business-critical decisions quickly.

The new normal is shorter turnaround, tighter budgets and growing creative demands, so studios need to be efficient, identify business issues quickly and adjust where and how resources are being used during production. Production Insights in Shotgun provides studios with an overview of the health of projects as well as the ability to dive into the details to see where time and resources are used, so operations can be streamlined and better decisions can be made.

“Our new Production Insights features help Shotgun customers answer urgent and costly production questions such as: Are we going to hit our deadline? How much work is there left to do? Where are we struggling?” explains James Pycock, head of product management for Shotgun. “Having access to these tools out of the box gives everyone instant at-a-glance visualizations of how and where they are spending time and resources.”

Shotgun Production Insights include:

– Analytics: The ability to apply production data in Shotgun to optimize how resources are used, plan ahead for tight deadlines and budgets, and accurately compile bids for upcoming projects.
– Data Visualization: In addition to the existing horizontal bar chart in Shotgun, there are now new graph types, including pie charts, vertical bar charts and line charts.
– Data Grouping: Display data is now available as stacked (see picture) or un-stacked bar charts to visualize in even greater at-a-glance detail.
– Presets: Users can drag and drop from a number of pre-configured presets to build reports instantly, with flexible customization options.

Shotgun pricing starts at $30 per account/per month with what they call “Awesome” support, or $50 per account/per month for “Super Awesome” support. They are offering free trials here.

Rogue takes us on VR/360 tour of Supermodel Closets

Rogue is a NYC-based creative boutique that specializes in high-end production and post for film, advertising and digital. Since its founding two years ago, executive creative director, Alex MacLean and his team have produced a large body of work providing color grading, finishing and visual effects for clients such as HBO, Vogue, Google, Vice, Fader and more. For the past three years MacLean has also been at the forefront of VR/360 content for narratives and advertising.

MacLean recently wrapped up post production on four five-minute episodes of 360-degree tours of Supermodel Closets. The series is a project of Conde Nast Entertainment and Vogue for Vogue’s 125th anniversary. If you’re into fashion, this VR tour gives you a glimpse at what supermodels wear in their daily lives. Viewers can look up, down and all around to feel immersed in the closet of each model as she shows her favorite fashions and shares the stories behind their most prized pieces.

 

Tours include the closets of Lily Aldridge, Cindy Crawford, Kendall Jenner  and
Amber Valletta.

MacLean worked with director Julina Tatlock, who is a co-founder and CEO of 30 Ninjas, a digital entertainment company that develops, writes and produces VR, multi-platform and interactive content. Rogue and 30 Ninjas worked together to determine the best workflow for the series. “I always think it’s best practice to collaborate with the directors, DPs and/or production companies in advance of a VR shoot to sort out any technical issues and pre-plan the most efficient production process from shoot to edit, stitching through all the steps of post-production,” reports MacLean. “Foresight is everything; it saves a lot of time, money, and frustration for everyone, especially when working in VR, as well as 3D.”

According to MacLean, they worked with a new camera format, the YI Halo camera, which is designed for professional VR data acquisition. “I often turn to the Assimilate team to discuss the format issues because they always support the latest camera formats in their Scratch VR tools. This worked well again because I needed to define an efficient VR and 3D workflow that would accommodate the conforming, color grading, creating of visual effects and the finishing of a massive amount of data at 6.7K x 6.7K resolution.”

 

The Post
“The post production process began by downloading 30 Ninjas’ editorial, stitched footage from the cloud to ingest into our MacBook Pro workstations to do the conform at 6K x 6K,” explains MacLean. “Organized data management is a critical step in our workflow, and Scratch VR is a champ at that. We were simultaneously doing the post for more than one episode, as well as other projects within the studio, so data efficiency is key.”

“We then moved the conformed raw 6.7K x 6.7K raw footage to our HP Z840 workstations to do the color grading, visual effects, compositing and finishing. You really need powerful workstations when working at this resolution and with this much data,” reports MacLean. “Spherical VR/360 imagery requires focused concentration, and then we’re basically doing everything twice when working in 3D. For these episodes, and for all VR/360 projects, we create a lat/long that breaks out the left eye and right eye into two spherical images. We then replicate the work from one eye to the next, and color correct any variances. The result is seamless color grading.

 

“We’re essentially using the headset as a creative tool with Scratch VR, because we can work in realtime in an immersive environment and see the exact results of work in each step of the post process,” he continues. “This is especially useful when doing any additional compositing, such as clean-up for artifacts that may have been missed or adding or subtracting data. Working in realtime eases the stress and time of doing a new composite of 360 data for the left eye and right eye 3D.”

Playback of content in the studio is very important to MacLean and team, and he calls the choice of multiple headsets another piece to the VR/360 puzzle. “The VR/3D content can look different in each headset so we need to determine a mid-point aesthetic look that displays well in each headset. We have our own playback black box that we use to preview the color grading and visual effects, before committing to rendering. And then we do a final QC review of the content, and for these episodes we did so in Google Daydream (untethered), HTV Live (tethered) and the Oculus Rift (tethered).”

MacLean sees rendering as one of their biggest challenges. “It’s really imperative to be diligent throughout all the internal and client reviews prior to rendering. It requires being very organized in your workflow from production through finishing, and a solid QC check. Content at 6K x 6K, VR/360 and 3D means extremely large files and numerous hours of rendering, so we want to restrict re-rendering as much as possible.”

First Impressions: Apple’s new iMac Pro

This London-based video editor gives it a ride

By Thomas Carter

Over the last few days I’ve had the chance to play with the new iMac Pro from Apple. I’m a professional editor at Trim Editing in London, where I cut high-end commercials, music videos and films. I was really excited to see how this new machine, and the upcoming version of Final Cut Pro X (10.4) NLE, could benefit us here and what sorts of things it might be able to achieve.

The Design
This thing looks like an iMac, no doubt about it. It’s the same all-in-one form factor we’ve become accustomed to, but in space grey. I love this design, and I’m a sucker for anything that nears a matte black finish. It’s pretty incredible to have a machine this powerful essentially living inside a display, and it looks great in the edit suite, especially as it comes paired with a space grey keyboard, mouse and trackpad.

Space grey aside, the only external tweaks are around the back — there are four USB 3 ports, four Thunderbolt 3 ports, a 10GB Ethernet port and large “Vader-like” vents to help cool the eager internals. While those Thunderbolt ports can support two additional 5K displays, what I’m most excited about here is the 10GB Ethernet port. We can now directly attach our LumaForge Jellyfish shared storage without the need for Thunderbolt conversion.

One last point, because I know I’d be asking this question. Can you buy the keyboard, mouse and trackpad separately? Sadly, apparently you cannot. But if you can somehow justify spending $4,999 on a space grey keyboard, mouse and trackpad, at least you’ll get a free iMac Pro!

The Performance
As I said, I’ve only had my hands on the machine for a couple of days, so I haven’t had the chance to run a full-blown editing job through it yet. But it’s abundantly clear to me that this thing is a beast. It’s by far the fastest Mac I’ve ever used, and according to Apple the most powerful they’ve ever built.

Thermal cooling

The machine I had access to featured a 10-core 3GHz processor, 128GB memory, 2TB SSD and Radeon Pro Vega 64 graphics with 16GB memory. The internal SSD is ridiculously fast. When I tested the speed I got 3021MB/s write and 2465MB/s read. And for anyone who knows what it means (not me) the GeekBench 4 score on the processors was 37003.

But let’s forget the paper specs for a moment. Here are a few real-world editing tests I ran:

A feature film has been cutting here at Trim over the past few months, so I took the opportunity to hijack the project to see what the export speeds were like. A ProRes HD file took 2 minutes 34 seconds, which is pretty great for a 90-minute timeline. But compressed H.264s are far more common for me as an editor when dealing with upload and review of my cuts. My biggest frustration with all previous Mac Pro machines was that their H.264 export speeds always seemed terrible. This is due to the fact that “workstation-class chips” don’t have the hardware-acceleration necessary for these tasks. So I was pleasantly surprised to find that Apple seem to be bypassing these limitations somehow, and the iMac Pro is also delivering fast H.264 exports. I have no idea what they are doing behind the scenes to achieve this, but it works and will save me hours in encoding time.

Next I decided to push the resolution right up and see how it might handle a ludicrous 8K timeline with footage shot on the Panavision Millennium DXL. With 8K ProRes 4:4:4:4 files, the iMac Pro played the sequence back perfectly. Even after adding a couple of color corrections and a blur to the clips it still didn’t drop a frame. I should add that this was playing back at better quality and without rendering. I’ll repeat that once more. 8K. Color correction. Blur. No Rendering. No “1/4 quality” BS. No frames dropped.

Yes, 8K is an impressive number, but I was also interested to see how it might handle a less friendly codec like R3D, a notoriously heavy codec for computers to decode/debayer and playback at full quality. The maximum I managed to test here was 5K Red RAW footage in a 5K timeline. Again, best quality and unrendered. Adding color correction, resizes and titles didn’t cause the machine to drop frames. The sequence played through smoothly, which is nuts.

Trim Editing

While this last test is really impressive, there aren’t many real-world jobs where I’ll be storing an entire film shoot of Red RAW rushes on my internal SSD. So I also checked how this played out on external storage. I’m happy to report that loading the same media onto our Jellyfish shared storage and accessing it over direct-attached 10Gb Ethernet gave me the same results.

These tests really blew me away. They aren’t necessarily going to be everyday scenarios for most people, or even me, but they make it possible to imagine editing workflows in which you’re working at close to the highest quality possible throughout the entire process… on a desktop computer. A space grey one. It’s going to be really interesting to see how the rest of the company reacts to this computer moving forward. While we mainly deal in offline workflows, we have begun to look at possibly taking on more conforming, online, grading work in-house. It’s not hard to conceive that the iMac Pro could be the tool to bring all these elements together for us in a streamlined way.

The Bottom Line
While I really haven’t had enough time to do a deep dive, it’s clearly the best Mac I’ve ever used — it’s stupidly powerful and great to work on.

Thomas Grove Carter

But who is it actually for? Clearly not everyone. It’s quite obviously a pro machine and it comes with a price tag to fit — $4,999. If you’re a pro user who needs a Pro Mac, it’s probably for you (and you can get your hands on one starting December 14). If you’re already an iMac user but you need more power, it’s probably for you too. If I had to make a wildly uninformed guess, I’d say this will be more than enough computer for 90% of pros.

There will still understandably be a number of places where this machine will not be enough, and I don’t mean it’s lacking in power — if you’re someone who needs rack-mountable, user-expandable hardware, this may not be for you.

For me, if an equally powerful Mac Pro existed, I’d still chose this iMac Pro over it, because I love the all-in-one compact design and the way it sits in my edit suite. I can’t wait to use the iMac Pro for genuine work and really put it through its paces. I’m excited and slightly dizzied by its power, and the potential that power has for delivering amazing work.

Also, did I mention that it’s space grey…


Thomas Grove Carter is an editor at Trim Editing in London, where they cut commercials, music videos and films. Follow him on Twitter @thomasgcarter.

Apache colorists: Cullen Kelly added, Quinn Alvarez promoted

Santa Monica color and post studio Apache has added colorist Cullen Kelly. In addition to Kelly’s hire, the studio also promoted colorist Quinn Alvarez from an assistant’s role.

Kelly joins from Labrador Post, a color grading studio he founded in Austin, Texas. He has relocated to Southern California. Alvarez has been with Apache since 2015, joining from the production company Prettybird, where he handled all post duties and worked closely with its directors and producers.

“We’re currently working on several scripted and documentary shows for Hulu, Netflix and Amazon, in addition to our commercial work for agencies,” says Apache managing partner LaRue Anderson. “We needed additional artists that come with a unique perspective to color grading to handle these assignments, not just helping hands.”

Kelly worked with Apache earlier this year as a freelancer, doing finishing for the debut season of Netflix’s American Vandal series. Kelly, who studied film at the Art Center College of Design in Pasadena before launching his career, worked in several post jobs before focusing on color grading. In addition to his work on the Netflix series, his reel includes short films and promos for The History Channel, FX Networks and SXSW.

When asked what drew him to color grading, Kelly said, “I’m a very visual person, and I love the amount of detail and energy that goes into color work. And it’s so collaborative; you’re working with people and helping bring their vision to life.”

A graduate of UC Berkeley, Alvarez says that while working at Prettybird he learned the craft of color grading from a director’s point of view, stressing the importance of story and substance. “I like the pace of color work, too,” he adds. “There’s always a new challenge, and new clients to work with. It keeps me fresh. And color is typically one of the final stages in a project — you’re putting the polish on things, so to speak, so people always leave happy.”

His reel includes work for such brands as Nike, Absolut, Jack Daniels, Tumi, Toyota, Lexus and Mercedes-Benz, as well as music videos shot by such directors as Paul Hunter, Eric Wareheim and Andy Hines.

Both Kelly and Alvarez use Blackmagic Resolve.

Apache’s branching out from just color to handling finishing is also driving its need to add more creative talent, reports Anderson: “Keeping the color and finish under the same roof, particularly for long-form projects, allows us to swiftly complete a show. That adds valuable time to our clients’ often-constrained post schedules, without compromising the look and feel of the film. And we’re finding that cinematographers and directors are moving to original series work, because it can offer more creative freedom. With the addition of Cullen and the promotion of Quinn, we now have five colorists to help transform their digital visions into reality.”

Three Billboards Outside Ebbing, Missouri director Martin McDonagh

By Iain Blair

Anglo-Irish playwright Martin McDonagh won an Academy Award for Best Live Action Short Film for Six Shooter, his first foray into film, and followed that project with his feature film debut In Bruges. Starring Colin Farrell, Ralph Fiennes and Brendan Gleeson, that gangster action/comedy premiered at the Sundance Film Festival in 2008 and won McDonagh a BAFTA Award and an Oscar nom for Best Original Screenplay.

He followed that up with Seven Psychopaths, another twisted tale about some incompetent dognappers and vengeful mobsters that reunited him with Farrell, along with a stellar cast that included Woody Harrelson, Sam Rockwell, Christopher Walken and Tom Waits.

Now McDonagh is back with his latest film, Three Billboards Outside Ebbing, Missouri. This darkly comedic drama stars Oscar-winner Frances McDormand as Mildred Hayes, a grieving, no-holds-barred vengeful mother. After months have passed without any progress in her daughter’s murder case, she takes matters into her own hands and commissions three signs leading into town with a controversial message directed at William Willoughby (Woody Harrelson), the town’s respected chief of police. When his second-in-command Officer Dixon (Sam Rockwell), an immature mother’s boy with a penchant for violence, gets involved, the battle between Mildred and Ebbing’s law enforcement is only exacerbated.

The Fox Searchlight Pictures release also features an impressive team of collaborators behind the camera: director of photography Ben Davis, BSC, production designer Inbal Weinberg, film editor Jon Gregory and composer Carter Burwell.

I recently spoke with McDonagh about making the film, which was just nominated for a Golden Globe for Best Picture — Drama and is already generating a lot of Oscar buzz. McDormand, McDonagh and Rockwell were also nominated for Golden Globes. Ok, let’s find out more…

L-R: Martin McDonagh and Woody Harrelson on set.

This film starts off like a simple tale of revenge, but it then becomes apparent that there’s a lot more going on.
Exactly. I never wrote it as a simple revenge piece. It was always going to be a dark comedy, and the main thing I wanted was to have a very strong woman in the lead — and a shockingly outrageous one at that. I wanted to write the character as real and human in her grief as possible.

Is it true you wrote the role with Frances in mind?
It is. I immediately thought of her as Mildred as I felt she had all the elements that Mildred needed. She had to have a kind of working class sensibility and also not sentimentalize the character. And I knew she had the range and could play the anguish and darkness of Mildred but also deal with the humor, while staying true to who Mildred is as a character.

So what would have happened if she’d turned down the role?
I probably wouldn’t have done the film. I’m so glad she wanted to do it and I didn’t have to worry about it. It definitely wouldn’t have been the film it is without her.

What did she bring to the role?
First, she’s the best actor of her generation, I think, and she brought a lot of integrity and honesty, and I knew she’d play it truthfully and not just go for laughs, and not patronize Mildred and try and make her more lovable — because she isn’t very lovable. She was completely fearless about taking it on.

What about Woody?
He has less time to play with his character, but again he brought a lot of integrity, and he is very lovable — a guy you instantly like, a decent good guy.

You also reunited with Sam Rockwell, whose character ultimately takes the biggest journey.
Like with Frances, I specifically wrote the part for him, as I always have his voice in my mind when I write these dark but slightly comedic characters. And again, there’s something inherently lovable about Sam, so while Dixon seems to be everything you would despise in a man — he’s a racist, he’s violent, he’s obnoxious — Sam also makes him redeemable, and gives him this slightly child-like quality. By the end, he doesn’t do a 180-degree turn, but Sam gives him enough of an inner change that should come as a surprise.

Ebbing is a fictional place. Where did you shoot?
In a little town called Sylva, near Asheville, North Carolina, in the Great Smoky Mountains. It’s a nice place that doesn’t hint at anything dark, and it had all the locations we needed, all close by. It was a really joyful shoot, just under two months, and it had a great family feel as I’d worked with several of the actors before — and the DP, 1st AD and some others. All of the locals were very helpful.

How do you feel about the post process?
I really enjoy it, especially the editing and looking through every single take and making notes and then going through all the performances and crafting and sculpting a scene with editor Jon Gregory. I love all that, and watching actors do what they do. That’s the biggest joy for me, and what’s so interesting about post is that scenes I felt could never be cut out when I wrote them or shot them may turn out to be unnecessary in the edit, and I was happy to lose them. I find editing very relaxing and you have time to explore all the material as you piece it together. The parts of post that I find a bit tedious are dealing with CGI and VFX, and all the waiting around for them.

Where did you edit and post this?
We did it all in Soho, London, at Goldcrest and various places.

Tell us about working with the editor. Was he on the set?
Jon came out to North Carolina a week before the shoot so he could see the place and we could talk about stuff. And as it’s a small, walkable place, I could wander over and see what he was up to while we were shooting. Jon’s great in that if he felt we’d missed a shot or moment, he’d let me know and we could do a pick-up, which is no problem when you have all the actors there. That happened a couple of times.

The main challenge was keeping the right balance between all the dark stuff and the comedy so that it flowed and wasn’t jarring. Tone and pacing are always key for me in the edit, and finding the moments of tenderness — that look in someone’s eyes as the rage and anger take care of themselves. I think the film’s more about loss and pain and hope than dark anger.

Who did the visual effects work, and how many visual effects shots were there?
There are a few, mainly taking stuff out and clean up. All of the fire sequences were done with real fire, but then we added VFX flames to enhance the look. And the whole Molotov cocktail scene was done with VFX. The scene where Sam goes across the street, up the stairs and then throws the guy out the window was all real — and done in one unbroken shot.

L-R: Martin McDonagh and writer Iain Blair.

How important are sound and music to you?
Hugely important. It’s half the film, at least, and I’ve loved Carter Burwell’s work ever since I saw Blood Simple. He’ll always do the opposite of what you’d expect and play against convention, which is partly why he’s so good. But he also comes up with beautiful melodies. I went over to see him in New York in the middle of the edit, and he played me a few ideas, which I loved as it had this great mix of Americana and Spaghetti Western. The score he wrote works perfectly for the characters and the themes. I love doing the sound mix and hearing how it elevates all the visuals so much. (Burwell received one of the film’s six Golden Globe nominations.)

Who did the DI?
Colorist Adam Glasman (at Goldcrest Post), who did my other films. I’m very involved, and pop in and give notes, but I really trust Adam and the DP to get the look I want.

The film’s getting a lot of Oscar and awards season buzz. How important is that to you?
It’s a small film with a small budget — obviously not one of the huge blockbusters like Star Wars and so on, so it’s great to be included in the conversation. It’s helped give it a lot of momentum, and I kind of like all the attention!


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Behind the Title: Senior compositing artist Marcel Lemme

We recently reached out to Marcel Lemme to find out more about how he works, his background and how he relaxes.

What is your job title and where are you based?
I’m a senior compositing artist based out of Hamburg, Germany.

What does your job entail?
I spend about 90 percent of my time working on commercial jobs for local and international companies like BMW, Audi and Nestle, but also dabble in feature films, corporate videos and music videos. On a regular day, I’m handling everything from job breakdowns to set supervision to conform. I’m also doing shot management for the team, interacting with clients, showing clients work and some compositing. Client review sessions and final approvals are regular occurrences for me too.

What would surprise people the most about the responsibilities that fall under that title?
When it comes to client attended sessions, you have to be part clown, part mind-reader. Half the job is being a good artist; the other half is keeping clients happy. You have to anticipate what the client will want and balance that with what you know looks best. I not only have to create and keep a good mood in the room, but also problem-solve with a smile.

What’s your favorite part of your job?
I love solving problems when compositing solo. There’s nothing better than tackling a tough project and getting results you’re proud of.

What’s your least favorite?
Sometimes the client isn’t sure what they want, which can make the job harder.

What’s your most productive time of day?
I’m definitely not a morning guy, so the evening — I’m more productive at night.

If you didn’t have this job, what would you be doing instead?
I’ve asked myself this question a lot, but honestly, I’ve never come up with a good answer.

How’d you get your first job, and did you know this was your path early on?
I fell into it. I was young and thought I’d give computer graphics a try, so I reached out to someonewho knew someone, and before I knew it I was interning at a company in Hamburg, which is how I came to know online editing. At the time, Quantel mostly dominated the industry with Editbox and Henry, and Autodesk Flame and Flint were just emerging. I dove in and started using all the technology I could get my hands on, and gradually started securing jobs based on recommendations.

Which tools are you using today, and why?
I use whatever the client and/or the project demands, whether it’s Flame or Foundry’s Nuke and for tracking I often use The Pixel Farm PFTrack and Boris FX Mocha. For commercial spots, I’ll do a lot of the conform and shot management on Flame and then hand off the shots to other team members. Or, if I do it myself, I’ll finish in Flame because I know I can do it fast.

I use Flame because it gives me different ways to achieve a certain look or find a solution to a problem. I can also play a clip at any resolution with just two clicks in Flame, which is important when you’re in a room with clients who want to see different versions on the fly. The recent open clip updates and python integration have also saved me time. I can import and review shots, with automatic versions coming in, and build new tools or automate tedious processes in the post chain that have typically slowed me down.

Tell us about some recent project work.
I recently worked on a project for BMW as a compositing supervisor and collaborated with eight other compositors to finish number of versions in a short amount of time. We did shot management, compositing, reviewing, versioning and such in Flame. Also individual shot compositing in Nuke and some tracking in Mocha Pro.

What is the project that you are most proud of?
There’s no one project that stands out in particular, but overall, I’m proud of jobs like the BMW spots, where I’ve led a team of artists and everything just works and flows. It’s rewarding when the client doesn’t know what you did or how you did it, but loves the end result.

Where do you find inspiration for your projects?
The obvious answer here is other commercials, but I also watch a lot of movies and, of course, spend time on the Internet.

Name three pieces of technology you can’t live without.
The off button on the telephone (they should really make that bigger), anything related to cinematography or digital cinema, and streaming technology.

What social media channels do you follow?
I’ve managed to avoid Facebook, but I do peek at Twitter and Instagram from time to time. Twitter can be a great quick reference for regional news or finding out about new technology and/or industry trends.

Do you listen to music while you work?
Less now than I did when I was younger. Most of the time, I can’t as I’m juggling too much and it’s distracting. When I listen to music, I appreciate techno, classical and singer/song writer stuff; whatever sets the mood for the shots I’m working on. Right now, I’m into Iron and Wine and Trentemøller, a Danish electronic music producer.

How do you de-stress from the job?
My drive home. It can take anywhere from a half an hour to an hour, depending on the traffic, and that’s my alone time. Sometimes I listen to music, other times I sit in silence. I cool down and prepare to switch gears before heading home to be with my family.