Object Matrix and Arvato Systems have partnered to help companies instantly access, manage, browse and edit clips from their digital archives.
Using Arvato’s production asset management platform, VPMS EditMate along with the media-focused object storage solution from Object Matrix, MatrixStore, the companies report that organizations can significantly reduce the time needed to manage media workflows, while making content easily discoverable. The integration makes it easy to unlock assets held in archive, enable creative collaboration and monetize archived assets.
MatrixStore is a media-focused private and hybrid cloud storage platform that provides instant access to all media assets. Built upon object-based storage technology, MatrixStore provides digital content governance through an integrated and automated storage platform supporting multiple media-based workflows while providing a secure and scalable solution.
VPMS EditMate is a toolkit built for managing and editing projects in a streamlined, intuitive and efficient manner, all from within Adobe Premiere Pro. From project creation and collecting media, to the export and storage of edited material, users benefit from a series of features designed to simplify the spectrum of tasks involved in a modern and collaborative editing environment.
Linda Hamilton’s Sarah Connor and Arnold Schwarzenegger T-800 are back to save humanity from a dystopian future in this latest installment of the Terminator franchise. James Cameron is also back and brings with him writing and producing credits, which is fitting — Terminator: Dark Fate is in essence Cameron’s sequel to Terminator 2: Judgment Day.
Tim Miller (Deadpool) is at the helm to direct the tale. It’s roughly two decades after the time of T2, and a new Rev-9 machine has been sent from an alternate future to kill Dani Ramos (Natalia Reyes), an unsuspecting auto plant worker in Mexico. But the new future’s resistance has sent back Grace (Mackenzie Davis), an enhanced super-soldier, to combat the Rev-9 and save her. They cross paths with Connor, and the story sets off on a mad dash to the finale at Hoover Dam.
Miller brought back much of his Deadpool team, including his VFX shop Blur, DP Ken Seng and editor Julian Clarke. This is also the second pairing of Miller and Clarke with Adobe. Both Deadpool and Terminator: Dark Fate were edited using Premiere Pro. In fact, Adobe was also happy to tie in with the film’s promotion through its own #CreateYourFate trailer remix challenge. Participants could re-edit their own trailer using supplied content from the film.
I recently spoke with Clarke about the challenges and fun of cutting this latest iteration of such an iconic film franchise.
Terminator: Dark Fate picks up two decades after Terminator 2, leaving out the timelines of the subsequent sequels. Was that always the plan, or did it evolve out of the process of making the film?
That had to do with the screenplay. You were written into a corner by the various sequels. We really wanted to bring Linda Hamilton’s character back. With Jim involved, we wanted to get back to first principles and have it based on Cameron’s mythology alone. To get back to the Linda/Arnold character arcs, and then add some new stuff to that.
Many fans were attracted to the franchise by Cameron’s two original Terminator films. Was there a conscious effort at integrating that nostalgia?
I come from a place of deep fandom for Terminator 2. As a teenager I had VHS copies of Aliens and Terminator 2 and watched them on repeat after school! Those films are deeply embedded in my psyche, and both of them have aged well — they still hold up. I watched the sequels, and they just didn’t feel like a Terminator film to me. So the goal was definitely to make it of the DNA of those first two movies. There’s going to be a chase. It’s going to be more grounded. It’s going to get back into the Sarah Connor character and have more heart.
This film tends to have elements of humor unlike most other action films. That must have posed a challenge to set the right tone without getting campy.
The humor thing is interesting. Terminator 2 has a lot of humor throughout. We have a little bit of humor in the first half and then more once Arnold shows up, but that’s really the way it had to be. The Dani Ramos character — who’s your entry point into the movie — is devastated when her whole family is killed. To have a lot of jokes happening would be terrible. It’s not the same in Terminator 2 because John Connor’s stepparents get very little screen time, and they don’t seem that nice. You feel bad for them, but it’s OK that you get into this funny stuff right off the bat. On this one we had to ease into the humor so you could [experience] the gravity of the situation at the start of the movie.
Did you have to do much to alter that balance during the edit?
There were one or two jokes that we nipped out, but it wasn’t like that whole first act was chock full of jokes. The tone of the first act is more like Terminator, which is more of a thriller or horror movie. Then it becomes more like T2 as the action gets bigger and the jokes come in. So the first half is like a bigger Terminator and the second half more like T2.
Deadpool, which Tim Miller also directed, used a very nonlinear story structure, balancing action, comedic moments and drama. Terminator was always designed with a linear, straightforward storyline. Right?
A movie hands you certain editing tools. Deadpool was designed to be nonlinear, with characters in different places, so there are a whole bunch of options for you. Terminator: Dark Fate is more like a road movie. The detonation of certain paths along the road are predetermined. You can’t be in Texas before Mexico. So the structural options you had were where to check in with the Rev-9, as well as the inter-scene structure. Once you are in the detention center, who are you cutting to? Sarah? Dani? However, where that is placed in the movie is pretty much set. All you can do is pace it up, pace it down, adjust how to get there. There aren’t a lot of mobile pieces that can be swapped around.
When we had talked after Deadpool, you discussed how you liked the assistants to build string-outs — what some call a Kem roll. Similar action is assembled back to back into a sequence in order from every take. Did you use that same organizational method on Terminator: Dark Fate?
Sometimes we were so swamped with material that there wasn’t time to create string-outs. I still like to have those. It’s a nice way to quickly see all the pieces that cover a moment. If you are trying to find the one take or action that’s 5% better than another, then it’s good to see them all in a row, rather than trying to keep it all in your head for a five-minute take. There was a lot of footage that we shot in the action scenes, but we didn’t do 11 or 12 takes for a dialogue scene. I didn’t feel like I needed some tool to quickly navigate through the dialogue takes. We would string out the ones that were more complicated.
Depending on the directing style, a series of takes may have increasingly calibrated performances with successive takes. With other directors, each take might be a lot different than the one before and after it. What is your approach to evaluating which is the best take to use?
It’s interesting when you use the earlier takes versus the later takes and what you get from them. The later takes are usually the ones that are most directed. The actors are warmed up and most closely nail what the director has in mind. So they are strong in that regard, but sometimes they can become more self-conscious. So sometimes the first take is more thrown away and may have less power but feels more real — more off the cuff. Sometimes a delivered dialogue line feels less written, and you’ll buy it more. Other times you’ll want that more dramatic quality of the later takes. My instinct is to first use the later takes, but as you start to revise a scene, you often go back to pieces of the earlier takes to ground it a little more.
How long did the production and post take?
It took a little over 100 days of shooting with a lot of units. I work on a lot of mid-budget films, so this seemed like a really long shoot. It was a little relentless for everyone — even squeezing it into those 100 days. Shooting action with a lot of VFX is slow due to the reset time needed between takes. The ending of the movie is 30 minutes of action in a row. That’s a big job shooting all of that stuff. When they have a couple of units cranking through the dialogue scenes plus shooting action sequences — that’s when I have to work hard to keep up. Once you hit the roadblocks of shooting just those little action pieces, you get a little time to catch up.
We had the usual director’s cut period and finished by the end of this September. The original plan was to finish by the beginning of September, but we needed the time for VFX. So everything piled up with the DI and the mix in order to still hit the release date. September got a little crazy. It seems like a long time — a total of 13 or 14 months — but it still was an absolute sprint to get the movie in shape and get the VFX into the film in time. This might be normal for some of these films, but compared to the other VFX movies I’ve done, it was definitely turning things up a notch!
I imagine that there was a fair amount of previz required to lay out the action for the large VFX and CG scenes. Did you have that to work with as placeholder shots? How did you handle adjusting the cut as the interim and final shots were delivered?
Tim is big into previz with his background in VFX and animation and owning his own VFX company. We had very detailed animatics going into production. Depending on a lot of factors, you still abandon a lot of things. For example, the freeway chases are quite a bit different because when you go there and do it with real cars, they do different things. Or only part of the cars look like they are going fast enough. Those scenes became quite different than the previz.
Others are almost 100% CG, so you can drop in the previz as placeholders. Although, even in those cases, sometimes the finished shot doesn’t feel real enough. In the “cartoon” world of previz, you can do wild camera moves and say, “Wow, that seems cool!” But when you start doing it at photoreal quality, then you go, “This seems really fake.” So we tried to get ahead of that stuff and find what to do with the camera to ground it. Kind of mess it up so it’s not too dynamic and perfect.
How involved were you with shaping the music? Did you use previous Terminator films’ scores as a temp track to cut with?
I was very involved with the music production. I definitely used a lot of temp music. Some of it was ripped from old Terminator movies, but there’s only so much Terminator 2 music you can put in. Those scores used a lot of synthesizers that date the sound. I did use “Desert Suite” from Terminator 2, when Sarah is in the hotel room. I loved having a very direct homage to a Sarah Connor moment while she’s talking about John. Then I begged our composer, Tom Holkenborg (from Junkie XL), to consider doing a version of it for our movie. So it is essentially the same chord progression.
That was an interesting musical and general question about how much do you lean into the homage thing. It’s powerful when you do it, but if you do it too much, it starts to feel artificial or pandering. So I tried to hit the sweet spot so you knew you were watching a Terminator movie, but not so much that it felt like Terminator karaoke. How many times can you go da-dum-dum-da-da-dum? You have to pick your moments for those Terminator motifs. It’s diminishing returns if you do it too much.
Another inspirational moment for me was another part in Terminator 2. There’s a disturbing industrial sound for the T-1000. It sounds more like a foghorn or something in a factory rather than music, and it created this unnerving quality to the T-1000 scenes, when he’s just scoping things out. So we came up with a modern-day electronic equivalent for the Rev-9 character, and that was very potent.
Was James Cameron involved much in the post production?
He’s quite busy with his Avatar movies. Some of the time he was in New Zealand, some of the time he was in Los Angeles. Depending on where he was and where we were in the process, we would hit milestones, like screenings or the first cut. We would send him versions and download a bunch of his thoughts.
Editing is very much a part of his wheelhouse. Unlike many other directors, he really thinks about this shot, then that shot, then the next shot. His mind really works that way. Sometimes he would give us pretty specific, dialed-in notes on things. Sometimes it would just be bigger suggestions, like, “Maybe the action cutting pattern could be more like this …” So we’d get his thoughts — and, of course, he’s Jim Cameron, and he knows the business and the Terminator franchise — so I listened pretty carefully to that input.
This is the second film that you’ve cut with Premiere Pro. Deadpool was first, and there were challenges using it on such a complex project. What was the experience like this time around?
Whenever you set out to use a new workflow — not to say Premiere is new because it’s been around a long time and has millions of users, but it’s unusual to use it on large VFX movies for specific reasons.
L-R: Matthew Carson and Julian Clarke
On Deadpool, that led to certain challenges, and that’s just what happens when you try to do something new. The fact that we had to split the movie into separate projects for each reel, instead of one large project. Even so, the size of our project files made it tough. They were so full of media that they would take five minutes to open. Nevertheless, we made it work, and there are lots of benefits to using Adobe over other applications.
In comparison, the interface to Avid Media Composer looks like it was designed 20 years ago, but they have multi-user collaboration nailed, and I love the trim tool. Yet, some things are old and creaky. Adobe’s not that at all. It’s nice and elegant in terms of the actual editing process. We got through it and sat down with Adobe to point out things that needed work, and they worked on them. When we started up Terminator, they had a whole new build for us. Project files now opened in 15 seconds. They are about halfway there in terms of multi-user editing. Now everyone can go into a big, shared project, and you can move bins back and forth. Although, only one user at a time has write access to the master project.
This is not simple software they are writing. Adobe is putting a lot of work into making it a more fitting tool for this type of movie. Even though this film was exponentially larger than Deadpool, from the Adobe side it was a smoother process. Props to them for doing that! The cool part about pioneering this stuff is the amount of work that Adobe is on board to do. They’ll have people work on stuff that is helpful to us, so we get to participate a little in how Adobe’s software gets made.
With two large Premiere Pro projects under your belt, what sort of new features would you like to see Adobe add to the application to make it even better for feature film editors?
They’ve built out the software from being a single-user application to being multi-user software, but the inherent software at the base level is still single-user. Sometimes your render files get unlinked when you go back and forth between multiple users. There’s probably stuff where they have to dig deep into the code to make those minor annoyances go away. Other items I’d like to see — let’s not use third-party software to send change lists to the mix stage.
I know Premiere Pro integrates beautifully with After Effects, but for me, After Effects is this precise tool for executing shots. I don’t want a fine tool for compositing — I want to work in broad strokes and then have someone come back and clean it up. I would love to have a tracking tool to composite two shots together for a seamless, split screen of two combined takes — features like that.
The After Effects integration and the color correction are awesome features for a single user to execute the film, but I don’t have the time to be the guy to execute the film at that high level. I just have to keep going. I want to be able to do a fast and dirty version so I know it’s not a terrible idea, and then turn to someone else and say, “OK, make that good.” After Effects is cool, but it’s more for VFX editors or single users who are trying to make a film on their own.
After all of these action films, are you ready to do a different type of film, like a period drama?
Funny you should say that. After Deadpool I worked on The Handmaid’s Tale pilot, and it was exactly that. I was working on this beautifully acted, elegant project with tons of women characters and almost everything was done in-camera. It was a lot of parlor room drama and power dynamics. And that was wonderful to work on after all of this VFX/action stuff. Periodically it’s nice to flex a different creative muscle.
It’s not that I only work on science-fiction/VFX projects — which I love — but, in part, people start associating you with a certain genre, and then that becomes an easy thing to pursue and get work for.
Much like acting, if you want to be known for doing a lot of different things, you have to actively pursue it. It’s easy to go where momentum will take you. If you want to be the editor who can cut any genre, you have to make it a mission to pursue those projects that will keep your resume looking diverse. For a brief moment after Deadpool, I might have been able to pivot to a comedy career (laughs). That was a real hybrid, so it was challenging to thread the needle of the different tones of the film and make it feel like one piece.
Any final thoughts on the challenges of editing Terminator: Dark Fate?
The biggest challenge of the film was that, in a way, the film was an ensemble with the Dani character, the Grace character, the Sarah character and Arnold’s character — the T-800. All of these characters are protagonists that all have their individual arcs. Feeling that you were adequately servicing those arcs without grinding the movie to a halt or not touching bases with a character often enough — finding out how to dial that in was the major challenge of the movie, plus the scale of the VFX and finessing all the action scenes. I learned a lot.
Oliver Peters is an experienced film and commercial editor/colorist. In addition, he regularly interviews editors for trade publications. He may be contacted through his website at oliverpeters.com
At IBC 2019, Adobe introduced a new reframing/reformatting feature for Premiere Pro called Auto Reframe. Powered by Adobe Sensei, the company’s AI/machine learning framework, Auto Reframe intelligently reframes and reformats video content for different aspect ratios, from square to vertical to cinematic 16:9 versions. Like the recently introduced Content-Aware Fill for After Effects, Auto Reframe uses AI and machine learning to accelerate manual production tasks without sacrificing creative control.
For anyone who needs to optimize content for different platforms, Auto Reframe will save valuable hours by automating the tedious task of manually reframing content every time a different video platform comes into play. It can be applied as an effect to individual clips or to whole sequences.
Auto Reframe will launch on Premiere Pro later this year. You can watch Adobe’s Victoria Nece talk about Auto Reframe and more from the IBC 2019 show floor.
Microsoft has released its Windows Mixed Reality (WMR) platform as part of the Fall Creator’s Update to Windows 10. This platform allows users to experience a variety of immersive experiences, and thankfully there are now many WMR headsets available from many familiar names in the hardware business. One of those is from Lenovo who kindly sent me their Explorer WMR headset to test on my Thinkpad P71. This provided me with a complete VR experience on their hardware.
On November 15, Microsoft’s WMR released beta support for SteamVR on WMR devices. This allows WMR headsets to be used in applications that are compatible with SteamVR. For example, the newest release of Adobe Premiere Pro (CC 2018, or V.12.0) uses SteamVR for 360 video preview.
My goal for this article was to see if I could preview my 360 videos in a Lenovo headset while editing in Premiere, especially now that I had new 360 footage from my GoPro Fusion camera. I also provide some comparisons to the Oculus Rift which I reviewed for postPerspective in October.
There are a number of advantages to the WMR options, including lower prices and hardware requirements, higher image resolution and simpler setup. Oculus and HTC’s VR-Ready requirements have always been a bit excessive for 360 video, because unlike true 3D VR there is no 3D rendering involved when playing back footage from a fixed perspective. But would it work? No one seemed to know if it would, but Lenovo was willing to let me try.
The first step is to get your installation of Windows 10 upgraded with the Fall Creators Update. This includes integrated support for Windows Mixed Reality headsets. Once installed, you can plug in the single USB3 cable and HDMI port and Windows will automatically configure the device and its drivers for you. You will also need to install Valve’s Steam application and SteamVR, which adds support for VR content. The next step is to find Microsoft’s Windows Mixed Reality for SteamVR in the Steam store, which is a free installation. Once you confirm that the headset is functioning in WMR and then in SteamVR, open up Premiere Pro and test it out.
Working in Premiere Pro
Within Premiere Pro, preview and playback worked immediately within my existing immersive project. I watched footage captured with my Samsung Gear 360 and GoPro Fusion cameras. The files played, and the increased performance within the new version of the software is noticeable. My 4K and 5K 30fps content worked great, but my new 3Kp60 content only played when Mercury Playback was set to software-only, which disabled most of the new Immersive Video effects. In CUDA mode, I could hold down the right arrow and watch it progress in slow motion, but pressing the space bar caused the VR preview to freeze even though it played fine on the laptop monitor. The 60p content played fine in the Rift, so this appears to be an issue specific to WMR. Hopefully, that will be addressed in a software update in the near future.
The motion controllers were visible in the interface, and allow you to scrub the timeline, but I still had to use space-bar to start and stop playback. (Update: The 12.1 release of Premiere Pro support WMR headsets, and testing confirms that 60p now works, and the motion controllers are fully functional and can control playback.) One other issue that arose was that the mouse cursor is hidden when the display is snapped down into place over my eyes, which is an intrinsic feature of WMR. I had to tip it up out of the way every time I wanted to make a change, instead of just peeking under it, which is a lot of snapping up and down for the headset.
I found the WMR experience to be slightly less solid than the Oculus system. It would occasionally lag on the tracking for a couple of frames, causing the image to visibly jump. This may be due to the integrated tracking instead of dedicated external cameras. The boundary system is a visual distraction, so I would recommend disabling it if you are primarily using it for 360 video — because it doesn’t require moving much within your space. The setup on the WMR is better; it is much easier and has lower requirements and fewer ports needed. The resolution is higher than the Oculus Rift I had tested, (1440×1440 per eye instead of 1080×1200), so I wanted to see how much of a difference that would make. The Explorer also has a narrower field of view (105 degrees instead of 110), which I wouldn’t expect to make a difference, but I think it did.
By my calculations, the increased resolution should allow you to resolve a 5K sphere, compared to the 3.9K resolution available from the Rift — 1440pixels/105degrees*360 vs 1080pixels /110degrees*360. You will also want a pair of headphones or earbuds to plug into the headset so the audio tracks with your head (compared to your computer speakers, which are fixed).
The Feel of the Headset
The headset is designed very differently from the Rift, and the display can be tipped up out of the way while the headband is still on. It is also way easier to put on and remove, but a bit less comfortable to keep on for longer periods of time. The headband has to be on tight enough to hold the display in front of your eyes, since it doesn’t rest on your face, and the cabling has to slide through a clip on the headband when you fold the display upward. And since you have to fold the display upward to use the mouse, it is a frequent annoyance. But between the motion controllers and the keyboard, you can navigate and playback while the headset is on.
Using the Microsoft WMR lobby interface was an interesting experience, but I’m not sure if it’s going to catch on. SteamVR’s lobby experience isn’t much better, but Steam does offer a lot more content for its users. I anticipate Steam will be the dominant software platform based on the fact that most hardware vendors have support for it — HTC, Oculus, WMR. The fact that Adobe chose SteamVR to support their immersive preview experience is why these new WMR headsets work in Premiere Pro without any further software updates needed on their part. (Adobe doesn’t officially support this configuration yet, hence the “beta” designation in SteamVR, but besides 60p playback, I was very happy.) Hopefully we will only see further increased support and integration between the various hardware and software options in the future.
Currently, the Lenovo Explorer and the Oculus Rift are both priced the same at $399 — I say currently because prices have been fluctuating, so investigate thoroughly. So which one is better? Well, neither is a clear winner. Each has its own strengths. The Rift has more specific hardware requirements and lower total resolution. The Explorer requires Windows 10, but will work on a wider array of systems. The Rift is probably better for periods of extended use, while I would recommend the Explorer if you are going to be doing something that involves taking it on and off all the time (like tweaking effects settings in Adobe apps). Large fixed installations may offer a better user experience with the Rift or Vive on a powerful GPU, but most laptop users will probably have an easier time with the Explorer (no external camera to calibrate and fewer ports needed).
Mike McCarthy is an online editor/workflow consultant with 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.
There is new movie coming out this week that is fairly unique. Telling the true story of Eric LeMarque surviving eight days lost in a blizzard, 6 Below: Miracle on the Mountain is the first film shot and edited in its entirety for the new Barco Escape theatrical format. If you don’t know what Barco Escape is, you are about to find out.
This article is meant to answer just about every question you might have about the format and how we made the film, on which I was post supervisor, production engineer and finishing editor.
What is Barco Escape?
Barco Escape is a wraparound visual experience — it consists of three projection screens filling the width of the viewer’s vision with a total aspect ratio of 7.16:1. The exact field of view will vary depending on where you are sitting in the auditorium, but usually 120-180 degrees. Similar to IMAX, it is not about filling the entire screen with your main object but leaving that in front of the audience and letting the rest of the image surround them and fill their peripheral vision in a more immersive experience. Three separate 2K scope theatrical images play at once resulting in 6144×858 pixels of imagery to fill the room.
Is this the first Barco Escape movie?
Technically, four other films have screened in Barco Escape theaters, the most popular one being last year’s release of Star Trek Beyond. But none of these films used the entire canvas offered by Escape throughout the movie. They had up to 20 minutes of content on the side screens, but the rest of the film was limited to the center screen that viewers are used to. Every shot in 6 Below was framed with the surround format in mind, and every pixel of the incredibly wide canvas is filled with imagery.
How are movies created for viewing in Escape?
There are two approaches that can be used to fill the screen with content. One is to place different shots on each screen in the process of telling the story. The other is to shoot a wide enough field of view and high enough resolution to stretch a single image across the screens. For 6 Below, director Scott Waugh wanted to shoot everything at 6K, with the intention of filling all the screens with main image. “I wanted to immerse the viewer in Eric’s predicament, alone on the mountain.”
Cinematographer Michael Svitak shot with the Red Epic Dragon. He says, “After testing both spherical and anamorphic lens options, I chose to shoot Panavision Primo 70 prime lenses because of their pristine quality of the entire imaging frame.” He recorded in 6K-WS (2.37:1 aspect ratio at 6144×2592), framing with both 7:1 Barco Escape and a 2.76:1 4K extraction in mind. Red does have an 8:1 option and a 4:1 option that could work if Escape was your only deliverable. But since there are very few Escape theaters at the moment, you would literally be painting yourself into a corner. Having more vertical resolution available in the source footage opens up all sorts of workflow possibilities.
This still left a few challenges in post: to adjust the framing for the most comfortable viewing and to create alternate framing options for other deliverables that couldn’t use the extreme 7:1 aspect ratio. Other projects have usually treated the three screens separately throughout the conform process, but we treated the entire canvas as a single unit until the very last step, breaking out three 2K streams for the DCP encode.
What extra challenges did Barco Escape delivery pose for 6 Below’s post workflow?
Vashi Nedomansky edited the original 6K R3D files in Adobe Premiere Pro, without making proxies, on some maxed-out Dell workstations. We did the initial edit with curved ultra-wide monitors and 4K TVs. “Once Mike McCarthy optimized the Dell systems, I was free to edit the source 6K Red RAW files and not worry about transcodes or proxies,” he explains. “With such a quick turnaround everyday, and so much footage coming in, it was critical that I could jump on the footage, cut my scenes, see if they were playing well and report back to the director that same day if we needed additional shots. This would not have been possible time-wise if we were transcoding and waiting for footage to cut. I kept pushing the hardware and software, but it never broke or let me down. My first cut was 2 hours and 49 minutes long, and we played it back on one Premiere Pro timeline in realtime. It was crazy!”
All of the visual effects were done at the full shooting resolution of 6144×2592, as was the color grade. Once Vashi had the basic cut in place, there was no real online conform, just some cleanup work to do before sending it to color as an 8TB stack of 6K frames. At that point, we started examining it from the three-screen perspective with three TVs to preview it in realtime, courtesy of the Mosaic functionality built into Nvidia’s Quadro GPU cards. Shots were realigned to avoid having important imagery in the seams, and some areas were stretched to compensate for the angle of the side screens from the audiences perspective.
DP Michael Svitak and director Scott Waugh
Once we had the final color grade completed (via Mike Sowa at Technicolor using Autodesk Lustre), we spent a day in an Escape theater analyzing the effect of reflections between the screens and its effect on the contrast. We made a lot of adjustments to keep the luminance of the side screens from washing out the darks on the center screen, which you can’t simulate on TVs in the edit bay. “It was great to be able to make the final adjustments to the film in realtime in that environment. We could see the results immediately on all three screens and how they impacted the room,” says Waugh.
Once we added the 7.1 mix, we were ready to export assets for our delivery in many different formats and aspect ratios. Making the three streams for Escape playback was a simple as using the crop tool in Adobe Media Encoder to trim the sides in 2K increments.
How can you see movies in the Barco Escape format?
Barco maintains a list of theaters that have Escape screens installed, which can be found at ready2escape.com. But for readers in the LA area, the only opportunity to see a film in Barco Escape in the foreseeable future is to attend one of the Thursday night screenings of 6Below at the Regal LA Live Stadium or the Cinemark XD at Howard Hughes Center. There are other locations available to see the film in standard theatrical format, but as a new technology, Barco Escape is only available in a limited number of locations. Hopefully, we will see more Escape films and locations to watch them in the future.
Mike McCarthy is an online editor/workflow consultant with 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.
LumaForge, which designs and sells high-performance servers and shared storage appliances for video workflows, will be at IBC this year showing full support for new collaboration features in Adobe Premiere Pro CC. When combined with LumaForge’s Jellyfish or ShareStation post production servers, the new Adobe features — including multiple open projects and project locking —allow production groups and video editors to work more effectively with shared projects and assets. This is something that feature film and TV editors have been asking for from Adobe.
Project locking allows multiple users to work with the same content. In a narrative workflow, an editing team can divide their film into shared projects per reel or scene. An assistant editor can get to work synchronizing and logging one scene, while the editor begins assembling another. Once the assistant editor is finished with their scene, the editor can refresh their copy of the scene’s Shared Project and immediately see the changes.
An added benefit of using Shared Projects on productions with large amounts of footage is the significantly reduced load time of master projects. When a master project is broken into multiple shared project bins, footage from those shared projects is only loaded once that shared project is opened.
“Adobe Premiere Pro facilitates a broad range of editorial collaboration scenarios,” says Sue Skidmore, partner relations for Adobe Professional Video. “The LumaForge Jellyfish shared storage solution complements and supports them well.”
All LumaForge Jellyfish and LumaForge ShareStation servers will support the Premiere Pro CC collaboration features for both Mac OS and Windows users, connecting over 10Gb Ethernet.
CAN YOU DESCRIBE YOUR COMPANY?
PS260 is a boutique editorial house (with offices in Venice, California and New York City) specializing in commercials, music videos and features. We also have a motion graphics and visual effects department. We’re a small team that fosters real creativity and experimentation in the work that we do.
WHAT’S YOUR JOB TITLE?
WHAT DOES THAT ENTAIL?
Editing is essentially taking video, audio and images and crafting them into the most effective telling of a story. It is an extremely collaborative process that involves many components — understanding the technology, working with directors/writers/creatives, coordinating sound and effects — but at its heart, editing is telling a compelling visual story over time.
WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I suppose for people who aren’t a part of the post process the most surprising thing might be how drastically a story can change in the edit. There’s that saying that a film (or short, or commercial, or whatever) is written three times: first as a script, then as it ends up being shot and finally as it’s edited. Very rarely does a final product end up as “boarded,” and I still find it amazing how such small changes can completely change the viewer’s idea of what’s happening on the screen — what if we held this shot so the character blinks weirdly one more time? Or how about we add a sound effect of his keys rustling? What if we open with the other character so now we’re in their POV for the rest of the scene?
Editors can frequently act as fixers — with the stress and unpredictability of productions, things don’t often go the way they’re planned and the editors are then tasked with making sense of a puzzle with missing pieces. I love being challenged to find some outrageous way to tell the story, and we want to tell it with the material that’s in front of us.
WHAT’S YOUR LEAST FAVORITE?
Sometimes these creative solutions to production problems work really well and sometimes they feel a bit lacking. It’s at these times you know these problems could have been solved if editorial was involved earlier in the process. I’ve been lucky to be involved in some projects through pre and post production and, along with minimizing prep in post and allowing for more time to be spent on creative editorial, potential issues were caught and ironed out before they became bigger concerns.
WHAT IS YOUR FAVORITE TIME OF THE DAY?
Lunchtime is always a pretty great thing. PS260 makes sure we’re all well fed.
IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Probably desperately trying to garner YouTube hits for my speculative fiction essay videos.
HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I’ve been editing since I was a kid, re-editing Star Wars audio books on cassette tape to tell new stories. Later, I began capturing analog video that I shot or recorded from the TV with a Dazzle Movie Star box and editing in Adobe Premiere 5.0 (I never thought I’d go back to Premiere almost 20 years later, but I did). I always knew I wanted to work with video and loved to experiment with new effects and ways to craft a story, so I went to art school and got a degree in video art. After that I found I needed to make rent, so getting paid to do what I love was the easiest decision I’ve ever made.
CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
Recently, I’ve done a lot of video work for Depeche Mode. The director Tim Saccenti and I created the live visuals for their current tour, along with two performance music videos and three 360 music videos, which should be out very soon!
I’ve also just finished a wonderful feature documentary, Illustrated Man (left), about tattooed men and the history of tattooing in NYC, with director Sophy Holland. Currently, I’m working on some videos for Elizabeth Arden, starring Reese Witherspoon.
YOU HAVE WORKED ON ALL SORTS OF PROJECTS. DO YOU PUT ON A DIFFERENT HAT WHEN CUTTING FOR A SPECIFIC GENRE?
The goal of every project is the same: to make the audience feel what you want them to feel, whether it’s laughter or sadness, or that rush of adrenaline as they’re making their way home from the theater. But each project comes with its own set of limitations.
With TV spots, you’re confined to 30 or 60 seconds and you have to temper your grand ideas of how best to tell the story with the economy of time, not to mention the sometimes limiting concerns of the brand or product you’re representing. Long form and features can allow you all the time you may need, but you have to be mindful of the audience’s attention span.
The best thing you can do is continually learn, and have at-the-ready techniques to help you with a specific form or genre, like knowing when to be in a wide or a close-up shot, using the camera’s distance to create tension or reveal emotion. Or in a comedy, for example, knowing not to reveal new information right after a big joke because the audience will miss it while they’re laughing (thanks Ren & Stimpy).
WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
My first feature, We’ve Forgotten More Than We Ever Knew, was an incredible learning experience for me. It taught me so much about how to cut dialogue, build out a scene and carry a character’s emotional arc across 90 minutes. Plus, it’s just a really cool film.
WHAT DO YOU USE TO EDIT?
Right now I’m using Adobe Premiere CC 2017. The recent updates have finally stolen me away from Avid and Final Cut.
WHAT IS YOUR FAVORITE PLUG-IN?
This is like asking someone what their favorite book or movie is, so it’ll probably change depending on the day of week. Right now I’m into using stock reverb plug-ins, or things like iZotope Vinyl to mix in sound elements in interesting ways. Tomorrow it could be star wipes.
ARE YOU OFTEN ASKED TO DO MORE THAN EDIT? IF SO, WHAT ELSE ARE YOU ASKED TO DO?
Definitely. Because of the progression of technology, clients are expecting more and more, and the divide between offline and online is narrowing. I do a lot of the online effects in the edit, whether it’s motion graphics, correcting eye lines when the actors stray, or comping split screens.
In the features and music videos I work on, I have a lot of freedom to work on bigger CG shots and effects set pieces, which is always a lot of fun.
NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
Of course, no one can live without their phone nowadays. It does help a lot for my job as well, allowing me to remote in to my workstation to check on a render or reference an EDL at a session.
The other two would be my corded Apple full-size keyboard and Logitech M500 mouse. They’re amazingly simple tools, but they make things so much easier.
WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
It’s all about having a good work-life balance, which is an issue most editors have to grapple with. Fortunately, I have some amazing people in my life who make sure to occasionally pull me away from all the screens.
For UK-based father and son Paul and Josh Butterworth, working together on the short film 25 Million Reasons to Smile was a chance for both of them to show off their respective talents — Paul as an actor/producer and Josh as an aspiring filmmaker.
The film features two old friends, and literal partners in crime, who get together to enjoy the spoils of their labors after serving time in prison. After so many years apart, they are now able to explore a different and more intimate side of their relationship.
In addition to writing the piece, Josh served as DP and director, calling on his Canon 700D for the shoot. “I bought him that camera when he started film school in Manchester,” says Paul.
Josh and Paul Butterworth
The film stars Paul Butterworth (The Full Monty) and actor/dialect/voice coach Jon Sperry as the thieves who are filled with regret and hope. 25 Million Reasons to Smile was shot in Southern California, over the course of one day.
We reached out to the filmmakers to find out why they shot the short film, what they learned and how it was received.
With tools becoming more affordable these days, making a short is now an attainable goal. What are the benefits of creating something like 25 Million Reasons to Smile? Josh: It’s wonderful. Young and old aspiring filmmakers alike are so lucky to have the ability to make short films. This can lead to issues, however, because people can lose sight of what it is important: character and story. What was so good about making 25 Million was the simplicity. One room, two brilliant actors, a cracking story and a camera is all you really need.
What about the edit? Paul: We had one hour and six minutes (a full day’s filming) to edit down to about six minutes, which we were told was a day’s work. An experienced editor starts at £500 a day, which would have been half our total budget in one bite! I budgeted £200 for edit, £100 for color grade and £100 for workflow.
At £200 a day, you’re looking at editors with very little experience, usually no professional broadcast work, often no show reel… so I took a risk and went for somebody who had a couple of shorts in good festivals, named Harry Baker. Josh provided a lot of notes on the story and went from there. And crucial cuts, like staying off the painting as long as possible and cutting to the outside of the cabin for the final lines — those ideas came from our executive producer Ivana Massetti who was brilliant.
How did you work with the colorist on the look of the film? Josh: I had a certain image in my head of getting as much light as possible into the room to show the beautiful painting in all its glory. When the colorist, Abhishek Hans, took the film, I gave him the freedom to do what he thought was best, and I was extremely happy with the results. He used Adobe Premiere Pro for the grade.
Paul: Josh was DP and director, so on the day he just shot the best shots he could using natural light — we didn’t have lights or a crew, not even a reflector. He just moved the actors round in the available light. Luckily, we had a brilliant white wall just a few feet away from the window and a great big Venice Beach sun, which flooded the room with light. The white walls bounced light everywhere.
The colorist gave Josh a page of notes on how he envisioned the color grade — different palettes for each character, how he’d go for the dominant character when it was a two shot and change the color mood from beginning to end as the character arc/resolution changed and it went from heist to relationship movie.
What about the audio? Paul: I insisted Josh hire out a professional Róde microphone and a TASCAM sound box from his university. This actually saved the shoot as we didn’t have a sound person on the boom, and consequently the sound box wasn’t turned up… and also we swiveled the microphone rather than moving it between actors, so one had a reverb on the voice while the other didn’t.
The sound was unusable (too low), but since the gear was so good, sound designer Matt Snowden was able to boost it in post to broadcast standard without distortion. Sadly, he couldn’t do anything about the reverb.
Can you comment on the score? Paul: A BAFTA mate of mine, composerDavid Poore, offered to do the music for free. It was wonderful and he was so professional. Dave already had a really good hold on the project as we’d had long chats but he took the Josh’s notes and we ended up with a truly beautiful score.
Was the script followed to the letter? Any improvisations? Josh: No, not quite. Paul and Jon were great, and certainly added a lot to the dialogue through conversations before and during the shoot. Jon, especially, was very helpful in Americanizing his character, Jackson’s, dialogue.
Paul: Josh spent a long time on the script and worked on every word. We had script meetings at various LA cafes and table reads with me and Jon. On the shoot day, it was as written.
Josh ended up cutting one of my lines in the edit as it wasn’t entirely necessary, and the reverb was bad. It tightened it up. And our original ending had our hands touching on the bottle, but it didn’t look right so Josh went with the executive producer’s idea of going to the cabin.
What are the benefits of creating something like 25 Million Reasons to Smile? Paul: Wow! The benefits are amazing… as an actor I never realized the process. The filming is actually a tiny proportion of the entire process. It gave me the whole picture (I’m now in awe of how hard producers work, and that’s only after playing at it!) and how much of a team effort it is — how the direction, edit, sound design and color grade can rewrite the film. I can now appreciate how the actor doesn’t see the bigger picture and has no control over any of those these elements. They are (rightly) fully immersed in their character, which is exactly what the actor’s role is: to turn up and do the lines.
I got a beautiful paid short film out of it, current footage for my show reel and a fantastic TV job — I was cast by Charles Sturridge in the new J.K.Rowling BBC1/HBO series Cormoran Strike as the dad of the female lead Robin (Holliday Grainger). I’d had a few years out bringing Josh up and getting him into film school. I relaunched when he went to university, but my agent said I needed a current credit as the career gap was causing casting directors problems. So I decided to take control and make my own footage — but it had to stand up on my show reel against clips like The Full Monty. If it wasn’t going to be broadcast-standard technically, then it had to have something in the script, and my acting (and my fellow actor had to be good) had to show that I could still do the job.
Josh met a producer in LA who’s given him runner work over here in England, and a senior producer with an international film company saw this and has given him an introduction to their people in Manchester. He also got a chance to write and direct a non-student short using industry professionals, which in the “real” world he might not get for years. And it came with real money and real consequences.
Josh, what did you learn from this experience from a filmmaker’s point of view?
More hands on deck is never a bad thing! It’s great having a tight-knit cast and crew, but the shoot would have definitely benefited from more people to help with lighting and sound, as well as the process running smoother overall.
Any surprises pop up? Any challenges? Josh: The shoot actually ran very smoothly. The one challenge we had to face was time. Every shot took longer than expected, and we nearly ran out of time but got everything we needed in the end. It helped having such professional and patient actors.
Paul: I was surprised how well Josh (at 20 years old and at the start of film school) directed two professional middle-aged actors. Especially as one was his dad… and I was surprised by how filmic his script was.
Any tips for those looking to do something similar? Josh: Once you have a story, find some good actors and just do it. As I said before, keep it simple and try to use character not plot to create drama.
Paul: Yes, my big tip would be to get the script right. Spend time and money on that and don’t film it till it’s ready. Get professional help/mentoring if you can. Secondly, use professional actors — just ask! You’d be surprised how many actors will take a project if the script and director are good. Of course, you need to pay them (not the full rate, but something).
Finally, don’t worry too much about the capture — as a producer said to me, “If I like a project I can buy in talent behind the camera. In a short I’m looking for a director’s voice and talent.”
Wax, an editorial house based in NYC, has added film and commercial editor Eddie Ringer. Ringer comes to Wax from Wildchild + Bonch in New York. Prior to that, he spent over eight years at Sausalito-based agency Butler Shine Stern + Partners (BSSP), where he edited and directed advertising projects spanning broadcast commercials, viral campaigns and branded content.
Ringer says he calls on his agency background for his editing work. “Working on the agency side I saw firsthand the tremendous amount of thought and hard work that goes into creating a campaign. I take this into consideration on every project. It focuses me. The baton has been passed, and it’s my responsibility to make sure the collective vision is carried through to the end.”
In addition to his agency experience, Ringer enjoys the way sound design can dictate the flow of the edit and stresses the importance of balancing the creative part with the commerce side of things and understanding why it works. “At the end of the day,” he notes, “we’re trying to connect with an audience to sell a product and a brand.”
Ringer’s first job with Wax was a new spot for ITV London promoting the horse-racing channel. It features momentum edits, hard cuts, energy and, of course, lots of sound design.
His tool of choice is Adobe Premiere Pro. “I made the switch to Premiere about four years ago and never looked back. I find the functionality more intuitive than other NLEs I’ve used in the past,” he says.
What do you need to know about the latest pro laptop from Apple? Well, the MacBook Pro is fast and light; the new Touch Bar is handy and sharp but not fully realized; the updated keys on the keyboard are surprisingly great; and working with ProRes QuickTime files in resolutions higher than 1920×1080 inside of FCP X, or any NLE for that matter, is blazing fast.
When I was tasked with reviewing the new MacBook Pro, I came into it with an open mind. After all, I did read a few other reviews that weren’t exactly glowing, but I love speed and innovation among professional workstation computers, so I was eager to test it myself.
I am pretty open-minded when it comes to operating systems and hardware. I love Apple products and I love Windows-based PCs. I think both have their place in our industry, and to be quite honest it’s really a bonus for me that I don’t rely heavily on one OS or get too tricked by the Command Key vs. Windows/Alt Key.
Let’s start with the call I had with the Apple folks as they gave me the lowdown on the new MacBook Pro. The Apple reps were nice, energetic, knowledgeable and extremely helpful. While I love Apple products, including this laptop, it’s not the be-all-end-all.
The Touch Bar is nice, but not a revolution. It feels like the first step in an evolution, a version 1 of an innovation that I am excited to see more of in later iterations. When I talked with the Apple folks they briefed me on what Tim Cook showed off in the reveal: emoji buttons, wide gamut display, new speakers and USB-C/Thunderbolt 3 connectivity.
They had an FCPX expert on the call, which was nice considering I planned on reviewing the MacBook Pro with a focus on the use of nonlinear editing apps, such as Adobe Premiere Pro, Avid Media Composer and Blackmagic’s Resolve. Don’t get me wrong, FCPX is growing on me — it’s snappy jumping around the timeline with ProRes 5K footage; assigning roles are something I wish every other app would pick up on; and the timeline is more of a breeze to use with the latest update.
The other side to this is that in my 13 years of working in television post I have never worked on a show that primarily used FCP or FCPX to edit or finish on. This doesn’t mean I don’t like the NLE, it simply means I haven’t relied on it in a professional working environment. Like I said, I really like the road it’s heading down, and if they work their way into mainstream broadcast or streaming platforms a little more I am sure I will see it more frequently.
Furthermore, with the ever-growing reduction in reliance on groups of editors and finishing artists apps like FCPX are poised to shine with their innovation. After all that blabbering, in this review I will touch on FCPX, but I really wanted to see how the MacBook Pro performed with the pro NLEs I encounter the most.
Let’s jump into the specs. I was sent a top-of-the-line 15-inch MacBook Pro with Touch Bar, which costs $3,499 if configured online. It comes with a quad/-core Intel Core i7 2.9GHz (up to 3.8 GHz using Turbo Boost) processor, 16GB of 2133MHz memory, 1TB PCI-e SSD hard drive and Radeon Pro 460 with 4GB of memory. It’s loaded. I think the only thing that can actually be upgraded beyond this configuration would be to include a 2TB hard drive, which would add another $800 to the price tag.
Physically, the MacBook Pro is awesome — very sturdy, very thin and very light. It feels great when holding it and carrying it around. Apple even sent along a Thunderbolt 3 (USB-C) to Thunderbolt 2 adapter, which costs an extra $29 and a USB-C to Lightning Cable that costs an extra $29.
So yes, it feels great. Apple has made a great new MacBook Pro. Is it worth upgrading if you have a new-ish MacBook Pro at home already? Probably not, unless the Touch Bar really gets you going. The speed is not too far off from the previous version. However, if you have a lot of Thunderbolt 3/USB-C-connected peripherals, or plan on moving to them, then it is a good upgrade.
I ran some processor/graphics card intensive tests while I had the new MacBook Pro and came to the conclusion that FCPX is not that much faster than Adobe Premiere Pro CC 2017 when working with non-ProRes-based media. Yes, FCPX tears through ProRes QuickTimes if you already have your media in that format. What about if you shoot on a camera like the Red and don’t want to transcode to a more edit-friendly codec? Well, that is another story. To test out my NLEs, I grabbed a sample Red 6K 6144×3160 23.98fps clip from the Red sample footage page, strung out a 10-minute-long sequence in all the NLEs and exported both a color-graded version and a non-color-graded version as ProRes HQ QuickTimes files matching the source file’s specs.
In order to work with Red media in some of the NLEs, you must download a few patches: for FCPX you must install the Red Apple workflow installer and for Media Composer you must install the Red AMA plug-in. Premiere doesn’t need anything extra.
Test 1: Red 6K 6144×3160 23.98fps R3D — 10-minute sequence (no color grade or FX) exported as ProRes HQ matching the source file’s specs. Premiere > Media Encoder = 1 hour, 55 minutes. FCPX = 1 hour, 57 minutes. Media Composer = two hours, 42 minutes (Good news, Media Composer’s interface and fonts display correctly on the new display).
You’ll notice that Resolve is missing from this list and that is because I installed Resolve 12.5.4 Studio but then realized my USB dongle won’t fit into the USB-C port — and I am not buying an adapter for a laptop I do not get to keep. So, unfortunately, I didn’t test a true 6K ProRes HQ export from Resolve but in the last test you will see some Resolve results.
Overall, there was not much difference in speeds. In fact, I felt that Premiere Pro CC 2017 played the Red file a little smoother and at a higher frames-per-second count. FCPX struggled a little. Granted a 6K Red file is one of the harder files for a CPU to process with no debayer settings enabled, but Apple touts this as a MacPro semi-replacement for the time being and I am holding them to their word.
Test 2: Red 6K 6144×3160 23.98fps R3D — 10-minute color-graded sequence exported as ProRes HQ matching the source files specs. Premiere > Media Encoder = one hour, 55 minutes. FCPX = one hour, 58 minutes. Media Composer = two hours, 34 minutes.
It’s important to note that the GPU definitely helped out in both Adobe Premiere and FCPX. Little to no extra time was added on the ProRes HQ export. I was really excited to see this as sometimes without a good GPU — resizing, GPU-accelerated effects like color correction and other effects will slow your system to a snail’s pace if it doesn’t fully crash. Media Composer surprisingly speed up its export when I added the color grade as a new color layer in the timeline. By adding the color correction layer to another layer Avid might have forced the Radeon to kick in and help push the file out. Not really sure what that is about to be honest.
Test 3: Red 6K 6144×3160 23.98fps R3D — 10-minute color-graded sequence resized to 1920×1080 on export as ProRes HQ. Premiere > Media Encoder = one hour, 16 minutes. FCPX = one hour, 14 minutes. Media Composer = one hour, 48 minutes. Resolve = one hour, 16 minutes
So after these tests, it seems that exporting and transcoding are all about the same. It doesn’t really come as too big of a surprise that all the NLEs, except for Media Composer, processed the Red file in the same amount of time. Regardless of the NLE, you would need to knock the debayering down to a half or more to start playing these clips at realtime in a timeline. If you have the time to transcode to ProRes you will get much better playback and rendering speed results. Obviously, transcoding all of your files to a codec, like ProRes or Avid DNX, takes way more time up front but could be worth it if you crunched for time on the back end.
In addition to Red 6K files, I also tested ProRes HQ 4K files inside of Premiere and FCPX, and both played them extremely smoothly without hiccups, which is pretty amazing. Just a few years ago I was having trouble playing down 10:1 compressed files in Media Composer and now I can playback superb-quality 4K files without a problem, a tremendous tip of the hat to technology and, specifically, Apple for putting so much power in a thin and light package.
While I was in the mood to test speeds, I hooked up a Thunderbolt 2 SSD RAID (OWC Thunderbay 4 mini) configured in RAID-0 to see what kind of read/write bandwidth I would get running through the Apple Thunderbolt 3 to Thunderbolt 2 adapter. I used both AJA System Test as well as the Blackmagic Disk Speed Test. The AJA test reported a write speed of 929MB/sec. and read speed of 1120MB/sec. The Blackmagic test reported a write speed of 683.1MB/sec. and 704.7MB/sec. from different tests and a read speed of 1023.3MB/sec. I set the test file for both at 4GB. These speeds are faster than what I have previously found when testing this same Thunderbolt 2 SSD RAID on other systems.
For comparison, the AJA test reported a write speed of 1921MB/sec. and read speed of 2134MB/sec. when running on the system drive. The Blackmagic test doesn’t allow for testing on the system drive.
What Else You Need to Know
So what about the other upgrades and improvements? When exporting these R3D files I noticed the fan kicked on when resizing or adding color grading to the files. Seems like the GPU kicked on and heated up which is to be expected. The fan is not the loudest, but it is noticeable.
The battery life on the new MacBook Pro is great when just playing music, surfing the web or writing product reviews. I found that the battery lasted about two days without having to plug in the power adapter. However, when exporting QuickTimes from either Premiere or FCPX the battery life dropped — a lot. I was getting a battery life of one hour and six minutes, which is not good when your export will take two hours. Obviously, you need to plug in when doing heavy work; you don’t really have an option.
This leads me to the new USB-C/Thunderbolt 3 ports — and, yes, you still have a headphone jack (thank goodness they didn’t talk with the iPhone developers). First off, I thought the MagSafe power adapter should have won a Nobel Peace Prize. I love it. It must be responsible for saving millions of dollars in equipment when people trip over a power cord — gracefully disconnecting without breaking or pulling your laptop off the table. However, I am disappointed Apple didn’t create a new type of MagSafe cable with the USB-C port. I will miss it a lot. The good news is you can now plug in your power adapter to either side of the MacBook Pro.
Adapters and dongles will have to be purchased if you pick up a new MacBook Pro. Each time I used an external peripheral or memory card like an SD card, Tangent Ripple Color Correction panel or external hard drive, I was disappointed that I couldn’t plug them in. Nonetheless, a good Thunderbolt 3 dock is a necessity in my opinion. You could survive with dongles but my OCD starts flaring up when I have to dig around my backpack for adapters. I’m just not a fan. I love how Apple dedicated themselves to a fast I/O like USB-C/Thunderbolt 3, but I really wish they gave it another year. Just one old-school USB port would have been nice. I might have even gotten over no SD card reader.
The Touch Bar
I like it. I would even say that I love it — in the apps that are compatible. Right now there aren’t many. Adobe released an update to Adobe Photoshop that added compatibility with the Touch Bar, and it is really handy especially when you don’t have your Wacom tablet available (or a USB dongle to attach it). I love how it gives access to so many levels of functionality to your tools within your immediate reach.
It has super-fast feedback. When I adjusted the contrast on the Touch Bar I found that the MacBook Pro was responding immediately. This becomes even more evident in FCPX and the latest Resolve 12.5.4 update. It’s clear Apple did their homework and made their apps like Mail and Messages work with the Touch Bar (hence emojis on the Touch Bar). FCPX has a sweet ability to scrub the timeline, zoom in to the timeline, adjust text and more from just the Touch Bar — it’s very handy, and after a while I began missing it when using other computers.
In Blackmagic’s latest DaVinci Resolve release, 12.5.4, they have added Touch Bar compatibility. If you can’t plug in your color correction panels, the Touch Bar does a nice job of easing the pain. You can do anything from contrast work to saturation, even adjust the midtones and printer lights, all from the Touch Bar. If you use external input devices a lot, like Wacom tablets or color correction panels, the Touch Bar will be right up your alley.
One thing I found missing was a simple application launcher on the Touch Bar. If you do pick up the new MacBook Pro with Touch Bar, you might want to download Touch Switcher, a free app I found via 9to5mac.com that allows you to have an app launcher on your Touch Bar. You can hide the dock, allowing you more screen real estate and the efficient use of the Touch Bar to launch apps. I am kind of surprised Apple didn’t make something like this standard.
From a purely superficial and non-scientific point of view, the newly updated P3-compatible wide-gamut display looks great… really great, actually. The colors are rich and vibrant. I did a little digging under the hood and noticed that it is an 8-bit display (data that you can find by locating the pixel depth in the System Information > Graphics/Display), which might limit the color gradations when working in a color space like P3 as opposed to a 10-bit display displaying in a P3 color space. Simply, you have a wider array of colors in P3 but a small amount of color shades to fill it up.
The MacBook Pro display is labeled as 32-bit color meaning the RGB and Alpha channels each have 8 bits, giving a total of 32 bits. Eight-bit color gives 256 shades per color channel while 10-bit gives 1,024 shades per channel, allowing for much smoother transitions between colors and luminance values (imagine a sky at dusk going smoothly from an orange to light blue to dark blue — the more colors per channel allows for a smoother gradient between lights and darks). A 10-bit display would have 30-bit color with each channel having 10 bits.
I tried to hook up a 10-bit display, but the supplied Thunderbolt 3 to Thunderbolt 2 dongle Apple sent me did not work with the mini display port. I did a little digging and it seems people are generally not happy that Apple doesn’t allow this to work, especially since Thunderbolt 2 and mini DisplayPort are the same connection. Some people have been able to get around this by hooking up their display through daisy chaining something like a Thunderbolt 2 RAID.
While I couldn’t directly test an external display when I had the MacBook Pro, I’ve read that people have been able to push 10-bit color out of the USB-C/Thunderbolt 3 ports to an external monitor. So as long as you are at a desk with a monitor you can most likely have 10-bit color output from this system.
I reached out to Apple on the types of adapters they recommend for an external display and they suggest a USB-C to DisplayPort adapter made by Aukey. It retails for $9.99. They also recommend the USB-C to DisplayPort cable from StarTech, which retails for $39.99. Make sure you read the reviews on Amazon because the experience people have with this varies wildly. I was not able to test either of these so I cannot give my personal opinion.
In the end, the new MacBook Pro is awesome. If you own a recent release of the MacBook Pro and don’t have $3,500 to spare, I don’t know if this is the update you will be looking for. If you are trying to find your way around going to a Windows-based PC because of the lack of Mac Pro updates, this may ease the pain slightly. Without more than 16GB of memory and an Intel Xeon or two, however, it might actually slow you down.
The battery life is great when doing light work, one of the longest batteries I’ve used on a laptop. But when doing the heavy work, you need to be near an outlet. When plugged into that outlet be careful no one yanks out your USB-C power adapter as it might throw your MacBook Pro to the ground or break off inside.
I really do love Apple products. They typically just work. I didn’t even touch on the new Touch ID Sensor that can immediately switch you to a different profile or log you in after waking up the MacBook Pro from sleep. I love that you can turn the new MacBook Pro on and it simply works, and works fast.
The latest iteration of FCPX is awesome as well, and just because I don’t see it being used a lot professionally doesn’t mean it shouldn’t be. It’s a well-built NLE that should be given a fairer shake than it has been given. If you are itching for an update to an old MacBook Pro, don’t mind having a dock or carrying around a bunch of dongles, then the 2016 MacBook Pro with the Touch Bar is for you.
The new MacBook Pro chews through ProRes-based media from 1920×1080 to 4K, 6K and higher will play but might slow down. If you are a Red footage user this new MacBook Pro works great, but you still might have to knock the debayering down a couple notches.
Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at email@example.com. Follow him on Twitter @allbetzroff.
WHAT’S YOUR JOB TITLE?
I’m a film editor working in commercials, feature films and experiential media projects. My experience is with brands such as Nike, BMW, Time Warner, Ford, Jeep, Coca-Cola, Scion, Altoids, LG, Sega, Dyson, Warner Bros., USPS, Chrysler, L’Oréal, Neutrogena, Budweiser, LucasArts, Dodge, Adidas,Toyota and AT&T.
CAN YOU DESCRIBE YOUR COMPANY:
I am currently working freelance in the Los Angeles and New York markets. I have worked with Jump, Optimus, Cosmo Street and Red Car.
WHAT DO YOU EDIT ON?
I use Avid Media Composer and Adobe Premiere Pro — visual effects, sound design and music are all-inclusive.
WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT YOUR ROLE AS AN EDITOR?
I nurture projects from award to final mix and delivery. Once on board A project, I’m there every step of the way.
Bob Mori in his edit suite.
WHAT’S YOUR FAVORITE PART OF THE JOB?
I have a few: working with a creative team that understands everyone’s participation is important to success; elevation of the concept even at the smallest level; dedication, conceptual thought and grace through years of my own personal experience; and watching an audience’s reaction to all that hard work!
WHAT DOES THAT ENTAIL?
Being nimble and wearing many hats throughout the process. Having aesthetic and technical knowledge that includes different approaches. Starting again never scares me.
WHAT’S YOUR LEAST FAVORITE?
Running out of time and knowing that we could have made it better. Everyone has experienced this, and it’s a reality we’ve all come to accept.
WHAT IS YOUR FAVORITE TIME OF THE DAY?
Watching dailies. Although, now we call them “media assets”… right? Usually it’s at double speed due to time constraints.
IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I never had a “Plan B” mapped out. This is bliss for me. I love what I do and can’t dream of doing anything else.
HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
That dye was set early on for me. I shot and edited films on 8mm film in high school. I was a film major at Columbia College in Chicago.
CAN YOU NAME SOME RECENT PROJECTS YOU’VE WORKED ON?
In late 2015, my commercial projects included Equifax, Home Advisor and South of Wilshire.
I also did work for On the Record With Mick Rock. Rock is a British photographer, best known for his iconic shots of rock and roll legends such as Queen, David Bowie, Syd Barrett, Lou Reed, Iggy Pop and The Sex Pistols.
WHAT RECENT PROJECT ARE YOU MOST PROUD OF? Who’s Driving Doug just had its world premiere at the Santa Barbara International Film Festival. It opened in theaters and VOD on February 26. The film stars RJ Mitte of Breaking Bad, Paloma Kwiatkowski of Bates Motel, Ray William Johnson of Equals Three, and Daphne Zuniga.
NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
Premiere Pro CC, Avid Media Composer, Photoshop and, of course, my iPhone.
WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
Every. Single. One. Facebook, Instagram, you name it. I’m easy to find and communicate with everywhere in the world.
DO YOU LISTEN TO MUSIC WHILE YOU WORK?
Music is essential to picture cutting. Not just popular music, but every genre. Some of my best friends are music composers. And usually (with rare exception) music is half of what you are emotionally feeling while watching.
WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Yoga. Hiking. The beach. Walking my dog Bruno (who is in our main photo). And escaping into film. As Frank Capra famously said, “As with heroin, the antidote to film is more film.”
Director Tim Miller’s Deadpool is action-packed, vulgar (in a good way) and a ton of fun. It’s also one of the few Hollywood blockbusters to be edited entirely on Adobe’s Premiere Pro.
On the Saturday following the film’s release, Adobe hosted a panel on the Fox Studios lot that included Deadpool’s post supervisor Joan Bierman, first assistant editor Matt Carson and Adobe consultants Vashi Nedomansky and Mike Kanfer. Here are some takeaways…
Why Premiere Pro?
According to Bierman, much of the credit for choosing Premiere Pro for the edit goes to Tim Miller. “Even before we had a crew, Tim knew he wanted to do this,” she said. Miller, a first-time feature director is no stranger to technology — he is co-founder of Culver City’s Blur Studio, which specializes in visual effects and animation.
Miller’s friend, director David Fincher, is a big advocate of Adobe Premiere. It’s likely his using it to edit Gone Girl — the first feature cut with the product — inspired Miller. The rest of the credit goes to Ted Gagliano, president of post production at Fox, for giving the go ahead for the road less taken.
Training and Storage
The first step in this undertaking was getting all the editors and assistants — who were used to editing on Media Composer and Final Cut — trained on Premiere. So they brought in editor Vashi Nedomansky — a Premiere Pro workflow consultant — who spent an initial three weeks training all five editors and established the workflow. He then returned for at least 12 days during the next nine months to further refine the workflow and answer questions both technical and editorial.
Additionally, he showed them features that are unique to Premiere, such as Dynamic Linking to After Effects projects and tapping the tilde (~) key to “full screen” the workspace section. “In our shared editing environment, because the editors were all coming from an Avid workflow, we treated Premiere Pro sequences as Avid bins,” explained Nedomansky. “Because Premiere Pro only allows one open project at a time… we shared sequences like you would share bins in Avid to allow all the editors access to the latest cuts.”
The next step was to get the multi-user editorial environment set up. They wanted to have several users, assistant editors and editors, get in and start working on the film simultaneously, without crashing into each other and corrupting files.
Jeff Brue’s Open Drives provided storage for the film via its product Velocity, which delivered 180TB of solid-state storage. With 5GB/s of “normal” throughput, the team had projects that would open in less than two minutes.
The solution to the multi-user access problem was much simpler and lower tech. When someone was working on a project file, they would move it to their named directory so nobody opened it mid-edit. Then, once they were done, they moved it back. So a little discipline went a long way in making sure that sharing media in a multi-user environment was stress-free.
When they needed a sequence in a project, they were able to link to it from another Premiere project without harming the source project. All of this allowed them to keep everything, as Nedomansky put it, “contained, safe and sharable.”
Re-Framing and Multi-Format Shooting
With all this in place, the team was ready to start cutting the wide array of footage the crew was producing. The film was shot primarily on the Arri Alexa at 3.2K RAW, but footage was also captured on 5K and 6K Red cameras and at least one Phantom. All of the footage was downsized to the common container format of 2048×1152 for the offline in Premiere and encoded in ProRes LT. This allowed them to do a center extraction, which gave the director and editor the ability to re-frame when they wanted to.
For the online, they went back to the Arri RAW, or other RAW formats, depending on their needs. The center extraction gave them a lot of creative freedom, so much so that they reframed the entire movie in the online. “If I had it to do over again I would have done it [the reframing] in a cheaper room” said Bierman.
Throughout the edit, the post team was burning its way through Mac Pros — the Macs were having an issue with the ATI D700 cards in OS X. In all the team burned through 10 of the cards, which would occasionally melt down on renders.
“There were some incredibly complex reels on Deadpool,” says Kanfer. “At one point midway through the production real five was taking over 10 minutes to load. Our engineers quickly regrouped and within a week were able to optimize the situation and the same reel took only 2 1/2 minutes to load once the fix was made. Other less complex reels in the film loaded in a minute or less.”
Vashi Nedomansky, Matt Carson and Joan Bierman.
The sound team had to create a slight workaround for audio turnovers. In a traditional Avid workflow, the hand off to Avid Pro Tools is relatively seamless — as you would expect since they are made by the same company — but going from Premiere required a little more effort. The package was the same as a normal sound turnover, including QuickTimes, guide tracks and EDLs, along with the AIFs. The trouble occurred when the conform wasn’t always in sync with what had been turned over.
Adobe looks at all of this as an opportunity to make their product even stronger. They said their engineers on the Adobe team “love to tackle problems and there is no better place to tackle those problem than live on an edit.”
Final Take Away
Even with training the editorial team to use a new program, working through audio conform hiccups and a pile of dead Mac towers, the team produced a polished film that had the best opening weekend for an R-rated film in history.
With improved sound turnover options hinted at for future versions of Premiere, we will very likely see more “Edited with Adobe Premiere Pro” logos in future film end credits.
If it seems like I’m reviewing Rampant Design Tools’ latest releases every few months, it’s because I am. Sean and Stefanie Mullen, the creators of Rampant Design Tools, are creating brand new sets of overlays, transitions, paint strokes, flares and tons of other tools every month.
Typically when I do reviews there isn’t much personal interaction with the business owners, but Sean and Stefanie made themselves available for questions every step of the way. Even when I’m not doing a Rampant review, I am emailing them and they are always ready to help and even give advice. For them it’s about their customers, and they are continually releasing top shelf tools that I believe every editor and motion graphics artist should have in their toolbox.
Before I get into what is new, you should download their free samples at www.4kfree.com. Almost every editor I show these too says, “I had no idea that’s what those were. I thought they were just stock footage elements.” Rampant Design Tools are not stock footage elements; they are color overlays, animated motion graphic elements, transitions, glitches and more. They are elements that are used in any program that can apply an Add, Multiply, Screen or any other composite mode to footage — really to any NLE or VFX app made. If you are a Blackmagic Design DaVinci Resolve user you can jump into the edit mode, place the Rampant clip on top of your original clip, select the Rampant clip to composite, open the inspector and under the composite mode pop up menu select your desired mode.
Typically, Add mode will do the job, but each mode has some cool differences that you will want to try out for yourself — for a stark contrast check out Hard Light. If you are an Avid Media Composer or Symphony user, check out my previous write-up on discovering the elusive composite or blending modes within Media Composer: https://postperspective.com/tutorial-blending-modes-rampant-inside-media-composer.
I think of Rampant offerings as quick and efficient tools that can add texture and interest to footage. In their latest rollout of releases, Rampant has sets of Designer Overlays, Film Burns, Matte Transitions, Flare Transitions, Glitch Transitions, Paint Stroke Transitions, and even animated motion graphics for editors. I’ll go into a few of the ones I find particularly interesting, but to find out more check out http://rampantdesigntools.com/rampant-all-products.
Matte Transitions are really useful. Not only can they be used traditionally as transitions between scenes or footage, but they can also be used to reveal a color treatment. I really like to use Rampant Design Tools in non-traditional ways, such as using mattes to reveal color treatments or effects. In Adobe Premiere I will duplicate my footage in the timeline, apply a unique color treatment to the duplicate footage, add the “Set Matte” effect and tell it to use the alpha channel of the Matte Transition. While this is a unique way to transition a color effect, it can be used in all sorts of circumstances.
Designer Overlays Sample
My favorite is when a producer or even another editor comes in and just wants something different; they don’t know what they want but they know it needs to be totally different. You can easily throw on a few different Rampant Design Tool overlays and get very different treatments quickly. You can even use the mattes to reveal text in a lower third or main title. It really adds depth to your work.
Paint Strokes are a really cool way to reveal or transition out of text or footage. I really like to use these to reveal color in a scene. Recently, I used it on a very desaturated piece I was working on. In the last 10 seconds of the piece I used a Paint Stroke to add a vibrant splash of paint to the project. The client really liked how it left a lasting impression of vibrancy and color.
If you have seen what is going on in the land of YouTube, you might have noticed how flashy and eye catching the videos are (and if you haven’t you better get over there and get inspired before you are asked to work on something and end up under-delivering in the “wow” department). One thing that gets tricky is designing new or altered transitions. Rampant Design has tons of transitions that are great to have in your editor’s toolbox. From the ever-popular Glitch transition to Flares, Paint Strokes and even Color Overlays. I like to add a white flash under a light leak to turn it into a transition sometimes.
Motion Graphics for Editors
Finally, my interest was captured with the “Motion Graphics for Editors” bundle. It contains lots of motion graphics elements such as Grids, Signs, Rays, Loaders, Lines, pre-made aspect ratios or even Triangles. Typically these little elements can take a ton of time to create. Usually if you are looking for these elements you are an editor who knows enough about motion graphics to be dangerous but who doesn’t have time to create these elements individually. Some uses for these are lower thirds that would typically be a boring gradient with text over the top or infographics, and while infographics seem easy they are most definitely not. They take tons and tons of time if you want them to look great. They are really easy to use with Rampant alpha channels.
In the end if you are looking for elements that are not stock footage, but instead handcrafted elements like organic paint strokes or unique Designer Overlays, you need to get over to www.rampantdesigntools.com. I have experienced firsthand the power these elements have. I’ve been at the end of my rope on some projects that weren’t paying enough to validate the drain on my brain power, then, remembering I had Rampant Design Tools, spent about an hour applying about 20 different treatments, transitions and effects to footage, color and text.
Film Burns and Matte Transition
In the end the client was happy and I was happy that I didn’t have to spend my time creating the elements from scratch. Rampant Design Tools takes projects to the next level quickly and easily by dragging and dropping, allowing you to work faster and more efficiently, making you more money in the process. I leave you with these highlights: unique non-serialized graphic overlays; easily combine color corrections to make unique color grades; and the newly-added “Motion Graphics for Editors.”
Brady Betzel is an online editor at Margarita Mix in Hollywood. Previously, he was editing The Real World at Bunim Murray Productions. You can email Brady at firstname.lastname@example.org, and follow him on Twitter, @allbetzroff.
What is a Trim Session? This blog postulates that Trim Session is a newly enhanced and highly nuanced approach to trim editing in Premiere Pro CC 2015. I’m suggesting three feature requests that will further establish Trim Session editing as an efficient editing workflow in Premiere Pro… I’ve even included a video demonstration of the requests.
When it comes to NLE updates, the hope of every editor is two-fold: improved media management and faster editing. Additional features are just a bonus. So, let’s ignore the Lumetri color panel for a moment and look at an enhancement in Premiere Pro CC 2015 that improves editing. This feature is hidden in the terminology of the Revert Trim Session button.
Revert Trim Session
In a previous post, I hinted that a Revert Trim Session has deeper implications beyond its own functionality. Here’s the Revert Trim Session feature description in the CC 2015 release notes: “A Revert Trim Session button can be added to the Program Monitor to enable an edit point to be returned to its original position before Trim Mode was entered (Premiere Pro Blog).”
But what is a Trim Session? There is no documentation for it, other than it can be reverted. You won’t find either “trim” or “session” anywhere in the Premiere Pro CC 2014 press release. Perhaps it’s an expanded capability of Trim Mode? Then why not call it Reset Trim Mode? No, Trim Session feels more nuanced than just editing in Trim Mode. So what is it besides semantics?
Trim Session Defined
This video defines Trim Session and how it can be improved with three feature requests.
It’s logical to infer that a Trim Session includes all trim activity between the time Trim Mode was entered and exited. In other words, a Trim Session is a series of Trim Mode edits that Premiere Pro treats as one event. Keep in mind, editing in Trim Mode supports continuous loop playback. Now, apply this to editing a radio edit or finessing a rough cut in the timeline. Suddenly, Trim Session begins to look and feel different than other timeline editing methods. Instead of click-and-drag trimming or in-and-out-point editing — both of which require playback to be stopped and manually reset to review changes — Trim Session enables continuous, uninterrupted trim editing, which can be collectively undone in one command. Dynamite.
An Enhanced Trim Editing Workflow
Through the Revert Trim Session feature, Adobe has, knowingly or unknowingly, introduced a new term for an enhanced trim editing workflow. However, it’s as if Trim Session has been teased out without fully delivering on its obvious strengths. Certain functionality is still needed in order to establish Trim Session as its own editing workflow in Premiere Pro. The absence of three features in particular, prevent Trim Session from achieving its full potential.
Trim Session Feature Requests
1. The ability to jump between edit points during loop playback in a Trim Session.
2. The ability to Toggle Trim Type during loop playback in a Trim Session.
3. The ability to toggle target audio/video tracks during loop playback in a Trim Session.
Trim Session editing will truly come into its own when given the capability to switch trim type, jump between edit points and toggle target audio/video tracks without having to stop loop playback. This will enable Premiere Pro editors to work down the timeline, trimming edit points on specific tracks more quickly, and all as one collective event. This ability to continuously adjust and collectively undo without interrupting playback will make Trim Session a very attractive editing method to all Premiere Pro editors.
Note, continuous loop playback is an advantage of Trim Session editing, not a requirement. Loop playback can be paused for fine-tuned edits and still be contained within a Trim Session.
The key message of this post is not about reinventing the wheel or adopting new terminology, but improving Premiere Pro functionality for its users and satisfying the desire for faster editing workflows. Whether or not it’s called “Trim Session” is beside the point. Call it whatever you want; these three feature requests provide the functionality Revert Trim Session inherently suggests, but Premiere Pro does not yet provide. Not to mention, they also give editing in Trim Mode the appeal it never really had.
If you would like to support these three feature requests, simply copy the feature request text above and paste it into Adobe’s Feature Request form.
I realize feature requests are largely opinionated and, if you read this far, I want to thank you for investing your time in something I feel passionate about.
Premiere Bro is the alias for Sean Schools. Sean is the video editor for JK Design. He is a Full Sail University graduate who did time in Brooklyn. You can email Sean at email@example.com, and follow him on Twitter @premierebro. You will also find this blog on his website www.premierebro.com.
Utopic editor Katherine Pryor didn’t grow up a racing fan, but a recent short film for an iconic car company turned her head. New York City-based production company ADDigital and Chicago edit house Utopic teamed up on a documentary-style film for Porsche. Director Sam Ciaramitaro and Pryor worked side by side on the web offering, via agency Cramer-Krasselt Chicago.
The five-minute-plus film, called The Enduring Bond, is the first of two long-form projects Ciaramitaro and Pryor are slated to collaborate on. The second one will shoot at Road Atlanta this fall. The Enduring Bond, whichshot over four days this past March, offers a “fly-on-the-wall style and features two personal stories: one showing how much the 12 Hours of Sebring endurance race means to the crews and drivers from Porsche, and another following a family that attends the race year after year.
We reached out to Pryor to find out more about editing the project, and working through 25 hours of footage, as well as her collaboration with the director.
How early did you get involved in the project?
I had several conversations with director Sam Ciaramitaro prior to production. I had worked with him before, so when we knew we were teaming up again for this one, he would send me ideas and we would discuss things like music, style and pacing.
Can you talk about how you worked with them before the edit process began?
I was given some “homework” before production. There were a few docs I watched as examples of great techniques for fly-on-the-wall-style documentaries. In addition to that, I had a learning curve for what endurance racing actually is. I had never seen a car race or understood the culture with fans and drivers, so I watched the feature film Le Mans (Steve McQueen) and Senna, a documentary composed entirely of found footage about Formula 1 driver Ayerton Senna. This all helped to get me into the driver’s seat POV, so to speak.
You mentioned that you and the director had worked together in the past. That must have made things a bit easier?
Sam and I have a great short hand. There was a lot of collaboration back and forth during post. Also, it was great to be on set and see him come back from a location excited to tell me what they just captured and what to keep my eyes open for as I looked through footage.
So you were on set, not near set?
The set was the entire track — 3.74 miles — and the surrounding areas where fans were camped out all weekend. We were set up in a room where media and TV people were stationed, behind the grandstand. I was editing on set with my assistant Christen Nehmer. On several occasions we were able to hop in a golf cart and head out to a location for parts of the shoot. I really couldn’t have done this without her there – she was syncing interviews and logging wild sounds right alongside me so I could focus on pulling selects.
It was essential to be there to have a grasp on where and how everything was shot, and how sound was captured. We were also able to go into the pit area to observe. Having been at a race track for four days, I can now distinguish the sound of a Porsche engine from any other race car!
L-R: DP Steven Huber and director Sam Ciaramitaro.
What was the piece shot on, and how did they come to that specific format/camera?
We had two very agile and talented DPs — Anthony Arendt and Steve Huber — who shot on Sony A7s combined with Atomos Shogun 4K. These cameras are great for their small size and ease with getting around quickly. The footage looks fantastic. And with everything at 4K, I had a lot of opportunity to blow shots up and move around the frame.
When did you start getting footage, and what was the workflow like?
We started getting footage on the second day of the four-day shoot. We were set up near the DIT, and as he transcoded the footage, we would then copy to our drives. We worked off of 15-inch Apple MacBook Pros, running Adobe Premiere CC, with 4TB G-Tech G-RAID Thunderbolt drives.
It was a 12-hour race on Saturday, day four of the shoot, so we had lots of footage trickling in as the day went on. We spent most of Saturday organizing the three days of footage we had already gotten. By Sunday morning, we had everything in hand, which was around 25 hours of footage. Then we got on a plane and flew back to Chicago. Sunday night Christen and Jarrad Quadir, another rockstar Utopic assistant, transferred all the footage over so I could continue editing on Monday without missing a beat. Sam came to town Tuesday and by Thursday we had about a nine-minute working cut.
What kind of direction did you get about the edit?
The race itself was never the focus of the piece. Nor was the goal to sell Porsches. The focus was to tell the human side of motorsports. To let the personal stories unfold, side by side. I knew that the meat of this was going to be the family’s story, followed by the driver, Jörg Bergmeister. With this kind of documentary style, there are always surprises. There was no traditional board or script to work from, so I started with the director’s treatment.
How would you describe your creative process on this one?
I watched every frame before I started laying anything into a timeline. I was extremely disciplined. With 25 hours of footage, it would be tempting to skip through a lot of it, but I made sure to screen everything, pull selects and really just digest all of it. With that amount of footage, I wanted to be sure of everything we had to work with.
Next, I started with Sam’s treatment as my roadmap. Everything in his treatment was captured. It came down to finding the most essential and best moments to tell the family’s story, and to balance that with the driver’s perspective leading up to the race. It was also important to fill it in with cinematic moments — like Jörg shaving and then driving to the track with a fellow race car driver. The goal was to create tension and build up the characters, then end with the beginning of the race.
Was there a part that was most challenging?
I think the challenge was what to do once we got to the actual race! Since it was not a focal point, I wasn’t quite sure how to wrap it all up. I felt like we could just keep building and building. Once I saw what an emotional story we had from the Diaz family, I knew we had to end with that. In fact, Sam discussed it with me on set immediately after he shot it — that was the ending… Javier’s tears. So I definitely had that in mind from the beginning when I started cutting.
The actual race result was a bit of a surprise. Porsche ended up having a rough last hour of the 12 hours and they lost despite holding leads throughout the day. Later in the post process it became important to get their message across, which ultimately ties back into the theme of Enduring Bond. Creating the drama of this losing moment but still maintaining their will to win was a bit of a challenge.
What are you most proud of?
I absolutely love this piece. I’m so thrilled to have been brought into this project by Sam, ADD and CK. I would say I am most proud of the sound design that I built with the variety of elements I had — original music, ambience, pit-to-car radio communication, track announcer voices, wild track audio of the race and sync sound from interviews. It was definitely outside of how I normally work on projects. I really pushed myself to build and layer the audio especially during the rough-cut stage. Then, of course, I worked closely with Brian Leitner, Utopic’s sound designer and music composer.
Can you talk more the music?
Before production we created an original music track, based on direction that Sam had given us. I wanted a track to cut with that could eventually be post scored. Instead of doing a traditional music search or getting hooked on a song from a band or soundtrack, I asked Brian Leitner to create something. It turned out to be an amazing piece of music, and perfect for this film.
Premiere Pro CC 2015 brought more to editors than awesome color grading tools and magical transitions. The new release also brought several enhancements to Premiere Pro’s trimming capabilities.
If you’re a Premiere Pro editor who has never edited in Trim Mode, CC 2015 is the time and version to start. This post highlights three new trim features along with many tips for maximizing the efficiency of Trim Mode editing in Premiere Pro.
Shortcut sharing sounds like chaos: two editing functions — Trim and Nudge — battling it out underneath the keyboard for priority. But it’s not as scary as it sounds. Premiere Pro will perform a Trim when an edited point is selected and will perform a Nudge when a clip is selected. It’s actually profoundly intuitive and it’s a feature that will soon be taken for granted.
By enabling Trim and Nudge to share the same keyboard shortcuts, Premiere Pro consolidates valuable keystrokes by giving them twice the capability. Obviously, only the Trim function of the shared shortcut applies while in Trim Mode. This tutorial shows how to map Trim commands to the default Nudge keyboard shortcuts: https://youtu.be/iEsWIE7hx9I.
Simply put, Revert Trim Session undoes successive trim edits made in Trim Mode. The ability to return an edit point to its original place, prior to changes, with one click, will make Trim Mode more appealing to many Premiere Pro editors. The Revert Trim Session feature is also particularly intriguing because it introduces a new trimming terminology: “Trim Session.” Although it’s logical to assume that Trim Session refers to all trim activity within Trim Mode, there’s no official documentation for this functionality. It may be reading too much between the lines, but it’s as if Adobe is using this language to suggest an enhanced trim editing workflow. More on that in a future post. Learn how to set-up Revert Trim Session in this tutorial: https://youtu.be/yQb7a2ilgCM.
We’ll coin this feature “Live Trimming” until a more official term is given by Adobe. It’s similar to “J-K-L Dynamic Trimming” (which still works in CC 2015) but it’s uniquely different in that making an edit does not require playback to stop.
While playback is looping in Trim Mode, pressing “I” and “O” will set a new in and out point (based on the current trim type) for the outgoing or incoming clips. When an edit is made, loop playback will reset on the new edit point and further editing can continue.
In a way, Live Trimming feels similar to multicam switching in being able to watch playback and make an edit when it feels right. This new functionality within Trim Mode gives Premiere Pro editors a more dynamic and interactive trim editing experience. Watch this tutorial to see Live Trimming in action: https://youtu.be/FXe-mjxR5ko.
Key Point Recap
The following tips will increase the speed and efficiency of trim editing in Premiere Pro CC 2015:
• Assign keyboard shortcuts to each of the “Select Nearest Edit Point…” commands. This will allow you to jump to the nearest edit point with a specific trim type, instead of having to select the edit point and then Toggle Trim Type (Ctrl+T).
• In Trim Mode, select your trim type before you begin loop playback. Playback must be stopped to change trim type.
• Try first using “I” and “O” Live Trimming to trim the edit point to where it feels right. Then, continue to finesse using the Trim keyboard shortcuts.
• Cmd+Z will undo the last trim edit without exiting trim mode or interrupting loop playback.
• Assign keyboard shortcuts to each of the “Toggle Target Video…” commands. This will allow you to make trim edits to clips on specific video tracks. Do the same for all the “Toggle Target Audio” commands.
Coming Soon to this space: a post defining Trim Session, including two feature requests, and how it is a unique trim editing workflow.
Premiere Bro is the alias for Sean Schools. Sean is the video editor for JK Design. He is a Full Sail University graduate who did time in Brooklyn. You can email Sean at firstname.lastname@example.org, and follow him on Twitter @premierebro. You will also find this blog on his website www.premierebro.com.
The cloud is everywhere. Workflows, as well as companies making tools for those workflows, are popping up all over, but still some post pros are dubious. What exactly is the cloud? How will it help beyond regular workflows? How does it keep my assets secure? Those are just some questions being thrown around by those who have yet to make the transition.
We thought reaching out to a company that capitalized on the cloud early and from a post production perspective might be a good way to get some of these questions answered.
David Peto owned London-based post house Unit Post Production until 2009 when he started Aframe, a cloud platform that enables teams and organizations to collaborate, organize and move media. He designed the product from a user’s perspective. Let’s find out more about the cloud and its benefits for post pros…
Some people don’t have a clear understanding of the cloud. Can you help them out?
In its simplest definition, the cloud is a combination of software and services that run on Internet-accessible servers rather than on local computers.
Can you describe how your company uses the cloud?
Aframe was built as a cloud platform from the very beginning. We recognized that people were shipping hard drives all around the world, making unnecessary copies and versions of their media, losing comments and other metadata, and generally spending too much time just waiting to work with media. Using the cloud gives organizations a central repository — a one-to-many point of distribution — that enables more people to access media and do their work regardless of where they may be located and what time of day it is.
There are private and public clouds. Can you describe the differences? What are the benefits of each?
Public clouds are generally owned and operated by a third party such as Amazon’s Web Services, Rackspace or Microsoft Azure. They are provisioned for use by many different types of users — banks, pharmaceutical companies or content distribution networks featuring the latest grumpy cat video!
Private clouds are owned by one company for the specific use of its employees and partners, and generally have very high security standards and limited accessibility.
Which does Aframe use, and why?
Aframe sits somewhere in between being a public and private cloud, which we think offers benefits from both types. What we offer is sometimes referred to as a Vendor Cloud. Like a public cloud offering, Aframe is globally available and accessible to all, but has been purpose-built for a specific task: handling large and complex media files.
Like private clouds, we offer very tight security, greater flexibility and features tailored to users. We also own and operate all of the equipment in the datacenters and do not outsource any portion of our infrastructure to third parties.
For content creators and owners dealing with large, often complex, high-resolution media files, dedicated processes and services are required to enable post workflows. Far beyond simply storing content is the need to automatically transcode to different formats and extract and add descriptive metadata, while also providing a method to review and approve assets for all stakeholders on any device.
How do you educate people who have concerns about data security?
The best way is to explain the different security areas that must be considered. First there is file transfer security where anytime your media is moving to or from the cloud, it should be encrypted during transit. Media in transit is encrypted with an extended validation 256-bit SSL encryption at all times. This means that our corporate identity and place of business has been verified by SSL certification.
Application Security is where we use 256-bit SSL encryption at all times in the browser. Like banking online, your browser will display a green box that shows a verified connection when you are logged in. For server security inside our datacenters, access is protected by powerful 2048-bit RSA encryption keys. This can only be accessed by senior members of the Aframe team.
For physical security and backups of the data itself, we firmly believe that it is not safe unless it is verified to be stored redundantly in at least two geographically separate locations. Our customers rest easy knowing that their files are backed up hundreds of miles apart at opposite ends of the country.
Why should a post organization consider a cloud-based solution? What are the advantages?
Most post houses, regardless of their size, are experiencing the headaches I’ve mentioned with regards to teams working in different locations, having to FedEx media and having silos of workflows where not everyone has access to the types of files they need. Only certain people need high-resolution files, others are happy to view proxies.
Being able to add timecode accurate notes and comments for the editor or producer is critical, but so are automatically transcoding, uploading and downloading files, as your workflow requires. All of these are necessary to get the maximum productivity to hit ever-shrinking deadlines and budgets. In the end, people should be concentrating on the creative aspects of their jobs, not the mundane moving of files to different departments and other stakeholders.
Can you give any examples of how post facilities are using Aframe?
There are many workflows being used by our post customers today. Some users upload dailies to the cloud so that stakeholders can view and log comments and even embed complete transcriptions. Others are using Aframe as a central repository where team members in offices across town or across countries can collaborate and get access to the latest footage. All metadata is indexed and preserved so that searching for just the right shot is effortless. Avid, Final Cut or Adobe editors benefit by seeing all comments and feedback when they transfer the metadata into the edit.
How is Aframe different from something like Amazon S3’s cloud offering?
That’s a great question. Amazon is a true cloud solution. It’s big, and in use by countless organizations every day. Unfortunately, that’s exactly where it can fall short for companies working with high-resolution files and broadcast masters. Amazon was built to serve many masters and as such, they have a different business model that can be quite costly for content creators.
Amazon charges for uploads and downloads, Aframe does not, which can make uploading a 500GB show master very expensive. With Amazon, it’s not as clear where your media is stored, and where the backup of that media is stored. Finally, Amazon is sharing their bandwidth across a huge cross-section of customers in a way that Aframe is not. It’s just not built for the media and entertainment industry.
A lot of people say the Internet is not fast enough to support upload/download of full-res video for any meaningful post workflows, how do you answer that?
That’s a completely legitimate question, because it is true that your experience in the cloud is only as good as your Internet connection. However, we have optimized file transfer protocols to be the fastest in the industry. Our transfer speeds are 15x faster than FTP, for example. There are many users working from home or from coffee shops that are quite successfully viewing media and making comments over 3G or 4G connections with as little as 5Mbps.
Obviously, you’ll want more than that if you are uploading dailies, but the good news is that Internet speeds are increasing exponentially every year and most post organizations have very capable connections in place today.
At NAB, Aframe showed a collaboration between Aframe and Adobe Anywhere. Can you talk about that?
Since the beginning of Aframe, I’ve dreamed of true, no-download cloud editing. I’ve seen a lot of people fail for various reasons. However, four years ago, Adobe showed me their Anywhere product which allowed full resolution material to be streamed down a standard Internet connection allowing you to edit in Adobe Premiere Pro on your laptop just as if you were back at the facility. I was blown away at the possibility, because I could imagine hosting Anywhere in Aframe’s cloud platform, allowing full broadcast quality, no-download edit in the cloud using Adobe Premiere on your laptop.
That’s exactly what we showed at this year’s NAB. It’s pretty amazing to see someone upload material into the cloud and never download it again, through the entire post process, until someone actually sees it.
This is significant for our industry because it means that you could now be editing from the office, home, the beach, the local café… anywhere really. It brings editing into the modern world and unchains the editor from all the storage and big iron workstations when you need to be someplace else!
Video processed with Adobe After Effects or Premiere Pro will soon be able to take advantage of Vidcheck’s intelligent “Vidapps-Video” plug-in to correct RGB gamut and YUV levels within the NLE.
Uk-based Vidcheck, which makes automated quality control software with patented intelligent video and audio correction, will be at NAB this year showing the latest version of its Vidchecker and Vidfixer product suites. These have been extended to include a range of video applications (Vidapps) for Adobe After Effects and Premiere Pro, enabling users to check and automatically correct video and audio errors without leaving the Adobe environment.
This means that users of Adobe After Effects and Premiere Pro can correct illegal video levels to broadcast safe parameters using Vidcheck’s patented algorithms which correct the video without clamping it. (Clamping being an antiquated means of achieving broadcast safe content that can cause undesirable degradations of the resulting picture).
Vidcheck’s Vidapps-Video provides checking and correction as part of the edit process and is designed to be used as the last stage of post production, immediately before the media is rendered. This approach means the user can be confident that the rendered media will fully comply with the specified requirements before it leaves the Adobe environment.
As part of using Vidapps, an XML report can be generated and saved as a record that QC corrections were done and, if required, be forwarded to the client for the media file. The report can also be ‘skinned’ with the logo and colors of the post house/video editor to make it highly specific and identifiable to them.
Additional Vidapps plug-ins are currently available for audio and photosensitive epilepsy (PSE) checking, and others are in development for introduction later in 2015.
Vidcheck’s core Vidchecker and Vidfixer AQC products are scalable from low-cost versions for post production to sophisticated Vidchecker/Vidfixer Grid systems suitable for larger enterprises. In addition to watch folder automation, Vidcheck’s API has been integrated into many MAM and workflow engine solutions across the industry for seamless addition of complex AQC into any workflow.
Agoura Hills, California-based Larry Jordan & Associates, headed by industry vet and long-time trainer Larry Jordan, is now offering educational institutions as well as students discounted pricing and specially tailored training.
These discounts apply to tools professional editors are using out there in the world, including Apple Final Cut Pro X and Adobe Premiere Pro. Jordan and company will also make additional educational support materials available as well as the training.
On the heels of this announcement, we reached out to Jordan for some details.