Whitehouse Post editor Lisa Gunning has relocated from the company’s London headquarters to its Los Angeles office. The move allows her to cut more long-form projects in addition to her spot work.
Gunning’s arrival at Whitehouse LA coincided with her editing the feature film Newness for commercial and narrative director Drake Doremus. The film was completed in only three months and premiered at this year’s Sundance Film Festival. Well known for her commercial work, Gunning wrapped Adidas’ Basketball Without Creativity starring James Harden for frequent director collaborator Stacy Wall in late 2016. In recent years, she has also teamed up with Wieden+Kennedy, 72 and Sunny, Y&R and BBH to work on brands including Nike, Corona, Landrover and Johnnie Walker.
Regarding her decision to relocate, Gunning explains that LA offers an opportunity to expand her commercial portfolio and cater to her long-form interests. “I feel like I’m in the epicenter of where my work is based now.”
Along with her spot work, Gunning has lent her editing talent to films including Nowhere Boy, Seven Psychopaths and Fifty Shades of Grey.
In addition to editing, Gunning has grown her directing skills with several projects, including three short films in collaboration with Nowness and Mini and multiple music videos. “Directing is great for editing, and what I learn on commercials is great for working in long-form,” she explains. “The varied experiences make me a better director and editor because I’m able to empathize with all of the processes and think of them as a whole, as opposed to just one side of it.”
Capturing an event with pro know-how and flexible tools
By David Hurd
I recently had an opportunity to shoot a gala event at a mall for the Tampa Innovation Alliance. The CEOs of all the big local companies, as well as the mayor were there, along with 600 guests. The event was held in the large space that used to be an Old Navy store, and there were booths out in the mall that needed coverage as well.
The plan was for Tracy, the interviewer, to get short interviews with the VIPs before the sit down part of the event and then I would record the speakers. The footage would then be edited down into a five-minute 720p YouTube video.
Because of the many set-ups, and the size of the venue, I needed a rig that was quick and portable. I started with a pair of American Grip Dana Dolly Baby Combo Stands on wheels. These things are awesome and built like tanks. I then attached a 48-inch SmartSystem slider to the top of the stands and a Manfrotto head and pan bar. The 48-inch SmartSystem slider can take a lot of weight and allows me to use any camera rig.
I assembled the rig in the parking lot, and just rolled it into the mall. During the shoot, I used the slider to re-position shots quickly when the crowd got in my way, and it came in handy for creating moving shots as well. Let’s talk about the camera.
I have grown to love my Blackmagic 4K production camera for jobs like these. I use a 35mm Rokinon lens, which due to the crop factor ends up at around 50mm. Indoors, I set it to Film mode (iso 800) and a color balance of 4000, which always seems to work best. I also turn on the 2:35 mask so that I have an idea of what the image will look like later.
The Rokinon lens is f1.3, so it does well in low light. Since I was going to be constantly on the move, I just used available light. Did I mention that the lighting inside the event looked like a dark bar? That’s where Film mode (iso 800) and the lens saved my butt.
For important jobs, I record a 220Mb/sec stream in ProRes 422HQ, otherwise ProRes 422 100Mb/sec works fine for the web. You will only see the difference when you zoom in a lot in post. For power, I used a V-mount Blueshape battery. Blueshape batteries are what professionals are changing to. The one I used that night lasted the whole shoot.
For audio, I use the amazing little JuicedLink BMC366 mixer for Blackmagic cameras. It’s small, lightweight, and has everything I need. I used a Shure VP64 mic, plugged into a Sennheiser RF transmitter in one channel of the mixer for the interviews. I also needed the house audio for the sit-down speeches. For this I used a Sennheiser lav transmitter plugged into a sub out on the house mixer via a 1/4-inch jack. Since the jack was mono and the mixer was stereo, I only pushed in the jack to the first click to avoid shorting it out. After adjusting the in and out levels, the Sennheiser transmitted the house audio to wherever I was in the room.
The interview part of the shoot went something like this: Tracy walked around with his mic in hand, finding interview victims. I followed him, happily pushing my rig along. When one was found, I directed them into position to make use of available light, framed for a wide shot, focused and hit record. It was painless, and the process took about one to three minutes per interview.
When everyone went inside for the sit-down part of the evening, I found a place off to one side of the stage, about 30 feet from the podium. Using the same lens, I could get most of the stage in the shot. After a quick battery change in the house audio transmitter, I was ready to rock.
About an hour later, after the event, we stood by the exit and snagged people for interviews as they were leaving. Then I rolled the rig to the parking lot, took it apart, loaded it up, and headed home for the edit.
The edit is where the magic happens. Thunderbolt is wonderful, and I have built up a system that is fairly state of the art, so that I don’t have to wait much while editing.
I called on a Mac Pro “Trash Can” with 64GB of memory and 12GB of GPU processing on the two video cards. The computer is connected to four G-Tech G-Speed esPro drive boxes via two HighPoint RocketStor 6328 RAID controllers. Each controller is connected to its own TB channel. Each set of two boxes (eight drives) is a RAID-5, and all 16 drives are striped RAID-0 in OS X. The system reads data at 2000MB/sec and writes at over 1700MB/sec. — perfect for 4K editing.
For viewing, there are two 32-inch monitors, one of which is a Boland broadcast monitor run through a Blackmagic UltraStudio 4K interface box via SDI.
The workflow is easy. I simply drop the SSDs from the Blackmagic camera into my RocketStor 5212, which transfers the data via Thunderbolt to my RAID really fast. I record on OWC 480GB Mercury Extreme Pro 6G SSD cards, so the transfer rate is over 550MB/sec.
In Apple FCPX I create a 720p timeline and when I import the 4K footage, I select “Leave Files in Place.” Basically, I am dropping roughly 2000×4000 pixel footage onto a 720×1280 pixel timeline.
For more of a “film” look, I place a 2:35 aspect ratio mask that I made in Photoshop over the footage. Now, I simply open up the scopes and color correct the footage, which is much easier to do before it’s all cut up.
My intention was to have the original wide shot, and zoomed-in medium and close-up shots, so first I had to see where I wanted to cut them. To do this I had to go through the footage and make cuts with the Blade tool. For example, I may start close-up on Tracy and go to a two-shot when he introduces his guest. Then I go to the guest when he says something interesting and then back to a two-shot for the close.
With the cuts made, I clicked on the clips, re-sized them and moved them around into the medium and close-up shots. Because I had about 2000×4000 pixels to work with, I was able to zoom in up to 300 percent and still have pixel-to-pixel coverage. If the shot was in focus, but looked a little soft, I would call on a sharpen filter to fix it.
Since I shoot with a Prime lens, there is no zoom. If the client wants a slow zoom, I just use keyframes. This is actually better than trying to zoom in and out at the event, where there are no re-takes.
This rig and workflow turned what would have been a lot of lifting and moving about in a crowded space into an efficient one-man shoot. I didn’t have to worry about zooming, or getting the exact framing, which removed a lot of stress. I got 90 minutes of footage, and I only needed five.
This story has a happy ending. The client was pleased with the video, and I got paid.
David Hurd is the owner of David Hurd Productionsin Tampa, Florida. He has been in the business for over 40 years.
What do you need to know about the latest pro laptop from Apple? Well, the MacBook Pro is fast and light; the new Touch Bar is handy and sharp but not fully realized; the updated keys on the keyboard are surprisingly great; and working with ProRes QuickTime files in resolutions higher than 1920×1080 inside of FCP X, or any NLE for that matter, is blazing fast.
When I was tasked with reviewing the new MacBook Pro, I came into it with an open mind. After all, I did read a few other reviews that weren’t exactly glowing, but I love speed and innovation among professional workstation computers, so I was eager to test it myself.
I am pretty open-minded when it comes to operating systems and hardware. I love Apple products and I love Windows-based PCs. I think both have their place in our industry, and to be quite honest it’s really a bonus for me that I don’t rely heavily on one OS or get too tricked by the Command Key vs. Windows/Alt Key.
Let’s start with the call I had with the Apple folks as they gave me the lowdown on the new MacBook Pro. The Apple reps were nice, energetic, knowledgeable and extremely helpful. While I love Apple products, including this laptop, it’s not the be-all-end-all.
The Touch Bar is nice, but not a revolution. It feels like the first step in an evolution, a version 1 of an innovation that I am excited to see more of in later iterations. When I talked with the Apple folks they briefed me on what Tim Cook showed off in the reveal: emoji buttons, wide gamut display, new speakers and USB-C/Thunderbolt 3 connectivity.
They had an FCPX expert on the call, which was nice considering I planned on reviewing the MacBook Pro with a focus on the use of nonlinear editing apps, such as Adobe Premiere Pro, Avid Media Composer and Blackmagic’s Resolve. Don’t get me wrong, FCPX is growing on me — it’s snappy jumping around the timeline with ProRes 5K footage; assigning roles are something I wish every other app would pick up on; and the timeline is more of a breeze to use with the latest update.
The other side to this is that in my 13 years of working in television post I have never worked on a show that primarily used FCP or FCPX to edit or finish on. This doesn’t mean I don’t like the NLE, it simply means I haven’t relied on it in a professional working environment. Like I said, I really like the road it’s heading down, and if they work their way into mainstream broadcast or streaming platforms a little more I am sure I will see it more frequently.
Furthermore, with the ever-growing reduction in reliance on groups of editors and finishing artists apps like FCPX are poised to shine with their innovation. After all that blabbering, in this review I will touch on FCPX, but I really wanted to see how the MacBook Pro performed with the pro NLEs I encounter the most.
Let’s jump into the specs. I was sent a top-of-the-line 15-inch MacBook Pro with Touch Bar, which costs $3,499 if configured online. It comes with a quad/-core Intel Core i7 2.9GHz (up to 3.8 GHz using Turbo Boost) processor, 16GB of 2133MHz memory, 1TB PCI-e SSD hard drive and Radeon Pro 460 with 4GB of memory. It’s loaded. I think the only thing that can actually be upgraded beyond this configuration would be to include a 2TB hard drive, which would add another $800 to the price tag.
Physically, the MacBook Pro is awesome — very sturdy, very thin and very light. It feels great when holding it and carrying it around. Apple even sent along a Thunderbolt 3 (USB-C) to Thunderbolt 2 adapter, which costs an extra $29 and a USB-C to Lightning Cable that costs an extra $29.
So yes, it feels great. Apple has made a great new MacBook Pro. Is it worth upgrading if you have a new-ish MacBook Pro at home already? Probably not, unless the Touch Bar really gets you going. The speed is not too far off from the previous version. However, if you have a lot of Thunderbolt 3/USB-C-connected peripherals, or plan on moving to them, then it is a good upgrade.
I ran some processor/graphics card intensive tests while I had the new MacBook Pro and came to the conclusion that FCPX is not that much faster than Adobe Premiere Pro CC 2017 when working with non-ProRes-based media. Yes, FCPX tears through ProRes QuickTimes if you already have your media in that format. What about if you shoot on a camera like the Red and don’t want to transcode to a more edit-friendly codec? Well, that is another story. To test out my NLEs, I grabbed a sample Red 6K 6144×3160 23.98fps clip from the Red sample footage page, strung out a 10-minute-long sequence in all the NLEs and exported both a color-graded version and a non-color-graded version as ProRes HQ QuickTimes files matching the source file’s specs.
In order to work with Red media in some of the NLEs, you must download a few patches: for FCPX you must install the Red Apple workflow installer and for Media Composer you must install the Red AMA plug-in. Premiere doesn’t need anything extra.
Test 1: Red 6K 6144×3160 23.98fps R3D — 10-minute sequence (no color grade or FX) exported as ProRes HQ matching the source file’s specs. Premiere > Media Encoder = 1 hour, 55 minutes. FCPX = 1 hour, 57 minutes. Media Composer = two hours, 42 minutes (Good news, Media Composer’s interface and fonts display correctly on the new display).
You’ll notice that Resolve is missing from this list and that is because I installed Resolve 12.5.4 Studio but then realized my USB dongle won’t fit into the USB-C port — and I am not buying an adapter for a laptop I do not get to keep. So, unfortunately, I didn’t test a true 6K ProRes HQ export from Resolve but in the last test you will see some Resolve results.
Overall, there was not much difference in speeds. In fact, I felt that Premiere Pro CC 2017 played the Red file a little smoother and at a higher frames-per-second count. FCPX struggled a little. Granted a 6K Red file is one of the harder files for a CPU to process with no debayer settings enabled, but Apple touts this as a MacPro semi-replacement for the time being and I am holding them to their word.
Test 2: Red 6K 6144×3160 23.98fps R3D — 10-minute color-graded sequence exported as ProRes HQ matching the source files specs. Premiere > Media Encoder = one hour, 55 minutes. FCPX = one hour, 58 minutes. Media Composer = two hours, 34 minutes.
It’s important to note that the GPU definitely helped out in both Adobe Premiere and FCPX. Little to no extra time was added on the ProRes HQ export. I was really excited to see this as sometimes without a good GPU — resizing, GPU-accelerated effects like color correction and other effects will slow your system to a snail’s pace if it doesn’t fully crash. Media Composer surprisingly speed up its export when I added the color grade as a new color layer in the timeline. By adding the color correction layer to another layer Avid might have forced the Radeon to kick in and help push the file out. Not really sure what that is about to be honest.
Test 3: Red 6K 6144×3160 23.98fps R3D — 10-minute color-graded sequence resized to 1920×1080 on export as ProRes HQ. Premiere > Media Encoder = one hour, 16 minutes. FCPX = one hour, 14 minutes. Media Composer = one hour, 48 minutes. Resolve = one hour, 16 minutes
So after these tests, it seems that exporting and transcoding are all about the same. It doesn’t really come as too big of a surprise that all the NLEs, except for Media Composer, processed the Red file in the same amount of time. Regardless of the NLE, you would need to knock the debayering down to a half or more to start playing these clips at realtime in a timeline. If you have the time to transcode to ProRes you will get much better playback and rendering speed results. Obviously, transcoding all of your files to a codec, like ProRes or Avid DNX, takes way more time up front but could be worth it if you crunched for time on the back end.
In addition to Red 6K files, I also tested ProRes HQ 4K files inside of Premiere and FCPX, and both played them extremely smoothly without hiccups, which is pretty amazing. Just a few years ago I was having trouble playing down 10:1 compressed files in Media Composer and now I can playback superb-quality 4K files without a problem, a tremendous tip of the hat to technology and, specifically, Apple for putting so much power in a thin and light package.
While I was in the mood to test speeds, I hooked up a Thunderbolt 2 SSD RAID (OWC Thunderbay 4 mini) configured in RAID-0 to see what kind of read/write bandwidth I would get running through the Apple Thunderbolt 3 to Thunderbolt 2 adapter. I used both AJA System Test as well as the Blackmagic Disk Speed Test. The AJA test reported a write speed of 929MB/sec. and read speed of 1120MB/sec. The Blackmagic test reported a write speed of 683.1MB/sec. and 704.7MB/sec. from different tests and a read speed of 1023.3MB/sec. I set the test file for both at 4GB. These speeds are faster than what I have previously found when testing this same Thunderbolt 2 SSD RAID on other systems.
For comparison, the AJA test reported a write speed of 1921MB/sec. and read speed of 2134MB/sec. when running on the system drive. The Blackmagic test doesn’t allow for testing on the system drive.
What Else You Need to Know
So what about the other upgrades and improvements? When exporting these R3D files I noticed the fan kicked on when resizing or adding color grading to the files. Seems like the GPU kicked on and heated up which is to be expected. The fan is not the loudest, but it is noticeable.
The battery life on the new MacBook Pro is great when just playing music, surfing the web or writing product reviews. I found that the battery lasted about two days without having to plug in the power adapter. However, when exporting QuickTimes from either Premiere or FCPX the battery life dropped — a lot. I was getting a battery life of one hour and six minutes, which is not good when your export will take two hours. Obviously, you need to plug in when doing heavy work; you don’t really have an option.
This leads me to the new USB-C/Thunderbolt 3 ports — and, yes, you still have a headphone jack (thank goodness they didn’t talk with the iPhone developers). First off, I thought the MagSafe power adapter should have won a Nobel Peace Prize. I love it. It must be responsible for saving millions of dollars in equipment when people trip over a power cord — gracefully disconnecting without breaking or pulling your laptop off the table. However, I am disappointed Apple didn’t create a new type of MagSafe cable with the USB-C port. I will miss it a lot. The good news is you can now plug in your power adapter to either side of the MacBook Pro.
Adapters and dongles will have to be purchased if you pick up a new MacBook Pro. Each time I used an external peripheral or memory card like an SD card, Tangent Ripple Color Correction panel or external hard drive, I was disappointed that I couldn’t plug them in. Nonetheless, a good Thunderbolt 3 dock is a necessity in my opinion. You could survive with dongles but my OCD starts flaring up when I have to dig around my backpack for adapters. I’m just not a fan. I love how Apple dedicated themselves to a fast I/O like USB-C/Thunderbolt 3, but I really wish they gave it another year. Just one old-school USB port would have been nice. I might have even gotten over no SD card reader.
The Touch Bar
I like it. I would even say that I love it — in the apps that are compatible. Right now there aren’t many. Adobe released an update to Adobe Photoshop that added compatibility with the Touch Bar, and it is really handy especially when you don’t have your Wacom tablet available (or a USB dongle to attach it). I love how it gives access to so many levels of functionality to your tools within your immediate reach.
It has super-fast feedback. When I adjusted the contrast on the Touch Bar I found that the MacBook Pro was responding immediately. This becomes even more evident in FCPX and the latest Resolve 12.5.4 update. It’s clear Apple did their homework and made their apps like Mail and Messages work with the Touch Bar (hence emojis on the Touch Bar). FCPX has a sweet ability to scrub the timeline, zoom in to the timeline, adjust text and more from just the Touch Bar — it’s very handy, and after a while I began missing it when using other computers.
In Blackmagic’s latest DaVinci Resolve release, 12.5.4, they have added Touch Bar compatibility. If you can’t plug in your color correction panels, the Touch Bar does a nice job of easing the pain. You can do anything from contrast work to saturation, even adjust the midtones and printer lights, all from the Touch Bar. If you use external input devices a lot, like Wacom tablets or color correction panels, the Touch Bar will be right up your alley.
One thing I found missing was a simple application launcher on the Touch Bar. If you do pick up the new MacBook Pro with Touch Bar, you might want to download Touch Switcher, a free app I found via 9to5mac.com that allows you to have an app launcher on your Touch Bar. You can hide the dock, allowing you more screen real estate and the efficient use of the Touch Bar to launch apps. I am kind of surprised Apple didn’t make something like this standard.
From a purely superficial and non-scientific point of view, the newly updated P3-compatible wide-gamut display looks great… really great, actually. The colors are rich and vibrant. I did a little digging under the hood and noticed that it is an 8-bit display (data that you can find by locating the pixel depth in the System Information > Graphics/Display), which might limit the color gradations when working in a color space like P3 as opposed to a 10-bit display displaying in a P3 color space. Simply, you have a wider array of colors in P3 but a small amount of color shades to fill it up.
The MacBook Pro display is labeled as 32-bit color meaning the RGB and Alpha channels each have 8 bits, giving a total of 32 bits. Eight-bit color gives 256 shades per color channel while 10-bit gives 1,024 shades per channel, allowing for much smoother transitions between colors and luminance values (imagine a sky at dusk going smoothly from an orange to light blue to dark blue — the more colors per channel allows for a smoother gradient between lights and darks). A 10-bit display would have 30-bit color with each channel having 10 bits.
I tried to hook up a 10-bit display, but the supplied Thunderbolt 3 to Thunderbolt 2 dongle Apple sent me did not work with the mini display port. I did a little digging and it seems people are generally not happy that Apple doesn’t allow this to work, especially since Thunderbolt 2 and mini DisplayPort are the same connection. Some people have been able to get around this by hooking up their display through daisy chaining something like a Thunderbolt 2 RAID.
While I couldn’t directly test an external display when I had the MacBook Pro, I’ve read that people have been able to push 10-bit color out of the USB-C/Thunderbolt 3 ports to an external monitor. So as long as you are at a desk with a monitor you can most likely have 10-bit color output from this system.
I reached out to Apple on the types of adapters they recommend for an external display and they suggest a USB-C to DisplayPort adapter made by Aukey. It retails for $9.99. They also recommend the USB-C to DisplayPort cable from StarTech, which retails for $39.99. Make sure you read the reviews on Amazon because the experience people have with this varies wildly. I was not able to test either of these so I cannot give my personal opinion.
In the end, the new MacBook Pro is awesome. If you own a recent release of the MacBook Pro and don’t have $3,500 to spare, I don’t know if this is the update you will be looking for. If you are trying to find your way around going to a Windows-based PC because of the lack of Mac Pro updates, this may ease the pain slightly. Without more than 16GB of memory and an Intel Xeon or two, however, it might actually slow you down.
The battery life is great when doing light work, one of the longest batteries I’ve used on a laptop. But when doing the heavy work, you need to be near an outlet. When plugged into that outlet be careful no one yanks out your USB-C power adapter as it might throw your MacBook Pro to the ground or break off inside.
I really do love Apple products. They typically just work. I didn’t even touch on the new Touch ID Sensor that can immediately switch you to a different profile or log you in after waking up the MacBook Pro from sleep. I love that you can turn the new MacBook Pro on and it simply works, and works fast.
The latest iteration of FCPX is awesome as well, and just because I don’t see it being used a lot professionally doesn’t mean it shouldn’t be. It’s a well-built NLE that should be given a fairer shake than it has been given. If you are itching for an update to an old MacBook Pro, don’t mind having a dock or carrying around a bunch of dongles, then the 2016 MacBook Pro with the Touch Bar is for you.
The new MacBook Pro chews through ProRes-based media from 1920×1080 to 4K, 6K and higher will play but might slow down. If you are a Red footage user this new MacBook Pro works great, but you still might have to knock the debayering down a couple notches.
Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at firstname.lastname@example.org. Follow him on Twitter @allbetzroff.
For seasoned picture editor Joe Walker, ACE, his work with directors Denis Villeneuve and Steve McQueen might best be described as “three times a charm.” His trio of successes with Villeneuve include the drug enforcement drama Sicario, the alien visitor film Arrival and the much-anticipated, upcoming sci-fi drama Blade Runner 2049, which is currently in post. His three films with McQueen include Hunger, Shame and the 2014 Oscar-winner for Best Picture 12 Years a Slave, which earned Walker a nomination for his editing work.
In addition, he has worked on a broad array of films, ranging from director Michael Mann’s cyber thriller Blackhat to writer/director Rupert Wyatt’s The Escapist to director Daniel Barber’s Harry Brown to writer/director Rowan Joffe’s Brighton Rock, which is a reworking of the Graham Greene classic.
We are currently in midst of awards season, and recently Paramount’s Arrival received eight Oscar noms, including Best Director and a Best Editing nod for Walker. The film was also nominated for nine BAFTA Award nominations, including Best Picture Editing, Best Director and Best Film. It has also been nominated for an American Cinema Editors Eddie in the Best Edited Feature Film — Dramatic category. (Read our interview with director Denis Villeneuve here.)
“My approach to all the films I have edited is to find the basic ‘rhythm’ of a scene,” Walker concedes. His background as a sound designer and composer enhance those sensibilities, in terms of internal pacing, beat and dramatic pulse.
The editor’s path toward Villeneuve began at a 2010 screening of Incendies in his native London. ”I was blown away and set my heart on working with this director. That same heart was beating out of my chest a few years later watching 2014’s Prisoners. While finishing Michael Mann’s Blackhat in 2015, my agent got me into the room with Denis for Sicario, which had a very solid script. That evolution felt like it was going in the right direction for me. Cinematographer Roger Deakins produced stunning work — he’s also cinematographer on Blade Runner 2049.” (Deakins was nominated for both Oscar and BAFTA Awards for Sicario.)
For Arrival, Walker’s biggest challenge was reconciling the two parallel worlds that existed within the evolving dramatic arcs. While several alien spacecraft land around the world, a linguistics expert (Amy Adams) is recruited by the military to determine whether they come in peace. “On the one hand we have the natural setting of the mother/daughter relationship, with beautiful, intimate material shot by a lakeside near Montreal, and the narrative content on a far lower gas,” explains Walker. “That’s pitted against the high-tech world of space ships as we learn more about the alien visitors and the psychological task faced as the lead character tries to decode their complex written language. Without CGI visuals of the Heptapods — the multi-limb visitors — I had to make early decisions about what space to leave in a scene for their eventual movements. From what was shot on set, all we had were puppeteers holding tennis balls on a stick.”
Walker saw every Arrival daily and started his cut early. “We had to turn over the Heptapod sequences to Montreal VFX house Hybride almost as soon as the director’s cut began,” he says. “And because, for me, sound always drives a lot of what I do, I brought on creature sound designer Dave Whitehead ahead of the game. I’d been impressed by Dave’s work on [Neill Blomkamp’s] District 9. I needed to know what type of sounds would be used for the aliens, and cut accordingly. He developed a coherent language with an inbuilt syntax and really nailed the ‘character’ of the Heptapods. I laid up his sounds onto tracks in my Avid Media Composer and they stayed pretty much unchanged all the way through post.”
In terms of pace and narrative arcs, Walker states that director Villeneuve “chose to starve the audience of information and just offer intriguing nuggets, teasing out the suspense and keeping them waiting for the pay off. For example, on one scene we hold on Amy Adams’ face watching the breaking news on the TV rather than the TV show itself,” which was reporting the mysterious spacecraft touching down in 12 cities. “Forest Whitaker [US Army Colonel Weber] plays our first audio of the Heptapods on a Dictaphone and it stimulates such curiosity about how they may look or behave. We avoided any pressure of cutting for the sake of cutting. Instead, we stayed on a shot, let it play and did not do all the thinking for the audience. While editing 12 Years a Slave, we stay on the hanging scene and don’t cut away. There’s no relief, it allows the audience to be truly troubled by the horrible inertia of the scene.”
Again, the word “rhythm” figures prominently within Walker’s creative vocabulary. “I always try to find the rhythm of a scene — one that works with the sounds and music elements. For Sicario, I developed peaks and troughs in the dramatic flow that supported different points of view” as the audience slowly begins to understand the complexity of the drug enforcement campaign. “Bad sound disturbs me, including distorted or widely variable dialogue levels. I always work hard to get the best out of the production tracks, perhaps more than I really have time for.
“With both Steve McQueen and Denis Villeneuve, I’ve always tried to avoid using music temp tracks, so that we do not become too influenced during the editing process,” he continues. “By holding off until we’re late into a final cut, we can stay critical in our judgments about the story and characters. When brought in later, music becomes a huge bonus since you’ve already been ruthless with the story. You use music only where it’s absolutely necessary, allowing silence or sound effects to have their day. I think composers want the freedom of a blank canvas. Otherwise, as the English composer Matthew Herbert once said, ‘Music is in an abusive relationship with film.’”
Changing Direction During Edit
While cutting Arrival, Walker recalls that one key scene took a dramatic left turn. “As scripted and shot,” he explains, “the nightmare sequence started out as a normal scene in which Amy Adams’ character, Louise, is visited in her quarters by colleague Ian [Jeremy Renner] and her boss, Colonel Webber, who decides to bench her. This was the beginning of a long piece of story tubing, which felt redundant. We’d tried to discard it, but the scene had an essential piece of information that we couldn’t live without: the notion that exposure to a language can rewire your mind.
“We thought about conveying that information elsewhere as voiceover or ADR, but instead, as an experiment, we strung together very crudely only the pieces we needed, thereby creating at one point a jarring join between one line of Ian’s dialogue and another. I always try to be ballsy with material, to stay on it with confidence or maul it, to tell the story a better way.”
In that pivotal scene in Arrival, during a close-up, Adams’ character is looking off-camera toward Whitaker. “But we never cut to him because it would take us down the path we wanted to avoid,” explains Walker. “As it happened, that same day in the cutting room, we saw the first test shots from Hybride’s VFX team of an alien crawling forward, looking like an elephant shrouded in mist. That first look inspired our decision to hold onto Adams’ off-camera look for as long as we could, and then — instead of going to a matching reverse revealing Forest Whitaker — we cut to this huge alien crouching in the corner of her bedroom.
“The scene was rounded off by a shot of Amy’s character waking up and looking utterly thrown. We kept the jarring cut [from Ian and then back to him], and added the incongruous sound of a canary, since it signaled early on that all is not as it seems. A nightmare was a great way to get inside Louise’s head. Ian’s presence in her dream also platforms their romance, which enters so late in the story. Normally, returning material to a cut can feel like putting wet swimming trunks back on, but here it set our minds alight.”
Adams’ performance throughout Arrival was thrilling to cut, says Walker. “She is very real in every take and always true to character, keeping her performance at just the right temperature for each scene. Every nuance counts, particularly in a film that has to hold up to scrutiny on a second or third viewing when more is understood about the true nature of things. To hold the audience’s attention in a scene, an editor’s craft involves a balance between time and tension.”
Walker says, “Time is our superpower since we can slow a moment down, speed it up or jump from one shard of a timeline to another. In Arrival we had two parallel worlds: the real-life world of the army camp with all the news on TVs and heavy technology. In opposition is the child’s world of caterpillars and nature. I could cut those together at will and flip quickly from one to the other.”
Walker says that after the 10-week shoot for Arrival, he spent a week finalizing his editor’s cut and then 10 to 14 weeks on the director’s cut with basic CGI. “We then went through test screenings as the final photorealistic CGI elements slowly took shape,” he recalls. “We refined the film’s overall pace and rhythm and made sure that each tiny fragment of this fantastic puzzle was told as well as we could. I consider the result to be really one of the most successful edits I have been involved with.”
The American Cinema Editors (ACE) have named the nominees for the 67th ACE Eddie Award, which recognize editing in 10 categories of film, television and documentaries.
Winners will be announced during ACE’s annual awards ceremony on January 27 at the Beverly Hilton Hotel. In addition to the regular editing awards, J.J. Abrams will receive the ACE Golden Eddie Filmmaker of the Year award.
Check out the nominees:
BEST EDITED FEATURE FILM (DRAMATIC) Arrival
Joe Walker, ACE
John Gilbert, ACE
Hell or High Water
Manchester by the Sea
Jennifer Lame Moonlight
Nat Sanders, Joi McMillon
BEST EDITED FEATURE FILM (COMEDY) Deadpool
Julian Clarke, ACE
The Jungle Book
Mark Livolsi, ACE
La La Land
Tom Cross, ACE
BEST EDITED ANIMATED FEATURE FILM Kubo and the Two Strings
Christopher Murrie, ACE
Jeff Draheim, ACE
Fabienne Rawley and Jeremy Milton
BEST EDITED DOCUMENTARY (FEATURE)
The Beatles: Eight Days a Week — The Touring Years
OJ: Made in America
Bret Granato, Maya Mumma and Ben Sozanski
Eli B. Despres
BEST EDITED DOCUMENTARY (TELEVISION) The Choice 2016
Steve Audette, ACE
Everything Is Copy
Bob Eisenhardt, ACE
We Will Rise: Michelle Obama’s Mission to Educate Girls Around the World
BEST EDITED HALF-HOUR SERIES Silicon Valley: “The Uptick”
Brian Merken, ACE
Veep: “Morning After”
Steven Rasch, ACE
BEST EDITED ONE-HOUR SERIES — COMMERCIAL Better Call Saul: “Fifi”
Skip Macdonald, ACE
Better Call Saul: “Nailed”
Kelley Dixon, ACE and Chris McCaleb
Mr. Robot: “eps2.4m4ster-s1ave.aes”
This is Us: “Pilot”
David L. Bertman, ACE
BEST EDITED ONE-HOUR SERIES – NON-COMMERCIAL The Crown: “Assassins”
Yan Miles, ACE
Game of Thrones: “Battle of the Bastards”
Tim Porter, ACE
Stranger Things: “Chapter One: The Vanishing of Will Byers”
Stranger Things: “Chapter Seven: The Bathtub”
Kevin D. Ross
Westworld: “The Original”
Stephen Semel, ACE and Marc Jozefowicz
BEST EDITED MINISERIES OR MOTION PICTURE (NON-THEATRICAL) All the Way
Carol Littleton, ACE
The Night Of: “The Beach”
Jay Cassidy, ACE
The People V. OJ Simpson: American Crime Story: “Marcia, Marcia, Marcia”
Adam Penn, Stewart Schill, ACE and C. Chi-yoon Chung
BEST EDITED NON-SCRIPTED SERIES: Anthony Bourdain: Parts Unknown: “Manila”
Hunter Gross, ACE
Anthony Bourdain: Parts Unknown: Senegal
Deadliest Catch: “Fire at Sea: Part 2”
Josh Earl, ACE and Alexander Rubinow, ACE
Final ballots will be mailed on January 6, and voting ends on January 17. The Blue Ribbon screenings, where judging for all television categories and the documentary categories take place, will be on January 15. Projects in the aforementioned categories are viewed and judged by committees comprised of professional editors (all ACE members). All 850-plus ACE members vote during the final balloting of the ACE Eddies, including active members, life members, affiliate members and honorary members.
In the scheme of things, we work in a very small industry where relationships, work ethic and talent matter. Brian Scofield is living proof of that. He is one of a team of editors who worked on Warren Beatty’s recent Rules Don’t Apply.
That team included lead editor Billy Weber, Leslie Jones and Robin Gonsalves. It was the veteran editor Weber (Beatty’s Bulworth 1998) who brought Scofield on board as a second editor.
Weber was Scofield’s mentor while he was in the MFA program at USC. “Not long after I completed graduate school, Billy helped me reconnect with the Malick camp, who I met while working in the camera crew on Tree of Life,” he explains. “I then became an apprentice on To the Wonder, and then an editor on Knight of Cups. When Billy came in as an advisor at the end of Knight of Cups, we reconnected in LA. He had just begun working on Rules Don’t Apply with Warren, and when I finished my work on Knight of Cups, he brought me aboard.”
Scofield recognizes that relationships open doors, but says you have to walk through them and prove you belong in the room all by yourself. “I think people often make the mistake of thinking that networking trumps talent and work ethic, or the other way around, and that just isn’t true. All three are required to have a career as a film editor — the ability to form lasting relationships, the diligence to work really hard, and having natural instincts that you’re always striving to improve upon.”
Scofield says he will always be grateful to Weber and the example he’s set. “I’m only one of over a dozen people whose careers Billy has helped launch over the years. It’s in large part his generosity and mentorship that inspires me to pay it forward any chance I get.”
Let’s find out more from Scofield about his editing process, what he’s learned over the years, and the importance of collaboration.
You have worked with two Hollywood icons in Terrence Malick and Warren Beatty. I’m assuming you’re not easily intimidated.
It’s been a transformative experience in every way. These two guys, who have been making films for over 40 years, are constantly challenging themselves to try new things… to experiment, to learn. They’re always re-evaluating pretty much everything from the story to the style, and yet these are two guys with such distinct voices that really shine through their work. You know a Malick or Beatty film when you see it. The Inexhaustibility of the cinematic art form, I guess, is what I really took away from both of them.
They are both very different kinds of filmmakers.
You would never think that working on a Terrence Malick film would prepare you to work on a Warren Beatty film. Knight of Cups is a stream-of-consciousness, meditative tome about the meaning of life. Warren’s film is a romantic comedy with a historical drama slant. Aesthetically, they’re very different films, but the process of constantly finding ways to break open the movie all over again, and the mindset that requires, is very similar.
Both Terry and Warren are uncompromising and passionate about making movies the way they want and not bending to conventions, yet at the same time looking for ways to reach people on a very deep level. In this case, both films were also deeply personal for the director. When you work on something like that, it adds another layer of pressure because you want to honor how much of themselves they’re willing to put into their work. But that’s also where I believe the most exciting films come from. That pressure just becomes inspiration.
How early did you get involved on Rules Don’t Apply?
Right after production wrapped. I was finishing up with Terry on the mix stage for Knight of Cups when Billy called. They had an assembly of the film when I joined — everything was in there — and that version was probably about four hours long. Interestingly, some things have changed dramatically since that version and some are remarkably similar.
I was on Rules Don’t Apply for just over a year, but I’ve been back several times since officially finishing. I took a good amount of time off and went back, and since then I’ve popped in and out whenever Warren has needed me. Robin became a true caretaker of the film, staying with Warren through that additional time leading up to the release.
Is that typically how you’ve worked? Coming in after there’s an assembly?
I’ve come in as an additional set eyes on some, and I’ve been on films during production, sending cuts to the director while they’re in the middle of shooting. This includes giving feedback on pick-ups they need to grab or things to be wary of performance-wise, those types of things.
Both are thrilling experiences. It’s fun to come in when there has been one specific approach and they’re open to new ideas. You kind of get to shake people out of the one way they’ve been going about the film. When I’m the editor that’s been working on the film since the beginning, that initial discovery period when you see the film take shape for the first time is always thrilling. The relationship you form with both the film and the director is hard to beat. But then, I’m always excited for someone to come in and shake things up, to help me think differently. That’s why you do feedback screenings. That’s why you bring other editors into the room to take a look and to make you think about things from a different angle.
How was it on Rules Don’t Apply?
When I came on, so much of it was working really well from the first assembly, but I did want to strengthen the love story between Frank and Marla and make their attraction more evident early in the film so that it paid off later. I started by going through all of the scenes and looking for little moments where we could build up glances between them or find little raindrops before the storm of that budding relationship.
There were a few storylines going on at the same time as well?
The story takes place over a long period of time — you’ve got Warren Beatty playing Howard Hughes, you’re dealing with a young love story, you’re dealing with an incredible supporting cast, all of whom could be bigger characters or smaller characters. When you come in a little bit later, it’s often your job to help figure out which storylines or themes are going to become the main thrust of the movie.
So there are different definitions of co-editor?
Well, it varies every day. Some days Warren would want to work on a couple of different scenes, so one editor would take one and I would take the other. Sometimes you would have worked on a scene for a long time and somebody else would say, “Let me have a stab at that. I’ve got a different idea.” Sometimes we were all together in one room with one of us driving the Avid and the others offering a different set of eyes — eyes that aren’t staring at the timeline — and they’re looking at it side-by-side with the director, almost as a viewer instead of within the nitty-gritty of making the cut. We would take turns doing that.
You’ve got to check your ego at the door, I suppose? Everybody’s on the same team these days.
There’s no pecking order, and I think Billy Weber is really the one who sets that tone because he’s such a generous and experienced editor and man. There are people out in the industry that might be protective of their work versus letting anybody else touch it, but there’s none of that in any of the editing rooms that I’ve been fortunate enough to work in. Everybody’s respectful of each other.
On this film we had Billy, myself, Leslie Jones and Robin all working at the same time. You’ve got almost three generations of editors in that room, and to be treated as an equal really opens up your mind and your creativity. You feel the freedom to really present big ideas.
How is it collaborating with Warren?
He is such a unique guy. His favorite thing to do is to have a fight — he doesn’t want people who are just going to accept what he says. He wants a fiery debate, which can make people uncomfortable, but I’m okay with it. I actually really enjoyed that, especially when you realize he’s not taking it personally and neither should I. This is about making a movie the best that it can be. He wants people that are going to challenge him and push back.
So it’s part of his creative process?
Yes, it’s all about the discourse. If he has a strong point of view, he wants to argue it to make sure that he really believes it. And if you have a strong point of view, he wants you to be able to tell him why. I would say the fiercest fights led to him being most happy afterwards. At the end of the screaming, he would always say, “That was such a productive conversation. I’m so glad we did that!” He surrounds himself with people he knows he trusts. He knows that’s what he needs to make him as productive and as creative as he can be.
It’s been a long time since Warren directed a film, how did he react to the new technology?
He was thrilled with all of the new abilities of technology. This movie was shot on the Alexa, for the most part, and we did do a good amount of combining it will archival footage. This is a very modern movie in many ways, but it also has a distinctive throwback vibe. We had to try to marry those things without going overboard.
We resized frames, added a few push-ins, speed ramps, and so on. Ultimately, all of these tools just allowed him to explore the footage even more than he’s used to doing. He really loved taking advantage of new editorial opportunities that couldn’t have been done even 15 years ago, at least not as easily.
How do you organize things within the Avid Media Composer?
Any time I start a new job, I send a Google Doc to the assistant that specifies exactly how I want the project set up. It’s an evolution of things I’ve learned in different editing rooms over time.
For every scene, I have a bin with a frame view. If the bin is the size of my monitor, I should be able to see all clips in that one view without scrolling. Each set-up is separated from each other, so I can see very quickly, “Oh there are four takes of that shot, there are four takes of that shot, there are three takes of that one.” I have the assistant prepare three sequences: one that’s just a pure string-out of all of the clips, so I can, in one sequence, scrub through everything that’s there. I do a string-out “clean,” which is when you take out all the slates and you take out all the director’s talking, so I can be impartial and just look at the footage. Then I usually have one more sequence that’s just circle takes that the director chose on set. Then I go through and I make a select reel based off of everything that I watch. That’s the basic bin set-up.
For films that have multiple editors, organization is really important because somebody else has to be able to understand how your work is organized. You have to be able to find things that you did a year ago.
Any special tricks, like speed ramps, sound effects, transitions? I’m imagining that changes per project?
Yeah, it’s pretty unique to the project. There are a lot of editors who have specific effects that they go back to over and over again in their own bin. I’ve got a few of those, but I almost always end up tailoring them and sometimes just starting from scratch. I go on the hunt for the right effect when I need it.
I’ve gotten pretty adept at tailoring the built-in effects to my needs as they come up, but people who use those effects all the time are working on more crazy action or stylized films because they’ve got a lot more demand for those than when you’re working on character-driven content.
Do you typically work with a template from a colorist, or do you do any temp color corrections yourself?
Most of the films have a look that the DP has already applied, and I do tweaking as needed. If we come up with a creative reason for color correction, I’ll do a sketch. I do a lot of work with sound, but with color, it just depends. If it needs to be changed in order to understand what the idea is or if we’re screening it for somebody that we don’t trust to be able to see what it is without color correction, then of course we’re going to go in and we’re going to tweak it. I’ve worked on a film where all the exteriors were really magenta, so we came up with our kind of default fix to be applied to all of those shots.
Can you elaborate on the sound part?
I cut as much for sound as I do for picture. I think people grossly underestimate the influence that sound has on how you watch a movie. I’m not a sound designer, but I try my best to provide a sketch for when we go into that next phase so the sound designer has a pretty clear idea of what we’re going for. Then, of course, they use their creativity to expand and do their own thing.
How do you work with your assistant editors? Do you encourage them to edit, or are they strictly technical?
It depends on the project and on the timeframe of the project. In the beginning, the priority is on getting everything set up. Then the priority is on helping me build a first sound pass after we’ve gotten an assembly. They help bring in effects and to smooth over things I’ve sketched out. Sometimes they’re just gathering effects for me and sometimes they’re cutting them themselves. Sometimes we’re kind of tossing them back and forth. I do a rough pass and I ask them to mix it, clean up the levels, add in a couple accents here and there. Once we’re through with that we kind of have at least a ground floor for sound to cut with.
When given the opportunity, I love to let my assistants get creative. I let them take a stab at scenes, or at least have them be present in the room to give feedback. When the director isn’t present, I rely a lot on my assistant just to check in and say, “Hey, is this crazy?” or try to engage them as much as I can in that creative process. It all just depends on the demands of the project and the experience level of the assistant.
Is there anything you would like to add?
Film is a collaborative art form, and in order to help a director do their best work, you need to be their friend, their antagonist, their therapist, their partner. Whatever it takes is what your job is. I was so fortunate to learn an enormous amount from Warren, but also from my fellow editors. I hope everybody has as much fun watching this crazy little movie as we did making it.
Finally, I’d just love to say that working with Warren will undoubtedly be one of the most cherished experiences of my life. Reputations be damned, he’s a kind, brilliant and uncompromising artist who it was endlessly inspiring to spend so much time with. I’ll forever be grateful I had the opportunity to both work for him and to call him a friend.
Main Image: Robin Gonsalves, Warren Beatty and Brian Scofield.
TwoPoint0 has added two veteran editors to its New York-based studio: David Cornman and Debbie McMurtrey.
Cornman is a commercial editor who has cut comedy, effects-driven, dramatic and documentary-style spots for clients such as AIG, GE, Accenture, Bank of America, Staples, Verizon and Computer Associates. He has won awards from the AICE, AICP, Clio and Addys, and he has an Emmy nom in the Best Commercial category.
Cornman’s recent projects include a package of Crayola spots for McGarry-Bowen and P&G work out of Havas, as well as a several digital projects for Facebook’s Creative Shop. A recent passion project included shooting and editing a piece for Atria Senior Living in Rye Brook, New York, which gave residents the chance to try rowing for the first time. Rowers ranged in age from 85-97. “That was fun to be part of,” he says.
McMurtrey started her career at Crew Cuts in 1999. In 2007, she was hired as the first editor at Nomad’s East Coast office. From there she worked at Cutting Room, Red Car and Alkemy X. In addition to spots and branded web content, she has also cut short films that have screened in over 30 festivals, a sitcom pilot for VH1, and parody commercials for Saturday Night Live. She recently collaborated with director/producer Greg Kohs on his feature documentary, The Great Alone, which chronicles the comeback journey of four-time Iditarod champion Lance Mackey. McMurtrey considers her specialty to be docu-style. She excels at taking raw footage and finding the narrative in order to shape the story. She also enjoys editing dialogue and comedy.
McMurtrey has recently worked with director Zack Resnicoff of Impressionista Films on three campaigns for Fisher Price, including 20 individual spots.They have previously worked together on projects for Macy’s, Blue Cross and the Centers for Disease Control and Prevention (CDC). Other recent projects completed by McMurtrey include the “We the Voters” campaign and a series of films for Stephens Bank, including a bio of Alexander Hamilton. She has also edited projects this fall for Facebook, Hewlett Packard and Nintendo.
To view Cornman’s and McMurtrey’s reels on the studio’s site.
Basic color correction is rapidly becoming a skill that is expected of an editor, or even an assistant editor. If you have had the luxury of using a colorist and/or an online editor, you have probably seen them use apps such as Blackmagic Resolve, Avid Symphony, FilmLight’s Baselight or other color grading tools. These systems have so many levels of intricacy that without years of experience in color correction, most editors’ knowledge starts at the beginning stage.
If you are an editor looking to do basic color correction, slight secondary correction and, maybe, even a creative grade, you probably want to stay inside of your NLE, whether it’s Adobe Premiere, Apple FCPX, Avid Media Composer, Magix Vegas, or even After Effects. This is where NewBlueFX’s latest color correction and grading plug-in comes into play.
Featuring over 60 different looks (sometimes referred to as creative LUTs or preset color grades), skin tone isolation and the ability to isolate regions of an image for the video scopes to analyze, New Blue ColorFast 2 is a modest color correction app without the overwhelming toolset of a full-fledged color correction application.
ColorFast 2 costs $99 and works in apps like Vegas Pro 10+, Resolve 11+, Premiere CS6/6.5/CC, After Effects 5+, FCPX, Media Composer/Symphony 6+ and Grass Valley Edius 7 and 8. If you are using apps like Resolve you probably would only use ColorFast 2 for its preset looks since you already have access to all of the color correction tools included in the plug-in — unless you like the region isolating feature for the video scopes, something I find really intriguing.
ColorFast2 RGB Scope and the Lumetri RGB scope.
Most people reading this review will probably want to know why they should buy ColorFast 2 when Premiere Pro has a lot of these features built into their Lumetri color correction tools. To be honest, there are only a few things that ColorFast 2 has that Premiere, or other apps for that matter, don’t have: region-controlled video scopes, skin color isolating and NewBlueFX’s color presets. You should really check out NewBlueFX’s product page for ColorFast 2 to see some more examples of the color presets and download a trial for yourself.
Right off the bat, I felt that stacking ColorFast 2 after the Lumetri color correction tools in the effects panel in Premiere is the proper order of operations. If you are familiar with LUTs and how the chain of command works, you probably have experimented with color correcting before and after the LUT is applied.
Typically, a LUT gives the colorist a good starting point to grade from, but these days you may see creative LUTs. If the creative LUT doesn’t quite look right you will want add color correction first in the chain of command and then the LUT. This is how I would work with ColorFast 2 and Lumetri color correction tools. You will be correcting the footage to work with your creative LUT instead of correcting the LUT, which most of the time will give you inadequate results. Long story short: stack your ColorFast 2 effect after Lumetri tools in the effects window and then fine-tune the Basic Correction settings with your ColorFast 2 preset to get a great color grade.
The ColorFast2 waveform with isolated scope region.
I was excited to check out the video scopes inside of ColorFast 2, so I jumped to the bottom where the Region Scopes twirl-down menu is. Under that is the Video Scopes menu, which contains Vectorscope (Classic), Vectorscope (Color), Vectorscope (Sat, RGB Parade), Waveform and Histogram. The real beauty is that NewBlueFX gives you the ability to isolate a square region of your footage to be output through the video scope. This allows you to pinpoint your correction a little easier, and I really love this feature… but I also noticed that when you have both the Lumetri video scopes, as well as the ColorFast 2 scopes there is a discrepancy in values. I tended to like the Lumetri video scopes a little better. In fact, they go all the way up to 100, where the ColorFast 2 scopes only go up to 80 — this could very well be a bug in the compatibility between ColorFast 2 and the new Adobe Premiere CC 2015.4.
One issue I found with the ColorFast 2 scopes was that I couldn’t move the actual scope around or have more than one on at a time. While the region selection is an awesome feature, being able to see your full image is sometimes more important, so that is why I would probably stick to the NLEs built-in scopes.
Primary, Secondary, Output Correction Menus
Going back to the top of the ColorFast 2 Effect Editor menus, up first is the Primary Correction twirl-down menu. Here you can quickly white-balance your footage with an eyedropper, even keyframe it. In addition, you can adjust the White Strength, White Tweak (fine-tune control of the white color), Hue, Saturation, Exposure, Brightness and Film Gamma. A problem I encountered was that if you do a primary color correct on your image and then choose a color preset, all of your primary work gets reset, which is a real bummer if you want to correct and then grade your footage. So, if you want to work in ColorFast 2 in a more traditional way, where you color correct then color grade, you may want to do it in two separate effects. Moreover, you may want to primary color correct inside of the Lumetri tools then stack the ColorFast 2 on top.
Next up is the Secondary Correction twirl-down menu, which gets you into the real meat and potatoes of the plug-in. There is a helpful “Show Mask” drop down that will allow you to isolate and view Highlights, Midtones, Shadows, Skin Color Mask and a Shape Mask. Inside each of these you can adjust Tint, Saturation, overall Level, and even enable and disable this secondary if you want. Further down in the secondary menu you can adjust the High, Mid and Shadow thresholds (basically transitions from high to mid or mid to shadow), and even the blending and spread.
While still in the secondary twirl-down menu you can jump into the Skin Mask, which will quickly help you identify skin color, soften imperfections and even help keep skin color fidelity while adjusting the rest of your image.
The last menu is the Output Correction twirl-down. Here you can do a widespread correction that lands after the fine-tuning. You can adjust overall Saturation, Exposure and Brightness.
In the end, I think ColorFast 2 is best suited for people who want a quick color grade by applying a preset look but who also want a little ability to fine-tune that look. ColorFast 2 has some pretty good-looking presets like Vintage, Fallout, Gotham and even some black and white presets like B&W Ink. It’s even more fun to go and purposely change your white balance to something crazy, like a deep purple, for interesting grades. You should definitely try NewBlueFX’s ColorFast 2 if you are looking for some additional creative grade looks while still being able to individually tweak the output.
Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at email@example.com. Follow him on Twitter @allbetzroff.
VideoStitch is offering a new version of its 360 video post software VideoStitch Studio, including support of ProRes and the H.265 codec, rig presets and feathering.
“With the new version of VideoStitch Studio we give professional 360 video content creators a great new tool that will save them a lot of valuable time during the post production process without compromising the quality of their output,” says Nicolas Burtey, CEO of VideoStitch.
VR pros are already using VideoStitch’s interactive high-resolution live preview as well as its rapid processing. With various new features, VideoStitch Studio 2.2 promises an easier and faster workflow. Support of ProRes ensures a high quality and interoperability with third parties. Support of the H.265 codec widens the range of cameras that can be used with the software. Newly added rig presets allow for quick and automatic stitching with optimal calibration results. Feathering provides for improved blending of the input videos. Also, audio and motion synchronization has been enhanced so that various inputs can be integrated flawlessly. Lastly, the software supports the latest Nvidia graphics card, GTX-10 series.
VideoStitch Studio 2.2 is available for trial download at www.video-stitch.com. The full license costs $295.
I’ve been trying to get my hands on a professional drone to review for a few years now. My wife even got me a drone from a local store that was a ton of fun to play with but was hard to master. For years, I’ve been working on television shows that use drone footage and capture incredible imagery, but it always seemed out of reach for me as an editor. Finally, after much persistence (or pestering, depending on who you ask), DJI agreed to send me the Phantom 4 to test out, and boy is it awesome!
By now you’ve probably made your way through the ubiquitous reviews, including the endless supply of YouTube reviews, but in that small chance you are reading this without much prior drone knowledge and work in production or post production, I have some ideas for you.
When reading this review, think about how you could take a drone, run outside and maybe grab some b-roll for something you are working on. If you create opening titles or sizzle reels, you could grab some great aerial shots or fast-paced shots to use as transitions. The possibilities are really endless, as long as you get your video picture settings dialed in.
Before I started as an online editor (which, for those who don’t know, focuses on the technical side of editing — color correction, grading, transcoding, outputting, exporting, anything that ends in “-porting” or “-linking” basically), I worked my way through being a post coordinator, post production supervisor and all the way to offline editor. One thing I noticed in many of the non-union live-to tape shows (like late night comedy or talk shows), is that the editor has a lot of freedom to be creative and can push the envelope a little.
Maybe the editor needs some b-roll for an edit that isn’t in the system, so as the post supervisor you might run out and shoot it yourself. Why not with a drone? If you need a quick aerial of a house from directly above, you might be able to get away with footage from your own drone, saving the project money while showing some talent that may get you more jobs in the future!
I really love the idea of people acquiring as much knowledge in different job positions as possible, whether you are in the craft service or executive producer, if you can do things like operate a camera, hold a boom mic or fly a drone, you will probably make a lasting impression and be known as someone who is hungry to work and to create a great end product, regardless of your position.
Not to be a total wet blanket and put a huge wrench in your drone flying, but there are some laws that recently have been passed (more like clarified) to standardize drone use between hobbyists and commercial fliers (basically someone who wants to make money from their footage). You should definitely check out the Federal Aviation Administration’s Getting Started page for more info.
If you are flying your drone for fun and as long as it is has the weight and footprint size of the DJI Phantom 4 — the weight is about 3lbs and it measures about 14 inches diagonally without propellers, which can add a couple of inches — there is minimal work that you need to do. However, if you are planning on making money from your drone footage, there are many steps you must take, including taking an official test. There is a a lot you need to know that is beyond the scope of this review, so definitely check out the FAA link above for more.
Easy to Use
Since I hadn’t flown a professional drone before I had nothing to compare it to, but I can tell you that I picked up the Phantom 4 and was flying it within five minutes. It really is that easy to get up and running.
Step 1, charge your remote and battery; Step 2, plug in your phone or tablet via USB to the remote; Step 3, attach propellers; Step 4, fly! You should probably boot up your Phantom before you go outside to check to make sure it is functional, and to update your firmware. As a side note, I’m not sure if I was up and running so quickly because the Phantom 4 I was loaned for review had been charged and used before, or if it was really that easy.
For this review, I really wanted to see how easy it was to get shots like wide sweeping pans and tilts or tracking shots, and it was relatively easy. Obviously, you will need to practice your camera work with the Phantom 4 to get nice shots that aren’t boring and have substance, but it’s pretty simple. I brought the Phantom 4 to an open field where I had tons and tons of space. I immediately turned on the Phantom 4 by pressing the power button once and then holding it down until it powered on, I forgot to download the DJI Go app to my iPhone 6, so after I downloaded it, I connected the USB to lightning cable from my iPhone 6 to the controller. While the iPhone 6 worked great, you do have minimal screen real estate with so many controls available, so I would suggest you use an iPad if you can or an iPhone 6 or 7 Plus. I tried using an iPad mini, but had trouble getting the Phantom 4 and the iPad to connect, so I stuck to the iPhone.
Once my propellers were spinning, I flew it straight up into the sky, I felt like a little kid with my first remote control car, except that the handling and precision that the Phantom 4 offers is exceptional. You can even take your hands off the joysticks and the Phantom 4 will hover. I noticed that once I got the Phantom 4 high in the air, I could hear it battle the winds. It really stuck to its position in the air even with some decent-strength gusts.
When I took the Phantom 4 out for a second time, I wanted to test out its upgraded collision avoidance system. I also wanted to test out my camera moves. The collision avoidance was awesome! Not only does it sense the ground beneath it, but objects in front of it. I started flying toward a basketball hoop and it caught it in its sights and maneuvered to the right. Then, with just one prior flight, I noticed I was really getting the hang of long shots while tilting and panning the camera — a real testament to how easy it is to control.
Keep in mind that the DJI Go app has a built-in flight simulator to help you get your moves and techniques down before you go outside. Unfortunately, you have to be connected to your drone while using the flight simulator, but still it’s pretty handy for practicing — something you should definitely use before you fly, even if your pride is telling you not to.
Beyond my pure joy at flying the Phantom 4 there are some fancy tech specs that you should know about. For my money, the DJI Phantom 4 really shows its worth in its camera, a 4K capable 1/2.3-inch CMOS image sensor, ISO range between 100-3,200 for video (100-1,600 for photos) and a shutter speed between eight seconds and 1/8000 of a second.
There are many different recording modes, including 4096×2160 (true 4K resolution) at 24/25 progressive frames per second, 3840×2160 (UHD) at 24/25/30p, 2704×1520 (2.7K) at 24/25/30p, 1920×1080 (HD) at 24/25/30/48/50/60/120p and 1280×720, for some reason, at 24/25/30/48/50/60p. All these resolutions are recorded at a max bit rate of 60Mbps, which is decent, but really should be higher in my opinion (probably more in the 100Mbps range).
In terms of image quality, the Phantom 4 is amazing for being a flying ship that captures video. However, it isn’t going to match cameras like the Sony a7S II, , Panasonic GH4 or Blackmagic Cinema Cameras, exactly. The Phantom 4 definitely rivals the GoPro Hero 5 Black in video quality, or at least gives them a good run for their money. The only problem is that the camera isn’t removable from the gimbal on the Phantom 4. I would really like a removable camera from the Phantom 4, much like the new GoPro Karma drone with its connection to the Karma Gimbal.
So after flying the Phantom 4 a few times I began to realize how volatile and important the picture and video profile settings are. The first time I recorded video I simply hit record. I was in Vivid mode, presumably at the baseline of Saturation, Sharpening and Contrast: 0,0,0. It looked great at first glance and for anyone who just wants to pick up the Phantom 4 and shoot you should probably just leave it at this or maybe knock the sharpness down to -1. If you plan on color correcting later or adding a creative LUT on top of your footage, then you are going to want a more flat-in-color image.
I thought the D-Log setting would be the way to go, as that should give you the flattest image in terms of saturation and exposure to pull the most life out of your image. Unfortunately, I found out that is not the case. I tried many variations of Saturation, Sharpness and Contrast from 0,0,0 to -3,-3,-3 and wasn’t really happy with any of them. After running through the usable color profiles (I’m omitting black and white and any other filters like that because you should really just go ahead and apply those looks while color correcting or editing since all NLEs have an easy way to add them), I found that D-Cinelike and None were the profiles I should really stay in, and I started to like Sharpness: -1, Contrast -2, and Saturation -2.
Before I go on about D-Cinelike and None, I think anyone buying a drone should consider ND filters (short for neutral density filters). When shooting outdoors you will get a lot of contrasting light values, such as dark shadows and blown out highlights. To get around having to pick your favorite, you can knock the exposure down on your camera externally with an ND filter while allowing you to keep your shutter speed and ISO values at more appropriate levels.
Without ND filters, you are going to have to ramp up the shutter speed on your Phantom 4 when filming using an ISO, such as 100, to properly expose your image, lending your footage to look a little choppier and less cinematic (I hate using the word cinematic to describe this, but essentially cinematic = motion blur in this instance).
If this sounds interesting to you, you should Google shutter speed techniques and rules, but be careful. It is a deep rabbit hole. From my simple research, I found ND filters ranging anywhere from $20 to $99 or more depending on quality and where you buy them. Polar Pro looks to make some sweet ones, including the Vivid Collection in their Cinema Series of polarized ND filters at $99 for a three-pack — another rabbit hole, be careful not to get G.A.S., Gear Acquisition Syndrome.
Moving on… D-Cinelike and None are flat color profile shooting modes that allow for decent color grading in post production but with less midtone muddiness like the D-Log seemed to produce for me. D-Cinelike seemed to warm up the shot a little with more orange and yellow tints and possibly less shadow detail. In None, I felt like I got the flattest color profile possible, which allowed for the best color correction and grading scenario with the Phantom 4 footage. Don’t forget to dial in your custom picture profile settings. Personally, I liked the picture best when I knocked Sharpness down to -1 or -2. Contrast and Saturation could also be knocked down a little, but this is something you should test when you buy a Phantom 4, since it is definitely a personal taste.
If you go on YouTube and search Phantom 4 color settings you will find a lot of videos. You should probably sort by upload date and watch the more recent videos that might take into account firmware updates. I really liked watching Bill Nichol’s YouTube Channel BillNicholsTV. He has a bunch of great and practical reviews.
You should still try out the Phantom 4’s D-Log mode. Hopefully, it works for you better than it did for me. If you use Blackmagic Resolve, you can check out DJI’s D-Log to sRGB LUT instructions and find the LUT under the software downloads here.
While I didn’t want to get too deep into the technical side of the Phantom 4, I did fall down the picture profile settings abyss and still want to highlight some automated flight modes that the Phantom 4 excels at. Some of the new features that separate the Phantom 4 from previous Phantom models include Active Track, TapFly, Obstacle Sensing System, Sport Mode, easier-to-use push and release propellers, up to 28-minute battery life (although I only got between 20-22 minutes with the Phantom 4 automatically returning to home when the battery was running low), improved camera with less chromatic aberration, and much more.
New Features That Editors Will Like
I now want to touch on the upgraded features that would get me, as an editor, interested in the Phantom 4. Active Track is an amazing feature that can track objects specified through the DJI Go app. You simply click the object or person you want to track and bam! The Phantom 4 will follow them from what DJI calls a “safe distance,” and it really is.
TapFly is another great feature that will help pilots who aren’t as comfortable flying in tight spaces to fly in a straight line. Simply tap the remote icon on your phone or tablet, tap TapFly, click on a visual point you want the Phantom 4 to fly to, and it will basically move into autopilot. You still have control over camera and even the Phantom 4, but it’s basically a coached flying system.
Again, there are a lot of technical specs I didn’t go into too much detail on, but if you want more info you can find it on DJI’s Phantom 4 page. For some simple and short videos check out: http://www.dji.com/edu/edu_videos or download the DJI Go app.
In the end, I really, really, really loved flying the Phantom 4! One of the easiest parts was installing the propellers — easy turn and lock. If you find yourself getting frustrated when filming or flying the Phantom 4, remember that it takes people many hours to get good at shooting with a camera, let alone a drone, with a camera and gimbal to control all at once. I spent many nights watching YouTuber’s reviews wondering why I couldn’t get a great picture out of the D-Log setting until I found Casey Faris’ video on the Mavic Pro, which described the same problem I was having with the Phantom 4. With some more tests, I was able to fail and succeed in the different picture profiles.
When reviewing products, I try to break them, and I did that with the Phantom 4. Really. I accidentally crashed it while in Sport mode and only one of the propellers caps flew off in that yard sale — a real testament to the sturdy construction of the Phantom 4.
Once back online, I tried to fly it into a tree but the Obstacle Sensing System and the Forward Vision System prevented the Phantom 4 from crashing. It’s like an extra layer of insurance.
I really like how the Phantom 4 has very advanced controls and features, but is also “dummy” proof. If I you’re editing a project and it begs for a tracking shot of a car that just isn’t in the dailies, you can grab a Phantom 4 and run out and film something. Even if it doesn’t make it into the final edit, it will give the producers and director a greater sense of what you are trying to convey. You could really help sell your vision, and your future job prospects.
I haven’t been able to get my hands on the recently announced MavicPro foldable drone from DJI, but I was able to get the recently announced GoPro Karma (you can see some of my in-flight footage on my YouTube page.
In my opinion, I really don’t think these drones compare to one another, so I won’t really be going into a “tit for tat” comparison, but with so much drone competition it is an exciting time in the UAV world.
One thing I did notice when I went out to test out the Phantom 4 was how many people were ready to become FAA/police authorities and tell you that you can’t fly. It was almost laughable. In fact, every time I think about it I laugh. Moral of the story is to keep that in mind that before making a purchase like this, if you live in a city you probably live within five miles of an airport, helipad, etc., and technically you can’t fly your drone. It is a conversation starter whether or not you want it to be.
Definitely check out the FAA’s website to get the rules on where and when you can fly drones, otherwise you might have an awesome grey box in your room with nowhere to fly. On the flip side, I’ve been reading people’s comments on forums, and if you are a hobbyist flyer, have registered your drone and want to fly, you can contact your local airport and let them know you want to fly at a certain altitude or below, and they usually will say it’s fine. Those aren’t my words but a summation of what I have been reading — of course do your own research please!
The only criticism is that the Phantom 4’s 60Mbps data rate isn’t high enough to get the best quality footage from your drone. If you’ve been paying attention to the news lately or my Twitter: @allbetzroff, you may have seen DJI’s latest reveal of the Phantom 4 Pro, Pro + and Inspire 2, which can film at a much better data rate of 100Mbps. Maybe this is a simple firmware update to the Phantom 4 (but it’s probably not). Nonetheless, 60Mbps is acceptable for 1920×1080 or maybe 2.7K video (2704 x 1524,16×9 aspect ratio) or below, but once you get up into the higher frame sizes, you can really see the video footage breakdown. If you zoom into the footage, the compression becomes noticeable and the color fidelity begins to fade.
While writing this review, the DJI Phantom 4 retailed for $1,199 on the DJI online store without any accessories. More like $1399 with two extra batteries and an external battery charger. I even just found a refurbished Phantom 4 on DJI’s site for $899. The Phantom 4 Pro starts at $1,499 and Phantom 4 Pro + $1,799. Oh yeah, don’t forget a few 64GB MicroSD cards at $20-$35 a piece. A pretty expensive investment if you ask me, but If you find yourself being a major gear nerd like me or editing and needing to shoot your own footage, the DJI Phantom 4 is a must-have. Once you fly the Phantom 4 you will be hooked.
Watch some of the video I shot with the Phantom 4 on my YouTube Channel:
Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at firstname.lastname@example.org. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration. Follow him on Twitter @allbetzroff.