Pioneering 6K post workflows for ‘Gone Girl’

By Daniel Restuccio

David Fincher’s Gone Girl, shot on a Red Dragon in 6K, boasts an innovative editorial workflow that integrates top-end hardware and software into a seamless 6K editing, VFX and conforming system.

Pioneering a tightly integrated SSD shared-storage system, Gone Girl was edited on 13 custom-built workstations based on Apple Mac Pros and HP Z820s boasting Nvidia GPU cards, Fusion-io solutions, as well as customized versions of Adobe Premiere Pro Creative Cloud and After Effects Creative Cloud, and editorial managed through Open Drives media storage system.

“Basically what happened,” explains assistant editor Tyler Nelson, “is we brought Jeff (Brue, co-founder/CTO of Open Drives) on board for (Fincher’s) House of Cards season one and he built this amazing infrastructure for us to use as a template for Gone Girl.”

“This is the first studio feature film to be edited from a completely ground-up designed shared SSD editorial system, mostly due to the speed and performance of Adobe Premiere Pro with Nvidia GPUs,” says Brue.

Jeff Brue

Brue’s system for connecting the offline workstations consisted of Open Drives Velocity, 36TB of SSD, a 60TB hard drive, custom Adobe Premiere Pro accelerating cache-ing volumes and six 10GbE Solarflare ports.

Two-time Academy Award-winning editor Kirk Baxter edited Gone Girl on Adobe Premiere Pro CC (for the first time) using one of four off line systems built using a MacPro 2012 with 64GB DDR3 RAM, Solarflare 10 GbE and an Nvidia K5000 GPU card configured for editorial with third monitor out for support. See our interview with him here.

“I found Premiere very, very easy to work with and very user friendly,” says Baxter, “but the attraction to it is all of the After Effects shots and how we can be doing visual effects in-house and reducing a lot of that cost and increasing a lot of the speed of these things and the promise of what’s to come.”

“Adobe very much promised to work hand in hand,” continues Baxter. “They (believed we would be) able to streamline the post production with this series of tools that they would provide. At the time, Apple Final Cut was blurry about what the future was. We were more than happy to try this new path. It seems to be paying off.”

There were also two additional offline systems: Mac Pro 2011 with 32GB DDR3 RAM, Nvidia Quadro 5000 and Solarflare 10 GbE.

“One of the really cool things about this project,” says Brue, “was because of Nvidia’s CUDA implementation of Redline that the Red camera wrote, they were able to render and debayer all footage through Nvidia GPUs.”

6K
The Red Dragon records 6K (6144×3160) that’s 19.4 megapixels per frame) in wavelet R3d files. According to Brue, technically that’s roughly double the resolution of 4K (3896×2160) which means that’s double the data rate. This means that playing back a single stream of 6K DPX files is happening at 1.8 gigabytes a second as compared to a single steam at 4K, which averages 800 megabytes a second.

“I think a lot of people are scared away by the file size,” says Nelson. “It’s really not that scary to deal with 6K media. 6K looks beautiful, and if you have that pixel value everything just becomes so much clearer. It is bigger for sure, but you go ahead knowing that with the infrastructure that Jeff has built for us it’s not a big deal.”

Tyler Nelson

Tyler Nelson

“From a technical perspective I would definitely say working in 6K does allow for a level of additional tracking points, which allows for better visual effects integration,” adds Brue. “Plus the higher resolution can compensate for the resolution reduction that occurs whenever you do any image movement.”

In addition to getting a better looking picture at acquisition, Nelson and Brue devised an approach for them to look inside the frame in realtime during editing and have room to reposition, stabilize and re-frame the image.

They started with a 6K resolution of 6144×3072 and extracted a 5K image 5120×2133. With that framing they can move the image left, right, up and down to reframe for consistency throughout a scene. “Some camera operator is moving the shot to go with an actor, but he doesn’t always hit the same mark take after take, so when we cut back to a shot two or three edits after the original use of that angle and the framing is a few hundred pixels left or right and you can notice that,” explains Nelson.

“So if you want to adjust for consistency for shoulder position or head room you can move that around. You have this very, very large palate that you’re extracting a portion of and are able to manipulate the framing and make it absolutely perfect. That’s what we’ve done on all of Fincher’s movies ever since The Social Network.”

“On the Red Dragon camera you can set frame guides,” continues Nelson, “and that’s exactly what the DPs and camera operators are framing for. When David (Fincher) is watching playback on set, he sees a ‘ground glass’ view of what they are framing for, as well as the full image that the sensor is capturing.”

For offline editorial, says Brue, they scaled everything to 2304×1152 pixels. That format meant that they could take a center extraction that corresponded to the 5K (5120×2560) finishing out of the 6K (6144×3072) frame. So throughout offline editorial they could custom reposition, stabilize and reframe the shot without ever having to worry about whether or not they had run out of pixel room.

“But you could only do that,” says Brue, “if at every single moment and every single time you’re re-rendering the frame with live repositioning. That’s one of the things that only Adobe Premiere can do.

“This is the first all-SSD array for offline editing,” he continues. “We had to keep the entire movie online in that manner. For one simple fact, when Premiere goes to export it can export with an Nvidia card five times faster than realtime now. When you have eight editors all hitting up a single volume for five times faster than realtime, not your average storage system can actually hold up inside of that. Particularly when going into a multi-cam scenario where you’re being asked to playback five streams of 2304×1152. At certain points the offline editorial system was being hit up for over 3.5GB a second, because of this and the After Effects dynamic link system.”

gone-girl-DF-01826cc_rgb

Visual Effects
Nelson explains that they processed offline and online/VFX files concurrently. For offline they transcoded everything to 2304×1152 QuickTime 422LTs, a scaled down version of the 6K (6144×3072) R3D files. For online/VFX they rendered full-frame 6K DPX files. To keep track of this they created a “code book.”

“This is not your standard code book,” Nelson states emphatically. This code book provides all the metadata associated with every clip that goes through that system. And by doing so, says Nelson, “it accelerates everything that we do from offline into online. We can take an offline edit, output an EDL or an XML, ingest it into this code book and it takes all the clips and the metadata associated with them and makes this “online package,” which is rendered out based on specific parameters. In our case, we’ve run everything out as RedColor3, RedLogFilm 10-bit DPX for image sequences.

“Every single VFX shot that was delivered went through our code book and was tracked with it as well. If I needed to deliver a VFX shot to one of our in-house vendors I can literally have it to them in about five minutes.”

Online systems for post and visual effects included two HP Z820 Intel Xeon E5-2697, 12 core, 2.7Ghz, release Q312, 256 GB DDR3, Nvidia K6000, Cubix Expansion Chassis, two Fusion-io LoDrive 1.6TB sustained 2.6GB/s, 64TB G-Tech G-Speed EsPro and SolarFlare Dual Port 10 Gb network adaptor; four HP Z820 Intel Xeon E5-2670, 8 core, 2.6Ghz, release Q1 2012, 128 GB DDR3, Nvidia K5200, two HP Z-Turbo 256 GB drives sustained 1.8 GB/s, Solarflare Dual Port 10Gb network adaptor; one HP Z820 Intel Xeon 12 core, 2.7Ghz,128GB DDR3, Nvidia K5200, Aja IO 4K, two Fusion-io LoDrive 1.6 TB sustained 2.6GB/s.

second croppe

For the online shared storage, Brue used the Open Drives Exos 342TB, 1TB SSD cache and six 10 GbE Solarflare ports. “The online volume was the massive place where we actually had to keep this all,” he says.

Visual effects were done literally arms length from editorial. “They are fantastic compositors,” says Nelson, “and it makes an amazing collaborative workflow when you have somebody that is 20 feet away from you that is passionate about making the movie look as best as they can.”

The VFX artists worked directly with 6K DPX files using Adobe After Effects running on the high-end HP 820s. “For our hero workstations, we used the combination of the Fusion-IOs, which we were able to clear 2.6 gigabytes a second, and the Nvidia K6000s, as well as tied into using Solarflare adapters, and the overall SSD storage network,” describes Brue.

Regarding the Io 4K, he says, “AJA’s Io 4K was used as a reference grade device as it allowed us to playback DPX frames without any tearing or artifacts when reviewing VFX shots. The Io 4K was ultimately chosen for the quality level, absolute sync and also for the fact that it can operate in 10-bit.”

Brue says the biggest thing about that was pushing around in After Effects 1.8 gigabytes a second just to play back. “Whenever we do split comps, whenever we do retimes, whenever we do performance retimes, whenever we do stabilization, every single iteration needs to have that throughput, and it’s an immense, immense amount of data. That meant that we could actually do iterations in a very, very rapid manner to be able to get to a finish state on a 6K film. Working with Fincher, it’s definitely about iterations.”

The visual effects and pre-conform were built as After Effects projects nested into Premiere Pro. The offline changed a lot, says Brue. Meaning they started as dynamic linked After Effects projects inside the Premiere Pro timeline and they slowly got replaced by actual final VFX shots. So inside the Premiere Pro timeline, if you zoomed out, that probably made up 70 to 80% of the timeline. Essentially they had online quality offline and were able to update the VFX shots whenever they needed to. “We were playing back 6k VFX’s shots quite regularly with the 5K extraction for correct framing for VFX review.”

There is so much more to tell. Stay tuned for Part 2 to this piece.

Gone Girl Photos: Credit Merrick Morton.

 

 


2 thoughts on “Pioneering 6K post workflows for ‘Gone Girl’

  1. Gavin Greenwalt

    Let me recap to see if I understand the workflow fully:
    1) Shoot 6k.
    2) Transcode R3Ds to ProRes LT and edit offline. Store RDM settings in database.
    2b) Edit
    3) Generate XML
    4)
    5) Generate 6k DPXes of shot in REDLine. GPU? Rocket-X?
    6)
    7) Composite in linked AE comp with 6k DPXes
    8) Ouptut 6k Online DPXes from Premiere -> Lightiron

    I would be interested in knowing more about your workflow in steps 4 and 6. How did you mix the offline and online edits? Did you scale up the Offline HD Quicktimes to 6k to maintain a consistent timeline? Once a shot was “onlined” how did you handle changes to the offline edit or once you passed a shot to the online edit for VFX was it ‘trapped’ in the online timeline?

    How did you transfer the repositioning and reframing in the Offline Edit to the Online Edit in step 4/6? Did you modify the EDL and scale up all of the transforms in the XML while swapping the file paths from the MOVs to the DPXs?

    Also were the Offline editors able to collaborate or did they work in separate projects? And how did you log and manage your bins? Did you have a system to share bins between editors?

    Thanks!
    Gavin

    Reply
  2. Gavin Greenwalt

    OOps, commenting system filtered out my 4/6 steps as HTML.

    4) Magic Script to Generate REDLine batch operations using Database Metadata
    5) Generate 6k DPXes of shot in REDLine. GPU? Rocket-X?
    6) Magic Script to Generate Premiere Timeline with 6k DPXes and rescale transforms

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *