Tag Archives: DIT

Building a workflow for The Great Wall

Bling Digital, which is part of the SIM Group, was called on to help establish the workflow on Legendary/Universal’s The Great Wall, starring Matt Damon as a European mercenary imprisoned within the wall. While being held he sees exactly why the Chinese built this massive barrier in the first place — and it’s otherworldly. This VFX-heavy mystery/fantasy was directed by Yimou Zhang.

We reached out to Bling’s director of workflow services, Jesse Korosi, to talk us through the process on the film, including working with data from the Arri 65, which at that point hadn’t yet been used on a full-length feature film. Bling Digital is a post technology and services provider that specializes in on-set data management, digital dailies, editorial system rentals and data archiving

Jesse Korosi

When did you first get involved on The Great Wall and in what capacity?
Bling received our first call from the unit production manager Kwame Parker about providing on-set data management, dailies, VFX and stereo pulls, Avid rentals and a customized process for the digital workflow for The Great Wall in December of 2014.

At this time the information was pretty vague, but outlined some of the bigger challenges, like the film being shot in multiple locations within China, and that the Arri 65 camera may be used, which had not yet been used on a full-length feature. From this point on I worked with our internal team to figure out exactly how we would tackle such a challenge. This also required a lot of communication with the software developers to ensure that they would be ready to provide updated builds that could support this new camera.

After talks with the DP Stuart Dryburgh, the studio and a few other members of production, a big part of my job and anyone on my workflow team is to get involved as early as possible. Therefore our role doesn’t necessarily start on day one of principal photography. We want to get in and start testing and communicating with the rest of the crew well ahead of time so that by the first day, the process runs like a well-oiled machine and the client never has to be concerned with “week-one kinks.”

Why did they opt for the Arri 65 camera and what were some of the challenges you encountered?
Many people who we work with love Arri. The cameras are known for recording beautiful images. For anyone who may not be a huge Arri fan, they might dislike the lower resolution in some of the cameras, but it is very uncommon that someone doesn’t like the final look of the recorded files. Enter the Arri 65, a new camera that can record 6.5K files (6560×3100) and every hour recorded is a whopping 2.8TB per hour.

When dealing with this kind of data consumption, you really need to re-evaluate your pipeline. The cards are not able to be downloaded by traditional card readers — you need to use vaults. Let’s say someone records three hours of footage in a day — that equals 8.7TB of data. If you’re sending that info to another facility even using a 500Mb/s Internet line, that would take 38 hours to send! LTO-ing this kind of media is also dreadfully slow. For The Great Wall we ended up setting up a dedicated LTO area that had eight decks running at any given time.

Aside from data consumption, we faced the challenge of having no dailies software that could even read the files. We worked with Colorfront to get a new build-out that could work, and luckily, after having been through this same ordeal recording Arri Open Gate on Warcraft, we knew how to make this happen and set the client at ease.

Were you on set? Near set? Remote?
Our lab was located in the production office, which also housed editorial. Considering all of the traveling this job entailed, from Beijing and Qingdao to Gansu, we were mostly working remotely. We wanted to be as close to production as possible, but still within a controlled environment.

The dailies set-up was right beside editor Craig Wood’s suite, making for a close-knit workflow with editorial, which was great. Craig would often pull our dailies team into his suite to view how the edit was coming along, which really helped when assessing how the dailies color was working and referencing scenes in the cut when timing pickup shots.

How did you work with the director and DP?
At the start of the show we established some looks with the DP Stuart Dryburgh, ASC. The idea was that we would handle all of the dailies color in the lab. The DIT/DMT would note as much valuable information on set about the conditions that day and we would use our best judgment to fulfill the intended look. During pre-production we used a theatre at the China Film Group studio to screen and review all the test materials and dial in this look.

With our team involved from the very beginning of these color talks, we were able to ensure that decisions made on color and data flow were going to track through each department, all the way to the end of the job. It’s very common for decisions to be made color wise at the start of a job that get lost in the shuffle once production has wrapped. Plus, sometimes there isn’t anyone available who recognizes why certain decisions were made up front when you‘re in the post stage.

Can you talk us through the workflow? 
In terms of workflow, the Arri 65 was recording media onto Codex cards, which were backed up onset with a VaultS. After this media was backed up, the Codex card would be forwarded onto the lab. Within the lab we had a VaultXL that would then be used to back this card up to the internal drive. Unfortunately, you can’t go directly from the card to your working drive, you need to do two separate passes on the card, a “Process” and a “Transfer.”

The Transfer moves the media off the card and onto an internal drive on the Vault. The Process then converts all the native camera files into .ARI files. Once this media is processed and on the internal drive, we were able to move it onto our SAN. From there we were able to run this footage through OSD and make LTO back-ups. We also made additional back-ups to G-Tech GSpeed Studio drives that would be sent back to LA. However, for security purposes as well as efficiency, we encrypted and shipped the bare drives, rather than the entire chassis. This meant that when the drives were received in LA, we were able to mount them into our dock and work directly off of them, i.e no need to wait on any copies.

Another thing that required a lot of back and forth with the DI facility was ensuring that our color pipeline was following the same path they would take once they hit final color. We ended up having input LUTs for any camera that recorded a non-LogC color space. In regards to my involvement, during production in China I had a few members of my team on the ground and I was overseeing things remotely. Once things came back to LA and we were working out of Legendary, I became much more hands-on.

What kind of challenges did providing offline editorial services in China bring, and how did that transition back to LA?
We sent a tech to China to handle the set-up of the offline editorial suites and also had local contacts to assist during the run of the project. Our dailies technicians also helped with certain questions or concerns that came up.

Shipping gear for the Avids is one thing, however shipping consoles (desks) for the editors would have been far too heavy. Therefore this was probably one of the bigger challenges — ensuring the editors were working with the same caliber of workspace they were used to in Los Angeles.

The transition of editorial from China to LA required Dave French, director of post engineering, and his team to mirror the China set-up in LA and have both up and running at the same time to streamline the process. Essentially, the editors needed to stop cutting in China and have the ability to jump on a plane and resume cutting in LA immediately.

Once back in LA, you continued to support VFX, stereo and editorial, correct?
Within the Legendary office we played a major role in building out the technology and workflow behind what was referred to as the Post Hub. This Post Hub was made up of a few different systems all KVM’d into one desk that acted as the control center for VFX and stereo reviews, VFX and stereo pulls and final stereo tweaks. All of this work was controlled by Rachel McIntire, our dailies, VFX and stereo management tech. She was a jack-of-all-trades who played a huge role in making the post workflow so successful.

For the VFX reviews, Rachel and I worked closely with ILM to develop a workflow to ensure that all of the original on set/dailies color metadata would carry into the offline edit from the VFX vendors. It was imperative that during this editing session we could add or remove the color, make adjustments and match exactly what they saw on set, in dailies and in the offline edit. Automating this process through values from the VFX Editors EDL was key.

Looking back on the work provided, what would you have done differently knowing what you know now?
I think the area I would focus on next time around would be upgrading the jobs database. With any job we manage at Bling, we always ensure we keep a log of every file recorded and any metadata that we track. At the time, this was a little weak. Since then, I have been working on overhauling this database and allowing creative to access all camera metadata, script metadata, location data, lens data, etc. in one centralized location. We have just used this on our first job in a client-facing capacity and I think it would have done wonders for our VFX and stereo crews on The Great Wall. It is all too often that people are digging around for information already captured by someone else. I want to make sure there is a central repository for that data.

Assimilate Scratch and Scratch VR Suite upgraded to V.8.6

Assimilate is now offering an open beta for Scratch 8.6 and the Scratch VR Suite 8.6, the latest versions of its realtime post tools and workflow — VR/360 and 2D/3D content, from dailies to conform grading, compositing and finishing. Expanded HDR functions are featured throughout the product line, including in Scratch VR, which now offers stitching capabilities.

Both open beta versions gives pros the opportunity to actively use the full suite of Scratch and Scratch VR tools, while evaluating and submitting requests and recommendations for additional features or updates.

Scratch Web for cloud-based, realtime review and collaboration, and Scratch Play for immediate review and playback, are also included in the ecosystem updates. Both products support VR/360 and 2D/3D content.

Current users of the Scratch VR Suite 8.5 and Scratch Finishing 8.5 can download the Scratch 8.6 open beta. Scratch 8.6 open beta and the Scratch VR Suite open beta are available now.

“V8.6 is a major update for both Scratch and the Scratch VR Suite with significant enhancements to the HDR and ACES workflows. We’ve added stitching to the VR toolset so that creators have a complete and streamlined end-to-end VR workflow,” says Jeff Edson, CEO at Assimilate. “The open Beta helps us to continue developing the best and most useful post production features and techniques all artists need to perfect their creativity in color grading and finishing. We act on all input, much of it immediately and some in regular updates.”

Here are some details of the update:

HDR
• PQ and HLG transfer functions are now an integral part of Scratch color management.
• Scopes automatically switch to HDR mode if needed and show levels in a nit-scale; highlights any reference level that you set.
• At the project level, define the HDR mastering metadata: color space, color primaries and white levels, luminance levels and more. The metadata is automatically included in the Video HDMI interface (AJA, BMD, Bluefish444) for display.
• Static metadata has the function to calculate dynamic luminance metadata like MaxCLL and MaxFall.
• HDR footage can be published directly to YouTube with HDR metadata.

VR/360 – Scratch VR Suite
• 360 stitching functionality: load all your source media from your 360 cameras into Scratch VR and combine it to a single equirectangular image. Support for camera stitch templates: AutoPano projects, Hugin and PTStitch scripts.
• Ambisonic Audio: Scratch VR can load, set and playback ambisonic audio files to complete the 360 immersive experience.
• Video with 360 sound can be published directly to YouTube 360.
• Additional overlay handles to the existing 2D-equirectangular feature for more easily positioning 2D elements in a 360 scene.

DIT Reporting Function
• Create a report of all clips of either a timeline, a project or just a selection of shots.
• Reports include metadata, such as a thumbnail, clip-name, timecode, scene, take, comments and any metadata attached to a clip.
• Choose from predefined templates or create your own.

This DIT talks synchronization and action scenes

By Peter Welch

As a digital imaging technician on feature films, I work closely with the director of photography, camera crew, editorial and visual effects teams to make sure the right data is collected for post production to transform raw digital footage into a polished movie as efficiently as possible.

A big part of this role is to anticipate how things could go wrong during a shoot, so I can line up the optimal combination of technical kit to minimize the impact should the worst happen. With all the inherent unpredictability that comes with high-speed chases and pyrotechnics, feature film action sequences can offer up some particularly tough challenges. These pivotal scenes are incredibly costly to get wrong, so every technological requirement is amplified. This includes the need to generate robust and reliable timecode.

I take timecode very seriously, seeing it as an essential part of the camera rather than a nice-to-have add-on. Although it doesn’t really affect us on set, further down the line, a break in timecode can cause other areas a whole world of problems. In the case of last year’s Avengers: Age of Ultron, creating the spectacular scenes and VFX we’ve come to expect from Marvel involved developing solid workflows for some very large multi-camera set-ups. For some shots, as many as 12 cameras were rolling with a total camera package of 27 cameras, including Arri Alexa XTs, Canon C500s with Codex recorders, Red Epics and Blackmagic cameras. The huge amounts of data generated made embedding accurate, perfectly synced timecode into every piece of footage an important technical requirement.

Avengers : Age of Ultron ; Year : 2015 USA ; Director : Joss Whedon ; Chris Evans, Jeremy Renner, Scarlett Johansson, Chris Hemsworth. Image shot 2015. Exact date unknown.One of the largest action sequences for Age of Ultron was filmed in Korea with eight cameras rigged to capture footage — four Arri Alexas and four Canon C500s — and huge volumes of RAW output going to Codex recorders. With this shoot, there was a chance that cameras could be taken out while filming, putting footage at risk of being lost. As a result, while the Alexas were strategically rigged a safe distance from the main action, the less costly C500s were placed in and around the explosion, putting them at an increased risk of being caught in the line of fire.

As an added complication, once the set was built, and definitely once it was hot with explosives, we couldn’t go back in to adjust camera settings. So while I was able to manually jam-sync the Alexas, the C500s had to be set to record with timecode running at the point of rigging. There wasn’t an opportunity to go back later and re-jam midway through the day — they had to stay in sync throughout, whatever twists and turns the filming process took.

With the C500 cameras placed in strategic positions to maximize the action, the Codex recorders, Preston MDRs and power were built into recording and camera control boxes (or ‘safe boxes’) and positioned at a distance from the cameras and then connected via a bespoke set of cables. Within each C500’s “safe box,” I also placed a Timecode Systems Minitrx+ set in receive mode. This was synced over RF to a master unit back outside of the “hot” zone.

With an internal Li-Polymer battery powering it for up to 12 hours, the Mintrx+ units in the C500 “safe boxes” could be left running throughout a long shooting day with complete confidence and no requirement for manual jamming or resetting. This set-up ensured all footage captured by the C500s in the “hot” zone was stamped with the same frame-accurate timecode as the Alexas. The timecode could also be monitored via the return video signals’ embedded SDI feed.

But it’s not just the pyrotechnics that inject unpredictability into shooting this kind of scene — the sheer scale of the locations can be as much of a challenge. The ability to synchronize timecode over RF definitely helps, but even with long-range RF it’s good to have a backup. For example, for one scene in 2015’s Spectre, 007 piloted a motorboat down a sizeable stretch of the Thames in London. For this scene, I rigged one camera with a Minirtx+ on a boat in Putney, powered it up and left it onboard filming James Bond. I then got in my car and raced down the Embankment to Westminster to set up the main technical base with the camera crews, with a Timecode Systems unit set to the same timecode as that on the boat.

Even though the boat started its journey out of range of its paired unit, the crystal inside the Minitrx+ continued to feed timecode to that camera accurately. As soon as the boat drifted into range, it synced back to the master unit again with zero drift. There was no need to reset or re-jam.

Action sequences are certainly getting increasingly ambitious, with footage being captured from an increasing number and variety of camera sources. Although it’s possible to fix sync problems in post, it’s time consuming and expensive. Getting it right at the point of shooting offers considerable efficiencies to the production process, something that every production demands — even those working with superhero budgets.

Peter Welch is a London-based digital imaging technician (DIT) with Camera Facilities.

Quick Chat: DP Dejan Georgevich, ASC

By Randi Altman

Long-time cinematographer Dejan Georgevich, ASC, has been working in television, feature film production and commercials for over 35 years. In addition to being on set, Georgevich regularly shares his experience and wisdom as a professor of advanced cinematography at New York’s School of Visual Arts.

Georgevich’s TV credits include the series Mercy, Cupid, Hope & Faith, The Book of Daniel and The Education of Max Bickford. In the world of documentaries, he has worked on HBO’s Arthur Ashe: Citizen of the World, PBS’ A Wayfarer’s Journey: Listening to Mahler and The Perfumed Road.

One of his most recent projects was as DP on Once in a Lifetime, a 30-minute television pilot about two New Jersey rockers trying to make it in the music business. The show’s musical roots are real — Once in a Lifetime was written by Iron Maiden’s bass player and songwriter, Stephen Harris.

Georgevich, who was in Australia on a job, was kind enough to use some of his down time to answer our questions about shooting, lighting, inspiration and more. Enjoy…

How did you decide TV production and cinematography, in particular, would be your path?
Perhaps it all started when I hauled around a Bell & Howell projector half my size in elementary school, showing films to an assembly of kids transfixed to a giant screen. Working on the stage crew in middle school revealed to me that I was “a fish to water” when it came to lighting.

You work on a variety of projects. How does your process change, if at all, going from a TV spot to a TV series to a documentary, etc.?
Each genre informs the other and has made me a better storyteller. For example, my work in documentaries demands being sensitive to anticipating and capturing the moment. The same skills translate perfectly when shooting dramas, which require making the best choices that visually express the idea, mood and emotion of a scene.

How do you decide what is the right camera for each job? Or do you have a favorite that you use again and again?
I choose a camera that offers the widest dynamic range, renders lovely skin tones, a natural color palette, and is user-friendly and ergonomic in handling. My camera choice will also be influenced by whether the end result will be projected theatrically on a big or small screen.

Once in a Lifetime

You used the Panasonic Varicam 35 on the TV pilot Once in a Lifetime. Why was this the right camera for this project, and was most of the shooting outdoors?
Once in a Lifetime was an independently financed TV pilot, on a tight schedule and budget, requiring a considerable amount of shooting in low-light conditions. This production demanded speed and a limited lighting package because we were shooting on-location night interiors/exteriors, including nightclubs, rooftops, narrow tenement apartments and dimly-lit city streets. Panasonic Varicam 35’s dual ISO of 800 and 5000 provided unbelievable image capture in low-light conditions, rendering rich blacks with no noise!

What were some of the challenges of this project? Since it was a pilot, you were setting a tone for the entire series. How did you go about doing that?
The biggest challenge for me was to “re-educate my eye” working with the Panasonic Varicam 35, which sees more than what my eye sees, especially in darkness. To my eye, a scene would look considerably under-lit at times, but surpringly the picture on the monitor looked organic and well motivated. I was able to light predominately with LEDs and low-wattage lights augmenting the practicals or, in the case of the rooftop, the Manhattan night skyline. House power and/or portable put-put generators were all that was necessary to power the lights.

The pilot’s tone, or look, was achieved using the combination of wide-angle lenses and high-contrast lighting, not only with light and shadow but with evocative primary and secondary colors. This is a comedic story about two young rockers wanting to make it in the music business and their chance meeting with a rock ’n’ roll legend offering that real possibility of fulfilling their dreams.

How did you work with the DIT on this project, and on projects in general?
I always prefer and request a DIT on my projects. I see my role as the “guardian of the image,” and having a DIT helps preserve my original intent in creating the look of the show. In other words, with the help of my DIT, I like to control the look as much as possible in-camera during production. I was very fortunate to have Dave Satin as my DIT on the pilot — we have worked together for many years — and it’s very much like a visual  pitcher/catcher-type of creative relationship. What’s more, he’s my second set of eyes and technical insurance against any potential digital disaster.

Can you talk about lighting? If you could share one bit of wisdom about lighting, what would it be?
As with anything to do with the arts, I believe that lighting should be seamless. Don’t wear it on your sleeve. Keep it simple… less is best! Direction of light is important as it best describes a story’s soul and character.

What about working with colorists after the shoot. Do you do much of that?
As a DP, I believe it’s critically important that we are active participants in post color correction. I enjoy outstanding collaborations with some of the top colorists in the business. In order to preserve the original intent of our image we, as directors of photography, must be the guiding hand through all phases of the workflow. Today, with the advent of digital image capture, the cinematographer must battle against too many entities that threaten to change our images into something other than what was originally intended.

What inspires you? Fine art? Photography?
I make it a point to get my “creative fix” by visiting art museums as often as possible. I’m inspired by the works of the Grand Master painters and photographers — the works of Rembrandt, Vermeer, Caravaggio, Georges de la Tour, Edward Hooper, Henri Cartier Bresson, William Eggelston — too many to name!  Recreating the world through light and perspective is magical and a necessary reminder of what makes us alive!

What haven’t I asked that you feel is important to talk about?
We’re currently experiencing a digital revolution that is being matched by an emerging revolution in lighting (i.e. LED technology). The tools will always change, but it’s our craft reflecting the heart and mind that remains constant and so important.

Assimilate’s Scratch 8.3 and Scratch Lab now in one toolset

Scratch 8.3 from Assimilate is a cloud-based ecosystem of digital cinema tools for DITs, DPs, directors, editors, colorists, post artists, and other creative pros. Scratch 8.3 integrates the full Scratch DI workflow and Scratch Lab for on-set and VFX dailies into one toolset for $650 per year. Filmmakers and artists now have an uninterrupted Scratch workflow with a single user interface for consistent color throughout the filmmaking process.

The Scratch 8.3 ecosystem also includes Scratch Web and Scratch Play 8.3 for realtime publishing and sharing of native RAW camera or other media with the entire creative team — in any format and any resolution at any time, anywhere in the world. Unicode is now included to support the high-growth, budget-limited Asian markets.

All current Scratch and Scratch Lab customers on subscription or active support receive an upgrade to Scratch 8.3 at no charge, as well as a Scratch Web channel to publish RAW and other media data for collaboration and review.

On-Set: VFX data gathering with Arri Alexa

By Hasraf “HaZ” Dulull

A few months ago I was VFX supervising on location for a rather large-scale commercial shot on the lovely Arri Alexa.

Whenever I am doing on-set VFX supervising, one of the many things I do is take notes of the lens used, height of jib crane, F-stops, FOV etc. To do this I usually hassle the camera assistants for that info and then scribble it down as we move onto the next shot and setup.

But when things start changing, such as different takes with different angles and lens or new setups made on the fly, it can be a nightmare to keep on top of all the camera info.

So on this particular commercial shoot, as I passed the DIT guy, I noticed he was dumping the Continue reading

Review: Assimilate’s Scratch Play

By Keith Putnam

Recently, postPerspective contacted me in my role as a “working DIT” to test drive the relatively new (and free) offering Scratch Play in a production environment. Having now auditioned it on a couple of jobs and two different computer platforms, here are my thoughts.

Prior to this, I had never really used any Assimilate software. My exposure to Scratch was limited to the post environment as a finishing tool for final color grades and deliverables creation. Some of my DIT colleagues use Scratch on a daily basis as their go-to on-set workflow center, but I tend to employ a combination of Colorfront Express Dailies, Continue reading