Tag Archives: Local Hero Post

Digging into the dailies workflow for HBO’s Sharp Objects

By Randi Altman

If you have been watching HBO’s new series Sharp Objects, you might have some theories about who is murdering teenage girls in a small Missouri town, but at this point they are only theories.

Sharp Objects revolves around Amy Adams’ character, Camille, a journalist living in St. Louis, who returns to her dysfunctional hometown armed with a deadline from her editor, a drinking problem and some really horrific childhood memories.

Drew Dale

The show is shot in Atlanta and Los Angeles, with dailies out of Santa Monica’s Local Hero and post out of its sister company, Montreal’s Real by Fake. Real by Fake did all the post on the HBO series Big Little Lies.

Local Hero’s VP of workflows, Drew Dale, managed the dailies workflow on Sharp Objects, coming up against the challenges of building a duplicate dailies set up in Atlanta as well as dealing with HBO’s strict delivery requirements — not just for transcoding, but for labeling files and more. Local Hero co-owner Steve Bannerman calls it “the most detailed and specific dailies workflow we’ve ever designed.”

To help cope with such a high level of complexity, Dale turned to Assimilate’s Scratch as the technical heart of his workflow. Since Scratch is a very open system, it was able to integrate seamlessly with all the software and hardware tools that were needed to meet the requirements.

Local Hero’s DI workflow is something that Dale and the studio have been developing for about five or six years and adjusting for each show or film they work on. We recently reached out to Dale to talk about that workflow and their process on Sharp Objects, which was created by Marti Noxon and directed by Jean-Marc Vallée.

Can you describe your workflow with the footage?
Basically, the DIT hands a shuttle RAID (we use either OWC or Areca RAIDs) to a PA, and they’ll take it to our operator. Our operators tend to start as soon as wrap hits, or as soon as lunch breaks, depending on whether or not you’re doing one or two breaks a day.

We’ll ingest into Scratch and apply the show LUT. The LUT is typically designed by our lead colorist and is based on a node stack in Blackmagic Resolve that we can use on the back end as the first pass of the DI process. Once the LUT is loaded, we’ll do our grades using the CDL protocol, though we didn’t do the grade on Sharp Objects. Then we’ll go through, sync all the audio, QC the footage and make our LTO back-ups.

What are you looking for in the QC?
Things like crew in the shot, hot pixels, corrupt footage, lens flares, just weird stuff that’s going to cost money on the backend. Since we’re working in conjunction with production a lot of the time, we can catch those things reasonably early; a lot earlier than if you were waiting until editorial. We flag those and say, “This scene that you shot yesterday is out of focus. You should probably re-shoot.” This allows them adjust more quickly to that sort of thing.

After the QC we do a metadata pass, where we take the embedded information from the WAV files provided by the sound mixer, as well as custom metadata entered by our operator and apply that throughout the footage. Then we’ll render out editorial media — typically Avid but sometimes Premiere or Final Cut — which will then get transferred to the editors either via online connection or shipped shuttle drives. Or, if we’re right next to them, we’ll just push it to their system from our computer using a fiber or Ethernet intranet.

We’ll also create web dailies. Web dailies are typically H.264s, and those will either get loaded onto an iPad for the director, uploaded to pix or Frame.io for web review, or both.

You didn’t grade the dailies on Sharp Objects?
No, they wanted a specific LUT applied; one that was used on the first season of Big Little Lies, and is being used on the second season as well. So they have a more generic look applied, but they do have very specific needs for metadata, which is really important. For example, a lot of the things they require are the input of shoot date and shoot day information, so you can track things.

We also ingest track information from WAV files, so when the editor is cutting the footage you can see the individual audio channel names in the edit, which makes cutting audio a lot easier. It also helps sync things up on the backend with the audio mix. As per HBO’s requests, a lot of extra information in the footage goes to the editor.

The show started in LA and then moved to Atlanta, so you had to build your workflow for a second time? Can you talk about that?
The tricky part of working on location is making sure the Internet is set up properly and getting a mobile version of our rig to wherever it needs to go. Then it’s dealing with the hassle of being on location. I came up in the production world in the camera department, so it reminds me of being back on set and being in the middle of nowhere with a lot less infrastructure than you’re used to when sitting at a post house in Los Angeles. Most of the challenge of being on location is finding creative ways to implement the same workflow in the face of these hurdles.

Let’s get back to working with HBO’s specific specs. Can you talk about different tools you had to call on to make sure it was all labeled and structured correctly?
A typical scene identifier for us is something like “35B-01” 35 signifies the scene, “B” signifies the shot and “01” signifies the take.

The way that HBO structured things on Sharp Objects was more by setup, so it was a much more fluid way of shooting. It would be like “Episode 1, setup 32, take one, two, three, four, five.” But each of those takes individually was more like a setup and less like a take itself. A lot of the takes were 20 minutes long, 15 minutes long, where they would come in, reset the actors, reset the shot, that kind of thing.

In addition to that, there was a specific naming convention and a lot of specific metadata requirements required by the editors. For example, the aforementioned WAV track names. There are a lot of ways to process dailies, but most software doesn’t provide the same kind of flexibility with metadata as Scratch.

For this show it was these sorts of things, as well as very specific LTO naming conventions and structure, which took a lot of effort on our part to get used to. Typically, with a smaller production or smaller movie, the LTO backups they require are basically just to make sure that the footage is placed somewhere other than our hard drives, so we can store it for a long period of time. But with HBO, very specific manifests are required with naming conventions on each tape as well as episode numbers, scene and take info, which is designed to make it easier for un-archiving footage later for restoration, or for use in later seasons of a show. Without that metadata, it becomes a much more labor-intensive job to track down specific shots and scenes.

HBO also requires us to use multiple LTO brands in case one brand suddenly ceases to support the medium, or if a company goes under, they can un-archive the footage 30 years from now. I think a lot of the companies are starting to move toward future-proofing their footage in case you need to go back and remaster it.

Does that make your job harder? Easier?
It makes it harder in some ways, and easier in others. Harder because there is a lot of material being generated. I think the total count for the show was something like 120TB of footage, which is not an excessive amount for a show this big, but it’s definitely a lot of data to manage over the course of a show.

Could name some of the tools that you used?
As I mentioned, the heartbeat of all our dailies workflows is Scratch. I really love Scratch for three reasons. First, I can use it to do fully color graded, fully animated dailies with power windows, ramping curves — everything. Second, it handles metadata very well. This was crucial for Sharp Objects. And finally, it’s pretty affordable.

Beyond Scratch, the software that we tend to use most for copying footage is Silverstack. We use that for transferring files to and from the RAID to make sure everything’s verified. We use Scratch for processing the footage; that’s sort of the big nexus of everything. We use YoYottaID for LTO creations; that’s what HBO suggests we use to handle their specific LTO requirements. One of the things I love is the ability to export ALEs directly out of Scratch and into YoYattID. This saves us time and errors. We use Aspera for transferring files back and forth between HBO and ourselves. We use Pix for web daily distributions. Pix access was specifically provided to us by HBO.

Hardware wise, we’re mostly working on either Mac Pros or Silverdraft Demon PCs for dailies. We used to use mostly Mac Pros, but we find that they aren’t quite robust enough for larger projects, though they can be useful for mid-range or smaller jobs.

We typically use Flanders monitors for our on-set grading, but we’ve also used Sony’s and JVC’s, depending on the budget level and what’s available on hand. We tend to use the G-Speed Shuttle XLs for the main on-set RAIDs, and we like to use OWC Thunderbays or Areca thunderbolt RAIDS for our transfer drives.

What haven’t I asked that is important?
For me it’s important to have tools, operators and infrastructure that are reliable so we can generate trust with our clients. Trust is the biggest thing for me, and the reason we vetted all the software… we know what works. We know it does what we need it to do to be flexible for everybody’s needs. It’s really about just showing the clients that we’ve got their back.

VR Post: Hybrid workflows are key

By Beth Marchant

Shooting immersive content is one thing, but posting it for an ever-changing set of players and headsets is whole other multidimensional can of beans.

With early help from software companies that have developed off-the-shelf ways to tackle VR post — and global improvements to their storage and networking infrastructures — some facilities are diving into immersive content by adapting their existing post suites with a hybrid set of new tools. As with everything else in this business, it’s an ongoing challenge to stay one step ahead.

Chris Healer

The Molecule
New York- and Los Angeles-based motion graphics and VFX post house The Molecule leapt into the VR space more than a year and a half ago when it fused The Foundry’s Nuke with the open-sourced panoramic photo stitching software Hugin. Then, CEO Chris Healer took the workflow one step further. He developed an algorithm that rendered stereoscopic motion graphics spherically in Nuke.

Today, those developments have evolved into a robust pipeline that fuels The Molecule’s work for Conan O’Brien’s eponymous TBS talk show, The New York Times’s VR division and commercial work. “It’s basically eight or ten individual nodes inside Nuke that complete one step or another of the process,” says Healer. “Some of them overlap with Cara VR,” The Foundry’s recently launched VR plug-in for Nuke, “but all of it works really well for our artists. I talk to The Foundry from time to time and show them the tools, so there’s definitely an open conversation there about what we all need to move VR post forward.”

Collaborating with VR production companies like SuperSphere, Jaunt and Pixvana in Seattle, The Molecule is heading first where mass VR adoption seems likeliest. “The New York Times, for example, wants to have a presence at film festivals and new technology venues, and is trying to get out of the news-only business and into the entertainment-provider business. And the job for Conan was pretty wild — we had to create a one-off gag for Comic-Con that people would watch once and go away laughing to the next thing. It’s kind of a cool format.”

Healer’s team spent six weeks on the three-minute spot. “We had to shoot plates, model characters, animate them, composite it, build a game engine around it, compile it, get approval and iterate through that until we finished. We delivered 20 or so precise clips that fit into a game engine design, and I think it looks great.”

Healer says the VR content The Molecule is posting now is, like the Conan job, a slight variation on more typical recent VR productions. “I think that’s also what makes VR so exciting and challenging right now,” he says. “Everyone’s got a different idea about how to take it to the next level. And a lot of that is in anticipation of AR (augmented reality) and next-generation players/apps and headsets.

‘Conan’

The Steam store,” the premiere place online to find virtual content, “has content that supports multiple headsets, but not all of them.” He believes that will soon gel into a more unified device driver structure, “so that it’s just VR, not Oculus VR or Vive VR. Once you get basic head tracking together, then there’s the whole next thing: Do you have a controller of some kind, are you tracking in positional space, do you need to do room set up? Do we want wands or joysticks or hand gestures, or will keyboards do fine? What is the thing that wins? Those hurdles should solidify in the next year or two. The key factor in any of that is killer content.”

The biggest challenge facing his facility, and anyone doing VR post right now, he says, is keeping pace with changing resolutions and standards. “It used to be that 4K or 4K stereo was a good deliverable and that would work,” says Healer. “Now everything is 8K or 10K, because there’s this idea that we also have to future-proof content and prepare for next-gen headsets. You end up with a lot of new variables, like frame rate and resolution. We’re working on stereo commercial right now, and just getting the footage of one shot converted from only six cameras takes almost 3TB of disk space, and that’s just the raw footage.”

When every client suddenly wants to dip their toes into VR, how does a post facility respond? Healer thinks the onus is on production and post services to provide as many options as possible while using their expertise to blaze new paths. “It’s great that everyone wants to experiment in the space, and that puts a certain creative question in our field,” he says. “You have to seriously ask of every project now, does it really just need to be plain-old video? Or is there a game component or interactive component that involves video? We have to explore that. But that means you have to allocate more time in Unity https://unity3d.com/ building out different concepts for how to present these stories.”

As the client projects get more creative, The Molecule is relying on traditional VFX processes like greenscreen, 3D tracking and shooting plates to solve VR-related problems. “These VFX techniques help us get around a lot of the production issues VR presents. If you’re shooting on a greenscreen, you don’t need a 360 lens, and that helps. You can shoot one person walking around on a stage and then just pan to follow them. That’s one piece of footage that you then composite into some other frame, as opposed to getting that person out there on the day, trying to get their performance right and then worrying about hiding all the other camera junk. Our expertise in VFX definitely gives us an advantage in VR post.”

From a post perspective, Healer still hopes most for new camera technology that would radically simplify the stitching process, allowing more time for concepting and innovative project development. “I just saw a prototype of a toric lens,” shaped like the donut-like torus that results from revolving a circle in three-dimensional space, “that films 360 minus a little patch, where the tripod is, in a single frame,” he says. “That would be huge for us. That would really change the workflow around, and while we’re doing a lot of CG stuff that has to be added to VR, stitching takes the most time. Obviously, I care most about post, but there are also lots of production issues around a new lens like that. You’d need a lot of light to make it work well.”

Local Hero Post
For longtime Scratch users Local Hero Post, in Santa Monica, the move to begin grading and compositing in Assimilate Scratch VR was a no-brainer. “We were one of the very first American companies to own a Scratch when it was $75,000 a license,” says founder and head of imaging Leandro Marini. “That was about 10 years ago and we’ve since done about 175 feature film DIs entirely in Scratch, and although we also now use a variety of tools, we still use it.”

Leandro Marini

Marini says he started seeing client demand for VR projects about two years ago and he turned to Scratch VR. He says it allows users do traditional post the way editors and colorist are used to — with all the same DI tools that let you do complicated paint outs, visual effects and 50-layer-deep color corrections, Power Windows, in realtime on a VR sphere.”

New Deal Studios’ 2015 Sundance film, Kaiju Fury was an early project, “when Scratch VR was first really user-friendly and working in realtime.” Now Marini says their VR workflow is “pretty robust. [It’s] currently the only system that I know of that can work in VR in realtime in multiple ways,” which includes a echo-rectangular projection, which gives you a YouTube 360-type of feel and an Oculus headset view.

“You can attach the headset, put the Oculus on and grade and do visual effects in the headset,” he says. “To me, that’s the crux: you really have to be able to work inside the headset if you are going to grade and do VR for real. The difference between seeing a 360 video on a computer screen and seeing it from within a headset and being able to move your head around is huge. Those headsets have wildly different colors than a computer screen.”

The facility’s — and likely the industry’s — highest profile and biggest budget project to date is Invisible, a new VR scripted miniseries directed by Doug Liman and created by 30 Ninjas, the VR company he founded with Julina Tatlock. Invisible premiered in October on Samsung VR and the Jaunt app and will roll out in coming months in VR theaters nationwide. Written by Dallas Buyers Club screenwriter Melisa Wallack and produced by Jaunt and Condé Nast Entertainment, it is billed as the first virtual reality action-adventure series of its kind.

‘Invisible’

“Working on that was a pretty magical experience,” says Marini. “Even the producers and Liman himself had never seen anything like being able to do the grade, do VFX and do composite and stereo fixes in 3D virtual reality all with the headset on. That was our initial dilemma for this project, until we figured it out: do you make it look good for the headset, for the computer screen or for iPhones or Samsung phones? Everyone who worked on this understood that every VR project we do now is in anticipation of the future wave of VR headsets. All we knew was that about a third would probably see it on a Samsung Gear VR, another third would see it on a platform like YouTube 360 and the final third would see it on some other headset like Oculus Rift, HTC or Google’s new Daydream.”

How do you develop a grading workflow that fits all of the above? “This was a real tricky one,” admits Marini. “It’s a very dark and moody film and he wanted to make a family drama thriller within that context. A lot of it is dark hallways and shadows and people in silhouette, and we had to sort of learn the language a bit.”

Marini and his team began exclusively grading in the headset, but that was way too dark on computer monitors. “At the end of the day, we learned to dial it back a bit and make pretty conservative grades that worked on every platform so that it looked good everywhere. The effect of the headset is it’s a light that’s shining right into your eyeball, so it just looks a lot brighter. It had to still look moody inside the headset in a dark room but not too moody that it vanishes on computer laptop in a bright room. It was a balancing act.”

Local Hero

Local Hero also had to figure out how to juggle the new VR work with its regular DI workload. “We had to break off the VR services into a separate bay and room that is completely dedicated to it,” he explains. “We had to slice it off from the main pipeline because it needs around-the-clock custom attention. Very quickly we realized we needed to quarantine this workflow. One of our colorists here has become a VR expert, and he’s now the only one allowed to grade those projects.” The facility upgraded to a Silverdraft Demon workstation with specialized storage to meet the exponential demand for processing power and disk space.

Marini says Invisible, like the other VR work Local Hero has done before is, in essence, a research project in these early days of immersive content. “There is no standard color space or headset or camera. And we’re still in the prototype phase of this. While we are in this phase, everything is an experiment. The experience of being in 3D space is interesting but the quality of what you’re watching is still very, very low resolution. The color fidelity relative to what we’re used to in the theater and on 4K HDR televisions is like VHS 1980’s quality. We’re still very far away from truly excellent VR.”

Scratch VR workflows in Invisible included a variety of complicated processes. “We did things like dimension-alizing 2D shots,” says Marini. “That’s complicated stuff. In 3D with the headset on we would take a shot that was in 2D, draw a rough roto mask around the person, create a 3D field, pull their nose forward, push their eyes back, push the sky back — all in a matter of seconds. That is next-level stuff for VR post.”

Local Hero also used Scratch Web for reviews. “Moments after we finished a shot or sequence it was online and someone could put on a headset and watch it. That was hugely helpful. Doug was in London, Condé Nast in New York. Lexus was a sponsor of this, so their agency in New York was also involved. Jaunt is down the street from us here in Santa Monica. And there were three clients in the bay with us at all times.”

‘Invisible’

As such, there is no way to standardize a VR DI workflow, he says. “For Invisible, it was definitely all hands on deck and every day was a new challenge. It was 4K 60p stereo, so the amount of data we had to push — 4K 60p to both eyes — which was unprecedented.” Strange stereo artifacts would appear for no apparent reason. “A bulge would suddenly show up on a wall and we’d have to go in there and figure out why and fix it. Do we warp it? Try something else? It was like that throughout the entire project: invent the workflow every day and fudge your way through. But that’s the nature of experimental technology.”

Will there be a watershed VR moment in the year ahead? “I think it all depends on the headsets, which are going to be like mobile phones,” he says. “Every six months there will be a new group of them that will be better and more powerful with higher resolution. I don’t think there will be a point in the future when everyone has a self-contained high-end headset. I think the more affordable headsets that you put your phone into, like Gear VR and Daydream, are the way most people will begin to experience VR. And we’re only 20 percent of the way there now. The whole idea of VR narrative content is completely unknown and it remains to be seen if audiences care and want it and will clamor for it. When they do, then we’ll develop a healthy VR content industry in Hollywood.”


Beth Marchant has been covering the production and post industry for 21 years. She was the founding editor-in-chief of Studio/monthly magazine and the co-editor of StudioDaily.com. She continues to write about the industry.

Duck Grossberg joins Local Hero as CTO, will grow dailies, VR biz

Santa Monica-based Local Hero, a boutique post facility working on feature and independent films has hired Duck Grossberg as chief technology officer.

Grossberg, who was most recently at Modern Videofilm, will drive the overall technology vision for Local Hero, as well as expand the dailies part of the studio’s end-to-end workflow services. In addition, Grossberg’s significant virtual reality production and DI experience will also help fuel Local Hero’s rapidly growing VR business.

Grossberg has held a variety of technical roles over the past 15 years, working with facilities such as the The Creative Cartel, Deluxe Labs, The Post Group, Modern, Cameron/Pace, Tyler Perry Studios and 20th Century Fox.

As a DIT, digital lab supervisor and colorist (dailies and on-set), Grossberg’s credits include Real Steel, Life of P, and Dawn of the Planet of the Apes, as well as TV shows such as Dig, Tyrant and Sleepy Hollow.

“Local Hero experienced exponential growth in our core dailies, DI, VFX and finishing business in 2015,” says Leandro Marini, founder/president of Local Hero. “We also saw rapid growth in our VR dailies and finishing business, delivering nearly 20 projects for clients such as Fox, Jaunt Studios and the NFL. The addition of Duck is a crucial component to our expansion at Local Hero. The combination of his technical prowess, creative skills and client experience make him uniquely positioned to help drive our aggressive growth.”

Quick Chat: Assimilate’s Lucas Wilson talks about Scratch Web

Recently, Assimilate launched Scratch Web, a cloud-based multi-user collaboration tool that offers individual clip, timeline or timeline plus version sharing (known as Scratch Construct) as well as native support for industry-standard RAW camera formats.

It’s actually already in use at Santa Monica’s Local Hero Post, where founder and supervising colorist Leandro Marini has made it a part of his everyday workflow. Actually, keep an eye on this space in the future for a Scratch Web review from Marini.

To find out more about the product itself, we picked the brain of Assimilate’s VP of business Continue reading