Category Archives: post production

Life is but a Streambox

By Jonathan Abrams

My interest in Streambox originated with their social media publishing capabilities (Facebook Live, YouTube Live, Twitter). I was shuttled to an unsecured, disclosed location (Suite 28140 at The Venetian) for a meeting with Tony Taylor (business development manager) and Bob Hildeman (CEO), where they were conducting user-focused presentations within a quiet and relaxing setting.

The primary use for Streambox in post production is live editorial and color review. Succinctly, it’s WebEx for post. A majority of large post production facilities use Streambox for live review services. It allows remote editorial and color grading over the public Internet with Mezzanine quality.

The process starts with either a software or hardware encoder. With the software encoder, you need to have your own I/O. As Bob mentioned this, he reached for a Blackmagic Design Mini Converter. The software encoder is limited to 8 bits. They also have two hardware encoders that occupy 1 RU. One of these can work with 4K video, and a new one shipping in June that uses a new version of their codec and works with 2K video. The 2K encoder will likely receive a software upgrade eventually that will enable it to work with 4K. All of their hardware encoders operate at 10 bit with 4:2:2 sampling and have additional post-specific features which include; genlock, frame sync, encryption and IFB audio talkback capabilities. Post companies offering remote color grading services are using a hardware encoder.

Streambox uses a proprietary ACT (Advanced Compression Technology) L3/L4 codec and LDMP (Low Delay Multi Path) protocol. For HD and 2K contribution over the Public Internet, their claim is that the ACT-L3/L4 codec is more bandwidth- and picture quality- efficient than H.264 (AVC), H.265 (HEVC), and JPEG2000. The low, and most importantly, sustained latency of the codec is in the use of LDMP (Low Delay Mutipath) video transport. The software and hardware decoders have about two seconds of latency, while the web output (browser) latency is 10 seconds. You can mix and match encoders and decoders. Put another way, you could use a hardware encoder and a software decoder.

TCP (Transmission Control Protocol), which is used for HTTP data transfer, is designed to have the receiving device confirm with the sender that it received packets. This creates packet redundancy overhead that reduces how much bandwidth you have available for data transmission.

Recovered packets in FEC display artifacts (macro blocking, buffering) when network saturation becomes problematic during playback. This does not generally effect lower bandwidth streams that use caching topology for network delivery, but for persistent streaming of video over 4Mbps this problem becomes apparent because of the large bandwidth that is needed for high-quality contribution content. UDP (User Datagram Protocol) eliminates this overhead at the cost of packets that were not delivered being unrecoverable. Streambox is using UDP to send its data and the decoder can detect and request lost packets. This keeps the transmission overhead low while eliminating lost packets. If you do have to limit your bandwidth, you can set a bitrate ceiling and not have to consider overhead. Streambox supports AES128 encryption as an add-on, and the number of bits can be higher (192 or 256).

Streambox Cloud allows the encoder to connect to the geographically closest cloud out of 10 sites available and have the data travel in the cloud until it reaches what is called the last mile to the decoder. All 10 cloud sites use Amazon Web Services, and two of those cloud sites also use Microsoft Azure. The cloud advantage in this situation is the use of global transport services, which minimize the risk of bandwidth loss while retaining quality.

Streambox has a database-driven service called Post Cloud that is evolving from broadcast-centric roots. It is effectively a v1 system, with post-specific reports and functionality added and broadcast-specific options stripped away. This is also where publishing to Facebook Live, YouTube Live and Twitter happens. After providing your live publishing credentials, Streambox manages the transcoding for the selected service. The publishing functionality does not prevent select users from establishing higher quality connections. You can have HQ streams to hardware and software decoders running simultaneously with a live streaming component.

The cloud effectively acts as a signal router to multiple destinations. Streamed content can be recorded and encrypted. Other cloud functionality includes realtime stitching of Ricoh Theta S camera outputs for 360º video.


Jonathan Abrams is Chief Technical Engineer at NYC’s Nutmeg Creative.

A virgin’s view of NAB

By Nolan Jennings

I have a confession: I am 28 years old, have lived in Los Angeles for over six years and had not been to Las Vegas until this past week. My naiveté concerning Las Vegas is generally a well-kept secret among my millennial peers, who seem to consider a debaucherous Las Vegas weekend a requisite validation of 21st century American life.

In addition, this was my first experience attending the National Association of Broadcasters convention (NAB). The simultaneous exposure to these two monolithic experiences was almost too great to bear, but I can happily report that I am safely back in Los Angeles, giddily recalling all of the incredible experiences I had over the course of the past four days.

Below, I’d love to share with you some background on how I arrived at NAB, and a few of the cool things I witnessed there.

How I Got There: An Introduction to Blue Collar Post Collective
I would not have been able to attend NAB without the wonderful generosity of the Blue Collar Post Collective (BCPC). For those not aware, BCPC is an organization of post production artists. Initially founded in New York City, BCPC has branched out to Los Angeles in recent years, and now boasts a bi-coastal membership representing every facet of the post industry.

Here I am (center) with the two other BCPC members who were sent to NAB: Eugene Vernikov and Tara Pennington.

BCPC hosts networking events as well as educational seminars, connecting various arms of the industry in order to promote awareness across all the various spectrums of the post industry including editorial, motion graphics, VFX, colorists, etc. If you haven’t been to one of these events, do yourself a favor and go to their website or sign up on the BCPC Facebook page to learn more and find their next one.

In addition to educational seminars, they are now offering the Professional Development Accessibility Program (PDAP), which assists folks like me who would not otherwise be able to take advantage of opportunities like NAB. Through this program, myself and two other talented post pros were able to travel to Las Vegas, stay there for four days and take advantage of quite literally every opportunity available.

PART 1: Arrival
We touched down at McCarran International Airport. I exited the gate and was immediately confronted with slot machines. Slot Machines! In the terminal! Any illusions I had about my peers’ exaggerations of Vegas were immediately decimated. Thus initiated, I hired a cab to take me to the front steps of the Las Vegas Convention Center. The car pulled up, and I was struck again by Vegas bravado. A giant window wrap advertising DaVinci Resolve 14 covered the entire face of the South Hall. Hoards of attendees streamed toward the entrance sporting their NAB badges. I emerged from the cab and trepidly joined them, holding my backpack close and clutching my pre-registered badge.

Once inside the South Hall I headed for a talk hosted by Rob Legato, the three-time Oscar-winning VFX supervisor whose most recent win was for John Favreau’s adaption of The Jungle Book. Say what you will about the movie in a narrative construct, but I dare anyone to challenge its visual sophistication. Legato talked candidly about the process required to bring the movie to life and showed extensive clips that revealed the extent of previs that preceded actual production, which was completed entirely on sound stages in downtown Los Angeles.

In particular, he emphasized the advent of virtual previsualization, and the benefits this process can provide to not only big-budget productions, but also micro projects due to the ease of use and rapidly sinking costs of HD cameras and previs software. The chance to witness those evolutions in Legato’s talk was a quantum leap forward in my understanding of the current state of affairs in post.

Following Legato’s presentation, I attended talks and classes with experts in various fields, such as motion graphics, virtual and augmented reality, and machine learning before finally checking into my hotel and attending a BCPC meet-up.

Part 2: NAB’s Show Floor
On the morning of day two I returned to the convention center and began to truly acquaint myself with the show. NAB is roughly divided into three types of experiences.

You could arguably spend your entire time walking the show floor (and accumulating thousands of Fit-bit steps). The variety of technology was absolutely astounding, and experts from each company was on hand to answer any questions you might have. The VR and AR demonstrations were some of the most fun to play with. Studios such as Digital Domain and companies like Nokia with its Ozo camera are making some terrific advances in the field, particularly in the realm of live-event VR broadcasting.

Another stand out was Adobe and its Character Animator, an animation application that uses facial recognition technology to animate an illustration based on the acting you perform in front of your computer’s camera. If you have a creative cloud subscription, this comes bundled with After Effects, and I highly recommend playing around with it. You don’t need any experience with animation in order to have a blast playing with Character Animator.

The Classes
There were also many classes taught during NAB, covering everything from HTML 5 animations to motion graphics to assistant editing in TV/film and beyond. The classes were taught by experts in their respective fields who were all very happy to answer questions and talk after the sessions were done.

I spent much of my time learning the various tools used in documentary editing, After Effects tricks and how video can be creatively and interactively incorporated into HTML 5 using cinemagraphs and other techniques. I knew many of the instructors from online tutorials and sites such as School of Motion. It was a very nerdy sort of star-struck feeling.

The Panels
Earlier I mentioned Rob Legato’s talk on The Jungle Book. His presentation was extraordinary, but it was only one of many amazing panels. Stand-outs included the editorial team from the new movie Logan, who spoke at length about their experience working on the movie. From pre-production to dailies ingest to test screenings to final delivery, their insights into the art of editing were enthralling. They showed several clips and discussed the various stages of the cut, as well as the discussions and even arguments that occurred over certain choices made in the edit. For an assistant editor such as myself who aspires to work on a project like Logan, this was a very special experience.

Other panels covered the emerging distribution landscape in VR entertainment, the cinematic innovations required to bring Ghost in the Shell to life, and a presentation of Big Little Lies hosted by Avid. These talks were indispensable elements of my NAB experience, and I’ll be going over the notes I took for months to come.

Part 3: Vendors
Through the auspices of Blue Collar Post Collective as well as postPerspective, I was able to meet with a few particular vendors who gave me a comprehensive tour of their new products.

One of these was Christie Projections, In the world of post, there is always a heavy focus on the tools that allow us to create wonderful content, but there is often little attention paid to the tools that shepherd that content into the eyes of audiences around the world. What good is a finely crafted story if it never reaches an audience, or reaches an audience in the wrong way? Christie offers an impressive line-up of industry leading projectors. If you have doubts as to the quality of their product, look no further than James Cameron’s endorsement. He’s using their RGB laser projection series to ensure his images are up to snuff.

JMR have been creating storage solutions for years, and their innovations continue to push the boundaries of speed and mobility for post pros. As an example, their Lightning LTNG-XD-8-MMDT uses a Mac mini integration to power an extremely fast and portable system that can provide up to 64TB of native disk storage capacity for your DI cart or other on-site production workflows.

I was lucky enough to attend Canon’s NAB 2017 dinner, where Canon showed its guests their new Compact-Servo lenses — 4K Super 35mm lenses that deliver the quality of cinema lenses at a fraction of the price and with all the ease of use that you typically associate with photography lenses. Additionally, they showed footage of the aurora borealis shot in Alaska, and the images they were able to get in an incredibly low-light environment were astounding, with extremely little noise in the image. I personally have been using Canon DSLRs for years, and this presentation convinced me that my love for Canon won’t be ending anytime soon.

Part 3: Au Revoir
The finale in my NAB experience was the 16th Annual Las Vegas SuperMeet, with presentations by Blackmagic, Adobe, HP and many others. These presentations were punctuated by raucous raffle announcements, with winners running up to the stage with their ticket in hand, jumping and screaming in delight as they walked away with some seriously valuable prizes, including a full DaVinci Resolve system.

These events were impressive, but the highlight for me was an interview with Dody Dorn, ACE, editor of many of my favorite films, including Memento. Her insights into the craft of editing were extraordinary. She focused entirely on character and storytelling, answering questions with a sense of humor and humility that belied her extreme talents and accomplishments.

I walked away from the SuperMeet wishing I could stay at NAB for the entire week. Unfortunately, duty called. I was on my way to the airport, back to Los Angeles where a list of director notes sat in my editing bay, ready to be addressed. I can’t emphasize enough how impactful the NAB experience was. My perspective on our industry shifted radically, and my knowledge base expanded more in four days than over the past year. I look forward to returning next year, the year after that, and so on. Au revoir NAB — see you next time.
——-
Nolan Jennings is an LA-based assistant editor currently working on Season 5 of The Fosters or Freeform.

MTI 4.28

Xytech intros MediaPulse Sky mobile interface

Xytech will be at NAB with its new MediaPulse Sky, a mobile interface for its MediaPulse facility management platform. The company is also showing over 400 new features for MediaPulse, a platform used by studios, networks, media companies and post facilities for functions such as scheduling, asset management, resource allotment, personnel, equipment and financials.

Expanding access and mobility for users, MediaPulse Sky is available for all portable devices. Greg Dolan, COO of Xytech, says, “Full and timely access to data is standard in today’s media business. We developed Sky to give our clients secure access to what they need.”

Configurable through MediaPulse Layout Editor, the MediaPulse Sky interface is available with new dashboards and charts for key performance indicators. Additionally, MediaPulse Sky is optimized for cellular networks. Users can actualize orders, confirm crewing assignments, provision video feeds, schedule sessions and review assignments wherever they may be, and with whatever device they choose.

MediaPulse Transmission 2017 also has a new, configurable operation screen with on-screen alerts and notifications. A new auto-routing feature allows users to find the best routes between two points quickly and leverages the MediaPulse Rules Engine to add user-defined criteria to the routing choices. MediaPulse Transmission offers enterprise-class operations and financial management tools designed specifically for the transmission environment, and the new features help to manage the complexities of circuits for the broadcast sector.

At NAB, Xytech will also demonstrate an automated camera-to-distribution workflow encapsulating metadata management, order management, transcoding and quality control, all performed without requiring user actions. In addition, the company’s MediaPulse development platform now offers the ability to integrate easily with any other system in the client ecosystem, enabling MediaPulse to act as a single console.


A chat with Emmy-winning comedy editor Sue Federman

This sitcom vet talks about cutting Man With A Plan and How I Met Your Mother.

By Dayna McCallum

The art of sitcom editing is overly enjoyed and underappreciated. While millions of people literally laugh out loud every day enjoying their favorite situation comedies, very few give credit to the maestro behind the scenes, the sitcom editor.

Sue Federman is one of the best in the business. Her work on the comedy How I Met Your Mother earned three Emmy wins and six nominations. Now the editor of CBS’ new series, Man With A Plan, Federman is working with comedy legends Matt LeBlanc and James Burrows to create another classic sitcom.

However, Federman’s career in entertainment didn’t start in the cutting room; it started in the orchestra pit! After working as a professional violinist with orchestras in Honolulu and San Francisco, she traded in her bow for an Avid.

We sat down to talk with Federman about the ins and outs of sitcom editing, that pesky studio audience, and her journey from musician to editor.

When did you get involved with your show, and what is your workflow like?
I came onto Man With A Plan (MWAP) after the original pilot had been picked up. They recast one of the leads, so there was a reshoot of about 75 percent of the pilot with our new Andi, Liza Snyder. My job was to integrate the new scenes with the old. It was interesting to preserve the pace and feel of the original and to be free to bring my own spin to the show.

The workflow of the show is pretty fast since there’s only one editor on a traditional audience sitcom. I usually put a show together in two to three days, then work with the producers for one to two days, and then send a pretty finished cut to the studio/network.

What are the biggest challenges you face as an editor on a traditional half-hour comedy?
One big challenge is managing two to three episodes at a time — assembling one show while doing producer or studio/network notes on another, as well as having to cut preshot playbacks for show night, which can be anywhere from three to eight minutes of material that has to be cut pretty quickly.

Another challenge is the live audience laughter. It’s definitely a unique part of this kind of show. I worked on How I Met Your Mother (HIMYM) for nine years without an audience, so I could completely control the pacing. I added fake laughs that fit the performances and things like that. When I came back to a live audience show, I realized the audience is a big part of the way the performances are shaped. I’ve learned all kinds of ways to manipulate the laughs and, hopefully, still preserve the spontaneous live energy of the show.

How would you compare cutting comedy to drama?
I haven’t done much drama, but I feel like the pace of comedy is faster in every regard, and I really enjoy working at a fast pace. Also, as opposed to a drama or anything shot single-camera, the coverage on a multi-cam show is pretty straightforward, so it’s really all about performance and pacing. There’s not a lot of music in a multi-cam, but you spend a lot of time working with the audience tracks.

What role would you say an editor has in helping to make a “bit” land in a half-hour comedy?
It’s performance, timing and camera choices — and when it works, it feels great. I’m always amazed at how changing an edit by a frame or two can make something pop. Same goes for playing something wider or closer depending on the situation.

MWAP is shot before a live studio audience. How does that affect your rhythm?
The audience definitely affects the rhythm of the show. I try to preserve the feeling of the laughs and still keep the show moving. A really long laugh is great on show night, but usually we cut it down a bit and play off more reactions. The actors on MWAP are great because they really know how to “ride” the laughs and not break character. I love watching great comedic actors, like the cast of I Love Lucy, for example, who were incredible at holding for laughs. It’s a real asset and very helpful to the editor.

Can you describe your process? And what system do you edit the show on?
I’ve always used the Avid Media Composer. Dabbled with Final Cut, but prefer Avid. I assemble the whole show in one sequence and go scene by scene. I watch all of the takes of a scene and make choices for each section or sometimes for each line. Then I chunk the scene together, sometimes putting in two choices for a line or area. I then cut into the big pieces to select the cameras for each shot. After that, I go back and find the rhythm of the scene — tightening the pace, cutting into the laughs and smoothing them.

After the show is put together, I go back and watch the whole thing again, pretending that I’ve never seen it, which is a challenge. That makes me adjust it even more. I try to send out a pretty polished first cut, without cutting any dialogue to show the producers everything, which seems to make the whole process go faster. I’m lucky that the directors on MWAP are very seasoned and don’t really give me many notes. Jimmy Burrows and Pam Fryman have directed almost all of the episodes, and I don’t send out a separate cut to either of them. Particularly with Pam, as I’ve worked with her for about 11 years, so we have a nice shorthand.

How do assistant editors work into the mix?
My assistant, Dan “Steely” Esparza, is incredible! He allows me to show up to work every day and not think about anything other than cutting the show. He’s told me, even though I always ask, that he prefers not to be an editor, so I don’t push him in that direction. He is excellent at visual effects and enjoys them, so I always have him do those. On HIMYM, we had quite a lot of visual effects, so he was pretty busy there. But on MWAP, it’s mostly rough composites for blue/greenscreen scenes and painting out errant boom shadows, boom mics and parts of people.

Your work on HIMYM was highly lauded. What are some of your favorite “editing” moments from that show and what were some of the biggest challenges they threw at you?
I really loved working on that show — every episode was unique, and it really gave me opportunities to grow as an editor. Carter Bays and Craig Thomas were amazing problem solvers. They were able to look at the footage and make something completely different out of it if need be. I remember times when a scene wasn’t working or was too long, and they would write some narration, record the temp themselves, and then we’d throw some music over it and make it into a montage.

Some of the biggest editing challenges were the music videos/sequences that were incorporated into episodes. There were three complete Robin Sparkles videos and many, many other musical pieces, almost always written by Carter and Craig. In “P.S. I Love You,” they incorporated her last video into kind of a Canadian Behind the Music about the demise of Robin Sparkles, and that was pretty epic for a sitcom. The gigantic “Subway Wars” was another big challenge, in that it had 85 “scenelets.” It was a five-way race around Manhattan to see who could get to a restaurant where Woody Allen was supposedly eating first, with each person using a different mode of transportation. Crazy fun and also extremely challenging to fit into a sitcom schedule.

You started in the business as a classical musician. How does your experience as a professional violinist influence your work as an editor?
I think the biggest thing is having a good feeling for the rhythm of whatever I’m working on. I love to be able to change the tempo and to make something really pop. And when asked to change the pacing or cut sections out, when doing various people’s notes, being able to embrace that too. Collaborating is a big part of being a musician, and I think that’s helped me a lot in working with the different personalities. It’s not unlike responding to a conductor or playing chamber music. Also having an understanding of phrasing and the overall structure of a piece is valuable, even though it was musical phrasing and structure, it’s not all that different.

Obviously, whenever there’s actual music involved, I feel pretty comfortable handling it or choosing the right piece for a scene. If classical music’s involved, I have a great deal of knowledge that can be helpful. For example, in HIMYM, we needed something to be a theme for Barney’s Playbook antics. I tried a few things, and we landed on the Mozart Rondo Alla Turca, which I’ve been hearing lately in the Progresso Soup commercials.

How did you make the transition from the concert hall to the editing room?
It’s a long story! I was playing in the San Francisco Ballet Orchestra and was feeling stuck. I was lucky enough to find an amazing career counseling organization that helped me open my mind to all kinds of possibilities, and they helped me to discover the perfect job for me. It was quite a journey, but the main thing was to be open to anything and identify the things about myself that I wanted to use. I learned that I loved music (but not playing the violin), puzzles, stories and organizing — so editing!

I sold a bow, took the summer off from playing and enrolled in a summer production workshop at USC. I wasn’t quite ready to move to LA, so I went back to San Francisco and began interning at a small commercial editing house. I was answering phones, emptying the dishwasher, getting coffees and watching the editing, all while continuing to play in the Ballet Orchestra. The people were great and gave me opportunities to learn whenever possible. Luckily for me, they were using the Avid before it came to TV and features. Eventually, there was a very rough documentary that one of the editors wanted to cut, but it wasn’t organized. They gave me the key to the office and said, “You want to be an editor? Organize this!” So I did, and they started offering me assistant work on commercials. But I wanted to cut features, so I started to make little trips to LA to meet anybody I could.

Bill Steinberg, an editor working in the Universal Syndication department who I met at USC, got me hooked up with an editor who was to be one of Roger Corman’s first Avid editors. The Avids didn’t arrive right away, but he helped me put my name in the hat to be an assistant the next time. It happened, and I was on my way! I took a sabbatical from the orchestra, went down to LA, and worked my tail off for $400 a week on three low-budget features. I was in heaven. I had enough hours to join the union as an assistant, but I needed money to pay the admission fee. So I went back to San Francisco and played one month of Nutcrackers to cover the fee, and then I took another year sabbatical. Bill offered me a month position in the syndication department to fill in for him, and show the film editors what I knew about the Avid.

Eventually Andy Chulack, the editor of Coach, was looking for an Avid assistant, and I was recommended because I knew it. Andy hired me and took me under his wing, and I absolutely loved it. I guess the upshot is, I was fearlessly naive and knew the Avid!

What do you love most about being an editor?
I love the variation of material and people that I get to work with, and I like being able to take time to refine things. I don’t have to play it live anymore!


Rick Anthony named GM of Light Iron New York

Post company Light Iron has named Rick Anthony to the newly created role of general manager in its New York facility. The addition comes after Light Iron added a second floor in 2016, tripling its inventory of editorial suites.

Anthony previously held GM roles at Pac Lab and New York Lab/Postworks/Moving Images, overseeing teams from lab through digital workflows. He began his career at New York film lab, DuArt, where he was a technical supervisor for many years.

Anthony notes several reasons why he joined Light Iron, a Panavision company. “From being at the forefront of color science and workflow to providing bi-coastal client support, this is a unique opportunity. Working together with Panavision, I look forward to serving the dailies, editorial, and finishing needs of any production, be it feature, episodic or commercial.”

Light Iron’s New York facility offers 20 premium editorial suites from its Soho location, as well as in-house and mobile dailies services, HDR-ready episodic timing bays and a 4K DI theater. The facility recently serviced Panavision’s first US-based feature shot on the new Millennium DXL camera.


Assimilate’s Scratch VR Suite 8.6 now available

Back in February, Assimilate announced the beta version of its Scratch VR Suite 8.6. Well, now the company is back with a final version of the product, including user requests for features and functions.

Scratch VR Suite 8.6 is a realtime post solution and workflow for VR/360 content. With added GPU stitching of 360-video and ambisonic audio support, as well as live streaming, the Scratch VR Suite 8.6 allows VR content creators — DPs, DITs, post artists — a streamlined, end-to-end workflow for VR/360 content.

The Scratch VR Suite 8.6 workflow automatically includes all the basic post tools: dailies, color grading, compositing, playback, cloud-based reviews, finishing and mastering.

New features and updates include:
– 360 stitching functionality: Load the source media of multiple shots from your 360 cameras. into Scratch VR and easily wrap them into a stitch node to combine the sources into a equirectangular image.
• Support for various stitch template format, such as AutoPano, Hugin, PTGui and PTStitch scripts.
• Either render out the equirectangular format first or just continue to edit, grade and composite on top of the stitched nodes and render the final result.
• Ambisonic audio: Load, set and playback ambisonic audio files to complete the 360 immersive experience.
• Video with 360 sound can be published directly to YouTube 360.
• Additional overlay handles to the existing. 2D-equirectangular feature for more easily positioning. 2D elements in a 360 scene.
• Support for Oculus Rift, Samsung Gear VR, HTC Vive and Google Cardboard.
• Several new features and functions make working in HDR just as easy as SDR.
• Increased Format Support – Added support for all the latest formats for even greater efficiency in the DIT and post production processes.
• More Simplified DIT reporting function – Added features and functions enables even greater efficiencies in a single, streamlined workflow.
• User Interface: Numerous updates have been made to enhance and simplify the UI for content creators, such as for the log-in screen, matrix layout, swipe sensitivity, Player stack, tool bar and tool tips.


Post vet Katie Hinsen now head of operations at NZ’s Department of Post

Katie Hinsen, who many of you may know as co-founder of the Blue Collar Post Collective, has moved back to her native New Zealand and has been named head of operations at Aukland’s Department of Post.

Most recently at New York City’s Light Iron, Hinsen comes from a technical and operations background, with credits on over 80 major productions. Over a 20-year career she has worked as an engineer, editor, VFX artist, stereoscopic 3D artist, colorist and finishing artist on commercials, documentaries, television, music videos, shorts and feature films. In addition to Light Iron, she has had stints at New Zealand’s Park Road Post Production and Goldcrest in New York.

Hinsen has throughout her career been involved in both production and R&D of new digital acquisition and distribution formats, including stereoscopic/autostereoscopic 3D, Red, HFR, HDR, 4K+ and DCP. Her expertise includes HDR, 4K and 8K workflows.

“I was looking for a company that had the forward-thinking agility to be able to grow in a rapidly changing industry. New Zealand punches well above its weight in talent and innovation, and now is the time to use this to expand our wider post production ecosystem,” says Hinsen.

“Department of Post is a company that has shown rapid growth and great success by taking risks, thinking outside the box, and collaborating across town, across the country and across the world. That’s a model I can work with, to help bring and retain more high-end work to Auckland’s post community. We’ve got an increasing number of large-scale productions choosing to shoot here. I want to give them a competitive reason to stay here through Post.“

Department of Post was started by James Brookes and James Gardner in 2008. They provide offline, online, color, audio and deliverables services to film and television productions, both local and international.


Building a workflow for The Great Wall

Bling Digital, which is part of the SIM Group, was called on to help establish the workflow on Legendary/Universal’s The Great Wall, starring Matt Damon as a European mercenary imprisoned within the wall. While being held he sees exactly why the Chinese built this massive barrier in the first place — and it’s otherworldly. This VFX-heavy mystery/fantasy was directed by Yimou Zhang.

We reached out to Bling’s director of workflow services, Jesse Korosi, to talk us through the process on the film, including working with data from the Arri 65, which at that point hadn’t yet been used on a full-length feature film. Bling Digital is a post technology and services provider that specializes in on-set data management, digital dailies, editorial system rentals and data archiving

Jesse Korosi

When did you first get involved on The Great Wall and in what capacity?
Bling received our first call from the unit production manager Kwame Parker about providing on-set data management, dailies, VFX and stereo pulls, Avid rentals and a customized process for the digital workflow for The Great Wall in December of 2014.

At this time the information was pretty vague, but outlined some of the bigger challenges, like the film being shot in multiple locations within China, and that the Arri 65 camera may be used, which had not yet been used on a full-length feature. From this point on I worked with our internal team to figure out exactly how we would tackle such a challenge. This also required a lot of communication with the software developers to ensure that they would be ready to provide updated builds that could support this new camera.

After talks with the DP Stuart Dryburgh, the studio and a few other members of production, a big part of my job and anyone on my workflow team is to get involved as early as possible. Therefore our role doesn’t necessarily start on day one of principal photography. We want to get in and start testing and communicating with the rest of the crew well ahead of time so that by the first day, the process runs like a well-oiled machine and the client never has to be concerned with “week-one kinks.”

Why did they opt for the Arri 65 camera and what were some of the challenges you encountered?
Many people who we work with love Arri. The cameras are known for recording beautiful images. For anyone who may not be a huge Arri fan, they might dislike the lower resolution in some of the cameras, but it is very uncommon that someone doesn’t like the final look of the recorded files. Enter the Arri 65, a new camera that can record 6.5K files (6560×3100) and every hour recorded is a whopping 2.8TB per hour.

When dealing with this kind of data consumption, you really need to re-evaluate your pipeline. The cards are not able to be downloaded by traditional card readers — you need to use vaults. Let’s say someone records three hours of footage in a day — that equals 8.7TB of data. If you’re sending that info to another facility even using a 500Mb/s Internet line, that would take 38 hours to send! LTO-ing this kind of media is also dreadfully slow. For The Great Wall we ended up setting up a dedicated LTO area that had eight decks running at any given time.

Aside from data consumption, we faced the challenge of having no dailies software that could even read the files. We worked with Colorfront to get a new build-out that could work, and luckily, after having been through this same ordeal recording Arri Open Gate on Warcraft, we knew how to make this happen and set the client at ease.

Were you on set? Near set? Remote?
Our lab was located in the production office, which also housed editorial. Considering all of the traveling this job entailed, from Beijing and Qingdao to Gansu, we were mostly working remotely. We wanted to be as close to production as possible, but still within a controlled environment.

The dailies set-up was right beside editor Craig Wood’s suite, making for a close-knit workflow with editorial, which was great. Craig would often pull our dailies team into his suite to view how the edit was coming along, which really helped when assessing how the dailies color was working and referencing scenes in the cut when timing pickup shots.

How did you work with the director and DP?
At the start of the show we established some looks with the DP Stuart Dryburgh, ASC. The idea was that we would handle all of the dailies color in the lab. The DIT/DMT would note as much valuable information on set about the conditions that day and we would use our best judgment to fulfill the intended look. During pre-production we used a theatre at the China Film Group studio to screen and review all the test materials and dial in this look.

With our team involved from the very beginning of these color talks, we were able to ensure that decisions made on color and data flow were going to track through each department, all the way to the end of the job. It’s very common for decisions to be made color wise at the start of a job that get lost in the shuffle once production has wrapped. Plus, sometimes there isn’t anyone available who recognizes why certain decisions were made up front when you‘re in the post stage.

Can you talk us through the workflow? 
In terms of workflow, the Arri 65 was recording media onto Codex cards, which were backed up onset with a VaultS. After this media was backed up, the Codex card would be forwarded onto the lab. Within the lab we had a VaultXL that would then be used to back this card up to the internal drive. Unfortunately, you can’t go directly from the card to your working drive, you need to do two separate passes on the card, a “Process” and a “Transfer.”

The Transfer moves the media off the card and onto an internal drive on the Vault. The Process then converts all the native camera files into .ARI files. Once this media is processed and on the internal drive, we were able to move it onto our SAN. From there we were able to run this footage through OSD and make LTO back-ups. We also made additional back-ups to G-Tech GSpeed Studio drives that would be sent back to LA. However, for security purposes as well as efficiency, we encrypted and shipped the bare drives, rather than the entire chassis. This meant that when the drives were received in LA, we were able to mount them into our dock and work directly off of them, i.e no need to wait on any copies.

Another thing that required a lot of back and forth with the DI facility was ensuring that our color pipeline was following the same path they would take once they hit final color. We ended up having input LUTs for any camera that recorded a non-LogC color space. In regards to my involvement, during production in China I had a few members of my team on the ground and I was overseeing things remotely. Once things came back to LA and we were working out of Legendary, I became much more hands-on.

What kind of challenges did providing offline editorial services in China bring, and how did that transition back to LA?
We sent a tech to China to handle the set-up of the offline editorial suites and also had local contacts to assist during the run of the project. Our dailies technicians also helped with certain questions or concerns that came up.

Shipping gear for the Avids is one thing, however shipping consoles (desks) for the editors would have been far too heavy. Therefore this was probably one of the bigger challenges — ensuring the editors were working with the same caliber of workspace they were used to in Los Angeles.

The transition of editorial from China to LA required Dave French, director of post engineering, and his team to mirror the China set-up in LA and have both up and running at the same time to streamline the process. Essentially, the editors needed to stop cutting in China and have the ability to jump on a plane and resume cutting in LA immediately.

Once back in LA, you continued to support VFX, stereo and editorial, correct?
Within the Legendary office we played a major role in building out the technology and workflow behind what was referred to as the Post Hub. This Post Hub was made up of a few different systems all KVM’d into one desk that acted as the control center for VFX and stereo reviews, VFX and stereo pulls and final stereo tweaks. All of this work was controlled by Rachel McIntire, our dailies, VFX and stereo management tech. She was a jack-of-all-trades who played a huge role in making the post workflow so successful.

For the VFX reviews, Rachel and I worked closely with ILM to develop a workflow to ensure that all of the original on set/dailies color metadata would carry into the offline edit from the VFX vendors. It was imperative that during this editing session we could add or remove the color, make adjustments and match exactly what they saw on set, in dailies and in the offline edit. Automating this process through values from the VFX Editors EDL was key.

Looking back on the work provided, what would you have done differently knowing what you know now?
I think the area I would focus on next time around would be upgrading the jobs database. With any job we manage at Bling, we always ensure we keep a log of every file recorded and any metadata that we track. At the time, this was a little weak. Since then, I have been working on overhauling this database and allowing creative to access all camera metadata, script metadata, location data, lens data, etc. in one centralized location. We have just used this on our first job in a client-facing capacity and I think it would have done wonders for our VFX and stereo crews on The Great Wall. It is all too often that people are digging around for information already captured by someone else. I want to make sure there is a central repository for that data.


Hollywood’s Digital Jungle moves to Santa Clarita

Digital Jungle, a long-time Hollywood-based post house, has moved its operations to a new facility in Santa Clarita, California, which has become a growing hub for production and post in the suburbs of Los Angeles. The new headquarters is now home to both Digital Jungle Post and its recent off-shoot Digital Jungle Pictures, a feature film development and production studio.

“I don’t mind saying, it was a bit of an experiment moving to Santa Clarita,” explains Digital Jungle president and chief creative Dennis Ho. “With so many filmmakers and productions working out here — including Disney/ABC Studios, Santa Clarita Studios and Universal Locations — this area has developed into a vast untapped market for post production professionals. I decided that now was a good time to tap into that opportunity.”

Digital Jungle’s new facility offers the full complement of digital workflow solutions for HD to 4K. The facility has multiple suites featuring Smoke, DaVinci Resolve, audio recording via Avid’s S6 console and Pro Tools, production offices, a conference area, a full kitchen and a client lounge.

Digital Jungle is well into the process of adding further capabilities with a new high-end luxury DI 4K theater and screening room, greenscreen stage, VFX bullpen, multiple edit bays and additional production offices as part of their phase two build-out.

Digital Jungle Post services include DI/color grading; VFX/motion graphics; audio recording/mixing and sound design; ADR and VO; HD to 4K deliverables for tape and data; DCI and DCDM; promo/bumper design and film/television title design.

Commenting on Digital Jungle Pictures, Ho says, “It was a natural step for me. I started my career by directing and producing promos and interstitials for network TV, studios and distributors. I think that our recent involvement in producing several independent films has enhanced our credibility on the post side. Filmmakers tend to feel more comfortable entrusting their post work to other filmmakers. One example is we recently completed audio post and DI for a new Hallmark film called Love at First Glance.”

In addition to Love at First Glance, Digital Jungle Productions’ recent projects include indie films Day of Days, A Better Place (available now on digital and DVD) and Broken Memories, which was screened at the Sedona Film Festival.

 

Review: Dell Precision 7910 tower workstation

By Mike McCarthy

While I started my career on Dell Precision workstations, I have spent the last 10 years with HP workstations under my desk. They have served me well, which is why I used them for five generations. At the beginning of 2016, I was given the opportunity to do a complete hardware refresh for director Scott Waugh’s post house, Vasquez Saloon, to gear up our capabilities to edit the first film shot for Barco Escape and edited fully in 6K. This time we ended up with Dell Precision 7910 workstations under our desks. After having a chance to use them for a year, I decided it was time to share some of my experiences with the top-end Precision workstation.

My 7910 has two Xeon E5-2687W V3 processors, each with 10 cores running at 3.1Ghz. Regardless of which CPU speed you select, always fill both sockets of a high-end workstation, as that doubles your memory bandwidth and enables the last two PCIe slots. Therefore, choose dual 4-core CPUs instead of a single 8-core CPU, if that is the performance level you are after. It has 128GB of DDR4 memory, divided across eight sticks that are 16GB each. Regardless of size, maximum performance is achieved with at least as many sticks of RAM since there are memory channels. This system has four memory channels per CPU, for a total of eight channels. I would recommend at least 64GB of RAM for most editing systems, with more for larger projects. Since we were cutting an entire feature with 6K source files, 128GB was a reasonable choice that served us well.

Both our systems are usually pretty quiet, which is impressive considering how powerful they are. They do generate heat, and I don’t recommend running them in a room without AC, but that was outside of our control. Air-cooled systems are only as effective as the environment they are in, and our situation wasn’t always optimal.

PCIe SSDs are a huge leap forward for storage throughput. This workstation came with a PCIe x16 Gen3 card that supports up to four M.2 NVMe https://en.wikipedia.org/wiki/NVM_Express SSDs at full speed. This allows up to 2500MB/s from each of the four ports, which is enough bandwidth to play back 6K DPXs at 24p in Premiere without dropping frames.

Now capacity is limited with this new expensive technology, topping out at 1TB per $700 card. My 512GB card can only store seven minutes of data at maximum throughput, but for smaller data sets, like VFX shots, this allows a system to cache meaningful quantities of data at very high speed without needing a large array of disks to sustain the required I/Os.

Once we open the tool-less case, one of the obvious visual differences between the Dell and HP solutions is that the Precision 7910 splits the PCIe slots, with two above the CPUs and five below. I assume the benefits to this are shorter circuit paths to the CPUs, and better cooling for hot cards. It hasn’t made a big difference to me, but it is worth noting. Like other dual-socket systems, two of the slots are disabled if the second CPU is not installed.

In my case, I have the SSD card in the top slot, and a Red Rocket-X in the next one down. The Thunderbolt 2 card has to be installed in the slot directly below the CPUs. Then I installed my SAS RAID card and the Intel X540 10GbE NIC, leaving space at the bottom for my Quadro GPU.

Another unique feature of the case layout is that the power supply is located behind the motherboard, instead of at the top or bottom of the system. This places the motherboard at the center of the chassis, with components and cards on one side, and power and storage bays on the other. There are a variety of integrated ports, with dual-Gigabit NICs, PS/2, audio, serial, and six USB ports. The only aspect I found limiting was the total of four USB 3.0 ports, one in front and three in back. I have on occasion been using all of them at once for my external drive transfers, but having a USB 3.0 hub in most of Dell’s monitors can help with this issue. Hopefully, we will see USB-C ports with double that bandwidth in the next generation, as well as integrated Thunderbolt 3 support to free up another PCIe slot.

Besides the slim DVD drive, there are four 3.5-inch hard drive bays with tool-less cages, and a 5.25-inch bay, which can be optionally reconfigured to hold four more 2.5-inch drives. The next model down, the Precision 7810, is similar, but without the top two PCIe slots and only two 3.5-inch drive bays. My drive bays are all empty because the PCIe SSD is my only internal storage, but that means that I could easily add four 8TB SAS drives for 32TB of internal storage with no other accessories required. And I may use the 5.25-inch bay for an LTO drive someday, if I don’t end up getting an external one.

If I do get an external SAS drive, it could be connected to one of the two SFF 8643 connectors on the motherboard. These new connectors each support four channels of 12Gb SAS, with one of them hooked to the 3.5-inch drive back plane by default. The integrated SAS controller supports up to eight channels of SAS or SATA data, capable of RAID-0 or -1. Using RAID-5 or -6 requires a separate dedicated card, in my case the Areca 1883x. At least one integrated M.2 slot would be great to see in the next refresh, as those SSDs become more affordable.

Dell also includes their system management software Dell Precision Optimizer to help you get the maximum performance from the system. It allows users to monitor and chart CPU and GPU use as well as memory and disk usage. It can configure system settings like Hyperthreading, Power Usage and V-Sync, using pre-built profiles for various industry applications. It won’t tune your system for video editing as well as an expert who knows what they are doing, but it is better than doing nothing right out of the box.

Real-World Use
Over the last year, we have run two of these workstations on a 6K feature film, taking them right to the limit on a regular basis. It was not uncommon to be encoding R3D dailies to H264 in AME, while rendering a VFX shot in AE, and playing back in Premiere, on both systems simultaneously, pulling data from each other’s local storage arrays over the network. And while I won’t say that they never crashed, stability was not an issue that seriously impacted our workflow or schedule. I have been quite impressed by what we were able to accomplish with them, with very little other infrastructure. The unique split chassis design makes room for a lot of internal storage, and they run reliably and quietly, even when chock full of powerful cards. I am looking forward to getting a couple more solid years of use out of them.


Mike McCarthy is an online editor and workflow consultant with 10 years of experience on feature films and commercials. He has been on the forefront of pioneering new solutions for tapeless workflows, DSLR filmmaking and now multiscreen and surround video experiences. If you want to see more specific details about performance numbers and benchmark tests for these Nvidia cards, check out techwithmikefirst.com.