Author Archives: Randi Altman

DP John Kelleran shoots Hotel Impossible

Director of photography John Kelleran shot season eight of the Travel Channel’s Hotel Impossible, a reality show in which struggling hotels receive an extensive makeover by veteran hotel operator and hospitality expert Anthony Melchiorri and team.

Kelleran, who has more than two decades experience shooting reality/documentary projects, called on Panasonic VariCam LT 4K cinema camcorders for this series.

eWorking for New York production company Atlas Media, Kelleran shot a dozen Hotel Impossible hour-long episodes in locations that include Palm Springs, Fire Island, Capes May, Cape Hatteras, Sandusky, Ohio, and Albany, New York. The production, which began last April and wrapped in December 2016, spent five days in each location.

Kelleran liked the VariCam LT’s dual native ISOs of 800/5000. “I tested ISO5000 by shooting in my own basement at night, and had my son illuminated only by a lighter and whatever light was coming through the small basement window, one foot candle at best. The footage showed spectacular light on the boy.”

Kelleran regularly deployed ISO5000 on each episode. “The crux of the show is chasing out problems in dark corners and corridors, which we were able to do like never before. The LT’s extreme low light handling allowed us to work in dark rooms with only motivated light sources like lamps and windows, and absolutely keep the honesty of the narrative.”

Atlas Media is handling the edit, using Avid Media Composer. “We gave post such a solid image that they had to spend very little time or money on color correction, but could rather devote resources to graphics, sound design and more,” concludes Kelleran.

Music house Wolf at the Door opens in Venice

Wolf at the Door has opened in Venice, California, providing original music, music supervision and sound design for the ad industry and, occasionally, films. Founders Alex Kemp and Jimmy Haun have been making music for some time: Kemp was composer at Chicago-based Catfish Music and Spank, and was the former creative director of Hum in Santa Monica. Haun spent over 10 years as the senior composer at Elias, in addition to being a session musician.

Between the two of them they’ve been signed to four major labels, written music for 11 Super Bowl spots, and have composed music for top agencies, including W+K, Goodby, Chiat Day, Team One and Arnold, working with directors like David Fincher, Lance Acord, Stacy Wall and Gore Verbinski.

In addition to making music, Kemp linked up with his longtime friend Scott Brown, a former creative director at agencies including Chiat Day, 72and Sunny and Deutsch, to start a surf shop and brand featuring hand-crafted surf boards — Lone Wolfs Objets d’Surf.

With the Wolf at the Door recording studio and production office existing directly behind the Lone Wolfs retail store, Kemp and his partners bounce between different creative projects daily: writing music for spots, designing handmade Lone Wolfs surfboards, recording bands in the studio, laying out their own magazine, or producing their own original branded content.

Episodes of their original surf talk show/Web series Everything’s Not Working have featured guest pro surfers, including Dion Agius, Nabil Samadani and Eden Saul.

Wolf at the Door recently worked on an Experian commercial directed by the Malloy Brothers for the Martin Agency, as well as a Century Link spot directed by Malcom Venville for Arnold Worldwide. Kemp worked closely with Venville on the casting and arrangement for the spot, and traveled to Denver to record the duet of singer Kelvin Jones’ “Call You Home” with Karissa Lee, a young singer Kemp found specifically for the project.

“Our approach to music is always driven by who the brand is and what ideas the music needs to support,” says Kemp. “The music provides the emotional context.” Paying attention to messaging is something that goes hand in hand with carving out their own brand and making their own content. “The whole model seemed ready for a reset. And personally speaking, I like to live and work at a place where being inspired dictates the actions we take, rather than the other way around.”

Main Image L-R:  Jimmy Haun and Alex Kemp.

Sound editor/mixer Korey Pereira on 3D audio workflows for VR

By Andrew Emge

As the technologies for VR and 360 video rapidly advance and become more accessible, media creators are realizing the crucial role that sound plays in achieving realism. Sound designers are exploring this new frontier of 3D audio at the same time that tools for the medium are being developed and introduced. When everything is so new and constantly evolving, how does one learn where to start or decide where to invest time and experimentation?

To better understand this process, I spoke with Korey Pereira, a sound editor and mixer based in Austin, Texas. He recently entered the VR/360 audio world and has started developing a workflow.

Can you provide some background about who you are, the work you’ve done, and what you’ve been up to lately?
I’m the owner/creative director at Soularity Sound, an Austin-based post company. We primarily work with indie filmmakers, but also do some television and ad work. In addition to my work at Soularity, I also work as a sound editor and mixer at a few other Austin post facilities, including Soundcrafter. My credits with them include Richard Linklater’s Boyhood and Everybody Wants Some, as well as TV shows such as Shipping Wars and My 600lb Life.

You recently purchased the Pro Sound Effects NYC Ambisonics library. Can you talk about some VR projects you are working on?
In the coming months I plan to start creating audio content for VR with a local content creator, Deepak Chetty. Over the years we have collaborated on a number of projects, most recently I worked on his stereoscopic 3D sci-fi/action film, Hard Reset, which won the 2016 “Best 3D Live Action Short” from the Advanced Imaging Society.

Deepak Chetty shooting a VR project.

I love sci-fi as a genre, because there really are no rules. It lets you really go for it as far as sound. Deepak has been shifting his creative focus toward 360 content and we are hoping to start working together in that aspect in the near future.

The content Deepak is currently mostly working on non-fiction and documentary-based content in 360 — mainly environment capture with a through line of audio storytelling that serves as the backbone of the piece. He is also looking forward to experimenting with fiction-based narratives in the 360 space, especially with the use of spatial audio to enhance immersion for the viewer.

Prior to meeting Deepak, did you have any experience working with VR/3D audio?
No, this is my first venture into the world of VR audio or 3D audio. I have been mixing in surround for over a decade, but I am excited about the additional possibilities this format brings to the table.

What have been the most helpful sources for studying up and figuring out a workflow?
The Internet! There is such a wealth of information out there, and you kind of just have to dive in. The benefit of 360 audio being a relatively new format is that people are still willing to talk openly about it.

Was there anything particularly challenging to get used to or wrap your head around?
In a lot of ways designing audio for VR is not that different from traditional sound mixing for film. You start with a bed of ambiences and then place elements within a surround space. I guess the most challenging part of the transition is anticipating how the audience might hear your mix. If the viewer decides to watch a whole video facing the surrounds, how will it sound?

Can you describe the workflow you’ve established so far? What are some decisions you’ve made regarding DAW, monitoring, software, plug-ins, tools, formats and order of operation?
I am a Pro Tools guy, so my main goal was finding a solution that works seamlessly inside the Pro Tools environment. As I started looking into different options, the Two Big Ears Spatial Workstation really stood out to me as being the most intuitive and easiest platform to hit the ground running with. (Two Big Ears recently joined Facebook, so Spatial Workstation is now available for free!)

Basically, you install a Pro Tools plug-in that works as a 3D audio engine and gives you a Pro Tools project with all the routing and tracks laid out for you. There are object-based tracks that allow you to place sounds within a 3D environment as well as ambience tracks that allow you to add stereo or ambisonic beds as a basis for your mix.

The coolest thing about this platform is that it includes a 3D video player that runs in sync with Pro Tools. There is a binaural preview pathway in the template that lets you hear the shift in perspective as you move the video around in the player. Pretty cool!

In September 2016, another audio workflow for VR in Pro Tools entered the market from the Dutch company Audio Ease and their 360 pan suite. Much like the Spatial Workstation, the suite offers an object-based panner (360 pan) that when placed on every audio track allows you to pan individual items within the 360-degree field of view. The 360 pan suite also includes the 360 monitor, which allows you to preview head tracking within Pro Tools.

Where the 360 pan suite really stands out is with their video overlay function. By loading a 360 video inside of Pro Tools, Audio Ease adds an overlay on top of the Pro Tools video window, letting you pan each track in real time, which is really useful. For the features it offers, it is relatively affordable. The suite does not come with its own template, but they have a quick video guide to get you up and going fairly easily.

Are there any aspects that you’re still figuring out?
Delivery is still a bit up in the air. You may need to export in multiple formats to be able to upload to Facebook, YouTube, etc. I was glad to see that YouTube is supporting the ambisonic format for delivery, but I look forward to seeing workflows become more standardized across the board.

Any areas in which you see the need for further development, and/or where the tech just isn’t there yet?
I think the biggest limitation with VR is the lack of affordable and easy-to-use 3D audio capture devices. I would love to see a super-portable ambisonic rig that filmmakers can easily use in conjunction with shooting 360 video. Especially as media giants like YouTube are gravitating toward the ambisonic format for delivery, it would be great for them to be able to capture the actual space in the same format.

In January 2017, Røde announced the VideoMic Soundfield — an on-camera ambisonic, 360-degree surround sound microphone — though pricing and release dates have not yet been made public.

One new product I am really excited about is the Sennheiser Ambeo VR mic, which is around $1,650. That’s a bit pricey for the most casual user once you factor in a 4-track recorder, but for the professional user that already has a 788T, the Ambeo VR mic offers a nice turnkey solution. I like that the mic looks a little less fragile than some of the other options on the market. It has a built-in windscreen/cage similar to what you would see on a live handheld microphone. It also comes with a Rycote shockmount and cable to 4-XLR, which is nice.

Some leading companies have recently selected ambisonics as the standard spatial audio format — can you talk a bit about how you use ambisonics for VR?
Yeah, I think this is a great decision. I like the “future proof” nature of the ambisonic format. Even in traditional film mixing, I like having the option to export to stereo, 5.1 or 7.1 depending on the project. Until ambisonic becomes more standardized, I like that the Two Big Ears/FB 360 encoder allows you to export to the .tbe B-Format (FuMa or ambiX/YouTube) as well as quad-binaural.

I am a huge fan of the ambisonic format in general. The Pro Sound Effects NYC Ambisonics Library (and now Chicago and Tokyo as well) was my first experience using the format and I was blown away. In a traditional mixing environment it adds another level of depth to the backgrounds. I really look forward to being able to bring it to the VR format as well.


Andrew Emge is operations manager at Pro Sound Effects.

Review: GoPro’s Karma Grip and Quik Key

By Brady Betzel

There has been a flood of GoPro-compatible accessories introduced over the last several years, with few having as much impact as handheld stabilizers. Stabilizers have revolutionized videography (more specifically GoPro videography) and they are becoming extremely compact and very reasonably priced.

A while ago, I reviewed a GoPro Hero 3- and 4-compatible handheld stabilizer from Polaroid, which was good but had a few kinks to work out, like a somewhat clumsy way of mounting your camera.

Over the last year, GoPro has ventured into the drone market with the Karma Drone where it unfortunately fell out of grace — it was recalled because of a battery latch issue — but has recently returned to the market.

When I first got my hands on the Karma Drone (the initial release), I immediately saw the benefit of buying GoPro’s drone. Along with the GoPro Karma Drone came the Karma Grip, a handheld stabilizer for the newly released Hero 5 action camera. It is really mind blowing to be flying a drone one minute and seconds later remove the Karma Grip from the Karma Drone and then be creating beautifully smooth shots. Handheld stabilizers like the GoPro Karma Grip have really helped shooters to create more cinematically styled footage at a relatively low cost.

When GoPro sent me the Karma Grip to borrow for a few weeks, I was really excited. I received the Karma Grip between the time they recalled the Karma Drone and when they subsequently re-released it. In addition to the Karma Grip they sent me the Quik Key, a mobile microSD card reader.

In this review I’m going to share my experience with the Karma Grip as well as touch on the Quik Key and why it’s a phenomenal accessory if you want to quickly upload photos from your GoPro action cam.

Jumping In
When testing the Karma Grip I used my GoPro Hero 5 Black Edition, which is important to note because the Hero 5 has a different case build than previous GoPro models. You’ll need to purchase a different harness if you have a Hero 4. Nonetheless, I love the Hero 5. While the Hero 4 and Hero 5 have similar camera sensors, they have some major differences. First, the Hero 5 has some really sweet voice control. I’m not a huge Siri user, so I was initially skeptical when GoPro tried to sell me on the voice control. To my surprise I love it, especially when paired with the Remo waterproof voice-activated remote. To not be a total GoPro fanboy, I will avoid reviewing the Remo for now but it’s something that I really love.

The Hero 5 has a built-in waterproof housing (unlike previous versions that needed a separate waterproof housing), voice activation, easy-to-use touch screen menu system and many other features. What I’m getting at is that the Karma Grip comes out of the box to fit the Hero 5, but you can purchase the Hero 4 Harness for an additional $29.99. The Session mount will be released later in the summer.

What makes the GoPro Karma Grip different from other handheld stabilizers, in my opinion, is its build quality, ease of use and GoPro-focused mounting options. Immediately when opening the Karma Grip box you get four key components: the removable grip handle ($99.99), mounting ring ($29.99) and stabilizer ($249.99) with the Hero 5 harness attached ($29.99). In addition, it all comes in a form-fitted case. The case is sturdy but kind of reminds me of a trombone case; it does the job but is a little unwieldy. When you buy the Karma Grip as a set it retails for $299.99, which is a little pricey, but in my opinion completely worth it — especially if you plan to buy the Karma drone because you can purchase the drone separately.

If you know you are going to buy the Karma Drone, you should probably just go ahead and buy the whole drone package now ($1099.99 Karma Drone with the Grip and Hero 5, $799.99 Karma Drone with the Grip). If you decide you want the Karma Drone you can purchase the Flight Kit for the Karma Grip for $599. For those counting at home that comes to $899 if you purchase the Grip and the Karma Drone separately, so it’s definitely a better deal to buy it all at once if you can.

Once I opened the form-fitted Karma Grip case, I plugged the USB-C charging cable into the base of the Karma Grip handle. I kind of wish the cable plugged in somewhere other than the base, since I like to rest stabilizers on their base, but not really a big deal if you have your case around. I set the Karma Grip to charge overnight, but the manual writes it will take six hours on a standard 1A charger, and one hour and 50 minutes if you use the “Supercharger” — immediately I was like what the hell is this Supercharger and why don’t I have one? They are $49.99 and can be found here.

So the next day I tried using the Karma Grip in conjunction with a suction cup mount inside of my car on my ride home from work. I wanted to see how the Karma Grip would work when mounted to a windshield (inside my car) to film a driving timelapse. To attach the Karma Grip you have to put a separate mounting ring between the handle and the stabilizer. Like a typical bonehead, I didn’t read the manual, so I tried mounting the ring with the GoPro mount. It took me a few tries to get it on right, but once it is on it actually feels very sturdy.

From there you have to do a typical GoPro mount connecting dance to get everything situated. You can check out my results here.

Admittedly, I probably should have locked the view of the Karma Grip to keep it focused straight forward, but I didn’t. It worked okay, but I definitely would need way more time to perfect this. However, if you can lock in your Karma Grip to something like the side of a train or airplane, your shooting options will become way smoother.

On the Move
Next I wanted to test running around with the Karma Grip. Once you lock your Hero 5 into the harness on the Karma Grip it’s as simple as powering on your Grip and hitting record. You can flip over the Grip to record a ground level view very easily. Flipping the Karma Grip over to a ground level view was the easiest transition on a handheld stabilizer I have ever experienced. Usually you have to either tell the stabilizer that you want to film ground level or you have to do a certain motion to not make the stabilizer flip out. The Karma Grip is incredibly easy to use; it lets you film smoothly with minimal effort.

To go a little further into testing I made a makeshift mount using a pipe and a 2×4 I had lying around. I screwed some sticky GoPro mounts to the 2×4 for mounting. In the end, I wanted to put my Hero 4 mounted alongside my Hero 5 mounted on the Karma Grip to demonstrate just how stable the Karma Grip makes your footage. You can check it out here. After a few hours of using the Karma Grip, I really felt like I had many more options when filming. I saw a staircase and knew I could run up it without my steps being reflected in my video recording; it really opens your creative brain.

One thing I wish was more easily accessible was a mount for an external microphone. In my video, I separated the audio on the left from the GoPro Hero 4 not mounted in the Karma Grip and the Hero 5 mounted in the Karma Grip. I did this so I could hear the difference. Once in the Karma Grip, the Hero 5’s audio becomes pretty muted. I know that GoPros aren’t necessarily supposed to be used with external mics, but with the GoPro’s audio not being high level all the time I sometimes use an external mic mounted on something like the iOgrapher Go or even the Karma Grip. If the Karma Grip could somehow mount a microphone along with possibly integrating a ¼-inch jack instead of having to buy a $49.99 converter I would be very happy.

Quik Key
The Quik Key is a great addition to the GoPro accessory line and is available for Lightning Port for the iOS ($29.99), Micro-USB ($19.99) and even USB-C ($19.99). It works directly with the GoPro Capture app on your mobile device to transfer photos and videos without having to hook up your GoPro or microSD card to your computer. Based on support documents, it seems like Android phones are more compatible with formats and resolutions, but since I have an iPhone the iOS version is what I am dealing with. You can get the specific iPhone resolution compatibility chart here. It’s interesting to note that ProTune footage is specifically not compatible with iOS.

The Quik Key is great for my dad adventures (or dad-ventures!) to Disneyland, Knott’s Berry Farm, hikes, baseball games, etc. If for some odd reason one of my sons takes a nap, I can transfer some videos or images to my phone and upload them to the web while on the run. The Quik Key comes with a carabiner-style clip to hang on, but it’s definitely small enough to keep in your pocket with the Remo remote. I love the Remo for the same dad-ventures with the kids; you can use the button as a shutter release and also change shooting modes from video to photos by just saying “GoPro Photo Mode.”

Summing Up
In the end, while GoPro is digging their way out of the Karma Drone battery latch caper, I continue to love their gear. The GoPro Hero 5 is my favorite camera they’ve made to date and it’s easy to take along since you no longer need an external housing to keep it waterproof. All of the GoPro accessories like the Karma Grip, Hero 5, Hero 5 Session, mounts, three-way mount and practically anything else fit perfectly in my favorite GoPro bag, The Seeker. It’s an incredible bag that even comes with room enough for your CamelBak water bladder.

The Karma Grip is smooth and super easy to use, it works flawlessly with the Hero 5 and coming soon in spring of 2017 is the Karma Grip extension cable. The extension cable allows you to put your Grip handle out of sight and mount the stabilizer inconspicuously, something I bet a lot of television shows will like to use, opening the GoPro creativity door a little more.

I really love GoPro products. Even if there are other options out there, I always know that for the most part the GoPro product line is made of high-quality accessories and cameras that everyone from moms and dads to professional broadcasters rely on. I can even give my GoPro to my sons to run around with and get muddy without a care in the world allowing them to capture the world from their own point of view. The GoPro product line including the Karma Grip is full of awesome gear that I can’t recommend enough.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Lenovo intros VR-ready ThinkStation P320

Lenovo launched its VR-ready ThinkStation P320 at Develop3D Live, a UK-based conference that puts special focus on virtual reality as a productivity tool in design workflows. The ThinkStation P320 is the latest addition to the Lenovo portfolio of VR-ready certified workstations and is designed for power users looking to balance both performance and their budgets.

The workstation’s pro VR certification allows ThinkStation P320 users an to more easily add virtual reality into their workflow without requiring an initial high-end hardware and software investment.

The refreshed workstation will be available in both full-size tower and small form factor (SFF) and comes equipped with Intel’s newest Xeon processors and Core i7 processors — offering speeds of up to 4.5GHz with Turbo Boost (on the tower). Both form factors will also support the latest Nvidia Quadro graphics cards, including support for dual Nvidia Quadro P1000 GPUs in the small form factor.

The ISV-certified ThinkStation P320 supports up to 64GB of DDR4 memory and customization via the Flex Module. In terms of environmental sustainability, the P320 is Energy Star-qualified, as well as EPEAT Gold and Greenguard-certified.

The Lenovo ThinkStation P320 full-size tower and SFF will be available at the end of April.

Chatting with Scorsese’s go-to editor Thelma Schoonmaker

By Iain Blair

Thelma Schoonmaker and Martin Scorsese go together like Lennon and McCartney, or Ben and Jerry. It’s hard to imagine one without the other.

Simply put, Schoonmaker has been Martin Scorsese’s go-to editor and key collaborator over the course of 23 films and half a century, winning Oscars for Raging Bull, The Aviator and The Departed. Now 77, she also recently received a career achievement award at the American Cinema Editors’ 67th Eddie Awards.

She cut Scorsese’s first feature, Who’s that Knocking at My Door, and since Raging Bull has worked on all of his feature films, including such classics as The King of Comedy, After Hours, The Color of Money, The Last Temptation of Christ, New York Stories, GoodFellas (which earned her another Oscar nomination), Cape Fear, The Age of Innocence, Casino, Kundun, Gangs of New York (another Oscar nomination), Shutter Island, Hugo (another Oscar nomination) and The Wolf of Wall Street.

Their most recent collaboration was Silence, Scorsese’s underrated and powerful epic, which is now available via Blu-ray, DVD and On Demand from Paramount Home Media Distribution.

A 28-year passion project that reinforces Scorsese’s place in the pantheon of great directors, Silence tells the story of two Christian missionaries (Adam Driver and Andrew Garfield) who travel to Japan in search of their missing mentor (Liam Neeson) at a time when Christianity was outlawed. When they are captured and imprisoned, both men are plunged into an odyssey that will test their faith, challenge their sanity and, perhaps, risk their very lives

I recently talked with Schoonmaker about cutting Silence, working with Scorsese, and their long and storied collaboration.

Silence must have been very challenging to cut as it’s very long and could easily have ended up being a bit slow and boring.
(Laughs) You’re right! It was one of the things we were most concerned about from the start, as it’s a very meditative film. It’s nothing like his last films, Hugo and Wolf of Wall Street, and it couldn’t be more different.

Wolf had all the crazy stuff and the wild humor and improvisation, but with Silence Marty wanted to make an entirely different movie from the way most movies are made today. So that was a very brave commitment, I think, and it was difficult to find the right balance and the right pace. We experimented a great deal with just how slow it could be, without losing the audience.

Even the film’s opening scene was a major challenge. It’s very slow and sets the tone before the film even starts, with just the cicadas on the soundtrack. It tells you, slow down from our crazy lives, just feel what’s going on and engage with it. The minimal score is all part of that. It’s not telling the audience what to think, as scores usually do. He wanted the audience to decide what they feel and think, and he was adamant about starting the film off like that, which was also brave.

It feels far closer to The Age of Innocence in terms of its pacing than his more recent films.
Yes, and that was definitely a big part of its appeal for him, as it’s set in another country and also another century, so Marty wanted the film to be very meditative, and the pace of it had to reflect all that. Along with that, he was able to examine his religious concerns and interests, which he couldn’t do so much in other films. They were always there, but here they’re up front.

Did you stay in New York cutting while he shot in Taiwan, or did you visit the set?
I was in Taipei while they shot, working on the dailies, but I didn’t go on set as the locations they used were very arduous — up these steep mountains — and it took two hours just to get up there. There was bad weather and mud, wind, mosquitoes and snakes. Really, I just didn’t have the time to go on set, so I never got to see the great beauty of Taiwan, since I was back in Taipei in my editing room.

I do go on sets sometimes, and I love to visit and watch Marty work with the actors, and it’s always fun to be on the set, but as an editor, I also want to be unbiased when I sit down and watch footage. I don’t want to have my eye prejudiced by what I see on set and how difficult it might be to get a particular shot. That has nothing at all to do with my job.

How long did it take to edit?
Almost a year, but we had a couple of interruptions. Marty had to finish up his show for HBO, Vinyl, and then there was a family illness. But I love having that much time. Most editors simply don’t get to live with a film that long, and you really have to in order to understand it and understand what it’s saying to you. You’re editing the work of 250 people, and you have to respect that. You shouldn’t have to rush it.

Last time we talked, you were using Lightworks to edit. Do you use Avid now?
No, I still use Lightworks, and I still prefer it. It’s what I was trained on during the early days of digital editing, and it’s used a lot in Europe. Our first digital film was Casino, and back then Lightworks sent a computer expert to train me, and I’ve loved it ever since because it has a controller that is like the old flatbed editing machines and I love that — you can customize it very easily. It also has this button that allows me to throw stuff out of sync and experiment more, and that’s not available on Avid. So I’ve been editing on Lightworks ever since Casino.

When I last interviewed Marty, he told me that editing and post are his favorite parts of filmmaking. When you both sit down to edit it must be like having two editors in the room rather than a director and his editor?
It’s exactly like that. I do the first cut, but then once he comes in after the shoot we make every decision together. He’s a brilliant editor, and he taught me everything I know about editing— I knew nothing when we started together. He also thinks like an editor, unlike many directors. When he’s writing and then shooting, he’s always thinking about how it’ll cut together. Some directors shoot a lot of stuff, but does it cut together? Marty knows all that and what coverage he needs. He’s a genius, and such a knowledgeable person to be around every day.

You’ve been Marty’s editor since his very first film, back in 1967 — a 50-year collaboration. What’s the secret?
I think it’s that we’re true collaborators. He’s such an editing director, and we know each other so well by now, but it’s always fresh and interesting. There are no ego battles. Every film’s different, with different challenges, and he’s always curious, always learning, always open to new experiences. I feel very fortunate.

What’s next?
Right now I’m working on the diaries of my husband, (famed British director) Michael Powell (The Red Shoes, Black Narcissus), and then Marty and I will start The Irishman later in the summer. It’s all about elderly gangsters, with Robert De Niro and Al Pacino. It’s exciting.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Nutmeg ups Drew Hankins to editor

Nutmeg in NYC has promoted Drew Hankins to editor. Hankins, who began his career as a production assistant at Nutmeg, has been an assistant editor at the creative and post house since 2011.

In that role, he supported producers, cut spots and prepared files for various platforms — TV, web, social media and apps — for clients such as Animal Planet, A&E, Cartoon Network, Comedy Central, Discovery, Disney, ESPN, HBO, Nickelodeon, Syfy and Verizon.

Recent projects have increasingly showcased his editorial talents, including several music-video-style remixes for infectious songs from SpongeBob SquarePants, as well as the mini-documentary spoof of VH1’s Behind the Music, How Luna Became the Loudest Loud, all of which were instant viral hits. He edits on Avid Media Composer and Adobe Premiere.

“An editor is one of the last people to touch a film and, ultimately, the person who brings the film to life,” he says. “I’ve been exposed to many amazing movies over the years, but the one that made the biggest impression was Goodfellas. It’s so well-crafted; it’s perfect. It made me say, ‘That’s what I want to do!’

He was also impressed and inspired by the film Jaws. “Editor Verna Fields was tasked with creating a suspenseful movie with very little usable footage of the malfunctioning mechanical antagonist. She managed to turn that into a plus, creating chills with only glimpses of a fin or ripples in the water. She went on to win the Oscar for Film Editing. As Spielberg famously observed, ‘Had the shark been working, perhaps the film would have made half the money and been half as scary.’”

What gives Hankins a feeling of accomplishment? “Seeing something I cut, out in the wild. Just knowing that others are seeing it makes me feel good.”

Photo credit: Eljay Aguillo

Building a workflow for The Great Wall

Bling Digital, which is part of the SIM Group, was called on to help establish the workflow on Legendary/Universal’s The Great Wall, starring Matt Damon as a European mercenary imprisoned within the wall. While being held he sees exactly why the Chinese built this massive barrier in the first place — and it’s otherworldly. This VFX-heavy mystery/fantasy was directed by Yimou Zhang.

We reached out to Bling’s director of workflow services, Jesse Korosi, to talk us through the process on the film, including working with data from the Arri 65, which at that point hadn’t yet been used on a full-length feature film. Bling Digital is a post technology and services provider that specializes in on-set data management, digital dailies, editorial system rentals and data archiving

Jesse Korosi

When did you first get involved on The Great Wall and in what capacity?
Bling received our first call from the unit production manager Kwame Parker about providing on-set data management, dailies, VFX and stereo pulls, Avid rentals and a customized process for the digital workflow for The Great Wall in December of 2014.

At this time the information was pretty vague, but outlined some of the bigger challenges, like the film being shot in multiple locations within China, and that the Arri 65 camera may be used, which had not yet been used on a full-length feature. From this point on I worked with our internal team to figure out exactly how we would tackle such a challenge. This also required a lot of communication with the software developers to ensure that they would be ready to provide updated builds that could support this new camera.

After talks with the DP Stuart Dryburgh, the studio and a few other members of production, a big part of my job and anyone on my workflow team is to get involved as early as possible. Therefore our role doesn’t necessarily start on day one of principal photography. We want to get in and start testing and communicating with the rest of the crew well ahead of time so that by the first day, the process runs like a well-oiled machine and the client never has to be concerned with “week-one kinks.”

Why did they opt for the Arri 65 camera and what were some of the challenges you encountered?
Many people who we work with love Arri. The cameras are known for recording beautiful images. For anyone who may not be a huge Arri fan, they might dislike the lower resolution in some of the cameras, but it is very uncommon that someone doesn’t like the final look of the recorded files. Enter the Arri 65, a new camera that can record 6.5K files (6560×3100) and every hour recorded is a whopping 2.8TB per hour.

When dealing with this kind of data consumption, you really need to re-evaluate your pipeline. The cards are not able to be downloaded by traditional card readers — you need to use vaults. Let’s say someone records three hours of footage in a day — that equals 8.7TB of data. If you’re sending that info to another facility even using a 500Mb/s Internet line, that would take 38 hours to send! LTO-ing this kind of media is also dreadfully slow. For The Great Wall we ended up setting up a dedicated LTO area that had eight decks running at any given time.

Aside from data consumption, we faced the challenge of having no dailies software that could even read the files. We worked with Colorfront to get a new build-out that could work, and luckily, after having been through this same ordeal recording Arri Open Gate on Warcraft, we knew how to make this happen and set the client at ease.

Were you on set? Near set? Remote?
Our lab was located in the production office, which also housed editorial. Considering all of the traveling this job entailed, from Beijing and Qingdao to Gansu, we were mostly working remotely. We wanted to be as close to production as possible, but still within a controlled environment.

The dailies set-up was right beside editor Craig Wood’s suite, making for a close-knit workflow with editorial, which was great. Craig would often pull our dailies team into his suite to view how the edit was coming along, which really helped when assessing how the dailies color was working and referencing scenes in the cut when timing pickup shots.

How did you work with the director and DP?
At the start of the show we established some looks with the DP Stuart Dryburgh, ASC. The idea was that we would handle all of the dailies color in the lab. The DIT/DMT would note as much valuable information on set about the conditions that day and we would use our best judgment to fulfill the intended look. During pre-production we used a theatre at the China Film Group studio to screen and review all the test materials and dial in this look.

With our team involved from the very beginning of these color talks, we were able to ensure that decisions made on color and data flow were going to track through each department, all the way to the end of the job. It’s very common for decisions to be made color wise at the start of a job that get lost in the shuffle once production has wrapped. Plus, sometimes there isn’t anyone available who recognizes why certain decisions were made up front when you‘re in the post stage.

Can you talk us through the workflow? 
In terms of workflow, the Arri 65 was recording media onto Codex cards, which were backed up onset with a VaultS. After this media was backed up, the Codex card would be forwarded onto the lab. Within the lab we had a VaultXL that would then be used to back this card up to the internal drive. Unfortunately, you can’t go directly from the card to your working drive, you need to do two separate passes on the card, a “Process” and a “Transfer.”

The Transfer moves the media off the card and onto an internal drive on the Vault. The Process then converts all the native camera files into .ARI files. Once this media is processed and on the internal drive, we were able to move it onto our SAN. From there we were able to run this footage through OSD and make LTO back-ups. We also made additional back-ups to G-Tech GSpeed Studio drives that would be sent back to LA. However, for security purposes as well as efficiency, we encrypted and shipped the bare drives, rather than the entire chassis. This meant that when the drives were received in LA, we were able to mount them into our dock and work directly off of them, i.e no need to wait on any copies.

Another thing that required a lot of back and forth with the DI facility was ensuring that our color pipeline was following the same path they would take once they hit final color. We ended up having input LUTs for any camera that recorded a non-LogC color space. In regards to my involvement, during production in China I had a few members of my team on the ground and I was overseeing things remotely. Once things came back to LA and we were working out of Legendary, I became much more hands-on.

What kind of challenges did providing offline editorial services in China bring, and how did that transition back to LA?
We sent a tech to China to handle the set-up of the offline editorial suites and also had local contacts to assist during the run of the project. Our dailies technicians also helped with certain questions or concerns that came up.

Shipping gear for the Avids is one thing, however shipping consoles (desks) for the editors would have been far too heavy. Therefore this was probably one of the bigger challenges — ensuring the editors were working with the same caliber of workspace they were used to in Los Angeles.

The transition of editorial from China to LA required Dave French, director of post engineering, and his team to mirror the China set-up in LA and have both up and running at the same time to streamline the process. Essentially, the editors needed to stop cutting in China and have the ability to jump on a plane and resume cutting in LA immediately.

Once back in LA, you continued to support VFX, stereo and editorial, correct?
Within the Legendary office we played a major role in building out the technology and workflow behind what was referred to as the Post Hub. This Post Hub was made up of a few different systems all KVM’d into one desk that acted as the control center for VFX and stereo reviews, VFX and stereo pulls and final stereo tweaks. All of this work was controlled by Rachel McIntire, our dailies, VFX and stereo management tech. She was a jack-of-all-trades who played a huge role in making the post workflow so successful.

For the VFX reviews, Rachel and I worked closely with ILM to develop a workflow to ensure that all of the original on set/dailies color metadata would carry into the offline edit from the VFX vendors. It was imperative that during this editing session we could add or remove the color, make adjustments and match exactly what they saw on set, in dailies and in the offline edit. Automating this process through values from the VFX Editors EDL was key.

Looking back on the work provided, what would you have done differently knowing what you know now?
I think the area I would focus on next time around would be upgrading the jobs database. With any job we manage at Bling, we always ensure we keep a log of every file recorded and any metadata that we track. At the time, this was a little weak. Since then, I have been working on overhauling this database and allowing creative to access all camera metadata, script metadata, location data, lens data, etc. in one centralized location. We have just used this on our first job in a client-facing capacity and I think it would have done wonders for our VFX and stereo crews on The Great Wall. It is all too often that people are digging around for information already captured by someone else. I want to make sure there is a central repository for that data.

Origins: The Creative Storage Conference

By Tom Coughlin

I was recently asked how the Creative Storage Conference came to be. So here I am to give you some background.

In 2006, the Storage Visions Conference that my colleagues and I had been organizing just before the CES show in January was in its fifth year. I had been doing more work on digital storage for professional media and entertainment, including a report on this important topic. In order to increase my connections and interaction with both media and entertainment professionals, and the digital storage and service companies that support them, it seems that a conference focusing on digital storage for media and entertainment would be in order.

That same year, my partner Ron Dennison and I participated in the MediaTech Conference in the LA area, working with Bryan Eckus, the director of the group at the time. In 2007, we held the first Creative Storage Conference in conjunction with the MediaTech Conference in Long Beach, California. It featured a dynamite line-up of storage companies and end users.

The conference has grown in size over the years, and we have had a stream of great companies showing their stuff, media and entertainment professional attendees and speakers, informative sessions and insightful keynote talks on numerous topics related to M&E digital storage.

The 2017 Creative Storage Conference
This year, the Creative Storage Conference is taking place on May 24 in Culver City. Attendees can learn more about the use of Flash memory in M&E as well as the growth in VR content in professional video, and how this will drive new digital storage demand and technologies to support the high data rates needed for captured content and cloud-based VR services. This is the 11th year of the conference and we look forward to having you join us.

We are planning for six sessions and four keynotes during the day and a possible reception in the evening on May 24.

Here is a list of the planned sessions:
• Impact of 4K/HDR/VR on Storage Requirements From Capture to Studio
• Collaboration in the Clouds: Storing and Delivering Content Where it is Needed
• Content on the Move: Delivering Storage Content When and Where it is Needed
• Preserving Digital Content — the Challenges, Needs and Options
• Accelerating Workflows: Solid State Storage in Media and Entertainment
• Independent Panel — Protecting the Life of Content

Don’t miss this opportunity to meet giants in the field of VR content capture and post production and meet the storage and service companies to help you make sure your next professional projects are a big success.

• Hear how major media equipment suppliers and entertainment industry customers use digital storage technology in all aspects of content creation and distribution.
• Find out the role that digital storage plays in new content distribution and marketing opportunities for a rapidly evolving market.
• See presentations on digital storage in digital acquisition and capture, nonlinear editing and special effects.
• Find out how to convert and preserve content digitally and protect it in long-term dependable archives.
• Learn about new ways to create and use content metadata, making it easier to find and use.
• Discover how to combine and leverage hard disk drives, flash memory, magnetic tape and optical storage technology with new opportunities in the digital media market.
• Be at the juncture of digital storage and the next generation of storage for the professional media market.

Online registration is open until May 23, 2017. As a media and entertainment professional you can register now with a $100 discount using this link:

—–
Thomas Coughlin, president of Coughlin Associates is a storage analyst and consultant with over 30 years in the data storage industry. He is active with SNIA, SMPTE, IEEE, and other professional organizations.

Industry vet Alex Moulton joins NYC’s Trollbäck+Company

New York’s Trollbäck+Company has hired Alex Moulton as chief creative officer where he has been tasked with helping businesses and organizations develop sustainable brands through design-driven strategy and mixed media.

Moulton, who joins the agency from Vice Media, was recently at the helm of NBC Universo’s highly regarded brand refresh, as well as show packaging for ESPN’s The Undefeated In-Depth: Serena With Common.

“Alex brings an invaluable perspective to Trollbäck+Company as both an artist and entrepreneur,” says founder Jakob Trollbäck. “In his short time here, he has already reinvigorated the collective creative energy of our company. This clearly stems from his constant quest to dig deeper as a creative problem solver, which falls perfectly into our philosophy of ‘Discard everything that means nothing.’”

Says Moulton, “My vision for Trollbäck+Company is very clear: design culturally relevant, sustainable brands — from initial strategy and positioning to content and experiential activations —  with a nimble and holistic approach that makes us the ultimate partner for CMOs that care about designing an enduring brand and bringing it to market with integrity.”

Prior to Trollbäck+Company, as senior director, creative and content at Vice, Moulton helped launch digital content channel Live Nation TV (LNTV) — a joint venture for which he led brand creative, content development, production and partnership initiatives.

As executive creative director at advertising agency Eyeball, Moulton led product launches, rebrands and campaigns for major brands, including Amazon, New York Public Radio, Wildlife Conservation Society’s New York Aquarium, A&E, CMT, Disney, E!, Nickelodeon, Oxygen, Ovation and VH1.

An early adopter of audio branding, Moulton founded his own branding agency and record label, Expansion Team, in 2002. As chief creative officer of the company, he created the sonic identities of Aetna, Amazon Studios/Originals, Boeing, JetBlue and Rovi, as well as more than 15 TV networks, including CNN International, Discovery, PBS, Universal and Comedy Central.

A DJ, composer and speaker about topics that combine music and design, Moulton has been featured in Billboard, V Man, Electronic Musician and XLR8R and has performed at The Guggenheim.