Category Archives: Editing

EditShare intros software-only Flow MAM, more at NAB

During NAB 2018, EditShare launched a new standalone version of its Flow MAM software, designed for non-EditShare storage environments such as Avid Nexis, Storage DNA and Amazon S3. Flow adds an intelligent media management layer to an existing storage infrastructure that can manage millions of assets across multiple storage tiers in different locations.

EditShare will spotlight the new Flow version as well as a new family of solutions in its QScan Automated Quality Control (AQC) software line, offering cost-effective compliance and delivery check capabilities and integration across production, post and delivery. In addition, EditShare will unveil its new XStream EFS auditing dashboard, aligned with Motion Picture Association of America (MPAA) best practices to promote security in media-engineered EFS storage platforms.

The Flow suite of apps helps users manage content and associated metadata from ingest through to archive. At the core of Flow are workflow engines that enable collaboration through ingest, search, review, logging, editing and delivery, and a workflow automation engine for automating tasks such as transcoding and delivery. Flow users are able to review content remotely and also edit content on a timeline with voiceover and effects from anywhere in the world.

Along with over 500 software updates, the latest version of Flow features a redesigned and unified UI across web-based and desktop apps. Flow also has new capabilities for remotely viewing Avid Media Composer or Adobe Premiere edits in a web browser; range markers for enhanced logging and review capabilities; and new software licensing with a customer portal and license management tools. A new integration with EditShare’s QScan AQC software makes AQC available at any stage of the post workflow.

Flow caters to the increased demand for remote post workflows by enabling full remote access to content, as well as integration with leading NLEs such as Avid Media Composer and Adobe Premiere. Comments James Richings, EditShare managing director, “We are seeing a huge demand from users to interact and collaborate with each other from different locations. The ability to work from anywhere without incurring the time and cost of physically moving content around is becoming much more desirable. With a simple setup, Flow helps these users track their assets, automate workflows and collaborate from anywhere in the world. We are also introducing a new pay-as-you-go model, making asset management affordable for even the smallest of teams.”

Flow will be available through worldwide authorized sales partners and distributors by the end of May, with monthly pricing starting at $19 per user.

Arvato previews EditMate SaaS for Adobe Premiere Pro

At NAB 2018, Arvato Systems previewed an on-demand, SaaS edition of VPMS EditMate, the company’s production asset management system for Adobe Premiere Pro CC.

According to Arvato, the new SaaS version of EditMate addresses the complexity of today’s editing projects by bringing all projects and media into a single, searchable library and eliminating complex folder structures. EditMate also manages project templates, automatically importing regularly used media assets (IDs, graphics, bugs, etc.) into projects and setting up sequence parameters.

The high bit rates of large file sizes can make remote editing tedious and in some cases impractical. EditMate addresses this with an adaptive streaming engine that uses available bandwidth to deliver a high-quality, full-HD “proxy.” This gives users the full Premiere Pro CC experience and toolset without requiring them to wait for or copy large media files. Once users complete an edit, they can check in sequences or subclips “to air” or for review. All of the rendering and high-res creation happens back in the facility or wherever the original material is stored.

A trial version of the EditMate on-demand offering will be available through a web portal alongside entry-level packages for small-to-medium-sized workgroups and advanced packages that include the remote editing features. Additional seats can be added as needed on a monthly basis while enterprise-wide and “private cloud” deployments will also be available on request.

Cinna 4.13

Director HaZ Dulull on his sci-fi offering The Beyond

By Randi Altman

Director Hasraf “HaZ” Dulull is no stranger to making movies. Before jumping into writing and directing short sci-fi films, he was a visual effects supervisor and producer. His short film resume includes Project Kronos, I.R.I.S. and Sync. Recently, his first feature film, The Beyond, was released by Gravitas Ventures.

When I first met HaZ a few years back, we were both at an Adobe event — on a canal boat in Amsterdam during IBC. We started talking about visual effects, the industry and his drive to make movies.

This Brit is friendly, intelligent and incredibly hands-on in all aspects of what he does. His latest is The Beyond, which he describes as “a cerebral science-fiction feature film that blends the realism of documentary with the fantastical, ‘big idea’ nature of the science-fiction films of today.” The Beyond tells the story of a ground-breaking mission that sent astronauts — modified with advanced robotics — through a newly discovered wormhole known as the Void. When the mission returns unexpectedly, the space agency races to discover what the astronauts encountered on their first-of-its-kind interstellar space journey.

HaZ on set

HaZ was so hands-on that he provided some of the film’s visual effects and edited the film. Here is the trailer. If you like what you see, the film is available for purchase or rent on most digital platforms.

When I reached out to HaZ to talk about The Beyond, he was in Vancouver working on an eight-part TV series for Disney called Fast Layne. “I directed episodes 1 and 2, and am currently directing episodes 7 and 8,” he says. “The beauty of starting and ending the series is it allowed me to set the show’s style and tone.”

It seems he can’t sit still! Let’s find out more about how he works and The Beyond

Can you talk about prepro? How much of that included visual effects prepro?
Most people who know me will say I’m obsessed with prep. I had about six months of hardcore prep on this, from doing little storyboards, known as HaZ-Grams, right through to previs of the key sequences.

But even during the script-writing stage (six months before actual prep), I was coming up with visuals to support the ideas I was writing in the script. Sometimes I would knock up a test VFX scene just to see how complex it would be to create this idea I was writing in the script. Prep worked hand in hand with the script development and the budgeting of the film. The film was self-financed and later additional financing came in (during post production of the film), so I wanted to ensure everything was mapped out technically, as there was no “fix it in post” scenarios in this film — I wouldn’t allow it.

During location scouting, I would have my iPhone with me and shoot a bunch of footage and still imagery, so when I went back home I could write those locations into the script to make them work with the scenarios depicted in the film.

As part of prep we actually shot a test scene to really see if this mocku-mentary format would work to tell a grounded sci-fi story. This was also used to attract crew and other casting to the project, as well as get distributors primed early on.

Many shots from that test actually made it into the final movie —I wasn’t kidding about not wasting any budget or material on this production! So prep pretty much helped shape the script too, as I knew I wasn’t in the financial position to write stuff and then go and build it. I had to reverse engineer it in a way. In the film we have tons of locations, such as the Space Centre with actual real rockets. We also had a team in Iceland shooting alien landscapes, and we even shot some scenes in Malaysia to give the film a global feel — with each of those opportunities the script was tweaked to make full use of those location opportunities we had.

You shot with Blackmagic cameras. Was that your choice? The DP’s? Have you shot with these before?
From the start, I knew we were going to shoot on Blackmagic cameras. This was mainly down to the fact my DP Adam Batchelor — who had shot Sync with me and the proof of concept tests we did for this film — was a Blackmagic advocate and knew the cameras inside out, but more importantly he was able to get cinematic imagery using those cameras.

Blackmagic was very supportive of the film and have been of my career since my short films, so they came on as one of the executive producers on the film. No one had ever shot a full feature film using just the Blackmagic cameras. We also then used a Resolve pipeline to delivery. So The Beyond is the perfect case study for it.

Can you talk about that workflow? Any hiccups? 
I think the only hiccups were the fact we were using a beta version of Resolve 14, so there were the expected crashes, etc. That would usually be seen as risky on a feature film, but luckily we didn’t have a distributor in place with a release date, so the risk was minimal.

The good thing was I would generate an error log report from Resolve and send it over to Blackmagic, who would then instantly send out a new patch. So we were looked after rather than being left on our own to scream at the monitor.

We stuck with a Pro Res 4444 QuickTime workflow for all material from footage to VFX renders, and enabled proxy on the fly within Resolve. This was great as it meant I was working with the highest-resolution imagery within Resolve, and it was fairly fast too. Things started to slow down when I had multiple layers of VFX and composites/groups, which I then had to render out as a new clip and bring back in.

How did you and the DP develop the look you wanted? Any scenes stick out that you guys worked on?
I was very fortunate to get Max Horton, who had worked on films like Gravity, to come onboard to grade this film at the Dolby Vision lab in London’s Soho. We also did an HDR version of the film, which I think is the first indie film to have an HDR treatment done to it.

We had three to four days of grading with Max, and I was in the room with him the whole time. This was because I had already done a first-pass temp grade myself while editing the film in the beta version of Resolve 14. This made the workflow as simple as exporting my Resolve file and then the material hand-over to Max, who would load up the Resolve file, link up the material and work from there.

Max kept everything photographically like a documentary but with a slight cinematic flair to it. The big challenge was matching all the various sources of material from the various Blackmagic cameras (Ursa Mini Pro, the Production Camera and the Pocket Camera) to the DJI Osmo, drone footage and stock footage.

How many VFX shots were there? Who did them?
There were around 750 visual effects shots. I designed all the VFX scenes and handled a huge portion of the compositing myself, including invisible effects shots, all the space scenes, alien planet scenes, memory scenes and tons more — this would not have been possible without the support of my VFX team who worked on their assigned sequences and shots and also generated tons of CGI assets for me to use to create my shots in comp.

My VFX team members included my long-time collaborator John Sellings, who was the VFX supervisor for all the Human 2.0 sequences. Filmmore, in Amsterdam and Brussels, handled Human 2.0 scenes in the transcode bay with in-house VFX supervisor Hans Van Helden. London’s Squint VFX handled the Human 2.0 scenes in wake-up lab. Charles Wilcocks was the Human 2.0 CG supervisor who worked on the shape and look of the Human 2.0.

Hussin Khan looked after the Malaysian team, which provided rotoscoping support and basic comps. Dan Newlands was our on-set tracking supervisor. He ensured all data was captured correctly and supervised anything tracking related in the Human 2.0 scenes.

Another long-time collaborator was Andrea Tedeschi, who handled the CG and comps for the spacecraft carrier at the end of the film, as well as rendering out the CG astronaut passes. Rhys Griffith handled the rigging for the Human 2.0 characters in Maya, and also looked after the CG passes for the alpha Human 2.0 scenes using Blender. Aleksandr Uusmees provided all the particles and simulation rendered out of Houdini as CG passes/elements, which I then used to create the wormhole effects, alien spheres and other shots that needed those elements.

JM Blay designed and created the standalone motion graphics sequences to visualize the Human 2.0 medical procedure, as well as mission trajectory graphics. He also created several “kit-bash” graphics assets for me to use, including UI graphics, from his After Effects files.

Territory Studio created the awesome end titles and credits sequence, which you can read more about on their site.

As a VFX pro yourself, do you find that you are harder to please because it’s your wheelhouse?
Oh boy. Ask any of the VFX guys on the team and they will say I am a beast to work with because I am hands-on, and also I know how long things take. But on the flip side that had its advantages, as they knew they were not going to get revision after revision, because with each brief I also presented a proposed methodology, and made sure we locked down on that first before proceeding with the shots.

Was this your biggest directing job to date? Can you talk about any surprises?
It wasn’t my biggest directing job to date, as during post production of The Beyond my second sci-fi film Origin Unknown (starring Katee Sackhoff from Battlestar Galactica, The Flash) was green-lit and that had its own set of challenges. We can talk more about that when the film is released theatrically and VOD later this year via Kew Media.

This was, however, my biggest producing job to date; there were so many logistics and resources to manage whilst directing too. The cool thing about the way we made this film was that most of the crew were on my short films, including some of the key cast too, so we embraced the guerrilla nature of the production and focused on maximizing our resources to the fullest within the time and budget constraints.

What did you learn on this film that will help on your next?
The other hat I was wearing was the producer hat, and one thing I had to embrace was the sheer amount of paperwork! I may have taken the same filmmaking approach as I did on my short films — guerrilla and thinking outside the box technically and creatively— but making a commercial feature film, I had to learn to deal with things like clearances, E&O (errors and omission) insurance, chain of title, script report and a whole bunch of paperwork required before a distributor will pick up your film.

Thankfully my co-producer Paula Crickard, who is currently wrapping post on Terry Gilliam’s Don Quixote, came in during the post stage of the film and helped.

The other thing I learned was the whole sales angle — getting a reputable distributor on board to sell the film in all worldwide territories and how to navigate that process with rights and IP and more contracts etc. The advise I got from other filmmakers is getting the right distributor is a big part in how your film will be released, and to me it was important the distributor was into the film and not just the trailer, but also what their marketing and sales strategy were. The Beyond was never designed to be a theatrical film and therefore I wanted someone that had a big reach in the VOD world through their brand, especially since The Beyond doesn’t have big-name actors in there.

What was the most challenging scene or scenes? Why and how did you overcome those challenges?
The Human 2.0 scenes were the most challenging because they had to look photoreal due to it being a documentary narrative. We did first try and do it all in-camera using a built suit, but it wasn’t achieving the look we wanted and the actors would feel uncomfortable with it, and also to do it properly with practical would cost a fortune. So we went with a full-digital solution for the Human 2.0 bodies, by having the actors wear a tight grey suit with tracking markers on and we restricted our camera moves for simplicity to enable object tracking to work as accurately as possible. We also shot multiple reference footage from all angles to help with match moving. Having an on set-tracking supervisor helped massively and allowed us to make this happen within the budget, while looking and feeling real.

Our biggest issue came when our actress made very tiny movements due to breathing in close-up shots. Because our Human 2.0 was human consciousness in a synthetic shell, breathing didn’t make sense and we began making up for it by freezing the image or doing some stabilization, which ended up being nearly impossible for the very close-up shots.

In the end, I had to think outside the box, so I wrote a few lines into the script that explained that the Human 2.0 was breathing to make it psychologically more acceptable to other humans. Those two lines saved us weeks and possibly months of time.

Being a VFX movie you would expect us to use a form of greenscreen or bluescreen, but we didn’t — in fact, the only stage used was for the “white room” astronaut scene, which was shot over at Asylum FX in London. There was an actor wearing an astronaut suit in a bright photography room, and we used brightly exposed lighting to give a surreal feeling. We used VFX to augment it.

As a writer and a director, how was it seeing your vision through from start to finish.
It didn’t really hit me until I watched the press screening of it at the Dolby Vision office in Soho. It had the fully mixed sound and the completed grade. I remember looking across at my DP and other team members thinking, “Whoa! It looks and feels like a feature film, and we did that in a year!”

You edited the film yourself?
Yes, I was the editor on the film! I shoot for the edit. I started off using Adobe Premiere CC for the early offline and then quickly moved over to Resolve 14, where I did the majority of the editing. It was great because I was doing a lot of online editorial tasks like stabilizing, basic VFX, pan and scans, as well as establishing temp looks while editing. So in a way there was no offline and online editorial, as it was all part of one workflow. We did all our deliverables out of Resolve 14, too.


Editor Dylan Tichenor to headline SuperMeet at NAB 2018

For those of you heading out to Las Vegas for NAB 2018, the 17th annual SuperMeet will take place on Tuesday, April 10 at the Rio Hotel. Speaking this year will be Oscar-nominated film editor Dylan Tichenor (There Will be Blood, Zero Dark Thirty). Additionally, there will be presentations from Blackmagic, Adobe, Frame.io, HP/Nvidia, Atomos and filmmaker Bradley Olsen, who will walk the audience through his workflow on Off the Tracks, a documentary about Final Cut Pro X.

Blackmagic Resolve designers Paul Saccone, Mary Plummer, Peter Chamberlain and Rohit Gupta will answer all questions on all things DaVinci Resolve, Fusion or Fairlight Audio.

Adobe Premiere Pro product manager Patrick Palmer will reveal new features in Adobe video solutions for their editing, color, graphics and audio editing workflows.

Frame.io CEO Emery Wells will preview the next generation of its collaboration and workflow tool, which will be released this summer.

Atomos’ Jeromy Young will talk about some of their new partners. He says, “It involves software and camera makers alike.”

As always, the evening will round out with the SuperMeet’s ”World Famous Raffle,” where the total value of prizes has now reached over $101,000. Part of that total includes a Blackmagic Advanced Control Panel, worth $29,995.

Doors will open at 4:30pm with the SuperMeet Vendor Showcase, which features 23 software and hardware developers. Those attending can enjoy a few cocktails and mingle with industry peers.

To purchase tickets, and for complete daily updates on the SuperMeet, including agenda updates, directions, transportation options and a current list of raffle prizes, visit the SuperMeet website.


NAB: Adobe’s spring updates for Creative Cloud

By Brady Betzel

Adobe has had a tradition of releasing Creative Cloud updates prior to NAB, and this year is no different. The company has been focused on improving existing workflows and adding new features, some based on Adobe’s Sensei technology, as well as improved VR enhancements.

In this release, Adobe has announced a handful of Premiere Pro CC updates. While I personally don’t think that they are game changing, many users will appreciate the direction Adobe is going. If you are color correcting, Adobe has added the Shot Match function that allows you to match color between two shots. Powered by Adobe’s Sensei technology, Shot Match analyzes one image and tries to apply the same look to another image. Included in this update is the long-requested split screen to compare before and after color corrections.

Motion graphic templates have been improved with new adjustments like 2D position, rotation and scale. Automatic audio ducking has been included in this release as well. You can find this feature in the Essential Sound panel, and once applied it will essentially dip the music in your scene based on dialogue waveforms that you identify.

Still inside of Adobe Premiere Pro CC, but also applicable in After Effects, is Adobe’s enhanced Immersive Environment. This update is for people who use VR headsets to edit and or process VFX. Team Project workflows have been updated with better version tracking and indicators of who is using bins and sequences in realtime.

New Timecode Panel
Overall, while these updates are helpful, none are barn burners, the thing that does have me excited is the new Timecode Panel — it’s the biggest new update to the Premiere Pro CC app. For years now, editors have been clamoring for more than just one timecode view. You can view sequence timecodes, source media timecodes from the clips on the different video layers in your timeline, and you can even view the same sequence timecode in a different frame rate (great for editing those 23.98 shows to a 29.97/59.94 clock!). And one of my unexpected favorites is the clip name in the timecode window.

I was testing this feature in a pre-release version of Premiere Pro, and it was a little wonky. First, I couldn’t dock the timecode window. While I could add lines and access the different menus, my changes wouldn’t apply to the row I had selected. In addition, I could only right click and try to change the first row of contents, but it would choose a random row to change. I am assuming the final release has this all fixed. If it the wonkiness gets flushed out, this is a phenomenal (and necessary) addition to Premiere Pro.

Codecs, Master Property, Puppet Tool, more
There have been some compatible codec updates, specifically Raw Sony X-OCN (Venice), Canon Cinema Raw Light (C200) and Red IPP2.

After Effects CC has also been updated with Master Property controls. Adobe said it best during their announcement: “Add layer properties, such as position, color or text, in the Essential Graphics panel and control them in the parent composition’s timeline. Use Master Property to push individual values to all versions of the composition or pull selected changes back to the master.”

The Puppet Tool has been given some love with a new Advanced Puppet Engine, giving access to improving mesh and starch workflows to animate static objects. Beyond updates to Add Grain, Remove Grain and Match Grain effects, making them multi-threaded, enhanced disk caching and project management improvements have been added.

My favorite update for After Effects CC is the addition of data-driven graphics. You can drop a CSV or JSON data file and pick-whip data to layer properties to control them. In addition, you can drag and drop data right onto your comp to use the actual numerical value. Data-driven graphics is a definite game changer for After Effects.

Audition
While Adobe Audition is an audio mixing application, it has some updates that will directly help anyone looking to mix their edit in Audition. In the past, to get audio to a mixing program like Audition, Pro Tools or Fairlight you would have to export an AAF (or if you are old like me possibly an OMF). In the latest Audition update you can simply open your Premiere Pro projects directly into Audition, re-link video and audio and begin mixing.

I asked Adobe whether you could go back and forth between Audition and Premiere, but it seems like it is a one-way trip. They must be expecting you to export individual audio stems once done in Audition for final output. In the future, I would love to see back and forth capabilities between apps like Premiere Pro and Audition, much like the Fairlight tab in Blackmagic’s Resolve. There are some other updates like larger tracks and under-the-hood updates which you can find more info about on: https://theblog.adobe.com/creative-cloud/.

Adobe Character Animator has some cool updates like overall character building updates, but I am not too involved with Character Animator so you should definitely read about things like the Trigger Improvements on their blog.

Summing Up
In the end, it is great to see Adobe moving forward on updates to its Creative Cloud video offerings. Data-driven animation inside of After Effects is a game-changer. Shot color matching in Premiere Pro is a nice step toward a professional color correction application. Importing Premiere Pro projects directly into Audition is definitely a workflow improvement.

I do have a wishlist though: I would love for Premiere Pro to concentrate on tried-and-true solutions before adding fancy updates like audio ducking. For example, I often hear people complain about how hard it is to export a QuickTime out of Premiere with either stereo or mono/discrete tracks. You need to set up the sequence correctly from the jump, adjust the pan on the tracks, as well as adjust the audio settings and export settings. Doesn’t sound streamlined to me.

In addition, while shot color matching is great, let’s get an Adobe SpeedGrade-style view tab into Premiere Pro so it works like a professional color correction app… maybe Lumetri Pro? I know if the color correction setup was improved I would be way more apt to stay inside of Premiere Pro to finish something instead of going to an app like Resolve.

Finally, consolidating and transcoding used clips with handles is hit or miss inside of Premiere Pro. Can we get a rock-solid consolidate and transcode feature inside of Premiere Pro? Regardless of some of the few negatives, Premiere Pro is an industry staple and it works very well.

Check out Adobe’s NAB 2018 update video playlist for details on each and every update.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.


NYC’s Wax adds editor Kate Owen

Kate Owen, an almost 20-year veteran who has cut both spots and films, has joined the editorial roster of New York’s Wax. The UK-born but New York-based Owen has edited projects across fashion, beauty, lifestyle and entertainment for brands such as Gucci, Victoria Beckham, Vogue, Adidas, Sony, Showtime and Virgin Atlantic.

Owen started editing in her teens and subsequently worked with top-tier agencies like Mother, Saatchi NY, McGarryBowen, Grey Worldwide and Y&R. She has also worked at editing houses Marshall Street Editors and Whitehouse Post.

In terms of recognition, Owen had been BAFTA-nominated for her short film Turning and has won multiple industry awards, including One Show, D&AD, BTAA as well as a Gold Cannes Lions for her work on the “The Man Who Walked Around the World” campaign for Johnnie Walker.

Owen believes editing is a “fluid puzzle. I create in my mind a somewhat Minority Report wall with all the footage in front of me, where I can scroll through several options in my mind to try out and create fluid visual mixes. It’s always the unknown journey at the start of every project and the fascination that comes with honing and fine tuning or tearing an edit upside down and viewing it from a totally different perspective that is so exciting to me”.

Regarding her new role, she says, “There is a unique opportunity to create a beauty, fashion and lifestyle edit arm at Wax. The combination of my edit aesthetic and the company’s legacy of agency beauty background is really exciting to me.”

Owen calls herself “a devoted lifetime Avid editor.” She says, for her, it’s the most elegant way to work. “I can build walls of thumbnails in my Selects Bins and create living mood boards. I love how I can work in very detailed timelines and speed effects without having to break my workflow.”

She also gives a shout out to the Wax design and VFX team. “If we need to incorporate After Effects or Maxon Cinema 4D, I am able to brief and work with my team and incorporate those elements into my offline. I also love to work with the agency or director to work out a LUT before the shoot so that the offline looks premium right from the start.”


First-time director of Beyond Transport calls on AlphaDogs for post

The new documentary Beyond Transport, directed and produced by Ched Lohr, focuses on technology and how it’s brought people together while at the same time creating a huge disconnect in personal relationships. In this doc, this topic is examined from the perspective of cab drivers. Shot on all seven continents of the world, the film includes interviews with drivers who share their accounts of how socializing has changed dramatically in the 21st Century.

Eighteen months in the making, Beyond Transport was shot intermittently due to an extensive travel schedule to countries that included, Ireland, Cambodia, Tanzania and Australia. An unexpected conversation with a cab driver in Cairns, Australia, and a dive trip to the Great Barrier Reef were initially what inspired Lohr to make the film. “I noticed all the divers were using their personal devices in between dives,” says Lohr. “It seemed like meeting new people and connecting with others has become less of a priority. I thought it would be interesting to interview cab drivers because they have a very unique perspective of people’s behaviors.”

A physician by trade, Lohr had a vision for the documentary, but no idea on how to go about creating it. With no background in producing, writing or even how to use editing systems, Lohr assembled a team of pros to help guide him through the process, including hiring the team at Burbank’s AlphaDogs to complete post for the film.

AlphaDogs colorist Sean Stack distinguished differences in climate between the various locations by choosing specific color palettes. This helped bring the audiences into the story with a feel and vibe on what it might feel like to actually be there in person. “The filmmaker talks to cab drivers from a variety of climates, ranging from the searing heat of Tanzania, to the frigid temperatures of Antarctica,” describes Stack. “With that in mind, I navigated through the documentary looking for ways to help define the surroundings.”

To accomplish this, Stack added saturated warm colors, such as yellow, tan and brown to locations in South Africa and South America, making even the dirt, cars and buildings radiate a sense of intense heat. In contrast, less saturation was given to the harsher climate of Antarctica, using a series of blue tones for both the sky and the water, which added depth, and also gave a more frigid and crisp appearance. Blackmagic’s DaVinci Resolve Power Windows were used to fix problems with uncontrolled lighting situations present in the interviews with cab drivers. Hand-held footage was also stabilized, with a final touch of film grain added to take away from a videotape feel and give a more inviting texture to the documentary.

In addition, Stack created an end credits section by pulling shots of the cab drivers looking into the camera and smiling. “This accomplished the goal of the filmmaker to have pictures accompany the end credits,” explains Stack. “It also added another element of connection to the drivers who are telling the story. Seeing them one last time reminds the viewer of some of the best moments in the documentary and hopefully taking those memorable moments away with them.”

AlphaDogs audio engineer Curtis Fritsch completed audio on the film that included clean up on noisy audio files, since most all of the interviews take place inside of a cab. To keep the audio from sounding over processed, Fritsch used a very specific combination of Cedar and Izotope plugins. “We were able to find a really good balance in making the dialogue sound much clearer and pronounced,” he says. “This was of particular importance in the scene where a muezzin is reciting the adhan (call to prayer). I was able to remove the wind noise so you not only heard the prayer in this dreamlike sequence but also to keep the focus on the music, rather than the VFX.”


Kathrin Lausch joins Uppercut as EP

New York post shop Uppercut has added Kathrin Lausch as executive producer. Lausch has over two decades of experience as an executive producer for top production and post production companies such as MPC, Ntropic, B-Reel, Nice Shoes, Partizan and Compass Films, among others. She has led shops on the front lines for the outset of digital, branded content, reality television and brand-direct production.

“I joined Uppercut after being very impressed with Micah Scarpelli’s clear understanding of the advertising market, its ongoing changes and his proactive approach to offer his services accordingly,” explains Lausch. “The new advertising landscape is offering up opportunities for boutique shops like Uppercut, and interesting conversations and relationships can come out of having a clear and focused offering. It was important to me to be part of a team that embraces change and thrives on being a part of it.”

Half French, half German-born, Lausch followed dual pursuits in law and art in NYC before finding her way to the world of production. She launched Passport Films, which later became Compass Films. After selling the company, she followed the onset of the digital advertising marketplace, landing with B-Reel. She made the shift to post production, further embracing the new digital landscape as executive producer at Nice Shoes and Ntropic before landing as head of new business at MPC.


Behind the Title: Arcade Edit’s Ali Mao

NAME: Ali Mao

COMPANY: Arcade Edit in New York City

CAN YOU DESCRIBE YOUR COMPANY?
Arcade is a film and television editorial house with offices located in Los Angeles and New York City.

WHAT’S YOUR JOB TITLE?
Editor

WHAT DOES THAT ENTAIL?
Being an editor is all about storytelling. Whether that means following the script and boards as designed or playing outside the parameters of those guidelines, we set the pace and tone of a piece in hopes that our audience reacts to it. Sometimes it’s super easy and everything just falls into place. Other times it requires a bit more problem solving on my end, but I’m always striving to tell the story the best I can.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
For a lot of people who don’t work in the industry, they think editors just sit in a dark room alone all the time, and we do sometimes! But what I love most about editing is how collaborative a process it is. So much of what we do is working with the director and the creatives to find just the right pieces that help tell their story the most effectively.

Aflac

Once in awhile the best cuts are not even what was originally boarded or conceived, but what was found through the exploration of editing. When you fall in love with a character, laugh at a joke, or cry at an emotional moment it’s a result of the directing, the acting and the editing all working perfectly in sync with one another.

WHAT’S YOUR FAVORITE PART OF THE JOB?
I love going through dailies for the first time and seeing how the director and the cinematographer compose a particular scene or how an actor interprets lines, especially when you pick up on something in a take that you as an editor love – a subtle twitching of an eye or the way the light captures some element of the image – that everyone forgot about until they see it in your edit.

WHAT’S YOUR LEAST FAVORITE?
Not having enough time to really sit with the footage before I start working with the director or agency.

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
Early in the morning even though I’m not really a morning person…but in our industry, that’s probably the quietest time of the day.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Bumming it at the beach back home in Hawaii.

WHY DID YOU CHOOSE THIS PROFESSION?
During the summer before my junior year of high school, I stumbled upon Vivacious Lady (with Jimmy Stewart and Ginger Rogers) on AMC. I don’t know what it was about that movie, but I stayed up until 2am watching the whole thing.

For the next two years, every Sunday I’d grab the TV guide from the morning newspaper and review the AMC and TCM lineups for the week. Then I’d set my VCR to record every movie I wanted to see, which at the time were mostly musicals and rom-coms. When my dad asked me what I wanted to study in college I said film because at 5’4” getting paid to play basketball probably wasn’t going to happen, and those old AMC and TMC movies were my next favorite thing.

When I got to college, I was taught the basics of FCP in a digital filmmaking class and fell in love with editing instantly. I liked how there was a structure to the process of it, while simultaneously having a ton of creative freedom in how to tell the story.

Tide

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
In January I worked with Saatchi on their Tide Super Bowl Campaign, editing the television teasers and 15s. This was the second year in a row that I got to work with them for the Super Bowl, and it’s one of my favorite jobs every year. They do some really fun and creative work for their teasers, and there’s so much opportunity to experiment and get a little weird

There was the Aflac Ski Patrol spot, and I also just finished a Fage Campaign with Leo Burnett, which went incredibly well. Matt Lenski from Arts & Science did such an incredible job with the shoot and provided me with so many options of how to tell the story for each spot.

DO YOU PUT ON A DIFFERENT HAT WHEN CUTTING FOR A SPECIFIC GENRE?
I think you put on a different hat whenever you start any project, regardless of genre. Every comedy piece or visual piece is unique in its story, rhythm, etc. I definitely try to put myself in the right head space for editing a specific genre, whether that be from chatting with the director/agency or doing a deep dive on the Internet looking for inspiration from films, ads, music videos — anything really.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
I worked as an editor on a documentary called Undroppable. It was about the school dropout rate across the US and followed students from different parts of the country, focusing on the challenges of graduating high school.

The film had already been edited by the time I got involved, but the producer felt it needed fresh eyes. I loved a lot of what the previous editors had done, and felt like the one thing I could bring to the film was focus. There were so many compelling stories that it sometimes felt like you never had a chance to really take any of it in. I wanted the audience to not just fall in love with these students and root for them, but to also leave the theater in active pursuit of ways they could be involved in our country’s education system.

As someone who was cutting mostly commercials and short films in Final Cut Pro at the time, doing a feature length documentary on Avid Media Composer was daunting, but so very, very exciting and gratifying.

WHAT DO YOU EDIT ON THESE DAYS?
Avid Media Composer.

ARE YOU EVER ASKED TO DO MORE THAN EDIT?
Every once in awhile I get a job where I’m asked to create an edit that is not in line with the footage that was shot. In those instances, I’ll have to comp takes together in order to get a desired set of performances or a desired shot. I try not to make the comps too clean because I don’t want to put our Flame artist out of a job.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
iPhone, computer, Roomba

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I just had a baby, so coming home to my son and my baby daddy is a great way to end the day. I also play on an all-women’s flag football team in a co-ed league on the weekends. The first game we ever won I QB’d while I was eight weeks pregnant; it was my Serena Williams moment!

Review: Digital Anarchy’s Transcriptive plugin for Adobe Premiere

By Brady Betzel

One of the most time consuming parts of editing can be dealing with the pre-post, including organizing scripts and transcriptions of interviews. In the past, I have used and loved Avid’s ScriptSync and Phrase Find. These days, with people becoming more comfortable with other NLEs such as Adobe Premiere Pro, Apple FCP X and Blackmagic Resolve, there is a need for similar technology inside  those apps, and that is where Digital Anarchy’s Transcriptive plug-in comes in.

Transcriptive is a Windows- and Mac OS-compatible plugin for Premiere Pro CC 2015.3 and above. Transcriptive allows the editor to have a sequence or multiple clips transcribed in the cloud by either IBM Watson or Speechmatics, a script downloaded to your system and in sync with the clips and sequences for a price. From there you can search for specific words, sort by person speaking, including labelling each speaker, or just follow an interview along with a transcript.

Avid’s ScriptSync is an invaluable plugin, in my opinion, when working on shows with interviews, especially when combining multiple responses into one cohesive answer being covered by b-roll — often referred to as a Frankenbite. Transcriptive comes close to Avid’s ScriptSync within Premiere Pro, but has a few differences, and is priced at $299, plus the per-minute cost of transcription.

A Deeper Look
Within Premiere, Transcriptive lives under the Windows menu > Extension > Transcriptive. To get access to the online AI transcription services you will obviously need an Internet connection as well as an account with Speechmatics and/or IBM’s Watson. You’ll really want to follow along with the manual, which can be found here. It walks you step by step through setting up the Transcriptive plugin.

It is a little convoluted to get it all set up, but once you do you are ready to upload a clip and get transcribing. IBM’s Watson will get you going with 1,000 free minutes of transcription a month, and from there it goes from $.02/minute down to $.01/minute, depending on how much you need transcribed. If you need additional languages transcribed it will be up-charged $.03/minute. Speechmatics is another transcription service that runs roughly $.08 a minute (I say roughly because the price is in pounds and has fluctuated in the past) and it will go down if you do more than 1,000 minutes a month.

Your first question should be why the disparity in price, and in this instance you get what you pay for. If you aren’t as strict on accuracy, then Watson is for you — it doesn’t quite get everything correct and can sometimes fail to see when a new person is talking, even on a very clear recording. Speechmatics was faster during my testing and more accurate. If free is a good price for you then Watson might do the job, and you should try it first. But in my opinion Speechmatics is where you need to be.

When editing interviews, accuracy is extremely important, especially when searching specific key words, and this is where Speechmatics came through. Neither service has complete accuracy, and if something is wrong you can’t kick it back like you could a traditional, human-based transcription service.

The Test
To test Transcriptive I downloaded a CNN interview between Anderson Cooper and Hillary Clinton, which in theory should have perfect audio. Even with “perfect audio” Watson had some trouble when one person would talk over the other. Speechmatics seemed to get each person labeled correctly when they spoke, I would guess it missed only about 5% of the words, so about 95% accurate — Watson seemed to be about 70% accurate.

To get your file to these services you will either send your media from a sequence, multiple clips or a folder of clips. I seem to favor a specific folder of clips to transcode as it forces some organization and my OCD assistant editor brain feels a little more at home.

As a plugin, Transcriptive is an extension inside of Premiere Pro, as alluded to earlier. Inside Premiere you have to have the Transcriptive window active when doing edits or simply playing down a clip, otherwise you will be affecting the timeline (meaning if you hit undo you will be undoing your timeline work, so be careful). When working with transcriptions between clips and sequences your transcription will load differently. If you transcribe individual clips using the Batch Files command, the transcription will be loaded into the infamous Speech Analysis field of the files metadata. In this instance you can now search in the metadata field instead of the Transcriptive window.

One feature I really like is the ability to export a transcript as markers to be placed on the timeline. In addition, you can export many different closed captioning file types such as SMPTE-TT (XML file), which can be used inside of Premiere with its built-in caption integration. SRT and VTT are captioning file types to be uploaded alongside your video to services like YouTube, and JSON files allow you to send transcripts to other machines using the Transcriptive plugin. Besides searching inside of Transcriptive for any lines or speakers you want, you can also edit the transcript. This can be extremely useful if two speakers are combined or if there are some missed words that need to be corrected.

To really explain how Transcriptive works, it is easiest to compare it to Avid’s ScriptSync. If you have used Avid’s ScriptSync and then gave Transcriptive a try, you likely noticed some features that Transcriptive desperately needs in order to be the powerhouse that ScriptSync is — but Transcriptive has the added ability to upload your files and process them in the cloud.

ScriptSync allows the editor or assistant editor to take a bunch of transcriptions, line them up, then, for example, have every clip from a particular person in one transcription file that could be searched or edited from. In addition, there is a physical representation of the transcriptions that can be organized in bins and accessed separately from the clips. These functions would be a huge upgrade to Transcriptive in the future, especially for editors who work on unscripted or documentary projects with multiple interviews from the same people. If you use an external transcription file and want to align with clips you have in the system you must use (and pay) Speechmatics, which for a lower price per minute will align the two files.

Updates Are Coming
After I had finished my initial review, Jim Tierney, president of Digital Anarchy, was kind enough to email me about some updates that were coming to Transcriptive as well as a really handy transcription workflow that I had missed my first time around.

He mentioned that they are working on a Power Search function that will allow for a search of all clips and transcripts inside the project. A window will then show all the search results and can be clicked on to open the corresponding clips in the source window or sequence in the record window. Once that update rolls in, Transcriptive will be much more powerful and easier to use.

The only thing that will be hard to differentiate is if you have multiple interviews from multiple people. For instance, if I wanted to limit the search to only my interviews and for a specific phrase. In the future, a way to Power Search a select folder of clips or sequences may be a great way to search isolated clips or sequences, at least easier than searching all clips and sequences.

The other tidbit Jim mentioned was using YouTube’s built-in transcriptions in your own videos. Before you watch the tutorial keep in mind that this process isn’t flawless. While you can upload your video to YouTube in private mode, the uploading part may still turn away a few people who have security concerns. In addition, you will need to export a low-res proxy version of your clip to transcode, which can take time.

If you have the time, or have an assistant editor with time, this process through YouTube might be your saving grace. My two cents is that with some upfront bookkeeping like tape naming, and after transcribing corrections, this could be one of the best solutions if you aren’t worried about security.

Regardless, check out the tutorial if you want a way to get supposedly very accurate transcriptions via YouTube’s transcriber. In the end it will produce a VTT transcription file that you will import back into Transcriptive, where you will need to either leave alone or spend adjusting since VTT files will not allow for punctuation. The main benefit to the VTT file from YouTube is the timecode is carried back to Transcriptive and enables each word to be clicked on and the video will line up to it.

Summing Up
All in all, there are only a few options when working with transcriptions inside of Premiere. Transcriptive did a good job at what it did: uploading my file to one of the transcription services, acquiring the transcript and aligning the clip to the timecoded transcript with identifying markers for speakers that can be changed if needed. Once the Power Search gets ironed out and put into a proper release, Transcriptive will get even closer to being the transcription powerhouse you need for Premiere editing.

If you work with tons of interviews or just want clips transcribed for easy search you should definitely download Digital Anarchy’s Transcriptive demo and give it a whirl.

You can also find a ton of good video tutorials on their site. Keep in mind that the Transcriptive plugin runs $299 and you have some free transcriptions available to you through IBM’s Watson, but if you want very accurate transcriptions you will need to pay for Speechmatics or you can try YouTube’s built-in transcription service that charges nothing.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.