Author Archives: Randi Altman

Arvato previews EditMate SaaS for Adobe Premiere Pro

At NAB 2018, Arvato Systems previewed an on-demand, SaaS edition of VPMS EditMate, the company’s production asset management system for Adobe Premiere Pro CC.

According to Arvato, the new SaaS version of EditMate addresses the complexity of today’s editing projects by bringing all projects and media into a single, searchable library and eliminating complex folder structures. EditMate also manages project templates, automatically importing regularly used media assets (IDs, graphics, bugs, etc.) into projects and setting up sequence parameters.

The high bit rates of large file sizes can make remote editing tedious and in some cases impractical. EditMate addresses this with an adaptive streaming engine that uses available bandwidth to deliver a high-quality, full-HD “proxy.” This gives users the full Premiere Pro CC experience and toolset without requiring them to wait for or copy large media files. Once users complete an edit, they can check in sequences or subclips “to air” or for review. All of the rendering and high-res creation happens back in the facility or wherever the original material is stored.

A trial version of the EditMate on-demand offering will be available through a web portal alongside entry-level packages for small-to-medium-sized workgroups and advanced packages that include the remote editing features. Additional seats can be added as needed on a monthly basis while enterprise-wide and “private cloud” deployments will also be available on request.

Atomos at NAB offering ProRes RAW recorders

Atomos is at this year’s NAB showing support for ProRes RAW, a new format from Apple that combines the performance of ProRes with the flexibility of RAW video. The ProRes RAW update will be available free for the Atomos Shogun Inferno and Sumo 19 devices.

Atomos devices are currently the only monitor recorders to offer ProRes RAW, with realtime recording from the sensor output of Panasonic, Sony and Canon cameras.

The new upgrade brings ProRes RAW and ProRes RAW HQ recording, monitoring, playback and tag editing to all owners of an Atomos Shogun Inferno or Sumo19 device. Once installed, it will allow the capture of RAW images in up to 12-bit RGB — direct from many of our industry’s most advanced cameras onto affordable SSD media. ProRes RAW files can be imported directly into Final Cut Pro 10.4.1 for high-performance editing, color grading, and finishing on Mac laptop and desktop systems.
Eight popular cine cameras with a RAW output — including the Panasonic AU-EVA1, Varicam LT, Sony FS5/FS7 and Canon C300mkII/C500 — will be supported with more to follow.

With this ProRes RAW support, filmmakers can work easily with RAW – whether they are shooting episodic TV, commercials, documentaries, indie films or social events.

Shooting ProRes RAW preserves maximum dynamic range, with a 12-bit depth and wide color gamut — essential for HDR finishing. The new format, which is available in two compression levels — ProRes RAW and ProRes RAW HQ — preserves image quality with low data rates and file sizes much smaller than uncompressed RAW.

Atomos recorders through ProRes RAW allow for increased flexibility in captured frame rates and resolutions. Atomos can record ProRes RAW up to 2K at 240 frames a second, or 4K at up to 120 frames per second. Higher resolutions such as 5.7K from the Panasonic AU-EVA1 are also supported.

Atomos’ OS, AtomOS 9, gives users filming tools to allow them to work efficiently and creatively with ProRes RAW in portable devices. Fast connections in and out and advanced HDR screen processing means every pixel is accurately and instantly available for on-set creative playback and review. Pull the SSD out and dock to your Mac over Thunderbolt 3 or USB-C 3.1 for immediate super fast post production.

Download the AtomOS 9 update for Shogun Inferno and Sumo 19 at www.atomos.com/firmware.

AlterMedia rolling out rebuild of its Studio Suite 12 at NAB

At this year’s NAB, AlterMedia is showing Studio Suite 12, a ground-up rebuild of its studio, production and post management application. The rebuilt codebase and streamlined interface have made the application lighter, faster and more intuitive; it functions as a web application and yet still has the ability to be customized easily to adapt to varying workflows.

“We literally started over with a blank slate with this version,” says AlterMedia founder Joel Stoner. “The goal was really to reconsider everything. We took the opportunity to shed tons of old code and tired interface paradigms. That said, we maintained the basic structure and flow so existing users would feel comfortable jumping right in. Although there are countless new features, the biggest is that every user can now access Studio Suite 12 through a browser from anywhere.”

Studio Suite 12 now provides better integration within the Internet ecosystem by connecting with Slack and Twillio (for messaging), as well as Google Calendar, Exchange Calendar, Apple Calendar, IMDB, Google Maps, Ebay, QuickBooks and Xero accounting software and more.

Director HaZ Dulull on his sci-fi offering The Beyond

By Randi Altman

Director Hasraf “HaZ” Dulull is no stranger to making movies. Before jumping into writing and directing short sci-fi films, he was a visual effects supervisor and producer. His short film resume includes Project Kronos, I.R.I.S. and Sync. Recently, his first feature film, The Beyond, was released by Gravitas Ventures.

When I first met HaZ a few years back, we were both at an Adobe event — on a canal boat in Amsterdam during IBC. We started talking about visual effects, the industry and his drive to make movies.

This Brit is friendly, intelligent and incredibly hands-on in all aspects of what he does. His latest is The Beyond, which he describes as “a cerebral science-fiction feature film that blends the realism of documentary with the fantastical, ‘big idea’ nature of the science-fiction films of today.” The Beyond tells the story of a ground-breaking mission that sent astronauts — modified with advanced robotics — through a newly discovered wormhole known as the Void. When the mission returns unexpectedly, the space agency races to discover what the astronauts encountered on their first-of-its-kind interstellar space journey.

HaZ on set

HaZ was so hands-on that he provided some of the film’s visual effects and edited the film. Here is the trailer. If you like what you see, the film is available for purchase or rent on most digital platforms.

When I reached out to HaZ to talk about The Beyond, he was in Vancouver working on an eight-part TV series for Disney called Fast Layne. “I directed episodes 1 and 2, and am currently directing episodes 7 and 8,” he says. “The beauty of starting and ending the series is it allowed me to set the show’s style and tone.”

It seems he can’t sit still! Let’s find out more about how he works and The Beyond

Can you talk about prepro? How much of that included visual effects prepro?
Most people who know me will say I’m obsessed with prep. I had about six months of hardcore prep on this, from doing little storyboards, known as HaZ-Grams, right through to previs of the key sequences.

But even during the script-writing stage (six months before actual prep), I was coming up with visuals to support the ideas I was writing in the script. Sometimes I would knock up a test VFX scene just to see how complex it would be to create this idea I was writing in the script. Prep worked hand in hand with the script development and the budgeting of the film. The film was self-financed and later additional financing came in (during post production of the film), so I wanted to ensure everything was mapped out technically, as there was no “fix it in post” scenarios in this film — I wouldn’t allow it.

During location scouting, I would have my iPhone with me and shoot a bunch of footage and still imagery, so when I went back home I could write those locations into the script to make them work with the scenarios depicted in the film.

As part of prep we actually shot a test scene to really see if this mocku-mentary format would work to tell a grounded sci-fi story. This was also used to attract crew and other casting to the project, as well as get distributors primed early on.

Many shots from that test actually made it into the final movie —I wasn’t kidding about not wasting any budget or material on this production! So prep pretty much helped shape the script too, as I knew I wasn’t in the financial position to write stuff and then go and build it. I had to reverse engineer it in a way. In the film we have tons of locations, such as the Space Centre with actual real rockets. We also had a team in Iceland shooting alien landscapes, and we even shot some scenes in Malaysia to give the film a global feel — with each of those opportunities the script was tweaked to make full use of those location opportunities we had.

You shot with Blackmagic cameras. Was that your choice? The DP’s? Have you shot with these before?
From the start, I knew we were going to shoot on Blackmagic cameras. This was mainly down to the fact my DP Adam Batchelor — who had shot Sync with me and the proof of concept tests we did for this film — was a Blackmagic advocate and knew the cameras inside out, but more importantly he was able to get cinematic imagery using those cameras.

Blackmagic was very supportive of the film and have been of my career since my short films, so they came on as one of the executive producers on the film. No one had ever shot a full feature film using just the Blackmagic cameras. We also then used a Resolve pipeline to delivery. So The Beyond is the perfect case study for it.

Can you talk about that workflow? Any hiccups? 
I think the only hiccups were the fact we were using a beta version of Resolve 14, so there were the expected crashes, etc. That would usually be seen as risky on a feature film, but luckily we didn’t have a distributor in place with a release date, so the risk was minimal.

The good thing was I would generate an error log report from Resolve and send it over to Blackmagic, who would then instantly send out a new patch. So we were looked after rather than being left on our own to scream at the monitor.

We stuck with a Pro Res 4444 QuickTime workflow for all material from footage to VFX renders, and enabled proxy on the fly within Resolve. This was great as it meant I was working with the highest-resolution imagery within Resolve, and it was fairly fast too. Things started to slow down when I had multiple layers of VFX and composites/groups, which I then had to render out as a new clip and bring back in.

How did you and the DP develop the look you wanted? Any scenes stick out that you guys worked on?
I was very fortunate to get Max Horton, who had worked on films like Gravity, to come onboard to grade this film at the Dolby Vision lab in London’s Soho. We also did an HDR version of the film, which I think is the first indie film to have an HDR treatment done to it.

We had three to four days of grading with Max, and I was in the room with him the whole time. This was because I had already done a first-pass temp grade myself while editing the film in the beta version of Resolve 14. This made the workflow as simple as exporting my Resolve file and then the material hand-over to Max, who would load up the Resolve file, link up the material and work from there.

Max kept everything photographically like a documentary but with a slight cinematic flair to it. The big challenge was matching all the various sources of material from the various Blackmagic cameras (Ursa Mini Pro, the Production Camera and the Pocket Camera) to the DJI Osmo, drone footage and stock footage.

How many VFX shots were there? Who did them?
There were around 750 visual effects shots. I designed all the VFX scenes and handled a huge portion of the compositing myself, including invisible effects shots, all the space scenes, alien planet scenes, memory scenes and tons more — this would not have been possible without the support of my VFX team who worked on their assigned sequences and shots and also generated tons of CGI assets for me to use to create my shots in comp.

My VFX team members included my long-time collaborator John Sellings, who was the VFX supervisor for all the Human 2.0 sequences. Filmmore, in Amsterdam and Brussels, handled Human 2.0 scenes in the transcode bay with in-house VFX supervisor Hans Van Helden. London’s Squint VFX handled the Human 2.0 scenes in wake-up lab. Charles Wilcocks was the Human 2.0 CG supervisor who worked on the shape and look of the Human 2.0.

Hussin Khan looked after the Malaysian team, which provided rotoscoping support and basic comps. Dan Newlands was our on-set tracking supervisor. He ensured all data was captured correctly and supervised anything tracking related in the Human 2.0 scenes.

Another long-time collaborator was Andrea Tedeschi, who handled the CG and comps for the spacecraft carrier at the end of the film, as well as rendering out the CG astronaut passes. Rhys Griffith handled the rigging for the Human 2.0 characters in Maya, and also looked after the CG passes for the alpha Human 2.0 scenes using Blender. Aleksandr Uusmees provided all the particles and simulation rendered out of Houdini as CG passes/elements, which I then used to create the wormhole effects, alien spheres and other shots that needed those elements.

JM Blay designed and created the standalone motion graphics sequences to visualize the Human 2.0 medical procedure, as well as mission trajectory graphics. He also created several “kit-bash” graphics assets for me to use, including UI graphics, from his After Effects files.

Territory Studio created the awesome end titles and credits sequence, which you can read more about on their site.

As a VFX pro yourself, do you find that you are harder to please because it’s your wheelhouse?
Oh boy. Ask any of the VFX guys on the team and they will say I am a beast to work with because I am hands-on, and also I know how long things take. But on the flip side that had its advantages, as they knew they were not going to get revision after revision, because with each brief I also presented a proposed methodology, and made sure we locked down on that first before proceeding with the shots.

Was this your biggest directing job to date? Can you talk about any surprises?
It wasn’t my biggest directing job to date, as during post production of The Beyond my second sci-fi film Origin Unknown (starring Katee Sackhoff from Battlestar Galactica, The Flash) was green-lit and that had its own set of challenges. We can talk more about that when the film is released theatrically and VOD later this year via Kew Media.

This was, however, my biggest producing job to date; there were so many logistics and resources to manage whilst directing too. The cool thing about the way we made this film was that most of the crew were on my short films, including some of the key cast too, so we embraced the guerrilla nature of the production and focused on maximizing our resources to the fullest within the time and budget constraints.

What did you learn on this film that will help on your next?
The other hat I was wearing was the producer hat, and one thing I had to embrace was the sheer amount of paperwork! I may have taken the same filmmaking approach as I did on my short films — guerrilla and thinking outside the box technically and creatively— but making a commercial feature film, I had to learn to deal with things like clearances, E&O (errors and omission) insurance, chain of title, script report and a whole bunch of paperwork required before a distributor will pick up your film.

Thankfully my co-producer Paula Crickard, who is currently wrapping post on Terry Gilliam’s Don Quixote, came in during the post stage of the film and helped.

The other thing I learned was the whole sales angle — getting a reputable distributor on board to sell the film in all worldwide territories and how to navigate that process with rights and IP and more contracts etc. The advise I got from other filmmakers is getting the right distributor is a big part in how your film will be released, and to me it was important the distributor was into the film and not just the trailer, but also what their marketing and sales strategy were. The Beyond was never designed to be a theatrical film and therefore I wanted someone that had a big reach in the VOD world through their brand, especially since The Beyond doesn’t have big-name actors in there.

What was the most challenging scene or scenes? Why and how did you overcome those challenges?
The Human 2.0 scenes were the most challenging because they had to look photoreal due to it being a documentary narrative. We did first try and do it all in-camera using a built suit, but it wasn’t achieving the look we wanted and the actors would feel uncomfortable with it, and also to do it properly with practical would cost a fortune. So we went with a full-digital solution for the Human 2.0 bodies, by having the actors wear a tight grey suit with tracking markers on and we restricted our camera moves for simplicity to enable object tracking to work as accurately as possible. We also shot multiple reference footage from all angles to help with match moving. Having an on set-tracking supervisor helped massively and allowed us to make this happen within the budget, while looking and feeling real.

Our biggest issue came when our actress made very tiny movements due to breathing in close-up shots. Because our Human 2.0 was human consciousness in a synthetic shell, breathing didn’t make sense and we began making up for it by freezing the image or doing some stabilization, which ended up being nearly impossible for the very close-up shots.

In the end, I had to think outside the box, so I wrote a few lines into the script that explained that the Human 2.0 was breathing to make it psychologically more acceptable to other humans. Those two lines saved us weeks and possibly months of time.

Being a VFX movie you would expect us to use a form of greenscreen or bluescreen, but we didn’t — in fact, the only stage used was for the “white room” astronaut scene, which was shot over at Asylum FX in London. There was an actor wearing an astronaut suit in a bright photography room, and we used brightly exposed lighting to give a surreal feeling. We used VFX to augment it.

As a writer and a director, how was it seeing your vision through from start to finish.
It didn’t really hit me until I watched the press screening of it at the Dolby Vision office in Soho. It had the fully mixed sound and the completed grade. I remember looking across at my DP and other team members thinking, “Whoa! It looks and feels like a feature film, and we did that in a year!”

You edited the film yourself?
Yes, I was the editor on the film! I shoot for the edit. I started off using Adobe Premiere CC for the early offline and then quickly moved over to Resolve 14, where I did the majority of the editing. It was great because I was doing a lot of online editorial tasks like stabilizing, basic VFX, pan and scans, as well as establishing temp looks while editing. So in a way there was no offline and online editorial, as it was all part of one workflow. We did all our deliverables out of Resolve 14, too.

Editor Dylan Tichenor to headline SuperMeet at NAB 2018

For those of you heading out to Las Vegas for NAB 2018, the 17th annual SuperMeet will take place on Tuesday, April 10 at the Rio Hotel. Speaking this year will be Oscar-nominated film editor Dylan Tichenor (There Will be Blood, Zero Dark Thirty). Additionally, there will be presentations from Blackmagic, Adobe, Frame.io, HP/Nvidia, Atomos and filmmaker Bradley Olsen, who will walk the audience through his workflow on Off the Tracks, a documentary about Final Cut Pro X.

Blackmagic Resolve designers Paul Saccone, Mary Plummer, Peter Chamberlain and Rohit Gupta will answer all questions on all things DaVinci Resolve, Fusion or Fairlight Audio.

Adobe Premiere Pro product manager Patrick Palmer will reveal new features in Adobe video solutions for their editing, color, graphics and audio editing workflows.

Frame.io CEO Emery Wells will preview the next generation of its collaboration and workflow tool, which will be released this summer.

Atomos’ Jeromy Young will talk about some of their new partners. He says, “It involves software and camera makers alike.”

As always, the evening will round out with the SuperMeet’s ”World Famous Raffle,” where the total value of prizes has now reached over $101,000. Part of that total includes a Blackmagic Advanced Control Panel, worth $29,995.

Doors will open at 4:30pm with the SuperMeet Vendor Showcase, which features 23 software and hardware developers. Those attending can enjoy a few cocktails and mingle with industry peers.

To purchase tickets, and for complete daily updates on the SuperMeet, including agenda updates, directions, transportation options and a current list of raffle prizes, visit the SuperMeet website.

NAB: Adobe’s spring updates for Creative Cloud

By Brady Betzel

Adobe has had a tradition of releasing Creative Cloud updates prior to NAB, and this year is no different. The company has been focused on improving existing workflows and adding new features, some based on Adobe’s Sensei technology, as well as improved VR enhancements.

In this release, Adobe has announced a handful of Premiere Pro CC updates. While I personally don’t think that they are game changing, many users will appreciate the direction Adobe is going. If you are color correcting, Adobe has added the Shot Match function that allows you to match color between two shots. Powered by Adobe’s Sensei technology, Shot Match analyzes one image and tries to apply the same look to another image. Included in this update is the long-requested split screen to compare before and after color corrections.

Motion graphic templates have been improved with new adjustments like 2D position, rotation and scale. Automatic audio ducking has been included in this release as well. You can find this feature in the Essential Sound panel, and once applied it will essentially dip the music in your scene based on dialogue waveforms that you identify.

Still inside of Adobe Premiere Pro CC, but also applicable in After Effects, is Adobe’s enhanced Immersive Environment. This update is for people who use VR headsets to edit and or process VFX. Team Project workflows have been updated with better version tracking and indicators of who is using bins and sequences in realtime.

New Timecode Panel
Overall, while these updates are helpful, none are barn burners, the thing that does have me excited is the new Timecode Panel — it’s the biggest new update to the Premiere Pro CC app. For years now, editors have been clamoring for more than just one timecode view. You can view sequence timecodes, source media timecodes from the clips on the different video layers in your timeline, and you can even view the same sequence timecode in a different frame rate (great for editing those 23.98 shows to a 29.97/59.94 clock!). And one of my unexpected favorites is the clip name in the timecode window.

I was testing this feature in a pre-release version of Premiere Pro, and it was a little wonky. First, I couldn’t dock the timecode window. While I could add lines and access the different menus, my changes wouldn’t apply to the row I had selected. In addition, I could only right click and try to change the first row of contents, but it would choose a random row to change. I am assuming the final release has this all fixed. If it the wonkiness gets flushed out, this is a phenomenal (and necessary) addition to Premiere Pro.

Codecs, Master Property, Puppet Tool, more
There have been some compatible codec updates, specifically Raw Sony X-OCN (Venice), Canon Cinema Raw Light (C200) and Red IPP2.

After Effects CC has also been updated with Master Property controls. Adobe said it best during their announcement: “Add layer properties, such as position, color or text, in the Essential Graphics panel and control them in the parent composition’s timeline. Use Master Property to push individual values to all versions of the composition or pull selected changes back to the master.”

The Puppet Tool has been given some love with a new Advanced Puppet Engine, giving access to improving mesh and starch workflows to animate static objects. Beyond updates to Add Grain, Remove Grain and Match Grain effects, making them multi-threaded, enhanced disk caching and project management improvements have been added.

My favorite update for After Effects CC is the addition of data-driven graphics. You can drop a CSV or JSON data file and pick-whip data to layer properties to control them. In addition, you can drag and drop data right onto your comp to use the actual numerical value. Data-driven graphics is a definite game changer for After Effects.

Audition
While Adobe Audition is an audio mixing application, it has some updates that will directly help anyone looking to mix their edit in Audition. In the past, to get audio to a mixing program like Audition, Pro Tools or Fairlight you would have to export an AAF (or if you are old like me possibly an OMF). In the latest Audition update you can simply open your Premiere Pro projects directly into Audition, re-link video and audio and begin mixing.

I asked Adobe whether you could go back and forth between Audition and Premiere, but it seems like it is a one-way trip. They must be expecting you to export individual audio stems once done in Audition for final output. In the future, I would love to see back and forth capabilities between apps like Premiere Pro and Audition, much like the Fairlight tab in Blackmagic’s Resolve. There are some other updates like larger tracks and under-the-hood updates which you can find more info about on: https://theblog.adobe.com/creative-cloud/.

Adobe Character Animator has some cool updates like overall character building updates, but I am not too involved with Character Animator so you should definitely read about things like the Trigger Improvements on their blog.

Summing Up
In the end, it is great to see Adobe moving forward on updates to its Creative Cloud video offerings. Data-driven animation inside of After Effects is a game-changer. Shot color matching in Premiere Pro is a nice step toward a professional color correction application. Importing Premiere Pro projects directly into Audition is definitely a workflow improvement.

I do have a wishlist though: I would love for Premiere Pro to concentrate on tried-and-true solutions before adding fancy updates like audio ducking. For example, I often hear people complain about how hard it is to export a QuickTime out of Premiere with either stereo or mono/discrete tracks. You need to set up the sequence correctly from the jump, adjust the pan on the tracks, as well as adjust the audio settings and export settings. Doesn’t sound streamlined to me.

In addition, while shot color matching is great, let’s get an Adobe SpeedGrade-style view tab into Premiere Pro so it works like a professional color correction app… maybe Lumetri Pro? I know if the color correction setup was improved I would be way more apt to stay inside of Premiere Pro to finish something instead of going to an app like Resolve.

Finally, consolidating and transcoding used clips with handles is hit or miss inside of Premiere Pro. Can we get a rock-solid consolidate and transcode feature inside of Premiere Pro? Regardless of some of the few negatives, Premiere Pro is an industry staple and it works very well.

Check out Adobe’s NAB 2018 update video playlist for details on each and every update.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Director Kay Cannon on her raunchy comedy Blockers

By Iain Blair

At a time when women are increasingly breaking down barriers in Hollywood, writer/director Kay Cannon is helping lead the charge. The director of Universal’s new film, Blockers, got her start at such comedic training grounds as The Second City, The iO West Theater and The ComedySportz Theatre.

Kay Cannon

While writing and performing around Chicago, she met Tina Fey, a fellow Second City alumna. When Fey began 30 Rock, Cannon joined the creative team and worked her way up from staff writer to supervising producer on the show. She’s a three-time Primetime Emmy-nominated writer, twice for Outstanding Comedy Series and once for Outstanding Writing for a Comedy Series. She has also won three Writers Guild of America Awards, as well as a Peabody, all for her work on 30 Rock.

Cannon, who also served as a co-executive producer on New Girl, a consulting producer on Cristela and co-produced the hit feature Baby Mama, received rave reviews for her debut screenplay for the film Pitch Perfect, and she wrote and co-produced the hit sequels. She served as the executive producer, creator and showrunner of the Netflix series Girlboss, based on Sophia Amoruso’s best-selling autobiography, which starred Britt Robertson.

Now, with the new release Blockers, Cannon — one of only a handful of women ever to direct an R-rated comedy for a big studio — has stepped behind the camera and made an assured and polished directorial debut with this coming-of-age sex comedy that takes one of the most relatable rites of passage and upends a long-held double standard. When three parents discover their daughters’ pact to lose their virginity at prom, they launch a covert one-night operation to stop the teens from sealing the deal.

The film stars Leslie Mann (The Other Woman, This is 40), John Cena (Trainwreck, Sisters) and Ike Barinholtz (Neighbors, Suicide Squad). It is produced by Evan Goldberg, Seth Rogen and James Weaver, under their Point Grey Pictures banner (Neighbors, This is the End), alongside Jon Hurwitz and Hayden Schlossberg (Harold & Kumar) and Chris Fenton (47 Ronin).

Cannon leds an accomplished behind-the-scenes team, including director of photography Russ Alsobrook (Forgetting Sarah Marshall, Superbad), production designer Brandon Tonner-Connolly (The Big Sick) and editor Stacey Schroeder (The Disaster Artist).

I recently talked to Cannon about making the bawdy film, which generated huge buzz at SXSW, and her advice for other aspiring women directors.

This is like a long-overdue female take on such raunchy-but-sweet male comedies as American Pie and Superbad. Was that the appeal of this story for you?
When I read the script, I really connected on two levels. I was a teenager who lost her virginity, and I’m also the mother of a daughter, and while she was only two at the time, it made me think about her and what might happen to her in the future. And that’s scary, and I saw how parents can lose their minds.

How did you first envision the film?
I grew up in a small town in the Chicago area and I was inspired by John Hughes and all his great teen comedies. I could really relate to them, and I felt he was speaking to me, that he really got that world and the way it looked. I wanted to do that too, and show how people really live, and I wanted it to feel real and grounded — but then I was also going to go to a very crazy place and got very silly. (Laughs) That was very important to me, because I wanted to make people laugh really hard, but also feel emotion.

Did you always want to direct?
It wasn’t always my dream. That’s shifted over the years. I started off wanting to be an actor on a sitcom, then writing one and then wanting to have my own show, which happened with Girlboss, so that was my focus for the past few years. To be honest, I’d kind of do movies when TV didn’t work out for me. A pilot didn’t happen, so I wrote Pitch Perfect, and then did Pitch 2 when another pilot didn’t go.

How did you prepare for directing your first film?
Being the showrunner on Girlboss was great training because I could shadow all the directors and watch them work, and I felt definitely ready to direct a film.

What was the biggest surprise of directing for the first time?
I pretty much knew what to expect — and that there will always be surprises on the day and stuff you could never have anticipated. You just have to work through it and keep going.

How tough was the shoot?
It was hard. We shot in Atlanta for nine weeks, and the last five were nights, and that’s very tough. I had a very long script to squeeze into the shoot. But Russ, my DP, was a huge help. We’d worked together before on New Girl, and he’s so experienced; he really guided me through it all.

Where did you do the post?
All in LA. We started at Sunset Gower, and then we took a break and did some reshoots in January, and then finished at Pivotal Post in Burbank.

Do you like post?
When I was at Girlboss I’d never experienced post before, so I was really afraid and uncomfortable with the whole process. It was so new and a bit daunting to me, especially as a writer. I loved writing and shooting, but it took me a while to get comfortable with post. But once I did, I loved it, and now it’s my favorite thing. I’d spend the night there if I could! As they say, it’s where you actually make the film and where the real magic happens.

Kay Cannon on set directing Leslie Mann and John Cena.

Your editor was Stacey Schroeder (pilot for The Last Man on Earth, for which she got an editing Emmy nom). How did that relationship work?
We’d worked together before on Girlboss, and we have a great partnership. She’s like my right-hand, and we’re automatically on the same page. We very rarely disagree, and what’s so great is that she’s extremely opinionated and has no poker face. I’m the same way. So it’s very refreshing to sit there and discuss material and any problems without taking anything personally. I really appreciate her honesty.

What were the biggest editing challenges?
Trying to balance the raucous comedy stuff with the serious emotions underneath, and dealing with some of the big set pieces. The whole puking scene was difficult as we shot three times the material you see, and there was a whole drug thing, and it was very long and it just wasn’t working. We previewed it a couple of times and it was seen as a poor man’s Bridesmaids. (Laughs) And then I saw Baby Driver and it hit us — what if we put the whole scene to music? And that was so much fun and it suddenly all worked.

Resistance VFX did the visual effects shots, and there seemed to be quite a few, considering it’s a comedy. What was involved?
You’re right. Usually comedies don’t have that many and we had a significant amount, including the puke scenes, and then all the computer stuff and the emojis. And then they did such a great job with all Amy Mann’s tears at the end. I really loved working with VFX, and the fact that they can create all this magic in post. I’d be constantly amazed. “Can you do that?” They’d sigh and go, “Yes Kay, we can do that, no problem.” It was a real education for me.

Where did you do the DI?
At Technicolor, and I was pretty involved along with Ross. I loved that whole process too. Again, it’s the magic of post. (Maxine Gervais was the supervising senior colorist. She used a FilmLight Baselight 5.)

Did it turn out the way you hoped?
Absolutely.

Do you want to direct again?
Definitely, if I get another chance.

What’s next?
I’m writing a movie for Sony — another comedy — and I’ve got a bunch of projects percolating.

What advice would you give to any woman wanting to direct?
Do the work, and don’t quit when it gets hard. I think a lot of women quit before the magic happens, and there were times when I wanted to quit, but you can’t. You have to keep going.

Kay Cannon Photo Credit: Quantrell D. Colbert (c) 2018 Universal. 


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

NYC’s Wax adds editor Kate Owen

Kate Owen, an almost 20-year veteran who has cut both spots and films, has joined the editorial roster of New York’s Wax. The UK-born but New York-based Owen has edited projects across fashion, beauty, lifestyle and entertainment for brands such as Gucci, Victoria Beckham, Vogue, Adidas, Sony, Showtime and Virgin Atlantic.

Owen started editing in her teens and subsequently worked with top-tier agencies like Mother, Saatchi NY, McGarryBowen, Grey Worldwide and Y&R. She has also worked at editing houses Marshall Street Editors and Whitehouse Post.

In terms of recognition, Owen had been BAFTA-nominated for her short film Turning and has won multiple industry awards, including One Show, D&AD, BTAA as well as a Gold Cannes Lions for her work on the “The Man Who Walked Around the World” campaign for Johnnie Walker.

Owen believes editing is a “fluid puzzle. I create in my mind a somewhat Minority Report wall with all the footage in front of me, where I can scroll through several options in my mind to try out and create fluid visual mixes. It’s always the unknown journey at the start of every project and the fascination that comes with honing and fine tuning or tearing an edit upside down and viewing it from a totally different perspective that is so exciting to me”.

Regarding her new role, she says, “There is a unique opportunity to create a beauty, fashion and lifestyle edit arm at Wax. The combination of my edit aesthetic and the company’s legacy of agency beauty background is really exciting to me.”

Owen calls herself “a devoted lifetime Avid editor.” She says, for her, it’s the most elegant way to work. “I can build walls of thumbnails in my Selects Bins and create living mood boards. I love how I can work in very detailed timelines and speed effects without having to break my workflow.”

She also gives a shout out to the Wax design and VFX team. “If we need to incorporate After Effects or Maxon Cinema 4D, I am able to brief and work with my team and incorporate those elements into my offline. I also love to work with the agency or director to work out a LUT before the shoot so that the offline looks premium right from the start.”

NextComputing, Z Cam, Assimilate team on turnkey VR studio

NextComputing, Z Cam and Assimilate have teamed up to create a complete turnkey VR studio. Foundation VR Studio is designed to provide all aspects of the immersive production process and help the creatives be more creative.

According to Assimilate CEO Jeff Edson, “Partnering with Z Cam last year was an obvious opportunity to bring together the best of integrated 360 cameras with a seamless workflow for both live and post productions. The key is to continue to move the market from a technology focus to a creative focus. Integrated cameras took the discussions up a level of integration away from the pieces. There have been endless discussions regarding capable platforms for 360; the advantage we have is we work with just about every computer maker as well as the component companies, like CPU and GPU manufacturers. These are companies that are willing to create solutions. Again, this is all about trying to help the market focus on the creative as opposed to debates about the technology, and letting creative people create great experiences and content. Getting the technology out of their way and providing solutions that just work helps with this.”

These companies are offering a few options with their Power VR Studio.

The Foundation VR Studio, which costs $8,999 and is available now includes:
• NextComputing Edge T100 workstation
o CPU: 6-core Intel core i7-8700K 3.7GHz processor
o Memory: 16GB DDR4 2666MHz RAM
• Z Cam S1 6K professional VR camera
• Z Cam WonderStitch software for offline stitching and profile creation
• Assimilate Scratch VR Z post software and live streaming for Z Cam

Then there is the Power VR Studio, for $10,999, which is also available now. It includes:
• NextComputing Edge T100 workstation
o CPU: 10-core Intel core i9-7900K 3.3GHz processor
o Memory: 32GB DDR4 2666MHz RAM
• Z Cam S1 6K professional VR camera
• Z Cam WonderStitch software for offline stitching and profile creation
• Assimilate Scratch VR Z post software and live streaming for Z Cam

These companies will be at NAB demoing the systems.

 

 

B&H expands its NAB footprint to target multiple workflows

By Randi Altman

In a short time, many in our industry will be making the pilgrimage to Las Vegas for NAB. They will come (if they are smart) with their comfy shoes, Chapstick and the NAB Show app and plot a course for the most efficient way to see all they need to see.

NAB is a big show that spans a large footprint, and typically companies showing their wares need to pick a hall — Central, South Lower, South Upper or North. This year, however, The Studio-B&H made some pros’ lives a bit easier by adding a booth in South Lower in addition to their usual presence in Central Hall.

B&H’s business and services have grown, so it made perfect sense to Michel Suissa, managing director at The Studio-B&H, to grow their NAB presence to include many of the digital workflows the company has been servicing.

We reached out to Suissa to find out more.

This year B&H and its Studio division are in the South Lower. Why was it important for you guys to have a presence in both the Central and South Halls this year?
The Central Hall has been our home for a long time and it remains our home with our largest footprint, but we felt we needed to have a presence in South Hall as well.

Production and post workflows merge and converge constantly and we need to be knowledgeable in both. The simple fact is that we serve all segments of our industry, not just image acquisition and camera equipment. Our presence in image and data centric workflows has grown leaps and bounds.

This world is a familiar one for you personally.
That’s true. The post and VFX worlds are very dear to me. I was an editor, Flame artist and colorist for 25 years. This background certainly plays a role in expanding our reach and services to these communities. The Studio-B&H team is part of a company-wide effort to grow our presence in these markets. From a business standpoint, the South Hall attendees are also our customers, and we needed to show we are here to assist and support them.

What kind of workflows should people expect to see at both your NAB locations?
At the South Hall, we will show a whole range of solutions to show the breadth and diversity of what we have to offer. That includes VR post workflow, color grading, animation and VFX, editing and high-performance Flash storage.

In addition to the new booth in South Hall, we have two in Central. One is for B&H’s main product offerings, including our camera shootout, which is a pillar of our NAB presence.

This Studio-B&H booth features a digital cinema and broadcast acquisition technology showcase, including hybrid SDI/IP switching, 4K studio cameras, a gyro-stabilized camera car, the most recent full-frame cinema cameras, and our lightweight cable cam, the DynamiCam.

Our other Central Hall location is where our corporate team can discuss all business opportunities with new and existing B2B customers

How has The Studio-B&H changed along with the industry over the past year or two?
We have changed quite a bit. With our services and tools, we have re-invented our image from equipment providers to solution providers.

Our services now range from system design to installation and deployment. One of the more notable recent examples is our recent collaboration with HBO Sports on World Championship Boxing. The Studio-B&H team was instrumental in deploying our DynamiCam system to cover several live fights in different venues and integrating with NEP’s mobile production team. This is part of an entirely new type of service —  something the company had never offered its customers before. It is a true game-changer for our presence in the media and entertainment industry.

What do you expect the “big thing” to be at NAB this year?
That’s hard to say. Markets are in transition with a number of new technology advancements: machine learning and AI, cloud-based environments, momentum for the IP transition, AR/VR, etc.

On the acquisition side, full frame/large sensor cameras have captured a lot of attention. And, of course, HDR will be everywhere. It’s almost not a novelty anymore. If you’re not taking advantage of HDR, you are living in the past.