Arraiy 4.11.19

Category Archives: production

Atomos’ new Shogun 7: HDR monitor, recorder, switcher

The new Atomos Shogun 7 is a seven-inch HDR monitor, recorder and switcher that offers an all-new 1500-nit, daylight-viewable, 1920×1200 panel with a 1,000,000:1 contrast ratio and 15+ stops of dynamic range displayed. It also offers ProRes RAW recording and realtime Dolby Vision output. Shogun 7 will be available in June 2019, priced at $1,499.

The Atomos screen uses a combination of advanced LED and LCD technologies which together offer deeper, better blacks the company says rivals OLED screens, “but with the much higher brightness and vivid color performance of top-end LCDs.”

A new 360-zone backlight is combined with this new screen technology and controlled by the Dynamic AtomHDR engine to show millions of shades of brightness and color. It allows Shogun 7 to display 15+ stops of real dynamic range on-screen. The panel, says Atomos, is also incredibly accurate, with ultra-wide color and 105% of DCI-P3 covered, allowing for the same on-screen dynamic range, palette of colors and shades that your camera sensor sees.

Atomos and Dolby have teamed up to create Dolby Vision HDR “live” — a tool that allows you to see HDR live on-set and carry your creative intent from the camera through into HDR post. Dolby have optimized their target display HDR processing algorithm which Atomos has running inside the Shogun 7. It brings realtime automatic frame-by-frame analysis of the Log or RAW video and processes it for optimal HDR viewing on a Dolby Vision-capable TV or monitor over HDMI. Connect Shogun 7 to the Dolby Vision TV and AtomOS 10 automatically analyzes the image, queries the TV and applies the right color and brightness profiles for the maximum HDR experience on the display.

Shogun 7 records images up to 5.7kp30, 4kp120 or 2kp240 slow motion from compatible cameras, in RAW/Log or HLG/PQ over SDI/HDMI. Footage is stored directly to AtomX SSDmini or approved off-the-shelf SATA SSD drives. There are recording options for Apple ProRes RAW and ProRes, Avid DNx and Adobe CinemaDNG RAW codecs. Shogun 7 has four SDI inputs plus a HDMI 2.0 input, with both 12G-SDI and HDMI 2.0 outputs. It can record ProRes RAW in up to 5.7kp30, 4kp120 DCI/UHD and 2kp240 DCI/HD, depending on the camera’s capabilities. Also, 10-bit 4:2:2 ProRes or DNxHR recording is available up to 4Kp60 or 2Kp240. The four SDI inputs enable the connection of most quad-link, dual-link or single-link SDI cinema cameras. Pixels are preserved with data rates of up to 1.8Gb/s.

In terms of audio, Shogun 7 eliminates the need for a separate audio recorder. Users can add 48V stereo mics via an optional balanced XLR breakout cable, or select mic or line input levels, plus record up to 12 channels of 24/96 digital audio from HDMI or SDI. Monitoring selected stereo tracks is via the 3.5mm headphone jack. There are dedicated audio meters, gain controls and adjustments for frame delay.

Shogun 7 features the latest version of the AtomOS 10 touchscreen interface, first seen on the Ninja V.  The new body of Shogun 7 has a Ninja V-like exterior with ARRI anti-rotation mounting points on the top and bottom of the unit to ensure secure mounting.

AtomOS 10 on Shogun 7 has the full range of monitoring tools, including Waveform, Vectorscope, False Color, Zebras, RGB parade, Focus peaking, Pixel-to-pixel magnification, Audio level meters and Blue only for noise analysis.

Shogun 7 can also be used as a portable touchscreen-controlled multi-camera switcher with asynchronous quad-ISO recording. Users can switch up to four 1080p60 SDI streams, record each plus the program output as a separate ISO, then deliver ready-for-edit recordings with marked cut-points in XML metadata straight to your NLE. The current Sumo19 HDR production monitor-recorder will also gain the same functionality in a free firmware update.

There is asynchronous switching, plus use genlock in and out to connect to existing AV infrastructure. Once the recording is over, users can import the XML file into an NLE and the timeline populates with all the edits in place. XLR audio from a separate mixer or audio board is recorded within each ISO, alongside two embedded channels of digital audio from the original source. The program stream always records the analog audio feed as well as a second track that switches between the digital audio inputs to match the switched feed.

Review: Mzed.com’s Directing Color With Ollie Kenchington

By Brady Betzel

I am constantly looking to educate myself, no matter what the source — or subject. Whether I am learning how to make a transition in Adobe After Effects from an eSports editor on YouTube to Warren Eagles teaching color correction in Blackmagic’s DaVinci Resolve on FXPHD.com, I’m always beefing up my skills. I even learn from bad tutorials — they teach you what not to do!

But when you come across a truly remarkable learning experience, it is only fair to share with the rest of the world. Last year I saw an ad for an MZed.com course called “Directing Color With Ollie Kenchington,” and was immediately interested. These days you can pretty much find any technical tutorial you can dream of on YouTube, but truly professional, higher education-like, theory-based education series are very hard to come by. Even ones you need to pay for aren’t always worth their price of admission, which is a huge let down.

Ollie sharing his wisdom.

Once I gained access to MZed.com I wanted to watch every educational series they had. From lighting techniques with ASC member Shane Hurlbut to the ARRI Amira Camera Primer, there are over 150 hours of education available from industry leaders. However, I found my way to Directing Color…

I am often asked if I think people should go to college or a film school. My answer? If you have the money and time, you should go to college followed by film school (or do both together, if the college offers it). Not only will you learn a craft, but you will most likely spend hundreds of hours studying and visualizing the theory behind it. For example, when someone asks me about the science behind camera lenses, I can confidently answer them thanks to my physics class based on lenses and optics from California Lutheran University (yes, a shameless plug).

In my opinion, a two-, four- or even 10-year education allows me to live in the grey. I am comfortable arguing for both sides of a debate, as well as the options that are in between —  the grey. I feel like my post-high school education really allowed me to recognize and thrive in the nuances of debate. Leaving me to play devil’s advocate maybe a little too much, but also having civil and proactive discussions with others without being demeaning or nasty — something we are actively missing these days. So if living in the grey is for you, I really think a college education supplemented by online or film school education is valuable (assuming you make the decision that the debt is worth it like I did).

However, I know that is not an option for everyone since it can be very expensive — trust me, I know. I am almost done paying off my undergraduate fees while still paying off my graduate ones, which I am still two or three classes away from finishing. That being said, Directing Color With Ollie Kenchington is the only online education series I have seen so far that is on the same level as some of my higher education classes. Not only is the content beautifully shot and color corrected, but Ollie gives confident and accessible lessons on how color can be used to draw the viewer’s attention to multiple parts of the screen.

Ollie Kenchington is a UK-based filmmaker who runs Korro Films. From the trailer of his Directing Color series, you can immediately see the beauty of Ollie’s work and know that you will be in safe hands. (You can read more about his background here.)

The course raises the online education bar and will elevate the audiences idea of professional insight. The first module “Creating a Palette” covers the thoughts behind creating a color palette for a small catering company. You may even want to start with the last Bonus Module “Ox & Origin” to get a look at what Ollie will be creating throughout the seven modules and about an hour and a half of content.

While Ollie goes over “looks,” the beauty of this course is that he goes through his internal thought processes including deciding on palettes based on color theory. He didn’t just choose teal and orange because it looks good, he chooses his color palette based on complementary colors.

Throughout the course Ollie covers some technical knowledge, including calibrating monitors and cameras, white balancing and shooting color charts to avoid having wrong color balance in post. This is so important because if you don’t do these simple steps, your color correction session while be much harder. And wasting time on fixing incorrect color balance takes time away from the fun of color grading. All of this is done through easily digestible modules that range from two to 20 minutes.

The modules include Creating a Palette; Perceiving Color; Calibrating Color; Color Management; Deconstructing Color 1 – 3 and the Bonus Module Ox & Origin.

Without giving away the entire content in Ollie’s catalog, my favorite modules in this course are the on-set modules. Maybe because I am not on-set that often, but I found the “thinking out loud” about colors helpful. Knowing why reds represent blood, which raise your heart rate a little bit, is fascinating. He even goes through practical examples of color use in films such as in Whiplash.

In the final “Deconstructing Color” modules, Ollie goes into a color bay (complete with practical candle backlighting) and dives in Blackmagic’s DaVinci Resolve. He takes this course full circle to show how since he had to rush through a scene he can now go into Resolve and add some lighting to different sides of someone’s face since he took time to set up proper lighting on set, he can focus on other parts of his commercial.

Summing Up
I want to watch every tutorial MZed.com has to offer. From “Philip Bloom’s Cinematic Masterclass” to Ollie’s other course “Mastering Color.” Unfortunately, as of my review, you would have to pay an additional fee to watch the “Mastering Color” series. It seems like an unfortunate trend in online education to charge a fee and then when an extra special class comes up, charge more, but this class will supposedly be released to the standard subscribers in due time.

MZed.com has two subscription models: MZed Pro, which is $299 for one year of streaming the standard courses, and MZed Pro Premium for $399. This includes the standard courses for one year and the ability to choose one “Premium” course.

“Philip Bloom’s Cinematic Master Class” was the Premium course I was signed up for initially, but you you can decide between this one and the “Mastering Color” course. You will not be disappointed regardless of which one you choose. Even their first course “How to Photograph Everyone” is chock full of lighting and positioning instruction that can be applied in many aspects of videography.

I really was impressed with Directing Color with Ollie Kenchington, and if the other course are this good MZed.com will definitely become a permanent bookmark for me.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Arraiy 4.11.19

VFX house Rodeo FX acquires Rodeo Production

Visual effects studio Rodeo FX, whose high-profile projects include Dumbo, Aquaman and Bumblebee, has purchased Rodeo Production and added its roster of photographers and directors to its offerings.

The two companies, whose common name is just a coincidence, will continue to operate as distinct entities. Rodeo Production’s 10-year-old Montreal office will continue to manage photo and video production, but will now also offer RodeoFX’s post production services and technical expertise.

In Toronto, Rodeo FX plans to open an Autodesk Flame editing suite in the Rodeo Production’ studio and expand its Toronto roster of photographers and directors with the goal of developing stronger production and post services for clients in the city’s advertising, television and film industries.

“This is a milestone in our already incredible history of growth and expansion,” says Sébastien Moreau, founder/president of Rodeo FX, which has offices in LA and Munich in addition to Montreal.

“I have always worked hard to give our artists the best possible opportunities, and this partnership was the logical next step,” says Rodeo Production’s founder Alexandra Saulnier. “I see this as a fusion of pure creativity and innovative technology. It’s the kind of synergy that Montreal has become famous for; it’s in our DNA.”

Rodeo Production clients include Ikea, Under Armour and Mitsubishi.


Shooting, posting New Republic’s Indie film, Sister Aimee

After a successful premiere at the Sundance Film Festival, New Republic Studios’ Sister Aimee screened at this month’s SXSW. The movie tells the story of an infamous American evangelist of the 1920s, Sister Aimee Semple McPherson, who gets caught up in her lover’s dreams of Mexico and finds herself on a road trip toward the border.

Sister Aimee shot at the newly renovated New Republic Studios near Austin, Texas, over two and a half weeks. “Their crew used our 2,400-square-foot Little Bear soundstage, our 3,000-square-foot Lone Wolf soundstage, our bullpen office space and numerous exterior locations in our backlot,” reports New Republic Studios president Mindy Raymond, adding that the Sister Aimee production also had access to two screening rooms with 5.1 surround sound, HDMI hookups to 4K monitors and theater-style leather chairs to watch dailies. The film also hit the road, shooting in the New Mexico desert.

L-R: Directors Samantha Buck, Marie Schlingmann at SXSW. Credit: Harrison Funk

Co-written and co-directed by Samantha Buck and Marie Schlingmann, the movie takes some creative license with the story of Aimee. “We don’t look for factual truth in Aimee’s journey,” they explain. “Instead we look for a more timeless truth that says something about female ambition, the female quest for immortality and, most of all, the struggle for women to control their own narratives. It becomes a story about storytelling itself.”

The film, shot by cinematographer Carlos Valdes-Lora at 3.2K ProRes 4444 XQ on an Arri Alexa Mini, was posted at Dallas and Austin-based Charlieuniformtango.

We reached out to the DP and the post team to find out more.

Carlos, why did you choose the package of the Alexa and Cooke Mini S4 Primes?
Carlos Valdes-Lora: In early conversations with the directors, we all agreed that we didn’t want Sister Aimee to feel like a traditional period movie. We didn’t want to use softening filters or vintage lenses. We aimed instead for clear images, deep focus and a rich color palette that remains grounded in the real world. We felt that this would lend the story a greater sense of immediacy and draw the viewer closer to the characters. Following that same thinking, we worked very extensively with the 25mm and 32mm, especially in closeups and medium closeups, emphasizing accessibility.

The Cooke Mini S4s are a beautiful and affordable set (relative to our other options.) We like the way they give deep dimensionality and warmth to faces, and how they create a slightly lower contrast image compared to the other modern lenses we looked at. They quickly became the right choice for us, striking the right balance between quality, size and value.

The Cookes paired with the Alexa Mini gave us a lightweight camera system with a very contained footprint, and we needed to stay fast and lean due to our compressed shooting schedule and often tight shooting quarters. The Chapman Cobra dolly was a big help in that regard as well.

What was the workflow to post like?
Charlieuniformtango producers Bettina Barrow, Katherine Harper, David Hartstein: Post took place primarily between Charlieuniformtango’s Dallas and Austin offices. Post strategizing started months before the shoot, and active post truly began when production began in July 2018.

Tango’s Evan Linton handled dailies brought in from the shoot, working alongside editor Katie Ennis out of Tango’s Austin studio, to begin assembling a rough cut as shooting continued. Ennis continued to cut at the studio through August with directors Schlingmann and Buck.

Editorial then moved back to the directors’ home state of New York to finish the cut for Sundance. (Editor Ennis, who four-walled out of Tango Austin for the first part of post, went to  New York with the directors, working out of a rented space.)

VFX and audio work started early at Tango, with continuously updated timelines coming from editorial, working to have certain locked shots also finished for the Sundance submission, while saving much of the cleanup and other CG heavy shots for the final picture lock.

Tango audio engineer Nick Patronella also tackled dialogue edit, sound design and mix for the submission out of the Dallas studio.

Can you talk about the VFX?
Barrow, Harper, Hartstein: The cut was locked in late November, and the heavy lifting really began. With delivery looming, Tango’s Flame artists Allen Robbins, Joey Waldrip, David Hannah, David Laird, Artie Peña and Zack Smith divided effects shots, which ranged from environmental cleanup, period-specific cleanup, beauty work such as de-aging, crowd simulation, CG sign creation and more. 3D

(L-R) Tango’s Artie Peña, Connor Adams, Allen Robbins in one of the studio’s Flame suites.

Artist Connor Adams used Houdini, Mixamo and Maya to create CG elements and crowds, with final comps being done in Nuke and sent to Flame for final color. Over 120 VFX shots were handled in total and Flame was the go-to for effects. Color and much of the effects happened simultaneously. It was a nice workflow as the project didn’t have major VFX needs that would have impacted color.

What about the color grade?
Barrow, Harper, Hartstein: Directors Buck and Schlingmann and DP Valdes-Lora worked with Tango colorist Allen Robbins to craft the final look of the film — with the color grade also done in Flame. The trio had prepped shooting for a Kodachrome-style look, especially for the exteriors, but really overall. They found important reference in selections of Robert Capa photographs.

Buck, Schlingmann and Valdes-Lora responded mostly to Kodachrome’s treatment of blues, browns, tans, greens and reds (while staying true to skin tone), but also to their gamma values, not being afraid of deep shadows and contrast wherever appropriate. Valdes-Lora wanted to avoid lighting/exposing to a custom LUT on set that would reflect this kind of Kodachrome look, in case they wanted to change course during the process. With the help of Tango, however, they discovered that by dialing back the Capa look it grounded the film a little more and made the characters “feel” more accessible. The roots of the inspiration remained in the image but a little more naturalism, a little more softness, served the story better.

Because of that they monitored on set with Alexa 709, which he thought exposing for would still provide enough room. Production designer Jonathan Rudak (another regular collaborator with the directors) was on the same page during prep (in terms of reflecting this Capa color style), and the practical team did what they could to make sure the set elements complemented this approach.

What about the audio post?
Barrow, Harper, Hartstein: With the effects and color almost complete, the team headed to Skywalker Ranch for a week of final dialogue edit, mix, sound design and Foley, led by Skywalker’s Danielle Dupre, Kim Foscato and E. Larry Oatfield. The team also was able to simultaneously approve color sections in Skywalker’s Stag Theater allowing for an ultra-efficient schedule. With final mix in hand, the film was mastered just after Christmas so that DCP production could begin.

Since a portion of the film was musical, how complex was the audio mix?
Skywalker sound mixer Dupre: The musical number was definitely one of the most challenging but rewarding scenes to design and mix. It was such a strong creative idea that played so deeply into the main character. The challenge was in striking a balance between tying it into the realism of the film while also leaning into the grandiosity of the musical to really sell the idea.

It was really fun to play with a combination of production dialogue and studio recordings to see how we could make it work. It was also really rewarding to create a soundscape that starts off minimally and simply and transitions to Broadway scale almost undetectably — one of the many exciting parts to working with creative and talented filmmakers.

What was the biggest challenge in post?
Barrow, Harper, Hartstein: Finishing a film in five to six weeks during the holidays was no easy feat. Luckily, we were able to have our directors hands-on for all final color, VFX and mix. Collaborating in the same room is always the best when you have no time to spare. We had a schedule where each day was accounted for — and we stuck to it almost down to the hour.

 


BlacKkKlansman director Spike Lee

By Iain Blair

Spike Lee has been on a roll recently. Last time we sat down for a talk, he’d just finished Chi-Raq, an impassioned rap reworking of Aristophanes’ “Lysistrata,” which was set against a backdrop of Chicago gang violence. Since then, he’s directed various TV, documentary and video projects. And now his latest film BlacKkKlansman has been nominated for a host of Oscars, including Best Picture, Best Director, Best Adapted Screenplay, Best Film Editing,  Best Original Score and Best Actor in a Supporting Role (Adam Driver).

Set in the early 1970s, the unlikely-but-true story details the exploits of Ron Stallworth (John David Washington), the first African-American detective to serve in the Colorado Springs Police Department. Determined to make a name for himself, Stallworth sets out on a dangerous mission: infiltrate and expose the Ku Klux Klan. The young detective soon recruits a more seasoned colleague, Flip Zimmerman (Adam Driver), into the undercover investigation. Together, they team up to take down the extremist hate group as the organization aims to sanitize its violent rhetoric to appeal to the mainstream. The film also stars Topher Grace as David Duke.

Behind the scenes, Lee reteamed with co-writer Kevin Willmott, longtime editor Barry Alexander Brown and composer Terence Blanchard, along with up-and-coming DP Chayse Irvin. I spoke with the always-entertaining Lee, who first burst onto the scene back in 1986 with She’s Gotta Have It, about making the film, his workflow and the Oscars.

Is it true Jordan Peele turned you onto this story?
Yeah, he called me out of the blue and gave me possibly the greatest six-word pitch in film history — “Black man infiltrates Ku Klux Klan.” I couldn’t resist it, not with that pitch.

Didn’t you think, “Wait, this is all too unbelievable, too Hollywood?”
Well, my first question was, “Is this actually true? Or is it a Dave Chappelle skit?” Jordan assured me it’s a true story and that Ron wrote a book about it. He sent me a script, and that’s where we began, but Kevin Willmott and I then totally rewrote it so we could include all the stuff like Charlottesville at the end.

Iain Blair and Spike Lee

Did you immediately decide to juxtapose the story’s period racial hatred with all the ripped-from-the-headlines news footage?
Pretty much, as the Charlottesville rally happened August 11, 2017 and we didn’t start shooting this until mid-September, so we could include all that. And then there was the terrible synagogue massacre, and all the pipe bombs. Hate crimes are really skyrocketing under this president.

Fair to say, it’s not just a film about America, though, but about what’s happening everywhere — the rise of neo-Nazism, racism, xenophobia and so on in Europe and other places?
I’m so glad you said that, as I’ve had to correct several people who want to just focus on America, as if this is just happening here. No, no, no! Look at the recent presidential elections in Brazil. This guy — oh my God! This is a global phenomenon, and the common denominator is fear. You fire up your base with fear tactics, and pinpoint your enemy — the bogeyman, the scapegoat — and today that is immigrants.

What were the main challenges in pulling it all together?
Any time you do a film, it’s so hard and challenging. I’ve been doing this for decades now, and it ain’t getting any easier. You have to tell the story the best way you can, given the time and money you have, and it has to be a team effort. I had a great team with me, and any time you do a period piece you have added challenges to get it looking right.

You assembled a great cast. What did John David Washington and Adam Driver bring to the main roles?
They brought the weight, the hammer! They had to do their thing and bring their characters head-to-head, so it’s like a great heavyweight fight, with neither one backing down. It’s like Inside Man with Denzel and Clive Owen.

It’s the first time you’ve worked with the Canadian DP Chayse Irvin, who mainly shot shorts before this. Can you talk about how you collaborated with him?
He’s young and innovative, and he shot a lot of Beyonce’s Lemonade long-form video. What we wanted to do was shoot on film, not digital. I talked about all the ‘70s films I grew up with, like French Connection and Dog Day Afternoon. So that was the look I was after. It had to match the period, but not be too nostalgic. While we wanted to make a period film, I also wanted it to feel and look contemporary, and really connect that era with the world we live in now. He really nailed it. Then my great editor, Barry Alexander Brown, came up with all the split-screen stuff, which is also very ‘70s and really captured that era.

How tough was the shoot?
Every shoot’s tough. It’s part of the job. But I love shooting, and we used a mix of practical locations and sets in Brooklyn and other places that doubled for Colorado Springs.

Where did you post?
Same as always, in Brooklyn, at my 40 Acres and a Mule office.

Do you like the post process?
I love it, because post is when you finally sit down and actually make your film. It’s a lot more relaxing than the shoot — and a lot of it is just me and the editor and the Avid. You’re shaping and molding it and finding your way, cutting and adding stuff, flopping scenes, and it never really follows the shooting script. It becomes its own thing in post.

Talk about editing with Barry Alexander Brown, the Brit who’s cut so many of your films. What were the big editing challenges?
The big one was finding the right balance between the humor and the very serious subject matter. They’re two very different tones, and then the humor comes from the premise, which is absurd in itself. It’s organic to the characters and the situations.

Talk about the importance of sound and music, and Terence Blanchard’s spare score that blends funk with classical.
He’s done a lot of my films, and has never been nominated for an Oscar — and he should have been. He’s a truly great composer, trumpeter and bandleader, and a big part of what I do in post. I try to give him some pointers that aren’t restrictive, and then let him do his thing. I always put as much as emphasis on sound and music as I do on the acting, editing and cinematography. It’s hugely important, and once we have the score, we have a film.

I had a great sound team. Phil Stockton, who began with me back on School Daze, was the sound designer. David Boulton, Mike Russo and Howard London did the ADR mix, and my longtime mixer Tommy Fleischman was on it. We did it all at C5 in New York. We spent a long time on the mix, building it all up.

Where did you do the DI and how important is it to you?
At Company 3 with colorist Tom Poole, who’s so good. It’s very important but I’m in and out, as I know Tom and the DP are going to get the look I want.

Spike Lee on set.

Did the film turn out the way you hoped?
Here’s the thing. You try to do the best you can, and I can’t predict what the reaction will be. I made the film I wanted to make, and then I put it out in the world. It’s all about timing. This was made at the right time and was made with a lot of urgency. It’s a crazy world and it’s getting crazier by the minute.

How important are industry awards and nomination to you? 
They’re very important in that they bring more attention, more awareness to a film like this. One of the blessings from the strong critical response to this has been a resurgence in looking at my earlier films again, some of which may have been overlooked, like Bamboozled and Summer of Sam.

Do you see progress in Hollywood in terms of diversity and inclusion?
There’s been movement, maybe not as fast as I’d like, but it’s slowly happening, so that’s good.

What’s next?
We just finished the second season of She’s Gotta Have It for Netflix, and I have some movie things cooking. I’m pretty busy.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.


VFX editor Warren Mazutinec on life, work and Altered Carbon

By Jeremy Presner

Long-time assistant editor Warren Mazutinec’s love for filming began when he saw Star Wars as an eight-year-old in a small town in Edmonton, Alberta. Unlike many other Lucas-heads, however, this one got to live out his dream grinding away in cutting rooms from Vancouver to LA working with some of the biggest editors in the galaxy.

We met back in 1998 when he assisted me on the editing of the Martin Sheen “classic” Voyage of Terror. We remain friends to this day. One of Warren’s more recent projects was Netflix’s VFX-heavy Altered Carbon, which got a lot of love from critics and audiences alike.

My old friend, who is now based in Vancouver, has an interesting story to tell, moving from assistant editor to VFX editor working on films like Underworld 4, Tomorrowland, Elysium and Chappie, so I threw some questions at him. Enjoy!

Warren Mazutinec

How did you get into the business?
I always wanted to work in the entertainment industry, but that was hard to find in Alberta. No film school-type programs were even offered, so I took the closest thing at a local college: audiovisual communications. While there, I studied photography, audio and video, but nothing like actual filmmaking. After that I attended Vancouver Film School. After film school, and with the help of some good friends, I got an opportunity to be a trainee at Shavick Entertainment.

What was it like working at a “film factory” that cranked out five to six pictures a year?
It was fun, but the product ultimately became intolerable. Movies for nine-year-olds can only be so interesting… especially low-budget ones.

What do your parents think of your career option?
Being from Alberta, everyone thought it wasn’t a real job — just a Hollywood dream. It took some convincing; my dad still tells me to look for work between gigs.

How did you learn Avid? Were you self-taught?
I was handed the manual by a post supervisor on day one. I never read it. I just asked questions and played around on any machine available. So I did have a lot of help, but I also went into work during my free time and on weekends to sit and learn what I needed to do.

Over the years I’ve been lucky enough to have cool people to work with and to learn with and from. I did six movies before I had an email address, more before I even owned a computer.

As media strayed away from film into digital, how did your role change in the cutting room? How did you refine your techniques with a changing workflow?
My first non-film movie was Underworld 4. It was shot with a Red One camera. I pretty much lied and said I knew how to deal with it. There was no difference really; just had to say goodbye to lab rolls, Keykode, etc. It was also a 3D stereo project, so that was a pickle, but not too hard to figure out.

How did you figure out the 3D stereo post?
It was basically learning to do everything twice. During production we really only played back in 3D for the novelty. I think most shows are 3D-ified in post. I’m not sure though, I’ve only done the one.

Do you think VR/AR will be something you work with in the future?
Yes, I want to be involved in VR at some point. It’s going to be big. Even just doing sound design would be cool. I think it’s the next step, and I want in.

Who are some of your favorite filmmakers?
David Lynch is my number one, by far. I love his work in all forms. A real treasure tor sure. David Fincher is great too. Scorsese, Christopher Nolan. There are so many great filmmakers working right now.

Is post in your world constantly changing, or have things more or less leveled off?
Both. But usually someone has dailies figured out, so Avid is pretty much the same. We cut in DNx115 or DnX36, so nothing like 4K-type stuff. Conform at the end is always fun, but there are tests we do at the start to figure it all out. We are rarely treading in new water.

What was it like transitioning to VFX editor? What tools did you need to learn to do that role?
FileMaker. And Jesus, son, I didn’t learn it. It’s a tough beast but it can do a lot. I managed to wrangle it to do what I was asked for, but it’s a hugely powerful piece of software. I picked up a few things on Tomorrowland and went from there.

I like the pace of the VFX editor. It’s different than assisting and is a nice change. I’d like to do more of it. I’d like to learn and use After Effects more. On the film I was VFX editor for, I was able to just use the Avid, as it wasn’t that complex. Mostly set extensions, etc.

How many VFX shot revisions would a typical shot go through on Elysium?
On Elysium, the shot version numbers got quite high, but part of that would be internal versioning by the vendor. Director Neil Blomkamp is a VFX guy himself, so he was pretty involved and knew what he wanted. The robots kept looking cooler and cooler as the show went on. Same for Chappie. That robot was almost perfect, but it took a while to get there.

You’ve worked with a vast array of editors, from, including Walter Murch, Lee Smith, Julian Clarke, Nancy Richardson and Bill Steinkamp. Can you talk about that, and have any of them let you cut material?
I’ll assemble scenes if asked to, just to help the editor out so he isn’t starting from scratch. If I get bored, I start cutting scenes as well. On Altered Carbon, when Julian (Clark) was busy with Episodes 2 and 3, I’d try to at least string together a scene or two for Episode 8. Not fine-cutting, mind you, just laying out the framework.

Walter asked a lot of us — the workload was massive. Lee Smith didn’t ask for much. Everyone asks for scene cards that they never use, ha!

Walter hadn’t worked on the Avid for five years or so prior to Tomorrowland, so there was a lot of him walking out of his room asking, “How do I?” It was funny because a lot of the time I knew what he was asking, but I had to actually do it on my machine because it’s so second nature.

What is Walter Murch like in the cutting room? Was learning his organizational process something you carried over into future cutting rooms?
I was a bit intimidated prior to meeting him. He’s awesome though. We got along great and worked well together. There was Walter, a VFX editor and four assistants. We all shared in the process. Of course, Walter’s workflow is unlike any other so it was a huge adjustment, but within a few weeks we were a well-oiled machine.

I’d come in at 6:30am to get dailies sorted and would usually finish around lunch. Then we’d screen in our theater and make notes, all of us. I really enjoyed screening the dailies that way. Then he would go into his room and do his thing. I really wish all films followed his workflow. As tough as it is, it all makes sense and nothing gets lost.

I have seen photos with the colored boxes and triangles on the wall. What does all that mean, and how often was that board updated?
Ha. That’s Walter’s own version of scene cards. It makes way better sense. The colors and shapes mean a particular thing — the longer the card the longer the scene. He did all that himself, said it helps him see the picture. I would peek into his room and watch him do this. He seemed so happy doing it, like a little kid.

Do you always add descriptions and metadata to your shots in Avid Media Composer?
We add everything possible. Usually there is a codebook the studios want, so we generate that with FileMaker on almost all the bigger shows. Walter’s is the same just way bigger and better. It made the VFX database look like a toy.

What is your workflow for managing/organizing footage?
A lot of times you have to follow someone else’s procedure, but if left to my own devices I try to make it the simplest it can be so anyone can figure out what was done.

How do you organize your timeline?
It’s specific to the editor, but I like to use as many audio tracks as possible and as few video tracks as possible, but when it’s a VFX-heavy show, that isn’t possible due to stacking various shot versions.

What did you learn from Lee Smith and Julian Clarke?
Lee Smith is a suuuuuper nice guy. He always had great stories from past films and he’s a very good editor. I’m glad he got the Oscar for Dunkirk, he’s done a lot of great work.

Julian is also great to work with. I’ve worked with him on Elysium, Chappie and Altered Carbon. He likes to cut with a lot of sound, so it’s fun to work with him. I love cutting sound, and on Altered Carbon we had over 60 tracks. It was a alternating stereo setup and we used all the tracks possible.

Altered Carbon

It was such a fun world to create sound for. Everything that could make a sound we put in. We also invented signature sounds for the tech we hoped they’d use in the final. And they did for some things.

Was that a 5.1 temp mix?? Have you ever done one?
No. I want to do a 5.1 Avid mix. Looks fun.

What was the schedule like on Altered Carbon? How was that different than some of the features you’ve worked on?
It was six-day weeks and 12 hours a day. Usually one week per month I’d trade off with the 2nd assistant and she’d let me have an actual weekend. It was a bit of a grind. I worked on Episodes 2, 3 and 8, and the schedules for those were tight, but somehow we got through it all. We had a great team up here for Vancouver’s editorial. They were also cutting in LA as well. It was pretty much non-stop editing the whole way through.

How involved was Netflix in terms of the notes process? Were you working with the same editors on the episodes you assisted?
Yes, all episodes were with Julian. First it went through Skydance notes, then Netflix. Skydance usually had more as they were the first to see the cuts. There were many versions for sure.

What was it like working with Neil Blomkamp?
It was awesome. He makes cool films, and it’s great to see footage like that. I love shooting guns, explosions, swords and swearing. I beat him in ping-pong once. I danced around in victory and he demanded we play again. I retired. One of the best environments I’ve ever worked in. Elysium was my favorite gig.

What’s the largest your crew has gotten in post?
Usually one or two editors, up to four assistants, a PA, a post super — so eight or nine, depending.

Do you prefer working with a large team or do you like smaller films?
I like the larger team. It can all be pretty overwhelming and having others there to help out, the easier it can be to get through. The more the merrier!

Altered Carbon

How do you handle long-ass-days?
Long days aren’t bad when you have something to do. On Altered Carbon I kept a skateboard in my car for those times. I just skated around the studio waiting for a text. Recently I purchased a One-Wheel (skateboard with 1 wheel) and plan to use it to commute to work as much as possible.

How do you navigate the politics of a cutting room?
Politics can be tricky. I usually try to keep out of things unless I’m asked, but I do like to have a sit down or a discussion of what’s going on privately with the editor or post super. I like to be aware of what’s coming, so the rest of us are ready.

Do you prefer features to TV?
It doesn’t matter anymore because the good filmmakers work in both mediums. It used to be that features were one thing and TV was another, with less complex stories. Now that’s different and at times it’s the opposite. Features usually pay more though, but again that’s changing. I still think features are where it’s at, but that’s just vanity talking.

Sometimes your project posts in Vancouver but moves to LA for finishing. Why? Does it ever come back?
Mostly I think it’s because that’s where the director/producers/studio lives. After it’s shot everyone just goes back home. Home is usually LA or NY. I wish they’d stay here.

How long do you think you’ll continue being an AE? Until you retire? What age do you think that’ll be?
No idea; I just want to keep working on projects that excite me.

Would you ever want to be an editor or do you think you’d like to pivot to VFX, or are you happy where you are?
I only hope to keep learning and doing more. I like the VFX editing, I like assisting and I like being creative. As far as cutting goes, I’d like to get on a cool series as a junior editor or at least start doing a few scenes to get better. I just want to keep advancing, I’d love to do some VR stuff.

What’s next for you project wise?
I’m on a Disney Show called Timmy Failure. I can’t say anything more at this point.

What advice do you have for other assistant editors trying to come up?
It’s going to take a lot longer than you think to become good at the job. Being the only assistant does not make you a qualified first assistant. It took me 10 years to get there. Also you never stop learning, so always be open to another approach. Everyone does things differently. With Murch on Tomorrowland, it was a whole new way of doing things that I had never seen before, so it was interesting to learn, although it was very intimidating at the start.


Jeremy Presner is an Emmy-nominated film and television editor residing in New York City. Twenty years ago, Warren was AE on his first film. Since then he has cut such diverse projects as Carrie, Stargate Atlantis, Love & Hip Hop and Breaking Amish.


Review: iOgrapher Multi Case for mobile filmmaking

By Brady Betzel

Thanks to the amazing iPhone X, Google Pixel and Samsung Galaxy, almost everyone has a high-end video camera on their person at all times and this is helping to spur on mobile filmmaking and vlogging.

From YouTube to Instagram to movies like Unsane (Steven Soderbergh) or Tangerine (Sean Baker) — and regardless of whether you think a $35,000 camera setup tells a story better than a $1,000 cell phone (looking at you Apple Phone XS Max) — mobile filmmaking is here to stay and will only get better.

iOgrapher’s latest release is the iOgrapher Multi Case, a compact mobile filmmaking mounting solution that works with today’s most popular phones. iOgrapher has typically created solutions that were tied to the mobile device being used for filmmaking, such as an iPhone, the latest Samsung Galaxy phones, iPads or even action cameras like a GoPro Hero 7 Black.

With the new iOgrapher Multi Case you can fit any mobile device that measures more than 5 ½” x 2 ¼” and less than 6 ½” by 3 ⅜”. Unfortunately, you won’t be fitting an iPad or a GoPro in the iOgrapher Multi Case, but don’t fret! iOgrapher makes rigs for those as well. On the top of the Multi Case are two cold shoe mounts for lights, microphones or any other device, like a GoPro. To mount things with ¼” 20 screw mounts in the cold shoes you will need to find a cold shoe to ¼” 20 adapter, which is available on iOgrapher’s accessory page. You can also find these at Monoprice or Amazon for real cheap.

And if you are looking to order more mounts you may want to order some extra cold shoe adapters that can be mounted on the handles of the iOgrapher Multi Case in the additional ¼” 20 screw mounts. The mounts on the handles are great for adding in additional lighting or microphones. I’ve even found that if you are going to be doing some behind-the-scenes filming or need another angle for your shooting, a small camera like a GoPro can be easily mounted and angled. With all this mounting you should assume that you are going to be using the iOgrapher on a sturdy tripod. Just for fun, I mounted the iOgrapher Multi Case onto a GoPro 3-Way Grip, which can also be used as a light tripod. It wasn’t exactly stable but it worked. I wouldn’t suggest using it for more than an emergency shooting situation though.

On the flip side (all pun intended), the iOgrapher can be solidly mounted vertically with the ¼” 20 screw mounts on the handles. With Instagram making headway with vertical video in their Instagram Stories, iOgrapher took that idea and built that into their Multi Case, further cementing grumbling from the old folks who just don’t get vertical video.

Testing
I tried out both a Samsung Galaxy s8+ as well as an iPhone 7+ with their cases on inside of the iOgrapher Multi Case. Both fit. The iPhone 7+ was stretching the boundaries of the Multi Case, but it did fit and worked well. The way the phones are inserted into the Multi Case is by a spring-loaded bottom piece. From the left or top side, if you are shooting vertically, you push the bottom of the mobile device into the corner covered slots of the iOgrapher Multi Case until the top or the left side can be secured under the left or top side of the Multi Case. It’s really easy.

I was initially concerned with the spring loading of the case; I wasn’t sure if the springs would be resilient enough to handle the constant pulling in and out of the phones, but the springs are high quality and held up beautifully. I even tried inserting my mobile phones tons of times and didn’t notice any issues with the springs or my phones.

Take care when inserting your phone into the Multi Case if you have a protective shield on the screen of your device. If you aren’t extra careful it can pull or snag on the cover — especially with the tight fit of a case. Just pay attention and there will be nothing to worry about. The simple beauty of the iOgrapher is that with a wider grip of your filmmaking device, you have a larger area to distribute any shaking coming from your hands, essentially helping stabilize your filmmaking without the need for a full-fledged gimbal.

If you accidentally drop your iOgrapher you may get a scratch, but for the most part they are built sturdy and can withstand punishment, whether it’s from your four year old or from weather. If you want to get a little fancy, you can buy affordable lights like the Litra Torch (check out my review) to attach to the cold shoe mounts, or even a Rode microphone (don’t forget the TRS to TRRS adapter if you are plugging into an iPhone), and you are off and running.

Summing Up
I have been really intrigued with iOgrapher’s products since day one. They are an affordable and sturdy way to jump into filmmaking using cameras everyone carries with them every day: their phones.

Whether you are a high school student looking to get steady and professional mobile video, or a journalist looking for a quick way to make the most of your shots with just a phone, light, mic and tripod mount, the iOgrapher Multi Case will unlock your mobile filmmaking potential.

The iOgrapher Multi Case is a very durable protective case for your mobile filmmaking devices that is a steal at $79.99. If you are a parent that is looking for an inexpensive way to try and tease your child’s interest in video take a look at www.iographer.com and grab a few accessories like a Manfrotto light and Rode VideoMicro to add some subtle lighting and pick up the best quality audio.

Make sure to check out Dave Basulto’s — the creator of iOgrapher — demo of the iOgrapher Multi Case, including trying out the fit of different phones.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.


Catching up with Aquaman director James Wan

By Iain Blair

Director James Wan has become one of the biggest names in Hollywood thanks to the $1.5 billion-grossing Fast & Furious 7, as well as the Saw, Conjuring and Insidious films — three of the most successful horror franchises of the last decade.

Now the Malaysian-born, Australian-raised Wan, who also writes and produces, has taken on the challenge of bringing Aquaman and Atlantis to life. The origin story of half-surface dweller, half-Atlantean Arthur Curry stars Jason Momoa in the title role. Amber Heard plays Mera, a fierce warrior and Aquaman’s ally throughout his journey.

James Wan and Iain Blair

Additional cast includes Willem Dafoe as Vulko, council to the Atlantean throne; Patrick Wilson as Orm, the present King of Atlantis; Dolph Lundgren as Nereus, King of the Atlantean tribe Xebel; Yahya Abdul-Mateen II as the revenge-seeking Manta; and Nicole Kidman as Arthur’s mom, Atlanna.

Wan’s team behind the scenes included such collaborators as Oscar-nominated director of photography Don Burgess (Forrest Gump), his five-time editor Kirk Morri (The Conjuring), production designer Bill Brzeski (Iron Man 3), visual effects supervisor Kelvin McIlwain (Furious 7) and composer Rupert Gregson-Williams (Wonder Woman).

I spoke with the director about making the film, dealing with all the effects, and his workflow.

Aquaman is definitely not your usual superhero. What was the appeal of doing it? 
I didn’t grow up with Aquaman, but I grew up with other comic books, and I always was well aware of him as he’s iconic. A big part of the appeal for me was he’d never really been done before — not on the big screen and not really on TV. He’s never had the spotlight before. The other big clincher was this gave me the opportunity to do a world-creation film, to build a unique world we’ve never seen before. I loved the idea of creating this big fantasy world underwater.

What sort of film did you set out to make?
Something that was really faithful and respectful to the source material, as I loved the world of the comic book once I dove in. I realized how amazing this world is and how interesting Aquaman is. He’s bi-racial, half-Atlantean, half-human, and he feels he doesn’t really fit in anywhere at the start of the film. But by the end, he realizes he’s the best of both worlds and he embraces that. I loved that. I also loved the fact it takes place in the ocean so I could bring in issues like the environment and how we treat the sea, so I felt it had a lot of very cool things going for it — quite apart from all the great visuals I could picture.

Obviously, you never got the Jim Cameron post-Titanic memo — never, ever shoot in water.
(Laughs) I know, but to do this we unfortunately had to get really wet as over 2/3rds of the film is set underwater. The crazy irony of all this is when people are underwater they don’t look wet. It’s only when you come out of the sea or pool that you’re glossy and dripping.

We did a lot of R&D early on, and decided that shooting underwater looking wet wasn’t the right look anyway, plus they’re superhuman and are able to move in water really fast, like fish, so we adopted the dry-for-wet technique. We used a lot of special rigs for the actors, along with bluescreen, and then combined all that with a ton of VFX for the hair and costumes. Hair is always a big problem underwater, as like clothing it behaves very differently, so we had to do a huge amount of work in post in those areas.

How early on did you start integrating post and all the VFX?
It’s that kind of movie where you have to start post and all the VFX almost before you start production. We did so much prep, just designing all the worlds and figuring out how they’d look, and how the actors would interact with them. We hired an army of very talented concept artists, and I worked very closely with my production designer Bill Brzeski, my DP Don Burgess and my visual effects supervisor Kelvin McIlwain. We went to work on creating the whole look and trying to figure out what we could shoot practically with the actors and stunt guys and what had to be done with VFX. And the VFX were crucial in dealing with the actors, too. If a body didn’t quite look right, they’d just replace them completely, and the only thing we’d keep was the face.

It almost sounds like making an animated film.
You’re right, as over 90% of it was VFX. I joke about it being an animated movie, but it’s not really a joke. It’s no different from, say, a Pixar movie.

Did you do a lot of previs?
A lot, with people like Third Floor, Day For Nite, Halon, Proof and others. We did a lot of storyboards too, as they are quicker if you want to change a camera angle, or whatever, on the fly. Then I’d hand them off to the previs guys and they’d build on those.

What were the main technical challenges in pulling it all together on the shoot?
We shot most of it Down Under, near Brisbane. We used all nine of Village Roadshow Studios’ soundstages, including the new Stage 9, as we had over 50 sets, including the Atlantis Throne Room and Coliseum. The hardest thing in terms of shooting it was just putting all the actors in the rigs for the dry-for-wet sequences; they’re very cumbersome and awkward, and the actors are also in these really outrageous costumes, and it can be quite painful at times for them. So you can’t have them up there too long. That was hard. Then we used a lot of newish technology, like virtual production, for scenes where the actors are, say, riding creatures underwater.

We’d have it hooked up to the cameras so you could frame a shot and actually see the whole environment and the creature the actor is supposed to be on — even though it’s just the actors and bluescreen and the creature is not there. And I could show the actors — look, you’re actually riding a giant shark — and also tell the camera operator to pan left or right. So it was invaluable in letting me adjust performance and camera setups as we shot, and all the actors got an idea of what they were doing and how the VFX would be added later in post. Designing the film was so much fun, but executing it was a pain.

The film was edited by Kirk Morri, who cut Furious 7, and worked with you on the Insidious and The Conjuring films. How did that work?
He wasn’t on set but he’d visit now and again, especially when we were shooting something crazy and it would be cool to actually see it. Then we’d send dailies and he’d start assembling, as we had so much bluescreen and VFX stuff to deal with. I’d hop in for an hour or so at the end of each day’s shoot to go over things as I’m very hands on — so much so that I can drive editors crazy, but Kirk puts up with all that.

I like to get a pretty solid cut from the start. I don’t do rough assemblies. I like to jump straight into the real cut, and that was so important on this because every shot is a VFX shot. So the sooner you can lock the shot, the better, and then the VFX teams can start their work. If you keep changing the cut, then you’ll never get your VFX shots done in time. So we’d put the scene together, then pass it to previs, so you don’t just have actors floating in a bluescreen, but they’re in Atlantis or wherever.

Where did you do the post?
We did most of it back in LA on the Warner lot.

Do you like the post process?
I absolutely love it, and it’s very important to my filmmaking style. For a start, I can never give up editing and tweaking all the VFX shots. They have to pull it away from me, and I’d say that my love of all the elements of the post process — editing, sound design, VFX, music — comes from my career in suspense movies. Getting all the pieces of post right is so crucial to the end result and success of any film. This post was creatively so much fun, but it was long and hard and exhausting.

James Wan

All the VFX must have been a huge challenge.
(Laughs) Yes, as there’s over 2,500 VFX shots and we had everyone working on it — ILM, Scanline, Base, Method, MPC, Weta, Rodeo, Digital Domain, Luma — anyone who had a computer! Every shot had some VFX, even the bar scene where Arthur’s with his dad. That was a set, but the environment outside the window was all VFX.

What was the hardest VFX sequence to do?
The answer is, the whole movie. The trench sequence was hard, but Scanline did a great job. Anything underwater was tough, and then the big final battle was super-difficult, and ILM did all that.

Did the film turn out the way you hoped?
For the most part, but like most directors, I’m never fully satisfied.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.


DevinSuperTramp: The making of a YouTube filmmaker

Devin Graham, aka DevinSuperTramp, made the unlikely journey from BYU dropout to a viral YouTube sensation who has over five million followers. After leaving school, Graham went to Hawaii to work on a documentary. The project soon ran out of money and he was stuck on the island… feeling very much a dropout and a failure. He started making fun videos with his friends to pass the time, and DevinSuperTramp was born. Now he travels, filming his view of the world, taking on daring adventures to get his next shot, and risking life and limb.

Shooting while snowboarding behind a trackhoe with a bunch of friends for a new video.

We recently had the chance to sit down with Graham to hear firsthand what lessons he’s learned along his journey, and how he’s developed into the filmmaker he is today.

Why extreme adventure content?
I grew up in the outdoors — always hiking and camping with my dad, and snowboarding. I’ve always been intrigued by pushing human limits. One thing I love about the extreme thing is that everyone we work with is the best at what they do. Like, we had the world’s best scooter riders. I love working with people who devote their entire lives to this one skillset. You get to see that passion come through. To me, it’s super inspiring to show off their talents to the world.

How did you get DevinSuperTramp off the ground? Pun intended.
I’ve made movies ever since I can remember. I was a little kid shooting Legos and stop-motion with my siblings. In high school, I took photography classes, and after I saw the movie Jurassic Park, I was like, “I want to make movies for a living. I want to do the next Jurassic Park.” So, I went to film school. Actually, I got rejected from the film program the first time I applied, which made me volunteer for every film thing going on at the college — craft service, carrying lights, whatever I could do. One day, my roommate was like, “YouTube is going to be the next big thing for videos. You should get on that.”

And you did.
Well, I started making videos just kind of for fun, not expecting anything to happen. But it blew up. Eight years later, it’s become the YouTube channel we have now, with five million subscribers. And we get to travel around the world creating content that we love creating.

Working on a promo video for Recoil – all the effects were done practically.

And you got to bring it full circle when you worked with Universal on promoting Fallen Kingdom.
I did! That was so fun and exciting. But yeah, I was always making content. I didn’t wait ‘til after I graduated. I was constantly looking for opportunities and networking with people from the film program. I think that was a big part of (succeeding at that time), just looking for every opportunity to milk it for everything I could.

In the early days, how did you promote your work?
I was creating all my stuff on YouTube, which, at that time, had hardly any solid, quality content. There was a lot of content, but it was mostly shot on whatever smartphone people had, or it was just people blogging. There wasn’t really anything cinematic, so right away our stuff stood out. One of the first videos I ever posted ended up getting like a million views right away, and people all around the world started contacting me, saying, “Hey, Devin, I’d love for you to shoot a commercial for us.” I had these big opportunities right from the start, just by creating content with my friends and putting it out on YouTube.

Where did you get the money for equipment?
In the beginning, I didn’t even own a camera. I just borrowed some from friends. We didn’t have any fancy stuff. I was using a Canon 5D Mark II and the Canon T2i, which are fairly cheap cameras compared to what we’re using now. But I was just creating the best content I could with the resources I had, and I was able to build a company from that.

If you had to start from scratch today, do you think you could do it again?
I definitely think it’s 100 percent doable, but I would have to play the game differently. Even now we are having to play the game differently than we did six months ago. Social media is hard because it’s constantly evolving. The algorithms keep changing.

Filming in Iceland for an upcoming documentary.

What are you doing today that’s different from before?
One thing is just using trends and popular things that are going on. For example, a year and a half ago, Pokémon Go was very popular, so we did a video on Pokémon and it got 20 million views within a couple weeks. We have to be very smart about what content we put out — not just putting out content to put out content.

One thing that’s always stayed true since the beginning is consistent content. When we don’t put out a video weekly, it actually hurts our content being seen. The famous people on YouTube now are the ones putting out daily content. For what we’re doing, that’s impossible, so we’ve sort of shifted platforms from YouTube, which was our bread and butter. Facebook is where we push our main content now, because Facebook doesn’t favor daily content. It just favors good-quality content.

Teens will be the first to say that grown-ups struggle with knowing what’s cool. How do you chase after topics likely to blow up?
A big one is going on YouTube and seeing what videos are trending. Also, if you go to Google Trends, it shows you the top things that were searched that day, that week, that month. So, it’s being on top of that. Or, maybe, Taylor Swift is coming out with a new album; we know that’s going to be really popular. Just staying current with all that stuff. You can also use Facebook, Twitter and Instagram to get an idea of what people are really excited about.

Can you tell us about some of the equipment you use, and the demands that your workflow puts on your storage needs?
We shoot so much content. We own two Red 8K cameras that we film everything with, and we’re shooting daily for the most part. On an average week, we’re shooting about eight terabytes, and then backing that up — so 16 terabytes a week. Obviously, we need a lot of storage, and we need storage that we can access quickly. We’re not putting it on tape. We need to pull stuff up right there and start editing on it right away.

So, we need the biggest drives that are as fast as possible. That’s why we use G-Tech’s 96TB G-Speed Shuttle XL towers. We have around 10 of those, and we’ve been shooting with those for the last three to four years. We needed something super reliable. Some of these shoots involve forking out a lot of money. I can’t take a hard drive and just hope it doesn’t fail. I need something that never fails on me — like ever. It’s just not worth taking that risk. I need a drive I can completely trust and is also super-fast.

What’s the one piece of advice that you wish somebody had given you when you were starting out?
In my early days, I didn’t have much of a budget, so I would never back up any of my footage. I was working on two really important projects and had them all on one drive. My roommate knocked that drive off the table, and I lost all that footage. It wasn’t backed up. I only had little bits and pieces still saved on the card — enough to release it, but a lot of people wanted to buy the stock footage and I didn’t have most of the original content. I lost out on a huge opportunity.

Today, we back up every single thing we do, no matter how big or how small it is. So, if I could do my early days over again, even if I didn’t have all the money to fund it, I’d figure out a way to have backup drives. That was something I had to learn the hard way.

NAB NY: A DP’s perspective

By Barbie Leung

At this year’s NAB New York show, my third, I was able to wander the aisles in search of tools that fit into my world of cinematography. Here are just a few things that caught my eye…

Blackmagic, which had large booth at the entrance to the hall, was giving demos of its Resolve 15, among other tools. Panasonic also had a strong presence mid-floor, with an emphasis on the EVA-1 cameras. As usual, B&H attracted a lot of attention, as did Arri, which brought a couple of Arri Trinity rigs to demo.

During the HDR Video Essentials session, colorist Juan Salvo of TheColourSpace, talked about the emerging HDR 10+ standard proposed by Samsung and Amazon Video. Also mentioned was the trend of consumer displays getting brighter every year and that impact on content creation and content grading. Salvo pointed out the affordability of LG’s C7 OLEDs (about 700 Nits) for use as client monitors, while Flanders Scientific (which had a booth at the show) remains the expensive standard for grading. It was interesting to note that LG, while being the show’s Official Display Partner, was conspicuously absent from the floor.

Many of the panels and presentations unsurprisingly focused on content monetization — how to monetize faster and cheaper. Amazon Web Service’s stage sessions emphasized various AWS Elemental technologies, including automating the creation of video highlight clips for content like sports videos using facial recognition algorithms to generate closed captioning, and improving the streaming experience onboard airplanes. The latter will ultimately make content delivery a streamlined enough process for airlines that it would enable advertisers to enter this currently untapped space.

Editor Janis Vogel, a board member of the Blue Collar Post Collective, spoke at the #galsngear “Making Waves” panel, and noted the progression toward remote work in her field. She highlighted the fact that DaVinci Resolve, which had already made it possible for color work to be done remotely, is now also making it possible for editors to collaborate remotely. The ability to work remotely gives professionals the choice to work outside of the expensive-to-live-in major markets, which is highly desirable given that producers are trying to make more and more content while keeping budgets low.

Speaking at the same panel, director of photography/camera operator Selene Richholt spoke to the fact that crews are being monetized with content producers either asking production and post pros to provide standard service at substandard rates, or more services without paying more.

On a more exciting note, she cited recent 9×16 projects that she has shot with the camera mounted vertically (as opposed to shooting 16×9 and cropping in) in order to take full advantage of lens properties. She looks forward to the trend of more projects that can mix aspects ratios and push aesthetics.

Well, that’s it for this year. I’m already looking forward to next year.

 


Barbie Leung is a New York-based cinematographer and camera operator working in film, music video and branded content. Her work has played Sundance, the Tribeca Film Festival, Outfest and Newfest. She is also the DCP mastering technician at the Tribeca Film Festival.