Tag Archives: post production

Director HaZ Dulull on his sci-fi offering The Beyond

By Randi Altman

Director Hasraf “HaZ” Dulull is no stranger to making movies. Before jumping into writing and directing short sci-fi films, he was a visual effects supervisor and producer. His short film resume includes Project Kronos, I.R.I.S. and Sync. Recently, his first feature film, The Beyond, was released by Gravitas Ventures.

When I first met HaZ a few years back, we were both at an Adobe event — on a canal boat in Amsterdam during IBC. We started talking about visual effects, the industry and his drive to make movies.

This Brit is friendly, intelligent and incredibly hands-on in all aspects of what he does. His latest is The Beyond, which he describes as “a cerebral science-fiction feature film that blends the realism of documentary with the fantastical, ‘big idea’ nature of the science-fiction films of today.” The Beyond tells the story of a ground-breaking mission that sent astronauts — modified with advanced robotics — through a newly discovered wormhole known as the Void. When the mission returns unexpectedly, the space agency races to discover what the astronauts encountered on their first-of-its-kind interstellar space journey.

HaZ on set

HaZ was so hands-on that he provided some of the film’s visual effects and edited the film. Here is the trailer. If you like what you see, the film is available for purchase or rent on most digital platforms.

When I reached out to HaZ to talk about The Beyond, he was in Vancouver working on an eight-part TV series for Disney called Fast Layne. “I directed episodes 1 and 2, and am currently directing episodes 7 and 8,” he says. “The beauty of starting and ending the series is it allowed me to set the show’s style and tone.”

It seems he can’t sit still! Let’s find out more about how he works and The Beyond

Can you talk about prepro? How much of that included visual effects prepro?
Most people who know me will say I’m obsessed with prep. I had about six months of hardcore prep on this, from doing little storyboards, known as HaZ-Grams, right through to previs of the key sequences.

But even during the script-writing stage (six months before actual prep), I was coming up with visuals to support the ideas I was writing in the script. Sometimes I would knock up a test VFX scene just to see how complex it would be to create this idea I was writing in the script. Prep worked hand in hand with the script development and the budgeting of the film. The film was self-financed and later additional financing came in (during post production of the film), so I wanted to ensure everything was mapped out technically, as there was no “fix it in post” scenarios in this film — I wouldn’t allow it.

During location scouting, I would have my iPhone with me and shoot a bunch of footage and still imagery, so when I went back home I could write those locations into the script to make them work with the scenarios depicted in the film.

As part of prep we actually shot a test scene to really see if this mocku-mentary format would work to tell a grounded sci-fi story. This was also used to attract crew and other casting to the project, as well as get distributors primed early on.

Many shots from that test actually made it into the final movie —I wasn’t kidding about not wasting any budget or material on this production! So prep pretty much helped shape the script too, as I knew I wasn’t in the financial position to write stuff and then go and build it. I had to reverse engineer it in a way. In the film we have tons of locations, such as the Space Centre with actual real rockets. We also had a team in Iceland shooting alien landscapes, and we even shot some scenes in Malaysia to give the film a global feel — with each of those opportunities the script was tweaked to make full use of those location opportunities we had.

You shot with Blackmagic cameras. Was that your choice? The DP’s? Have you shot with these before?
From the start, I knew we were going to shoot on Blackmagic cameras. This was mainly down to the fact my DP Adam Batchelor — who had shot Sync with me and the proof of concept tests we did for this film — was a Blackmagic advocate and knew the cameras inside out, but more importantly he was able to get cinematic imagery using those cameras.

Blackmagic was very supportive of the film and have been of my career since my short films, so they came on as one of the executive producers on the film. No one had ever shot a full feature film using just the Blackmagic cameras. We also then used a Resolve pipeline to delivery. So The Beyond is the perfect case study for it.

Can you talk about that workflow? Any hiccups? 
I think the only hiccups were the fact we were using a beta version of Resolve 14, so there were the expected crashes, etc. That would usually be seen as risky on a feature film, but luckily we didn’t have a distributor in place with a release date, so the risk was minimal.

The good thing was I would generate an error log report from Resolve and send it over to Blackmagic, who would then instantly send out a new patch. So we were looked after rather than being left on our own to scream at the monitor.

We stuck with a Pro Res 4444 QuickTime workflow for all material from footage to VFX renders, and enabled proxy on the fly within Resolve. This was great as it meant I was working with the highest-resolution imagery within Resolve, and it was fairly fast too. Things started to slow down when I had multiple layers of VFX and composites/groups, which I then had to render out as a new clip and bring back in.

How did you and the DP develop the look you wanted? Any scenes stick out that you guys worked on?
I was very fortunate to get Max Horton, who had worked on films like Gravity, to come onboard to grade this film at the Dolby Vision lab in London’s Soho. We also did an HDR version of the film, which I think is the first indie film to have an HDR treatment done to it.

We had three to four days of grading with Max, and I was in the room with him the whole time. This was because I had already done a first-pass temp grade myself while editing the film in the beta version of Resolve 14. This made the workflow as simple as exporting my Resolve file and then the material hand-over to Max, who would load up the Resolve file, link up the material and work from there.

Max kept everything photographically like a documentary but with a slight cinematic flair to it. The big challenge was matching all the various sources of material from the various Blackmagic cameras (Ursa Mini Pro, the Production Camera and the Pocket Camera) to the DJI Osmo, drone footage and stock footage.

How many VFX shots were there? Who did them?
There were around 750 visual effects shots. I designed all the VFX scenes and handled a huge portion of the compositing myself, including invisible effects shots, all the space scenes, alien planet scenes, memory scenes and tons more — this would not have been possible without the support of my VFX team who worked on their assigned sequences and shots and also generated tons of CGI assets for me to use to create my shots in comp.

My VFX team members included my long-time collaborator John Sellings, who was the VFX supervisor for all the Human 2.0 sequences. Filmmore, in Amsterdam and Brussels, handled Human 2.0 scenes in the transcode bay with in-house VFX supervisor Hans Van Helden. London’s Squint VFX handled the Human 2.0 scenes in wake-up lab. Charles Wilcocks was the Human 2.0 CG supervisor who worked on the shape and look of the Human 2.0.

Hussin Khan looked after the Malaysian team, which provided rotoscoping support and basic comps. Dan Newlands was our on-set tracking supervisor. He ensured all data was captured correctly and supervised anything tracking related in the Human 2.0 scenes.

Another long-time collaborator was Andrea Tedeschi, who handled the CG and comps for the spacecraft carrier at the end of the film, as well as rendering out the CG astronaut passes. Rhys Griffith handled the rigging for the Human 2.0 characters in Maya, and also looked after the CG passes for the alpha Human 2.0 scenes using Blender. Aleksandr Uusmees provided all the particles and simulation rendered out of Houdini as CG passes/elements, which I then used to create the wormhole effects, alien spheres and other shots that needed those elements.

JM Blay designed and created the standalone motion graphics sequences to visualize the Human 2.0 medical procedure, as well as mission trajectory graphics. He also created several “kit-bash” graphics assets for me to use, including UI graphics, from his After Effects files.

Territory Studio created the awesome end titles and credits sequence, which you can read more about on their site.

As a VFX pro yourself, do you find that you are harder to please because it’s your wheelhouse?
Oh boy. Ask any of the VFX guys on the team and they will say I am a beast to work with because I am hands-on, and also I know how long things take. But on the flip side that had its advantages, as they knew they were not going to get revision after revision, because with each brief I also presented a proposed methodology, and made sure we locked down on that first before proceeding with the shots.

Was this your biggest directing job to date? Can you talk about any surprises?
It wasn’t my biggest directing job to date, as during post production of The Beyond my second sci-fi film Origin Unknown (starring Katee Sackhoff from Battlestar Galactica, The Flash) was green-lit and that had its own set of challenges. We can talk more about that when the film is released theatrically and VOD later this year via Kew Media.

This was, however, my biggest producing job to date; there were so many logistics and resources to manage whilst directing too. The cool thing about the way we made this film was that most of the crew were on my short films, including some of the key cast too, so we embraced the guerrilla nature of the production and focused on maximizing our resources to the fullest within the time and budget constraints.

What did you learn on this film that will help on your next?
The other hat I was wearing was the producer hat, and one thing I had to embrace was the sheer amount of paperwork! I may have taken the same filmmaking approach as I did on my short films — guerrilla and thinking outside the box technically and creatively— but making a commercial feature film, I had to learn to deal with things like clearances, E&O (errors and omission) insurance, chain of title, script report and a whole bunch of paperwork required before a distributor will pick up your film.

Thankfully my co-producer Paula Crickard, who is currently wrapping post on Terry Gilliam’s Don Quixote, came in during the post stage of the film and helped.

The other thing I learned was the whole sales angle — getting a reputable distributor on board to sell the film in all worldwide territories and how to navigate that process with rights and IP and more contracts etc. The advise I got from other filmmakers is getting the right distributor is a big part in how your film will be released, and to me it was important the distributor was into the film and not just the trailer, but also what their marketing and sales strategy were. The Beyond was never designed to be a theatrical film and therefore I wanted someone that had a big reach in the VOD world through their brand, especially since The Beyond doesn’t have big-name actors in there.

What was the most challenging scene or scenes? Why and how did you overcome those challenges?
The Human 2.0 scenes were the most challenging because they had to look photoreal due to it being a documentary narrative. We did first try and do it all in-camera using a built suit, but it wasn’t achieving the look we wanted and the actors would feel uncomfortable with it, and also to do it properly with practical would cost a fortune. So we went with a full-digital solution for the Human 2.0 bodies, by having the actors wear a tight grey suit with tracking markers on and we restricted our camera moves for simplicity to enable object tracking to work as accurately as possible. We also shot multiple reference footage from all angles to help with match moving. Having an on set-tracking supervisor helped massively and allowed us to make this happen within the budget, while looking and feeling real.

Our biggest issue came when our actress made very tiny movements due to breathing in close-up shots. Because our Human 2.0 was human consciousness in a synthetic shell, breathing didn’t make sense and we began making up for it by freezing the image or doing some stabilization, which ended up being nearly impossible for the very close-up shots.

In the end, I had to think outside the box, so I wrote a few lines into the script that explained that the Human 2.0 was breathing to make it psychologically more acceptable to other humans. Those two lines saved us weeks and possibly months of time.

Being a VFX movie you would expect us to use a form of greenscreen or bluescreen, but we didn’t — in fact, the only stage used was for the “white room” astronaut scene, which was shot over at Asylum FX in London. There was an actor wearing an astronaut suit in a bright photography room, and we used brightly exposed lighting to give a surreal feeling. We used VFX to augment it.

As a writer and a director, how was it seeing your vision through from start to finish.
It didn’t really hit me until I watched the press screening of it at the Dolby Vision office in Soho. It had the fully mixed sound and the completed grade. I remember looking across at my DP and other team members thinking, “Whoa! It looks and feels like a feature film, and we did that in a year!”

You edited the film yourself?
Yes, I was the editor on the film! I shoot for the edit. I started off using Adobe Premiere CC for the early offline and then quickly moved over to Resolve 14, where I did the majority of the editing. It was great because I was doing a lot of online editorial tasks like stabilizing, basic VFX, pan and scans, as well as establishing temp looks while editing. So in a way there was no offline and online editorial, as it was all part of one workflow. We did all our deliverables out of Resolve 14, too.

NYC’s Wax adds editor Kate Owen

Kate Owen, an almost 20-year veteran who has cut both spots and films, has joined the editorial roster of New York’s Wax. The UK-born but New York-based Owen has edited projects across fashion, beauty, lifestyle and entertainment for brands such as Gucci, Victoria Beckham, Vogue, Adidas, Sony, Showtime and Virgin Atlantic.

Owen started editing in her teens and subsequently worked with top-tier agencies like Mother, Saatchi NY, McGarryBowen, Grey Worldwide and Y&R. She has also worked at editing houses Marshall Street Editors and Whitehouse Post.

In terms of recognition, Owen had been BAFTA-nominated for her short film Turning and has won multiple industry awards, including One Show, D&AD, BTAA as well as a Gold Cannes Lions for her work on the “The Man Who Walked Around the World” campaign for Johnnie Walker.

Owen believes editing is a “fluid puzzle. I create in my mind a somewhat Minority Report wall with all the footage in front of me, where I can scroll through several options in my mind to try out and create fluid visual mixes. It’s always the unknown journey at the start of every project and the fascination that comes with honing and fine tuning or tearing an edit upside down and viewing it from a totally different perspective that is so exciting to me”.

Regarding her new role, she says, “There is a unique opportunity to create a beauty, fashion and lifestyle edit arm at Wax. The combination of my edit aesthetic and the company’s legacy of agency beauty background is really exciting to me.”

Owen calls herself “a devoted lifetime Avid editor.” She says, for her, it’s the most elegant way to work. “I can build walls of thumbnails in my Selects Bins and create living mood boards. I love how I can work in very detailed timelines and speed effects without having to break my workflow.”

She also gives a shout out to the Wax design and VFX team. “If we need to incorporate After Effects or Maxon Cinema 4D, I am able to brief and work with my team and incorporate those elements into my offline. I also love to work with the agency or director to work out a LUT before the shoot so that the offline looks premium right from the start.”

B&H expands its NAB footprint to target multiple workflows

By Randi Altman

In a short time, many in our industry will be making the pilgrimage to Las Vegas for NAB. They will come (if they are smart) with their comfy shoes, Chapstick and the NAB Show app and plot a course for the most efficient way to see all they need to see.

NAB is a big show that spans a large footprint, and typically companies showing their wares need to pick a hall — Central, South Lower, South Upper or North. This year, however, The Studio-B&H made some pros’ lives a bit easier by adding a booth in South Lower in addition to their usual presence in Central Hall.

B&H’s business and services have grown, so it made perfect sense to Michel Suissa, managing director at The Studio-B&H, to grow their NAB presence to include many of the digital workflows the company has been servicing.

We reached out to Suissa to find out more.

This year B&H and its Studio division are in the South Lower. Why was it important for you guys to have a presence in both the Central and South Halls this year?
The Central Hall has been our home for a long time and it remains our home with our largest footprint, but we felt we needed to have a presence in South Hall as well.

Production and post workflows merge and converge constantly and we need to be knowledgeable in both. The simple fact is that we serve all segments of our industry, not just image acquisition and camera equipment. Our presence in image and data centric workflows has grown leaps and bounds.

This world is a familiar one for you personally.
That’s true. The post and VFX worlds are very dear to me. I was an editor, Flame artist and colorist for 25 years. This background certainly plays a role in expanding our reach and services to these communities. The Studio-B&H team is part of a company-wide effort to grow our presence in these markets. From a business standpoint, the South Hall attendees are also our customers, and we needed to show we are here to assist and support them.

What kind of workflows should people expect to see at both your NAB locations?
At the South Hall, we will show a whole range of solutions to show the breadth and diversity of what we have to offer. That includes VR post workflow, color grading, animation and VFX, editing and high-performance Flash storage.

In addition to the new booth in South Hall, we have two in Central. One is for B&H’s main product offerings, including our camera shootout, which is a pillar of our NAB presence.

This Studio-B&H booth features a digital cinema and broadcast acquisition technology showcase, including hybrid SDI/IP switching, 4K studio cameras, a gyro-stabilized camera car, the most recent full-frame cinema cameras, and our lightweight cable cam, the DynamiCam.

Our other Central Hall location is where our corporate team can discuss all business opportunities with new and existing B2B customers

How has The Studio-B&H changed along with the industry over the past year or two?
We have changed quite a bit. With our services and tools, we have re-invented our image from equipment providers to solution providers.

Our services now range from system design to installation and deployment. One of the more notable recent examples is our recent collaboration with HBO Sports on World Championship Boxing. The Studio-B&H team was instrumental in deploying our DynamiCam system to cover several live fights in different venues and integrating with NEP’s mobile production team. This is part of an entirely new type of service —  something the company had never offered its customers before. It is a true game-changer for our presence in the media and entertainment industry.

What do you expect the “big thing” to be at NAB this year?
That’s hard to say. Markets are in transition with a number of new technology advancements: machine learning and AI, cloud-based environments, momentum for the IP transition, AR/VR, etc.

On the acquisition side, full frame/large sensor cameras have captured a lot of attention. And, of course, HDR will be everywhere. It’s almost not a novelty anymore. If you’re not taking advantage of HDR, you are living in the past.

Light Iron opens in Atlanta, targets local film community

In order to support the thriving Georgia production community, post studio Light Iron has opened a new facility in Atlanta. The expansion is the fourth since Panavision acquired Light Iron in 2015, bringing Light Iron’s US locations to six total, including Los Angeles, New York, New Orleans, Albuquerque and Chicago.

“Light Iron has been supporting Georgia productions for years through our mobile dailies services,” explains CFO Peter Cioni. “Now with a team on the ground, productions can take advantage of our facility-based dailies with talent that brings the finishing perspective into the process.”

Clark Cofer

The company’s Atlanta staff recently provided dailies services to season one of Kevin (Probably) Saves the World, season three of Greenleaf and the features Uncle Drew and Superfly.

With a calibrated theater, the Light Iron Atlanta facility has hosted virtual DI sessions from its LA facility for cinematographers working in Atlanta. The theater is also available for projecting camera and lens tests, as well as private screenings for up to 45 guests.

The theater is outfitted with a TVIPS Nevion TBG480, which allows for a full bandwidth 2K signal from either their LA or NY facility for virtual DI sessions. For example, if a cinematographer is working another show in Atlanta, they can still connect with the colorist for the final look of their previous show.

The Light Iron Atlanta dailies team uses Colorfront Express Dailies, which is standard across their facility-based and mobile dailies services worldwide.

Cioni notes that the new location is led by director of business development Clark Cofer, a member of Atlanta’s production and post industry. “Clark brings years of local and state-wide relationships to Light Iron, and we are pleased to have him on our growing team.”

Cofer most recently represented Crawford Media Services, where he drove sales for their renowned content services to companies like Lionsgate, Fox and Marvel. He currently serves as co-president of the Georgia Production Partnership, and is on the board of directors for the DeKalb County Film and Entertainment Advisory Board.

First-time director of Beyond Transport calls on AlphaDogs for post

The new documentary Beyond Transport, directed and produced by Ched Lohr, focuses on technology and how it’s brought people together while at the same time creating a huge disconnect in personal relationships. In this doc, this topic is examined from the perspective of cab drivers. Shot on all seven continents of the world, the film includes interviews with drivers who share their accounts of how socializing has changed dramatically in the 21st Century.

Eighteen months in the making, Beyond Transport was shot intermittently due to an extensive travel schedule to countries that included, Ireland, Cambodia, Tanzania and Australia. An unexpected conversation with a cab driver in Cairns, Australia, and a dive trip to the Great Barrier Reef were initially what inspired Lohr to make the film. “I noticed all the divers were using their personal devices in between dives,” says Lohr. “It seemed like meeting new people and connecting with others has become less of a priority. I thought it would be interesting to interview cab drivers because they have a very unique perspective of people’s behaviors.”

A physician by trade, Lohr had a vision for the documentary, but no idea on how to go about creating it. With no background in producing, writing or even how to use editing systems, Lohr assembled a team of pros to help guide him through the process, including hiring the team at Burbank’s AlphaDogs to complete post for the film.

AlphaDogs colorist Sean Stack distinguished differences in climate between the various locations by choosing specific color palettes. This helped bring the audiences into the story with a feel and vibe on what it might feel like to actually be there in person. “The filmmaker talks to cab drivers from a variety of climates, ranging from the searing heat of Tanzania, to the frigid temperatures of Antarctica,” describes Stack. “With that in mind, I navigated through the documentary looking for ways to help define the surroundings.”

To accomplish this, Stack added saturated warm colors, such as yellow, tan and brown to locations in South Africa and South America, making even the dirt, cars and buildings radiate a sense of intense heat. In contrast, less saturation was given to the harsher climate of Antarctica, using a series of blue tones for both the sky and the water, which added depth, and also gave a more frigid and crisp appearance. Blackmagic’s DaVinci Resolve Power Windows were used to fix problems with uncontrolled lighting situations present in the interviews with cab drivers. Hand-held footage was also stabilized, with a final touch of film grain added to take away from a videotape feel and give a more inviting texture to the documentary.

In addition, Stack created an end credits section by pulling shots of the cab drivers looking into the camera and smiling. “This accomplished the goal of the filmmaker to have pictures accompany the end credits,” explains Stack. “It also added another element of connection to the drivers who are telling the story. Seeing them one last time reminds the viewer of some of the best moments in the documentary and hopefully taking those memorable moments away with them.”

AlphaDogs audio engineer Curtis Fritsch completed audio on the film that included clean up on noisy audio files, since most all of the interviews take place inside of a cab. To keep the audio from sounding over processed, Fritsch used a very specific combination of Cedar and Izotope plugins. “We were able to find a really good balance in making the dialogue sound much clearer and pronounced,” he says. “This was of particular importance in the scene where a muezzin is reciting the adhan (call to prayer). I was able to remove the wind noise so you not only heard the prayer in this dreamlike sequence but also to keep the focus on the music, rather than the VFX.”

Cutter Mark Burnett returns to his Australian roots and The Editors

Editor Mark Burnett has returned home to Australia and The Editors after nine years of cutting in London, most recently at The Whitehouse. Launching his career in Sydney, working at The Post Office before joining The Editors back in 2007, Burnett moved to London in 2009 to edit at Speade, joining The Whitehouse in 2014.

Burnett’s style and comedic timing have brought him industry recognition with Clios, BTA Arrows, Cannes Lions and APA Crystal Awards. Last year he won a Bronze Kinsale Shark Award for his work on McCain’s We Are Family and his quirky approach has seen him cut for comedy directors such as Jim Hosking, Zach Math and Hamish Rothwell.

Also behind this year’s Sundance film An Evening With Beverly Luff and the Palm Springs Film Festival 2017 opening film Edmund The Magnificent, Burnett is no stranger to longform and has delivered on past Sundance hits The Greasy Strangler (2016) and the LCD Soundsystem documentary Shut Up and Play The Hits (2012).

On his recent signing, Burnett says, “After nine years in the UK and after many long winters, many teas, many pints, many new friends, a child, a lot of travel and a bit of whinging, the time felt right to head home. It made sense to head back to the company that has always been a home away from home, and I am stoked to be welcomed back to The Editors and to be surrounded by not only amazing talent, but amazing people.”

David Walton Smith joins digital agency Grow as head of film

Norfolk, Virginia-based digital agency Grow has expanded its film and production capabilities with the addition of David Walton Smith, who will take on the newly created role of head of film. Walton Smith will be charged with overseeing all content development and video production for the agency’s clients, which include Google, Spotify, Adidas and Adult Swim.

A multidisciplinary filmmaker and creative, Walton Smith has produced commercials, as well as branded and documentary content, for brands like Google, Volvo, Mass Mutual, Hyundai and Aleve. Prior to joining Grow, he was a director and producer at CNN’s branded content division, Courageous Studio, where he created broadcast and web content for CNN’s global audiences. He was also editor of Born to Explore with Richard Wiese, an Emmy Award-winning show that aired on ABC, as well as creative lead/director at London and Brooklyn-based LonelyLeap, where he spearheaded campaigns for Google and Tylenol.

Grow works with brands including Google, Spotify, NBC, Adidas, Homes.com, Oxygen Network and Adult Swim, to create digital experiences, products and interactive installations. Notable recent projects include Window Wonderland for Google Shopping, Madden Giferator for EA Sports, as part of Google’s Art, Copy & Code initiative, as well as The Pursuit, an interactive, crime thriller game created with Oxygen Media.

The challenges of creating a shared storage ‘spec’

By James McKenna

The specification — used in a bid, tender, RFQ or simply to provide vendors with a starting point — has been the source of frustration for many a sales engineer. Not because we wish that we could provide all the features that are listed, but because we can’t help but wonder what the author of those specs was thinking.

Creating a spec should be like designing your ideal product on paper and asking a vendor to come as close as they can to that ideal. Unlike most other forms of shopping, you avoid the sales process until the salesperson knows exactly what you want. This is good in some ways, but very limiting in others.

I dislike analogies with the auto industry because cars are personal and subjective, but in this way, you can see the difference in spec versus evaluation and research. Imagine writing down all the things you want in a car and showing up at the dealership looking for a match. You want power, beauty, technology, sports-car handling and room for five?

Your chances of finding the exact car you want are slim, unless you’re willing to compromise or adjust your budget. The same goes for facility shared storage. Many customers get hung up on the details and refuse to prioritize important aspects, like usability and sustainability, and as a result end up looking at quotes that are two to three times their cost expectations for systems that don’t perform the day-to-day work any better (and often perform worse).

There are three ways to design a specification:

Based On Your Workflow
By far, this is the best method and will result in the easiest path to getting what you want. Go ahead and plan for years down the road and challenge the vendors to keep up with your trajectory. Keep it grounded in what you believe is important to your business. This should include data security, usable administration and efficient management. Lay out your needs for backup strategy and how you’d like that to be automated, and be sure to prioritize these requests so the vendor can focus on what’s most important to you.

Be sure to clearly state the applications you’ll be using, what they will be requiring from the storage and how you expect them to work with the storage. The highest priority and true test of a successful shared storage deployment is: Can you work reliably and consistently to generate revenue? These are my favorite types of specs.

Based On Committee
Some facilities are the victim of their own size or budget. When there’s an active presence from the IT department, or the dollar amounts get too high, it’s not just up to the creative folks to select the right product. The committee can include consultants, system administrators, finance and production management, and everyone wants to justify their existence at the table. People with experience in enterprise storage and “big iron” systems will lean on their past knowledge and add terms like “Five-9s uptime,” “No SPOF,” “single namespace,” “multi-path” and “magic quadrant.”

In the enterprise storage world these would be important, but they don’t force vendors to take responsibility for prioritizing the interactions between the creative applications and the storage, and the usability and sustainability of a solution in the long term. The performance necessary to smoothly deliver a 4K program master, on time and on budget, might not even be considered. I see these types of specifications and I know that there will be a rude awakening when the quotes are distributed, usually leading to some modifications of the spec.

Based On A Product
The most limiting way to design a spec is by copying the feature list of a single product to create your requirements. I should mention that I have helped our customers to do this on some occasions, so I’m guilty here. When a customer really knows the market, and wants to avoid being bid an inferior product, this can be justified. However, you have better completed your research beforehand because there may be something out there that could change your opinion, and you don’t want to find out about it after you’re locked into the status quo. If you choose to do this but want to stay on the lookout for another option, simply prioritize the features list by what’s most important to you.

If you really like something about your storage, prioritize that and see if another vendor has something similar. When I respond to these bid specs, I always provide details on our solution and how we can achieve better results than the one that is obviously being requested. Sometimes it works, sometimes not, but at least now they’re educated.

The primary frustration with specifications that miss the mark is the waste of money and time. Enterprise storage features come with enterprise storage complexity and enterprise storage price tags. This requires training or reliance upon the IT staff to manage, or in some cases completely control the network for you. Cost savings in the infrastructure can be repurposed to revenue-generating workstations and artists can be employed instead of full-time techs. There’s a reason that scrappy, grassroots facilities produce faster growth and larger facilities tend to stagnate. They focus on generating content, invest only where needed and scale the storage as the bigger jobs and larger formats arrive.

Stick with a company that makes the process easy and ensures that you’ll never be without a support person that knows your daily grind.


James McKenna is VP of marketing and sales at shared storage company Facilis.

DigitalFilm Tree’s Ramy Katrib talks trends and keynoting BMD conference

By Randi Altman

Blackmagic, which makes tools for all parts of the production and post workflow, is holding its very first Blackmagic Design Conference and Expo, produced with FMC and NAB Show. This three-day event takes place on February 11-13 in Los Angeles. The event includes a paid conference featuring over 35 sessions, as well as a free expo on February 12, which includes special guests, speakers and production and post companies.

Ramy Katrib, founder and CEO of Hollywood-based post house and software development company DigitalFilm Tree, is the keynote speaker for the conference. FotoKem DI colorist Walter Volpatto and color scientist Joseph Slomka will be keynoting the free expo on the 12th.

We reached out to Katrib to find out what he’ll be focusing on in his keynote, as well as pick his brains about technology and trends.

Can you talk about the theme of your keynote?
Resolve has grown mightily over the past few years, and is the foundation of DigitalFilm Tree’s post finishing efforts. I’ll discuss the how Resolve is becoming an essential post tool. And with Resolve 14, folks who are coloring, editing, conforming and doing VFX and audio work are now collaborating on the same timeline, and that is huge development for TV, film and every media industry creative and technician.

Why was it important for you to keynote this event?
DaVinci was part of my life when I was a colorist 25 years ago, and today BMD is relevant to me while I run my own post company, DigitalFilm Tree. On a personal note, I’ve known Grant Petty since 1999 and work with many folks at BMD who develop Resolve and the hardware products we use, like I/O cards and Teranex converters. This relationship involves us sharing our post production pain points and workflow suggestions, while BMD has provided very relevant software and hardware solutions.

Can you give us a sample of something you might talk about?
I’m looking forward to providing an overview of how Resolve is now part of our color, VFX, editorial, conform and deliverables effort, while having artists provide micro demos on stage.

You alluded to the addition of collaboration in Resolve. How important is this for users?
Resolve 14’s new collaboration tools are a huge development for the post industry, specifically in this golden age of TV where binge delivery of multiple episodes at the same time is common place. As the complexity of production and post increases, greater collaboration across multiple disciplines is a refreshing turn — it allows multiple artists and technicians to work in one timeline instead of 10 timelines and round tripping across multiple applications.

Blackmagic has ramped up their NLE offerings with Resolve 14. Do you see more and more editors embracing this tool for editing?
Absolutely. It always takes a little time to ramp up in professional communities. It reminds me of when the editors on Scrubs used Final Cut Pro for the first time and that ushered FCP into the TV arena. We’re already working with scripted TV editors who are in the process of transitioning to Resolve. Also, DigitalFilm Tree’s editors are now using Resolve for creative editing.

What about the Fairlight audio offerings within? Will you guys take advantage of that in any way? Do you see others embracing it?
For simple audio work like mapping audio tracks, creating multi mixes for 5.1 and 7.1 delivery and mapping various audio tracks, we are talking advantage of Fairlight and audio functionality within Resolve. We’re not an audio house, yet it’s great to have a tool like this for convenience and workflow efficiency.

What trends did you see in 2017 and where do you think things will land in 2018?
Last year was about the acceptance of cloud-based production and post process. This year is about the wider use of cloud-based production and post process. In short, what used to be file-based workflows will give way to cloud-based solutions and products.

postPerspective readers can get $50 off of Registration for the Blackmagic Design Conference & Expo by using Code: POST18. Click here to register

Made in NY’s free post training program continues in 2018

New York City’s post production industry continues to grow thanks to the creation of New York State’s Post Production Film Tax Credit, which was established in 2010. Since then, over 1,000 productions have applied for the credit, creating almost a million new jobs.

“While this creates more pathways for New York City residents to get into the industry, there is evidence that this growth is not equally distributed among women and people of color. In response to this need, the NYC Mayor’s Office of Media and Entertainment decided to create the Made in New York Post Production Training Program, which built on the success of the Made in New York PA Training Program, which for the last 11 years has trained over 700 production assistants for work on TV and film sets,” explains Ryan Penny, program director of the Made In NY Post Production Training Program.

The Post Production Training Program seeks to diversify New York’s post industry by training low-income and unemployed New Yorkers in the basics of editing, animation and visual effects. Created in partnership with the Blue Collar Post Collective, BRIC Media Arts and Borough of Manhattan Community College, the course is free to participants and consists of a five-week, full-time skills training and job placement program administered by workforce development non-profit Brooklyn Workforce Innovations.

Trainees take part in classroom training covering the history and theory of post production, as well as technical training in Avid Media Composer, Adobe’s Premiere, After Effects and Photoshop, as well as Foundry’s Nuke. “Upon successful completion of the training, our staff will work with graduates to identify job opportunities for a period of two years,” says Penny.

Ryan Penny, far left with the most recent graduating class.

Launched in June 2017, the Made in New York Post Production Training Program graduated its second cycle of trainees in January 2018 and is now busy establishing partnerships with New York City post houses and productions who are interested in hiring graduates of the program as post PAs, receptionists, client service representatives, media management technicians and more.

“Employers can expect entry-level employees who are passionate about post and hungry to continue learning on the job,” reports Penny. “As an added incentive, the city has created a work-based learning program specifically for MiNY Post graduates, which allows qualified employers to be reimbursed for up to 80% of the first 280 hours of a trainee’s wages. This results in a win-win for employers and employees alike.”

The Made in New York Post Production Training Program will be conducting further cycles throughout the year, beginning with Cycle 3 planned for spring 2018. More information on the program and how to hire program graduates can be found here.