Tag Archives: post production

HPA issues a call for award entries, adds two new TV categories

The HPA (Hollywood Professional Association) has opened the call for entries in creative categories for the 13th annual HPA Awards. These awards recognize artistic excellence in color grading, editing, sound and visual effects in feature film, television and commercials.

The 13th annual awards presentation will be held at the Skirball Cultural Center in Los Angeles on November 15.

This year, two additional creative categories have been announced to reflect the evolution of the industry — Editing for Television and Visual Effects for Television. The category additions were based upon input on the changing nature of the industry from core creative constituents of the HPA Awards, as well as the editing and visual effects communities.

Entries are now being accepted in the following competitive categories:
•  Outstanding Color Grading – Feature Film
•  Outstanding Color Grading – Television
•  Outstanding Color Grading – Commercial
•  Outstanding Editing – Feature Film
•  Outstanding Editing – Television (30 Minutes and Under)
•  Outstanding Editing – Television (Over 30 Minutes)
•  Outstanding Sound – Feature Film
•  Outstanding Sound – Television
•  Outstanding Sound – Commercial
•  Outstanding Visual Effects – Feature Film
•  Outstanding Visual Effects – Television (13 Episodes and Fewer)
•  Outstanding Visual Effects – Television (Over 13 Episodes)

Changes to visual effects submissions teams were also announced. Complete rules, guidelines and entry information for the creative categories and all of the HPA Awards are available here.

Submissions for consideration in the Creative Categories will be accepted between May 16 and July 13. Early Bird Entries (at a reduced entry fee for the Creative Categories) will be accepted through June 11. To be considered eligible, work must have debuted domestically and/or internationally during the eligibility period — September 6, 2017 through September 4, 2018. Entrants do not need to be members of the Hollywood Professional Association or working in the US.

The call for entries for the HPA Engineering Excellence Award opened last month. Submissions for the Engineering Excellence Award will be accepted until May 25.

Review: HP’s zBook x2 mobile workstation

By Brady Betzel

There are a lot of laptops and tablets on the market these days that can seemingly power a SpaceX Falcon 9 rocket launch and landing. If you work in media and entertainment like I do, these days you might even be asked to edit and color correct that Falcon 9 footage that could have been filmed in some insane resolution like 8K.

So how do you edit that footage on the go? You need to find the most powerful mobile solution on the market. In my mind there are only a few that can power editing 8K footage (even if the footage is transcoded into manageable ProRes proxies). There is Razer, which offers a 4K/UHD “gaming” laptop with its Razer Blade Pro. It sports a high-end Nvidia GTX 1060 GPU and i7 processor; Dell’s high-end Precision 7720 mobile workstation allows for a high-end Quadro GPU; and HP offers high-quality mobile workstations via its zBook line.

For this review, I am focusing on the transforming HP zBook x2 mobile workstation, complete with an Intel Core i7 CPU, 32GB memory, Nvidia Quadro and much more.

The zBook x2 allows you to go laptop-style to tablet by removing the keyboard. If you’ve ever used a Wacom Cintiq mobile tablet, you’ve likely enjoyed the matte finish of the display, as well as the ability to draw directly on screen with a stylus. Well, the zBook x2 is a full touchscreen as well as stylus-enabled matte surface compatible with HP’s own battery-less pen. The pen from HP is based off of Wacom’s Electro Magnetic Resonance technology, which essentially allows for cable- and battery-free pens.

In addition, the display bezel has 12 buttons that are programmable for apps like Adobe’s Creative Cloud. For those wondering, HP partnered with Adobe when designing the x2, so you will notice that Creative Cloud comes pre-installed on the system, and the quick access buttons around the bezel are already programmed for use in Adobe’s apps. However, they don’t give you a free subscription with purchase — Hey, HP, this would be a nice touch. Just a suggestion.

Digging In
I was sent the top-of-the-line version of the zBook x2, complete with a DreamColor UHD touchscreen display. Here are the specs under the hood:

– Windows 10 64-bit
– Intel Core i7 8650 (Quad Core — 8th gen)
– 4K UHD DreamColor Touch with anti-glare
– 32GB (2×16 GB) DDR4 2133 memory
– Nvidia Quadro M620 (2GB)
– 512GB HP Z-Turbo Drive PCIe
– 70Whr fast charging battery
– Intel vPro WLAN
– Backlit Bluetooth Keyboard
– Fingerprint reader
– One- or three-year warranty, including the battery
– Two Thunderbolt 3 ports
– HDMI 1.4 port
– USB 3.0 charging port
– SD card slot
– Fingerprint reader
– Headset/microphone port
– External volume controls

The exterior hardware specs are as impressive as the technical specs. I’ve got to be honest, when I first received the x2, I was put off by the sharp edged-octagon design. I’m so used to either square shaped tablets or rounded edges, so the octagon-edged sides were a little strange. After using it for a month, I got used to how sturdy and well built this machine is. I kind of miss the octagon shape now that I had to ship the x2 back to HP.

In addition, the zBook x2 I received weighed in at around 5lbs (with the bluetooth keyboard attached), which isn’t really lightweight. Part of that weight is the indestructible-feeling magnesium and aluminum casing that surrounds the x2’s internal components.

I’ve reviewed a few of these stylus-based workstations before, such as Microsoft’s Surface Pro and Wacom’s mobile Cintiq offering, and they each have their positives and negatives. One thing that consistently sticks out to me is the kickstand used to prop these machines up. When you use a stylus on a tablet you will have a height and angle you like to work at. Some tablets have a few specified heights like the Wacom offering. The Surface Pro has a somewhat limited angle, but the zBook x2 has the strongest and best working built-in stand that I have used. It is sturdy when working in apps, like Adobe Photoshop, with the stylus.

HP’s Wacom-infused stylus is very lightweight. I personally like a stylus that is a little hefty, like the Wacom Pro Pen, but don’t get me wrong, HP’s pen works well. The pen has a similar pressure sensitivity to the Wacom’s pens many multimedia pros are used to at 4,096 levels and includes tilt sensitivity. When using tablets, palm rejection is a very important feature, and the x2 has excellent palm rejection. HP’s fact sheets and website all have different information on whether the pen is included with the x2 or not, but when ordering it looks like it is bundled with your purchase. As it should be).

One final note on the build quality of HP’s zBook x2: the detachable Bluetooth keyboard is excellent. The keyboard not only acts like a full-sized keyboard, complete with numerical keypad (a favorite of mine when typing in specific timecodes), but it also folds up to protect the screen when not in use.

If you are looking at the zBook x2 to purchase, you are probably also comparing it to a Microsoft Surface Pro, a Wacom Cintiq mobile computer and maybe an iPad Pro. In my opinion, there is no contest. Te x2 wins hands down. However, you are also going to be paying a lot more for it. For instance, the x2 can be purchased with the latest Intel 8th gen i7 processors, an Nvidia Quadro GPU built into the tablet —not the keyboard like on the Microsoft Surface Book systems — it has the ability to be packed with 32GB of RAM as opposed to 16GB in all other tablets. And most importantly, in my opinion, this system offers a color-accurate UHD 10-bit-HP DreamColor display. As I said, it is definitely the beefiest mobile workstation/tablet that you will find out there, but will cost you.

One of my favorite practices that HP is starting to standardize among its mobile workstations is the use of quick charging, where you can charge 50% of your battery in a half an hour and the rest over a few more hours. I can’t tell you how handy this is when you are running around all day and don’t have four hours to charge your computer between appointments. When running apps like Blackmagic’s Resolve 14.3 with UHD video, you can drain the battery fast — something like four hours — but being able to quickly charge back up to 50% is a lifesaver in a lot of circumstances.

In the real world, I use my mobile workstation/tablets all the time. I surf the web, listen to music, edit in Adobe Premiere Pro or color correct in Resolve. This means my systems have to have some high-end processors to keep up. The HP zBook x2 is a great addition to your workstation lineup when you need to take your work on the road and not lose any features, like the HP DreamColor display with 100% Adobe RGB color accuracy. While it’s not a truly calibrated work monitor, DreamColor displays will, at the very least, give you a common calibration among all DreamColor monitors that you can rely on for color critical jobs on the run. In addition, DreamColor displays can display different color spaces like BT. 709, DCI-P3 and more.

Putting it to the Test
To test the x2, I ran a few tests using one of the free clips that Red offers to download from: http://www.red.com/sample-r3d-files. It is the Red One Mysterium clip with a resolution of 4096×2304 and runs at 29.97fps. For a mobile workstation this is a pretty hefty clip to run in Resolve or Premiere. In Premiere, the Red clip would play at realtime when dumbed down to half quality. Half quality isn’t bad to work in, but when spending $3,500 I would like to work in a better-quality Red files. Maybe the technology will be there in a year.

If you are into the whole offline/online workflow (a.k.a. proxy workflow — a.k.a. transcoding to a interframe codec like DNxHR or ProRes — then you will be able to play down the full 4K clip when transcoding to something like DNxHR HQ. Unfortunately, I couldn’t get a 10-bit DNxHR HQX clip to play at realtime, and with the sweet 10-bit display that could have been a welcome success. To test exporting speed I trimmed the R3D file (still raw Red) to 10 seconds and exported it as a DNxHR HQX 10-bit QuickTime (in the files native resolution and frame rate) and highly compressed H.264 at around 10,000mb/s.

The DNxHR HQX 10-bit QuickTime took 1 minute and 25 seconds to export. I then added a 110% resize and a color grade to really make sure the Quadro GPU kicked in, and unfortunately the export failed. I tried multiple times with different Lumetri color grades and all of them failed, probably a sweet bug.

Next, I exported an uncolored 10,000mb/s H.264 MP4 (a clip perfect for YouTube) in 2 minutes and 41 seconds. I then resized the clip to 110% and performed a color grade using the Lumetri tools inside of Premiere Pro. The MP4 exported in 1 minute and 30 seconds. This was pretty incredible and really showed just how important that Nvidia Quadro M620 with 2GB of memory is. And while things like resizing and color correcting will make sure your GPU kicks in to help, the HP zBook x2 was relatively quiet with the active cooling fan system that kicks all of the hot air up and out of the magnesium case.

Inside of Resolve 14.3, I performed the same tests on the same Red clip. I was able to play the Red clip at about 16fps in 1/16 debayer quality in realtime. Not great, but for a mobile tablet workstation, maybe it’s ok, although I would expect more from a workstation. When exporting the DNxHR HQX 10-bit QuickTime took 2 minutes and the same clip resized to 110% and color graded also took 2 minutes. The H.264 took 2 minutes and 33 seconds without any color grading and resizing, but it also took 2 minutes and 33 seconds when resized 110% and color graded. I had all caching and performance modes disabled when performing these tests. I would have thought Resolve would have performed better than Premiere Pro, but in this case Adobe wins.

As a bonus, I happen to have Fusion, GoPro’s 360 video camera, and ran it through Fusion Studio, GoPro’s stitching and exporting software. Keep in mind 360 video is a huge resource hog that takes lots of time to process. The 30-second test clip I exported in flat color, with image stabilization applied, took an hour to export. The resulting file was a 1.5GB – 4992×2496 4:2:2 Cineform 10-bit YUV QuickTime with Ambisonic audio. That’s a big and long render in my opinion, although it will also take a long time on many computers.

Summing up
In the end, the HP zBook x2 is a high-end mobile workstation that doubles as a stylus-based drawing tablet designed to be used in apps like Photoshop and even video editing apps like Premiere Pro.

The x2 is profoundly sturdy with some high-end components, like the Intel i7 8th gen processor, Nvidia Quadro M620 GPU, 4K/UHD HP DreamColor touchscreen display and 32GB of RAM.

But along with these high-end components comes a high price: the setup in this review retails for around $3,500, which is not cheap. But for a system that is designed to be run 24 hours a day 365 days a year, it might be the investment you need to make.

Do you want to use the table at the office when connected to a Thunderbolt 3 dock while also powering a 4K display? The x2 is the only mobile table workstation that will do this at the moment. If I had any criticisms of the HP zBook x2 it would be the high cost and the terrible speakers. HP touts the Bang & Olufsen speakers on the x2, but they are not good. My Samsung Galaxy S8+ has better speakers.

So whether you are looking to color correct on the road or have a Wacom-style table at the office, the HP zBook x2 is a monster that HP has certified with companies like Adobe using their Independent Software Vendor verifications to ensure your drivers and software will work as well as possible.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Behind the Title: Versus Partner/CD Justin Barnes

NAME: Justin Barnes

COMPANY: Versus (@vs_nyc)

CAN YOU DESCRIBE YOUR COMPANY?
We are “versus” the traditional model of a creative studio. Our approach is design driven and full service. We handle everything from live action to post production, animation and VFX. We often see projects from concept through delivery.

WHAT’S YOUR JOB TITLE?
Partner and Creative Director

WHAT DOES THAT ENTAIL?
I handle the creative side of Versus. From pitching to ideation, thought leadership and working closely with our editors, animators, artists and clients to make our creative — and our clients’ creative vision — the best it can be.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
There’s a lot of business and politics that you have to deal with being a creative.

Adidas

WHAT’S YOUR FAVORITE PART OF THE JOB?
Every day is different, full of new challenges and the opportunity to come up with new ideas and make really great work.

WHAT’S YOUR LEAST FAVORITE?
When I have to deal with the business side of things more than the creative side.

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
For me, it’s very late at night; the only time I can work with no distractions.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Anything in the creative world.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
It’s been a natural progression for me to be where I am. Working with creative and talented people in an industry with unlimited possibilities has always seemed like a perfect fit.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
– Re-brand of The Washington Post
– Animated content series for the NCAA
– CG campaign for Zyrtec
– Live-action content for Adidas and Alltimers collaboration

Zyrtec

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
I am proud of all the projects we do, but the ones that stick out the most are the projects with the biggest challenges that we have pulled together and made look amazing. That seems like every project these days.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
My laptop, my phone and Uber.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
I can’t live without Pinterest. It’s a place to capture the huge streams of inspiration that come at us each day.

DO YOU LISTEN TO MUSIC WHILE YOU WORK?
We have music playing in the office 24/7, everything from hip-hop to classical. We love it all. When I am writing for a pitch, I need a little more concentration. I’ll throw on my headphones and put on something that I can get lost in.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Working on personal projects is big in helping de-stress. Also time at my weekend house in Connecticut.

Creative editorial and post boutique Hiatus opens in Detroit

Hiatus, a full-service, post production studio with in-house creative editorial, original music composition and motion graphics departments, has opened in Detroit. Their creative content offerings cover categories such as documentary, narrative, conceptual, music videos and advertising media for all video platforms.

Led by founder/senior editor Shane Patrick Ford, the new company includes executive producer/partner Catherine Pink, and executive producer Joshua Magee, who joins Hiatus from the animation studio Lunar North. Additional talents feature editor Josh Beebe, composer/editor David Chapdelaine and animator James Naugle.

The roots of Hiatus began with The Factory, a music venue founded by Ford while he was still in college. It provided a venue for local Detroit musicians to play, as well as touring bands. Ford, along with a small group of creatives, then formed The Work – a production company focused on commercial and advertising projects. For Ford, the launch of Hiatus is an opportunity to focus solely on his editorial projects and to expand his creative reach and that of his team nationally.

Leading up to the launch of Hiatus, the team has worked on projects for brands such as Sony, Ford Motor Company, Acura and Bush’s, as well as recent music videos for Lord Huron, Parquet Courts and the Wombats.

The Hiatus team is also putting the finishing touches on the company’s first original feature film Dare to Struggle, Dare to Win. The film uncovers a Detroit Police decoy unit named STRESS and the efforts made to restore civil order in 1970s post-rebellion Detroit. Dare to Struggle, Dare to Win makes its debut at the Indy Film Festival on Sunday April 29th and Tuesday May 1st in Indianapolis, before it hits the film festival circuit.

“Launching Hiatus was a natural evolution for me,” says Ford. “It was time to give my creative team even more opportunities, to expand our network and to collaborate with people across the country that I’ve made great connections with. As the post team evolved within The Work, we outgrew the original role it played within a production company. We began to develop our own team, culture, offerings and our own processes. With the launch of Hiatus, we are poised to better serve the visual arts community, to continue to grow and to be recognized for the talented creative team we are.”

“Instead of having a post house stacked with people, we’d prefer to stay small and choose the right personal fit for each project when it comes to color, VFX and heavy finishing,” explains Hiatus EP Catherine Pink. “We have a network of like-minded artists that we can call on, so each project gets the right creative attention and touch it deserves. Also, the lower overhead allows us to remain nimble and work with a variety of budget needs and all kinds of clients.”

NAB 2018: My key takeaways

By Twain Richardson

I traveled to NAB this year to check out gear, software, technology and storage. Here are my top takeaways.

Promise Atlas S8+
First up is storage and the Promise Atlas S8+. The Promise Atlas S8+ is a network attached storage solution for small groups that features easy and fast NAS connectivity over Thunderbolt3 and 10GB Ethernet.

The Thunderbolt 3 version of the Atlas S8+ offers two Thunderbolt 3 ports, four 1Gb Ethernet ports, five USB 3.0 ports and one HMDI output. The 10g BaseT version swaps in two 10Gb/s Ethernet ports for the Thunderbolt 3 connections. It can be configured up to 112TB. The unit comes empty, and you will have to buy hard drives for it. The Atlas S8+ will be available later this year.

Lumaforge

Lumaforge Jellyfish Tower
The Jellyfish is designed for one thing and one thing only: collaborative video workflow. That means high bandwidth, low latency and no dropped frames. It features a direct connection, and you don’t need a 10GbE switch.

The great thing about this unit is that it runs quiet, and I mean very quiet. You could place it under your desk and you wouldn’t hear it running. It comes with two 10GbE ports and one 1GbE port. It can be configured for more ports and goes up to 200TB. The unit starts at $27,000 and is available now.

G-Drive Mobile Pro SSD
The G-Drive Mobile Pro SSD is blazing-fast storage with data transfer rates of up to 2800MB/s. It was said that you could transfer as much as a terabyte of media in seven minutes or less. That’s fast. Very fast.

It provides up to three-meter drop protection and comes with a single Thunderbolt 3 port and is bus powered. It also features a 1000lb crush-proof rating, which makes it ideal for being used in the field. It will be available in May with a capacity of 500GB. 1TB and 2TB versions will be available later this year.

OWC Thunderblade
Designed to be rugged and dependable as well as blazing fast, the Thunderblade has a rugged and sleek design, and it comes with a custom-fit ballistic hard-shell case. With capacities of up 8TB and data transfer rates of up to 2800MB/s, this unit is ideal for on-set workflows. The unit is not bus powered, but you can connect two ThunderBlades that can reach speeds of up to 3800MB/s. Now that’s fast.

OWC Thunderblade

It starts at $1,199 for the 1TB and is available now for purchase.

OWC Mercury Helios FX External Expansion Chassis
Add the power of a high-performance GPU to your Mac or PC via Thunderbolt 3. Performance is plug-and-play, and upgrades are easy. The unit is quiet and runs cool, making it a great addition to your environment.

It starts at $319 and is available now.

Flanders XM650U
This display is beautiful, absolutely beautiful.

The XM650U is a professional reference monitor designed for color-critical monitoring of 4K, UHD, and HD signals. It features the latest large-format OLED panel technology, offering outstanding black levels and overall picture performance. The monitor also features the ability to provide a realtime downscaled HD resolution output.

The FSI booth was showcasing the display playing HD, UHD, and UHD HDR content, which demonstrates how versatile the device is.

The monitor goes for $12,995 and is available for purchase now.

DaVinci Resolve 15
What could arguably be the biggest update yet to Resolve is version 15. It combines editing, color correction, audio and now visual effects all in one software tool with the addition of Fusion. Other additions include ADR tools in Fairlight and a sound library. The color and edit page has additions such as a LUT browser, shared grades, stacked timelines, closed captioning tools and more.

You can get DR15 for free — yes free — with some restrictions to the software and you can purchase DR15 Studio for $299. It’s available as a beta at the moment.

Those were my top take aways from NAB 2018. It was a great show, and I look forward to NAB 2019.


Twain Richardson is a co-founder of Frame of Reference, a boutique post production company located on the beautiful island of Jamaica. Follow the studio and Twain on Twitter: @forpostprod @twainrichardson

NAB 2018: A closer look at Firefly Cinema’s suite of products

By Molly Hill

Firefly Cinema, a French company that produces a full set of post production tools, premiered Version 7 of its products at NAB 2018. I visited with co-founder Philippe Reinaudo and head of business development Morgan Angove at the Flanders Scientific booth. They were knowledgeable and friendly, and they helped me to better understand their software.

Firefly’s suite includes FirePlay, FireDay, FirePost and the brand-new FireVision. All the products share the same database and Éclair color management, making for a smooth and complete workflow. However, Reinaudo says their programs were designed with specific UI/UXs to better support each product’s purpose.

Here is how they break down:
FirePlay: This is an on-set media player that supports most any format or file. The player is free to use, but there’s a paid option to include live color grading.

FireDay: Firefly Cinema’s dailies software includes a render tree for multiple versions and supports parallel processing.

FirePost: This is Firefly Cinema’s proprietary color grading software. One of its features was a set of “digital filters,” which were effects with adjustable parameters (not just pre-set LUTs). I was also excited to see the inclusion of curve controls similar to Adobe Lightroom’s Vibrance setting, which increases the saturation of just the more muted colors.

FireVision: This new product is a cloud-based review platform, with smooth integration into FirePost. Not only do tags and comments automatically move between FirePost and FireVision, but if you make a grading change in the former and hit render, the version in FireVision automatically updates. While other products such as Frame.io have this feature, Firefly Cinema offers all of these in the same package. The process was simple and impressive.

One of the downsides of their software package is its lack of support for HDR, but Raynaud says that’s a work in progress. I believe this will likely begin with ÉclairColor HDR, as Reinaudo and his co-founder Luc Geunard are both former Éclair employees. It’s also interesting that they have products for every step after shooting except audio and editing, but perhaps given the popularity of Avid Media Composer, Adobe Premiere and Avid Pro Tools, those are less of a priority for a young company.

Overall, their set of products was professional, comprehensive and smooth to operate, and I look forward to seeing what comes next for Firefly Cinema.


Molly Hill is a motion picture scientist and color nerd, soon-to-be based out of San Francisco. You can follow her on Twitter @mollymh4.

Director HaZ Dulull on his sci-fi offering The Beyond

By Randi Altman

Director Hasraf “HaZ” Dulull is no stranger to making movies. Before jumping into writing and directing short sci-fi films, he was a visual effects supervisor and producer. His short film resume includes Project Kronos, I.R.I.S. and Sync. Recently, his first feature film, The Beyond, was released by Gravitas Ventures.

When I first met HaZ a few years back, we were both at an Adobe event — on a canal boat in Amsterdam during IBC. We started talking about visual effects, the industry and his drive to make movies.

This Brit is friendly, intelligent and incredibly hands-on in all aspects of what he does. His latest is The Beyond, which he describes as “a cerebral science-fiction feature film that blends the realism of documentary with the fantastical, ‘big idea’ nature of the science-fiction films of today.” The Beyond tells the story of a ground-breaking mission that sent astronauts — modified with advanced robotics — through a newly discovered wormhole known as the Void. When the mission returns unexpectedly, the space agency races to discover what the astronauts encountered on their first-of-its-kind interstellar space journey.

HaZ on set

HaZ was so hands-on that he provided some of the film’s visual effects and edited the film. Here is the trailer. If you like what you see, the film is available for purchase or rent on most digital platforms.

When I reached out to HaZ to talk about The Beyond, he was in Vancouver working on an eight-part TV series for Disney called Fast Layne. “I directed episodes 1 and 2, and am currently directing episodes 7 and 8,” he says. “The beauty of starting and ending the series is it allowed me to set the show’s style and tone.”

It seems he can’t sit still! Let’s find out more about how he works and The Beyond

Can you talk about prepro? How much of that included visual effects prepro?
Most people who know me will say I’m obsessed with prep. I had about six months of hardcore prep on this, from doing little storyboards, known as HaZ-Grams, right through to previs of the key sequences.

But even during the script-writing stage (six months before actual prep), I was coming up with visuals to support the ideas I was writing in the script. Sometimes I would knock up a test VFX scene just to see how complex it would be to create this idea I was writing in the script. Prep worked hand in hand with the script development and the budgeting of the film. The film was self-financed and later additional financing came in (during post production of the film), so I wanted to ensure everything was mapped out technically, as there was no “fix it in post” scenarios in this film — I wouldn’t allow it.

During location scouting, I would have my iPhone with me and shoot a bunch of footage and still imagery, so when I went back home I could write those locations into the script to make them work with the scenarios depicted in the film.

As part of prep we actually shot a test scene to really see if this mocku-mentary format would work to tell a grounded sci-fi story. This was also used to attract crew and other casting to the project, as well as get distributors primed early on.

Many shots from that test actually made it into the final movie —I wasn’t kidding about not wasting any budget or material on this production! So prep pretty much helped shape the script too, as I knew I wasn’t in the financial position to write stuff and then go and build it. I had to reverse engineer it in a way. In the film we have tons of locations, such as the Space Centre with actual real rockets. We also had a team in Iceland shooting alien landscapes, and we even shot some scenes in Malaysia to give the film a global feel — with each of those opportunities the script was tweaked to make full use of those location opportunities we had.

You shot with Blackmagic cameras. Was that your choice? The DP’s? Have you shot with these before?
From the start, I knew we were going to shoot on Blackmagic cameras. This was mainly down to the fact my DP Adam Batchelor — who had shot Sync with me and the proof of concept tests we did for this film — was a Blackmagic advocate and knew the cameras inside out, but more importantly he was able to get cinematic imagery using those cameras.

Blackmagic was very supportive of the film and have been of my career since my short films, so they came on as one of the executive producers on the film. No one had ever shot a full feature film using just the Blackmagic cameras. We also then used a Resolve pipeline to delivery. So The Beyond is the perfect case study for it.

Can you talk about that workflow? Any hiccups? 
I think the only hiccups were the fact we were using a beta version of Resolve 14, so there were the expected crashes, etc. That would usually be seen as risky on a feature film, but luckily we didn’t have a distributor in place with a release date, so the risk was minimal.

The good thing was I would generate an error log report from Resolve and send it over to Blackmagic, who would then instantly send out a new patch. So we were looked after rather than being left on our own to scream at the monitor.

We stuck with a Pro Res 4444 QuickTime workflow for all material from footage to VFX renders, and enabled proxy on the fly within Resolve. This was great as it meant I was working with the highest-resolution imagery within Resolve, and it was fairly fast too. Things started to slow down when I had multiple layers of VFX and composites/groups, which I then had to render out as a new clip and bring back in.

How did you and the DP develop the look you wanted? Any scenes stick out that you guys worked on?
I was very fortunate to get Max Horton, who had worked on films like Gravity, to come onboard to grade this film at the Dolby Vision lab in London’s Soho. We also did an HDR version of the film, which I think is the first indie film to have an HDR treatment done to it.

We had three to four days of grading with Max, and I was in the room with him the whole time. This was because I had already done a first-pass temp grade myself while editing the film in the beta version of Resolve 14. This made the workflow as simple as exporting my Resolve file and then the material hand-over to Max, who would load up the Resolve file, link up the material and work from there.

Max kept everything photographically like a documentary but with a slight cinematic flair to it. The big challenge was matching all the various sources of material from the various Blackmagic cameras (Ursa Mini Pro, the Production Camera and the Pocket Camera) to the DJI Osmo, drone footage and stock footage.

How many VFX shots were there? Who did them?
There were around 750 visual effects shots. I designed all the VFX scenes and handled a huge portion of the compositing myself, including invisible effects shots, all the space scenes, alien planet scenes, memory scenes and tons more — this would not have been possible without the support of my VFX team who worked on their assigned sequences and shots and also generated tons of CGI assets for me to use to create my shots in comp.

My VFX team members included my long-time collaborator John Sellings, who was the VFX supervisor for all the Human 2.0 sequences. Filmmore, in Amsterdam and Brussels, handled Human 2.0 scenes in the transcode bay with in-house VFX supervisor Hans Van Helden. London’s Squint VFX handled the Human 2.0 scenes in wake-up lab. Charles Wilcocks was the Human 2.0 CG supervisor who worked on the shape and look of the Human 2.0.

Hussin Khan looked after the Malaysian team, which provided rotoscoping support and basic comps. Dan Newlands was our on-set tracking supervisor. He ensured all data was captured correctly and supervised anything tracking related in the Human 2.0 scenes.

Another long-time collaborator was Andrea Tedeschi, who handled the CG and comps for the spacecraft carrier at the end of the film, as well as rendering out the CG astronaut passes. Rhys Griffith handled the rigging for the Human 2.0 characters in Maya, and also looked after the CG passes for the alpha Human 2.0 scenes using Blender. Aleksandr Uusmees provided all the particles and simulation rendered out of Houdini as CG passes/elements, which I then used to create the wormhole effects, alien spheres and other shots that needed those elements.

JM Blay designed and created the standalone motion graphics sequences to visualize the Human 2.0 medical procedure, as well as mission trajectory graphics. He also created several “kit-bash” graphics assets for me to use, including UI graphics, from his After Effects files.

Territory Studio created the awesome end titles and credits sequence, which you can read more about on their site.

As a VFX pro yourself, do you find that you are harder to please because it’s your wheelhouse?
Oh boy. Ask any of the VFX guys on the team and they will say I am a beast to work with because I am hands-on, and also I know how long things take. But on the flip side that had its advantages, as they knew they were not going to get revision after revision, because with each brief I also presented a proposed methodology, and made sure we locked down on that first before proceeding with the shots.

Was this your biggest directing job to date? Can you talk about any surprises?
It wasn’t my biggest directing job to date, as during post production of The Beyond my second sci-fi film Origin Unknown (starring Katee Sackhoff from Battlestar Galactica, The Flash) was green-lit and that had its own set of challenges. We can talk more about that when the film is released theatrically and VOD later this year via Kew Media.

This was, however, my biggest producing job to date; there were so many logistics and resources to manage whilst directing too. The cool thing about the way we made this film was that most of the crew were on my short films, including some of the key cast too, so we embraced the guerrilla nature of the production and focused on maximizing our resources to the fullest within the time and budget constraints.

What did you learn on this film that will help on your next?
The other hat I was wearing was the producer hat, and one thing I had to embrace was the sheer amount of paperwork! I may have taken the same filmmaking approach as I did on my short films — guerrilla and thinking outside the box technically and creatively— but making a commercial feature film, I had to learn to deal with things like clearances, E&O (errors and omission) insurance, chain of title, script report and a whole bunch of paperwork required before a distributor will pick up your film.

Thankfully my co-producer Paula Crickard, who is currently wrapping post on Terry Gilliam’s Don Quixote, came in during the post stage of the film and helped.

The other thing I learned was the whole sales angle — getting a reputable distributor on board to sell the film in all worldwide territories and how to navigate that process with rights and IP and more contracts etc. The advise I got from other filmmakers is getting the right distributor is a big part in how your film will be released, and to me it was important the distributor was into the film and not just the trailer, but also what their marketing and sales strategy were. The Beyond was never designed to be a theatrical film and therefore I wanted someone that had a big reach in the VOD world through their brand, especially since The Beyond doesn’t have big-name actors in there.

What was the most challenging scene or scenes? Why and how did you overcome those challenges?
The Human 2.0 scenes were the most challenging because they had to look photoreal due to it being a documentary narrative. We did first try and do it all in-camera using a built suit, but it wasn’t achieving the look we wanted and the actors would feel uncomfortable with it, and also to do it properly with practical would cost a fortune. So we went with a full-digital solution for the Human 2.0 bodies, by having the actors wear a tight grey suit with tracking markers on and we restricted our camera moves for simplicity to enable object tracking to work as accurately as possible. We also shot multiple reference footage from all angles to help with match moving. Having an on set-tracking supervisor helped massively and allowed us to make this happen within the budget, while looking and feeling real.

Our biggest issue came when our actress made very tiny movements due to breathing in close-up shots. Because our Human 2.0 was human consciousness in a synthetic shell, breathing didn’t make sense and we began making up for it by freezing the image or doing some stabilization, which ended up being nearly impossible for the very close-up shots.

In the end, I had to think outside the box, so I wrote a few lines into the script that explained that the Human 2.0 was breathing to make it psychologically more acceptable to other humans. Those two lines saved us weeks and possibly months of time.

Being a VFX movie you would expect us to use a form of greenscreen or bluescreen, but we didn’t — in fact, the only stage used was for the “white room” astronaut scene, which was shot over at Asylum FX in London. There was an actor wearing an astronaut suit in a bright photography room, and we used brightly exposed lighting to give a surreal feeling. We used VFX to augment it.

As a writer and a director, how was it seeing your vision through from start to finish.
It didn’t really hit me until I watched the press screening of it at the Dolby Vision office in Soho. It had the fully mixed sound and the completed grade. I remember looking across at my DP and other team members thinking, “Whoa! It looks and feels like a feature film, and we did that in a year!”

You edited the film yourself?
Yes, I was the editor on the film! I shoot for the edit. I started off using Adobe Premiere CC for the early offline and then quickly moved over to Resolve 14, where I did the majority of the editing. It was great because I was doing a lot of online editorial tasks like stabilizing, basic VFX, pan and scans, as well as establishing temp looks while editing. So in a way there was no offline and online editorial, as it was all part of one workflow. We did all our deliverables out of Resolve 14, too.

NYC’s Wax adds editor Kate Owen

Kate Owen, an almost 20-year veteran who has cut both spots and films, has joined the editorial roster of New York’s Wax. The UK-born but New York-based Owen has edited projects across fashion, beauty, lifestyle and entertainment for brands such as Gucci, Victoria Beckham, Vogue, Adidas, Sony, Showtime and Virgin Atlantic.

Owen started editing in her teens and subsequently worked with top-tier agencies like Mother, Saatchi NY, McGarryBowen, Grey Worldwide and Y&R. She has also worked at editing houses Marshall Street Editors and Whitehouse Post.

In terms of recognition, Owen had been BAFTA-nominated for her short film Turning and has won multiple industry awards, including One Show, D&AD, BTAA as well as a Gold Cannes Lions for her work on the “The Man Who Walked Around the World” campaign for Johnnie Walker.

Owen believes editing is a “fluid puzzle. I create in my mind a somewhat Minority Report wall with all the footage in front of me, where I can scroll through several options in my mind to try out and create fluid visual mixes. It’s always the unknown journey at the start of every project and the fascination that comes with honing and fine tuning or tearing an edit upside down and viewing it from a totally different perspective that is so exciting to me”.

Regarding her new role, she says, “There is a unique opportunity to create a beauty, fashion and lifestyle edit arm at Wax. The combination of my edit aesthetic and the company’s legacy of agency beauty background is really exciting to me.”

Owen calls herself “a devoted lifetime Avid editor.” She says, for her, it’s the most elegant way to work. “I can build walls of thumbnails in my Selects Bins and create living mood boards. I love how I can work in very detailed timelines and speed effects without having to break my workflow.”

She also gives a shout out to the Wax design and VFX team. “If we need to incorporate After Effects or Maxon Cinema 4D, I am able to brief and work with my team and incorporate those elements into my offline. I also love to work with the agency or director to work out a LUT before the shoot so that the offline looks premium right from the start.”

B&H expands its NAB footprint to target multiple workflows

By Randi Altman

In a short time, many in our industry will be making the pilgrimage to Las Vegas for NAB. They will come (if they are smart) with their comfy shoes, Chapstick and the NAB Show app and plot a course for the most efficient way to see all they need to see.

NAB is a big show that spans a large footprint, and typically companies showing their wares need to pick a hall — Central, South Lower, South Upper or North. This year, however, The Studio-B&H made some pros’ lives a bit easier by adding a booth in South Lower in addition to their usual presence in Central Hall.

B&H’s business and services have grown, so it made perfect sense to Michel Suissa, managing director at The Studio-B&H, to grow their NAB presence to include many of the digital workflows the company has been servicing.

We reached out to Suissa to find out more.

This year B&H and its Studio division are in the South Lower. Why was it important for you guys to have a presence in both the Central and South Halls this year?
The Central Hall has been our home for a long time and it remains our home with our largest footprint, but we felt we needed to have a presence in South Hall as well.

Production and post workflows merge and converge constantly and we need to be knowledgeable in both. The simple fact is that we serve all segments of our industry, not just image acquisition and camera equipment. Our presence in image and data centric workflows has grown leaps and bounds.

This world is a familiar one for you personally.
That’s true. The post and VFX worlds are very dear to me. I was an editor, Flame artist and colorist for 25 years. This background certainly plays a role in expanding our reach and services to these communities. The Studio-B&H team is part of a company-wide effort to grow our presence in these markets. From a business standpoint, the South Hall attendees are also our customers, and we needed to show we are here to assist and support them.

What kind of workflows should people expect to see at both your NAB locations?
At the South Hall, we will show a whole range of solutions to show the breadth and diversity of what we have to offer. That includes VR post workflow, color grading, animation and VFX, editing and high-performance Flash storage.

In addition to the new booth in South Hall, we have two in Central. One is for B&H’s main product offerings, including our camera shootout, which is a pillar of our NAB presence.

This Studio-B&H booth features a digital cinema and broadcast acquisition technology showcase, including hybrid SDI/IP switching, 4K studio cameras, a gyro-stabilized camera car, the most recent full-frame cinema cameras, and our lightweight cable cam, the DynamiCam.

Our other Central Hall location is where our corporate team can discuss all business opportunities with new and existing B2B customers

How has The Studio-B&H changed along with the industry over the past year or two?
We have changed quite a bit. With our services and tools, we have re-invented our image from equipment providers to solution providers.

Our services now range from system design to installation and deployment. One of the more notable recent examples is our recent collaboration with HBO Sports on World Championship Boxing. The Studio-B&H team was instrumental in deploying our DynamiCam system to cover several live fights in different venues and integrating with NEP’s mobile production team. This is part of an entirely new type of service —  something the company had never offered its customers before. It is a true game-changer for our presence in the media and entertainment industry.

What do you expect the “big thing” to be at NAB this year?
That’s hard to say. Markets are in transition with a number of new technology advancements: machine learning and AI, cloud-based environments, momentum for the IP transition, AR/VR, etc.

On the acquisition side, full frame/large sensor cameras have captured a lot of attention. And, of course, HDR will be everywhere. It’s almost not a novelty anymore. If you’re not taking advantage of HDR, you are living in the past.

Light Iron opens in Atlanta, targets local film community

In order to support the thriving Georgia production community, post studio Light Iron has opened a new facility in Atlanta. The expansion is the fourth since Panavision acquired Light Iron in 2015, bringing Light Iron’s US locations to six total, including Los Angeles, New York, New Orleans, Albuquerque and Chicago.

“Light Iron has been supporting Georgia productions for years through our mobile dailies services,” explains CFO Peter Cioni. “Now with a team on the ground, productions can take advantage of our facility-based dailies with talent that brings the finishing perspective into the process.”

Clark Cofer

The company’s Atlanta staff recently provided dailies services to season one of Kevin (Probably) Saves the World, season three of Greenleaf and the features Uncle Drew and Superfly.

With a calibrated theater, the Light Iron Atlanta facility has hosted virtual DI sessions from its LA facility for cinematographers working in Atlanta. The theater is also available for projecting camera and lens tests, as well as private screenings for up to 45 guests.

The theater is outfitted with a TVIPS Nevion TBG480, which allows for a full bandwidth 2K signal from either their LA or NY facility for virtual DI sessions. For example, if a cinematographer is working another show in Atlanta, they can still connect with the colorist for the final look of their previous show.

The Light Iron Atlanta dailies team uses Colorfront Express Dailies, which is standard across their facility-based and mobile dailies services worldwide.

Cioni notes that the new location is led by director of business development Clark Cofer, a member of Atlanta’s production and post industry. “Clark brings years of local and state-wide relationships to Light Iron, and we are pleased to have him on our growing team.”

Cofer most recently represented Crawford Media Services, where he drove sales for their renowned content services to companies like Lionsgate, Fox and Marvel. He currently serves as co-president of the Georgia Production Partnership, and is on the board of directors for the DeKalb County Film and Entertainment Advisory Board.