Category Archives: Cinematography

Creating the look for Netflix’s The End of the F***ing World

By Adrian Pennington

Content in 8K UHD won’t be transmitting or streaming its way to a screen anytime soon, but the ultra-high-resolution format is already making its mark in production and post. Remarkably, it is high-end TV drama, rather than feature films, that is leading the way. The End of The F***ing World is the latest series to pioneer a workflow that gives its filmmakers a creative edge.

Adapted from the award-winning graphic novels of Charles Forsman, the dark comedy is an eight-part co-production between Netflix and UK broadcaster Channel 4. The series invites viewers into the confused lives of teen outsiders James (Alex Lawther) and Alyssa (Jessica Barden), as they decide to escape from their families and embark on a road trip to find Alyssa’s estranged father.

Executive producer and director Jonathan Entwistle and cinematographer Justin Brown were looking for something special stylistically to bring the chilling yet humorous tale to life. With Netflix specifying a 4K deliverable, the first critical choice was to use 8K as the dominant format. Brown selected the Red Weapon 8K S35 with the Helium sensor.

In parallel, the filmmakers turned to colorist Toby Tomkins, co-founder of East London grading and finishing boutique studio Cheat, to devise a look and a workflow that would maximize the rich, detailed color, as well as the light information from the Red rushes.

“I’ve worked with Justin for about 10 years, since film school,” explains Tomkins. “Four years ago he shot the pilot for The End of The F***ing World with Jon, which is how I first became involved with the show. Because we’d worked together for so long, I kind of already knew what type of thing they were looking for. Justin shot tests on the Red Weapon, and our first job was to create a 3D LUT for the on-set team to refer to throughout shooting.”

Expert at grading commercials, and with feature-length narrative Sixteen (also shot by Justin Brown) under his belt, this was Tomkins’ first responsibility for an episodic TV drama, and he relished the challenge. “From the beginning, we knew we wanted to work completely RAW at 7K/8K the whole way through and final output at 4K,” he explains. “We conformed to the R3D rushes, which were stored on our SSD NAS. This delivered 10Gbps bandwidth to the suite.”

With just 10 days to grade all the episodes, Tomkins needed to develop a rich “Americana” look that would not only complement the dark narrative but would also work across a range of locations and timescales.

“Our references were films such as No Country for Old Men and Revolutionary Road (both lensed by Roger Deakins, BSC, ASC), which have richness and denseness to them, with skin tones almost a leathery red, adding some warmth to the characters,” he says. “Despite being shot at British locations — with British weather — we wanted to emulate something filmic and American in style. To do this we wanted quite a dense film print look, using skin tones you would find on celluloid film and a shadow and highlight roll-off that you would find in films, as opposed to British TV.”

Cheat used its proprietary film emulation to create the look. With virtually the whole series shot in 8K, the Cheat team invested in a Quad GPU Linux Resolve workstation, with dual Xeon processors, to handle the additional processing requirements once in the DaVinci Resolve finishing suite.

“The creative benefits of working in 8K from the Red RAW images are huge,” says Tomkins. “The workstation gave us the ability to use post-shoot exposure and color temperature settings to photorealistically adjust and match shots and, consequently, more freedom to focus on the finer details of the grade.

“At 8K the noise was so fine in size that we could push the image further. It also let us get cleaner keys due to the over-sample, better tracking, and access to high-frequency detail that we could choose to change or adapt as necessary for texture.”

Cheat had to conform more than 50 days of rushes and 100TBs of 7K and 8K RAW material spread across 40 drives, a process that was completed by Cheat junior colorist Caroline Morin in Resolve.

“After the first episode, the series becomes a road movie, so almost each new scene is a new location and lighting setup,” Tomkins explains. “I tried to approach each episode as though it was its own short film and to establish a range of material and emotion for each scene and character, while also trying to maintain a consistent look that flowed throughout the series.”

Tomkins primarily adjusted the RAW settings of the material in Resolve and used lift, gamma and gain to adjust the look depending on the lighting ratios and mood of the scenes. “It’s very easy to talk about workflow, tools and approach, but the real magic comes from creative discussions and experimentation with the director and cinematographer. This process was especially wonderful on this show because we had all worked together several times before and had developed a short hand for our creative discussion.

“The boundaries are changing,” he adds. “The creative looks that you get to work and play with are so much stronger on television now than they ever used to be.”

NAB 2018: A closer look at Firefly Cinema’s suite of products

By Molly Hill

Firefly Cinema, a French company that produces a full set of post production tools, premiered Version 7 of its products at NAB 2018. I visited with co-founder Philippe Reinaudo and head of business development Morgan Angove at the Flanders Scientific booth. They were knowledgeable and friendly, and they helped me to better understand their software.

Firefly’s suite includes FirePlay, FireDay, FirePost and the brand-new FireVision. All the products share the same database and Éclair color management, making for a smooth and complete workflow. However, Reinaudo says their programs were designed with specific UI/UXs to better support each product’s purpose.

Here is how they break down:
FirePlay: This is an on-set media player that supports most any format or file. The player is free to use, but there’s a paid option to include live color grading.

FireDay: Firefly Cinema’s dailies software includes a render tree for multiple versions and supports parallel processing.

FirePost: This is Firefly Cinema’s proprietary color grading software. One of its features was a set of “digital filters,” which were effects with adjustable parameters (not just pre-set LUTs). I was also excited to see the inclusion of curve controls similar to Adobe Lightroom’s Vibrance setting, which increases the saturation of just the more muted colors.

FireVision: This new product is a cloud-based review platform, with smooth integration into FirePost. Not only do tags and comments automatically move between FirePost and FireVision, but if you make a grading change in the former and hit render, the version in FireVision automatically updates. While other products such as Frame.io have this feature, Firefly Cinema offers all of these in the same package. The process was simple and impressive.

One of the downsides of their software package is its lack of support for HDR, but Raynaud says that’s a work in progress. I believe this will likely begin with ÉclairColor HDR, as Reinaudo and his co-founder Luc Geunard are both former Éclair employees. It’s also interesting that they have products for every step after shooting except audio and editing, but perhaps given the popularity of Avid Media Composer, Adobe Premiere and Avid Pro Tools, those are less of a priority for a young company.

Overall, their set of products was professional, comprehensive and smooth to operate, and I look forward to seeing what comes next for Firefly Cinema.


Molly Hill is a motion picture scientist and color nerd, soon-to-be based out of San Francisco. You can follow her on Twitter @mollymh4.

Cinna 4.13

Director HaZ Dulull on his sci-fi offering The Beyond

By Randi Altman

Director Hasraf “HaZ” Dulull is no stranger to making movies. Before jumping into writing and directing short sci-fi films, he was a visual effects supervisor and producer. His short film resume includes Project Kronos, I.R.I.S. and Sync. Recently, his first feature film, The Beyond, was released by Gravitas Ventures.

When I first met HaZ a few years back, we were both at an Adobe event — on a canal boat in Amsterdam during IBC. We started talking about visual effects, the industry and his drive to make movies.

This Brit is friendly, intelligent and incredibly hands-on in all aspects of what he does. His latest is The Beyond, which he describes as “a cerebral science-fiction feature film that blends the realism of documentary with the fantastical, ‘big idea’ nature of the science-fiction films of today.” The Beyond tells the story of a ground-breaking mission that sent astronauts — modified with advanced robotics — through a newly discovered wormhole known as the Void. When the mission returns unexpectedly, the space agency races to discover what the astronauts encountered on their first-of-its-kind interstellar space journey.

HaZ on set

HaZ was so hands-on that he provided some of the film’s visual effects and edited the film. Here is the trailer. If you like what you see, the film is available for purchase or rent on most digital platforms.

When I reached out to HaZ to talk about The Beyond, he was in Vancouver working on an eight-part TV series for Disney called Fast Layne. “I directed episodes 1 and 2, and am currently directing episodes 7 and 8,” he says. “The beauty of starting and ending the series is it allowed me to set the show’s style and tone.”

It seems he can’t sit still! Let’s find out more about how he works and The Beyond

Can you talk about prepro? How much of that included visual effects prepro?
Most people who know me will say I’m obsessed with prep. I had about six months of hardcore prep on this, from doing little storyboards, known as HaZ-Grams, right through to previs of the key sequences.

But even during the script-writing stage (six months before actual prep), I was coming up with visuals to support the ideas I was writing in the script. Sometimes I would knock up a test VFX scene just to see how complex it would be to create this idea I was writing in the script. Prep worked hand in hand with the script development and the budgeting of the film. The film was self-financed and later additional financing came in (during post production of the film), so I wanted to ensure everything was mapped out technically, as there was no “fix it in post” scenarios in this film — I wouldn’t allow it.

During location scouting, I would have my iPhone with me and shoot a bunch of footage and still imagery, so when I went back home I could write those locations into the script to make them work with the scenarios depicted in the film.

As part of prep we actually shot a test scene to really see if this mocku-mentary format would work to tell a grounded sci-fi story. This was also used to attract crew and other casting to the project, as well as get distributors primed early on.

Many shots from that test actually made it into the final movie —I wasn’t kidding about not wasting any budget or material on this production! So prep pretty much helped shape the script too, as I knew I wasn’t in the financial position to write stuff and then go and build it. I had to reverse engineer it in a way. In the film we have tons of locations, such as the Space Centre with actual real rockets. We also had a team in Iceland shooting alien landscapes, and we even shot some scenes in Malaysia to give the film a global feel — with each of those opportunities the script was tweaked to make full use of those location opportunities we had.

You shot with Blackmagic cameras. Was that your choice? The DP’s? Have you shot with these before?
From the start, I knew we were going to shoot on Blackmagic cameras. This was mainly down to the fact my DP Adam Batchelor — who had shot Sync with me and the proof of concept tests we did for this film — was a Blackmagic advocate and knew the cameras inside out, but more importantly he was able to get cinematic imagery using those cameras.

Blackmagic was very supportive of the film and have been of my career since my short films, so they came on as one of the executive producers on the film. No one had ever shot a full feature film using just the Blackmagic cameras. We also then used a Resolve pipeline to delivery. So The Beyond is the perfect case study for it.

Can you talk about that workflow? Any hiccups? 
I think the only hiccups were the fact we were using a beta version of Resolve 14, so there were the expected crashes, etc. That would usually be seen as risky on a feature film, but luckily we didn’t have a distributor in place with a release date, so the risk was minimal.

The good thing was I would generate an error log report from Resolve and send it over to Blackmagic, who would then instantly send out a new patch. So we were looked after rather than being left on our own to scream at the monitor.

We stuck with a Pro Res 4444 QuickTime workflow for all material from footage to VFX renders, and enabled proxy on the fly within Resolve. This was great as it meant I was working with the highest-resolution imagery within Resolve, and it was fairly fast too. Things started to slow down when I had multiple layers of VFX and composites/groups, which I then had to render out as a new clip and bring back in.

How did you and the DP develop the look you wanted? Any scenes stick out that you guys worked on?
I was very fortunate to get Max Horton, who had worked on films like Gravity, to come onboard to grade this film at the Dolby Vision lab in London’s Soho. We also did an HDR version of the film, which I think is the first indie film to have an HDR treatment done to it.

We had three to four days of grading with Max, and I was in the room with him the whole time. This was because I had already done a first-pass temp grade myself while editing the film in the beta version of Resolve 14. This made the workflow as simple as exporting my Resolve file and then the material hand-over to Max, who would load up the Resolve file, link up the material and work from there.

Max kept everything photographically like a documentary but with a slight cinematic flair to it. The big challenge was matching all the various sources of material from the various Blackmagic cameras (Ursa Mini Pro, the Production Camera and the Pocket Camera) to the DJI Osmo, drone footage and stock footage.

How many VFX shots were there? Who did them?
There were around 750 visual effects shots. I designed all the VFX scenes and handled a huge portion of the compositing myself, including invisible effects shots, all the space scenes, alien planet scenes, memory scenes and tons more — this would not have been possible without the support of my VFX team who worked on their assigned sequences and shots and also generated tons of CGI assets for me to use to create my shots in comp.

My VFX team members included my long-time collaborator John Sellings, who was the VFX supervisor for all the Human 2.0 sequences. Filmmore, in Amsterdam and Brussels, handled Human 2.0 scenes in the transcode bay with in-house VFX supervisor Hans Van Helden. London’s Squint VFX handled the Human 2.0 scenes in wake-up lab. Charles Wilcocks was the Human 2.0 CG supervisor who worked on the shape and look of the Human 2.0.

Hussin Khan looked after the Malaysian team, which provided rotoscoping support and basic comps. Dan Newlands was our on-set tracking supervisor. He ensured all data was captured correctly and supervised anything tracking related in the Human 2.0 scenes.

Another long-time collaborator was Andrea Tedeschi, who handled the CG and comps for the spacecraft carrier at the end of the film, as well as rendering out the CG astronaut passes. Rhys Griffith handled the rigging for the Human 2.0 characters in Maya, and also looked after the CG passes for the alpha Human 2.0 scenes using Blender. Aleksandr Uusmees provided all the particles and simulation rendered out of Houdini as CG passes/elements, which I then used to create the wormhole effects, alien spheres and other shots that needed those elements.

JM Blay designed and created the standalone motion graphics sequences to visualize the Human 2.0 medical procedure, as well as mission trajectory graphics. He also created several “kit-bash” graphics assets for me to use, including UI graphics, from his After Effects files.

Territory Studio created the awesome end titles and credits sequence, which you can read more about on their site.

As a VFX pro yourself, do you find that you are harder to please because it’s your wheelhouse?
Oh boy. Ask any of the VFX guys on the team and they will say I am a beast to work with because I am hands-on, and also I know how long things take. But on the flip side that had its advantages, as they knew they were not going to get revision after revision, because with each brief I also presented a proposed methodology, and made sure we locked down on that first before proceeding with the shots.

Was this your biggest directing job to date? Can you talk about any surprises?
It wasn’t my biggest directing job to date, as during post production of The Beyond my second sci-fi film Origin Unknown (starring Katee Sackhoff from Battlestar Galactica, The Flash) was green-lit and that had its own set of challenges. We can talk more about that when the film is released theatrically and VOD later this year via Kew Media.

This was, however, my biggest producing job to date; there were so many logistics and resources to manage whilst directing too. The cool thing about the way we made this film was that most of the crew were on my short films, including some of the key cast too, so we embraced the guerrilla nature of the production and focused on maximizing our resources to the fullest within the time and budget constraints.

What did you learn on this film that will help on your next?
The other hat I was wearing was the producer hat, and one thing I had to embrace was the sheer amount of paperwork! I may have taken the same filmmaking approach as I did on my short films — guerrilla and thinking outside the box technically and creatively— but making a commercial feature film, I had to learn to deal with things like clearances, E&O (errors and omission) insurance, chain of title, script report and a whole bunch of paperwork required before a distributor will pick up your film.

Thankfully my co-producer Paula Crickard, who is currently wrapping post on Terry Gilliam’s Don Quixote, came in during the post stage of the film and helped.

The other thing I learned was the whole sales angle — getting a reputable distributor on board to sell the film in all worldwide territories and how to navigate that process with rights and IP and more contracts etc. The advise I got from other filmmakers is getting the right distributor is a big part in how your film will be released, and to me it was important the distributor was into the film and not just the trailer, but also what their marketing and sales strategy were. The Beyond was never designed to be a theatrical film and therefore I wanted someone that had a big reach in the VOD world through their brand, especially since The Beyond doesn’t have big-name actors in there.

What was the most challenging scene or scenes? Why and how did you overcome those challenges?
The Human 2.0 scenes were the most challenging because they had to look photoreal due to it being a documentary narrative. We did first try and do it all in-camera using a built suit, but it wasn’t achieving the look we wanted and the actors would feel uncomfortable with it, and also to do it properly with practical would cost a fortune. So we went with a full-digital solution for the Human 2.0 bodies, by having the actors wear a tight grey suit with tracking markers on and we restricted our camera moves for simplicity to enable object tracking to work as accurately as possible. We also shot multiple reference footage from all angles to help with match moving. Having an on set-tracking supervisor helped massively and allowed us to make this happen within the budget, while looking and feeling real.

Our biggest issue came when our actress made very tiny movements due to breathing in close-up shots. Because our Human 2.0 was human consciousness in a synthetic shell, breathing didn’t make sense and we began making up for it by freezing the image or doing some stabilization, which ended up being nearly impossible for the very close-up shots.

In the end, I had to think outside the box, so I wrote a few lines into the script that explained that the Human 2.0 was breathing to make it psychologically more acceptable to other humans. Those two lines saved us weeks and possibly months of time.

Being a VFX movie you would expect us to use a form of greenscreen or bluescreen, but we didn’t — in fact, the only stage used was for the “white room” astronaut scene, which was shot over at Asylum FX in London. There was an actor wearing an astronaut suit in a bright photography room, and we used brightly exposed lighting to give a surreal feeling. We used VFX to augment it.

As a writer and a director, how was it seeing your vision through from start to finish.
It didn’t really hit me until I watched the press screening of it at the Dolby Vision office in Soho. It had the fully mixed sound and the completed grade. I remember looking across at my DP and other team members thinking, “Whoa! It looks and feels like a feature film, and we did that in a year!”

You edited the film yourself?
Yes, I was the editor on the film! I shoot for the edit. I started off using Adobe Premiere CC for the early offline and then quickly moved over to Resolve 14, where I did the majority of the editing. It was great because I was doing a lot of online editorial tasks like stabilizing, basic VFX, pan and scans, as well as establishing temp looks while editing. So in a way there was no offline and online editorial, as it was all part of one workflow. We did all our deliverables out of Resolve 14, too.


Director Kay Cannon on her raunchy comedy Blockers

By Iain Blair

At a time when women are increasingly breaking down barriers in Hollywood, writer/director Kay Cannon is helping lead the charge. The director of Universal’s new film, Blockers, got her start at such comedic training grounds as The Second City, The iO West Theater and The ComedySportz Theatre.

Kay Cannon

While writing and performing around Chicago, she met Tina Fey, a fellow Second City alumna. When Fey began 30 Rock, Cannon joined the creative team and worked her way up from staff writer to supervising producer on the show. She’s a three-time Primetime Emmy-nominated writer, twice for Outstanding Comedy Series and once for Outstanding Writing for a Comedy Series. She has also won three Writers Guild of America Awards, as well as a Peabody, all for her work on 30 Rock.

Cannon, who also served as a co-executive producer on New Girl, a consulting producer on Cristela and co-produced the hit feature Baby Mama, received rave reviews for her debut screenplay for the film Pitch Perfect, and she wrote and co-produced the hit sequels. She served as the executive producer, creator and showrunner of the Netflix series Girlboss, based on Sophia Amoruso’s best-selling autobiography, which starred Britt Robertson.

Now, with the new release Blockers, Cannon — one of only a handful of women ever to direct an R-rated comedy for a big studio — has stepped behind the camera and made an assured and polished directorial debut with this coming-of-age sex comedy that takes one of the most relatable rites of passage and upends a long-held double standard. When three parents discover their daughters’ pact to lose their virginity at prom, they launch a covert one-night operation to stop the teens from sealing the deal.

The film stars Leslie Mann (The Other Woman, This is 40), John Cena (Trainwreck, Sisters) and Ike Barinholtz (Neighbors, Suicide Squad). It is produced by Evan Goldberg, Seth Rogen and James Weaver, under their Point Grey Pictures banner (Neighbors, This is the End), alongside Jon Hurwitz and Hayden Schlossberg (Harold & Kumar) and Chris Fenton (47 Ronin).

Cannon leds an accomplished behind-the-scenes team, including director of photography Russ Alsobrook (Forgetting Sarah Marshall, Superbad), production designer Brandon Tonner-Connolly (The Big Sick) and editor Stacey Schroeder (The Disaster Artist).

I recently talked to Cannon about making the bawdy film, which generated huge buzz at SXSW, and her advice for other aspiring women directors.

This is like a long-overdue female take on such raunchy-but-sweet male comedies as American Pie and Superbad. Was that the appeal of this story for you?
When I read the script, I really connected on two levels. I was a teenager who lost her virginity, and I’m also the mother of a daughter, and while she was only two at the time, it made me think about her and what might happen to her in the future. And that’s scary, and I saw how parents can lose their minds.

How did you first envision the film?
I grew up in a small town in the Chicago area and I was inspired by John Hughes and all his great teen comedies. I could really relate to them, and I felt he was speaking to me, that he really got that world and the way it looked. I wanted to do that too, and show how people really live, and I wanted it to feel real and grounded — but then I was also going to go to a very crazy place and got very silly. (Laughs) That was very important to me, because I wanted to make people laugh really hard, but also feel emotion.

Did you always want to direct?
It wasn’t always my dream. That’s shifted over the years. I started off wanting to be an actor on a sitcom, then writing one and then wanting to have my own show, which happened with Girlboss, so that was my focus for the past few years. To be honest, I’d kind of do movies when TV didn’t work out for me. A pilot didn’t happen, so I wrote Pitch Perfect, and then did Pitch 2 when another pilot didn’t go.

How did you prepare for directing your first film?
Being the showrunner on Girlboss was great training because I could shadow all the directors and watch them work, and I felt definitely ready to direct a film.

What was the biggest surprise of directing for the first time?
I pretty much knew what to expect — and that there will always be surprises on the day and stuff you could never have anticipated. You just have to work through it and keep going.

How tough was the shoot?
It was hard. We shot in Atlanta for nine weeks, and the last five were nights, and that’s very tough. I had a very long script to squeeze into the shoot. But Russ, my DP, was a huge help. We’d worked together before on New Girl, and he’s so experienced; he really guided me through it all.

Where did you do the post?
All in LA. We started at Sunset Gower, and then we took a break and did some reshoots in January, and then finished at Pivotal Post in Burbank.

Do you like post?
When I was at Girlboss I’d never experienced post before, so I was really afraid and uncomfortable with the whole process. It was so new and a bit daunting to me, especially as a writer. I loved writing and shooting, but it took me a while to get comfortable with post. But once I did, I loved it, and now it’s my favorite thing. I’d spend the night there if I could! As they say, it’s where you actually make the film and where the real magic happens.

Kay Cannon on set directing Leslie Mann and John Cena.

Your editor was Stacey Schroeder (pilot for The Last Man on Earth, for which she got an editing Emmy nom). How did that relationship work?
We’d worked together before on Girlboss, and we have a great partnership. She’s like my right-hand, and we’re automatically on the same page. We very rarely disagree, and what’s so great is that she’s extremely opinionated and has no poker face. I’m the same way. So it’s very refreshing to sit there and discuss material and any problems without taking anything personally. I really appreciate her honesty.

What were the biggest editing challenges?
Trying to balance the raucous comedy stuff with the serious emotions underneath, and dealing with some of the big set pieces. The whole puking scene was difficult as we shot three times the material you see, and there was a whole drug thing, and it was very long and it just wasn’t working. We previewed it a couple of times and it was seen as a poor man’s Bridesmaids. (Laughs) And then I saw Baby Driver and it hit us — what if we put the whole scene to music? And that was so much fun and it suddenly all worked.

Resistance VFX did the visual effects shots, and there seemed to be quite a few, considering it’s a comedy. What was involved?
You’re right. Usually comedies don’t have that many and we had a significant amount, including the puke scenes, and then all the computer stuff and the emojis. And then they did such a great job with all Amy Mann’s tears at the end. I really loved working with VFX, and the fact that they can create all this magic in post. I’d be constantly amazed. “Can you do that?” They’d sigh and go, “Yes Kay, we can do that, no problem.” It was a real education for me.

Where did you do the DI?
At Technicolor, and I was pretty involved along with Ross. I loved that whole process too. Again, it’s the magic of post. (Maxine Gervais was the supervising senior colorist. She used a FilmLight Baselight 5.)

Did it turn out the way you hoped?
Absolutely.

Do you want to direct again?
Definitely, if I get another chance.

What’s next?
I’m writing a movie for Sony — another comedy — and I’ve got a bunch of projects percolating.

What advice would you give to any woman wanting to direct?
Do the work, and don’t quit when it gets hard. I think a lot of women quit before the magic happens, and there were times when I wanted to quit, but you can’t. You have to keep going.

Kay Cannon Photo Credit: Quantrell D. Colbert (c) 2018 Universal. 


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.


B&H expands its NAB footprint to target multiple workflows

By Randi Altman

In a short time, many in our industry will be making the pilgrimage to Las Vegas for NAB. They will come (if they are smart) with their comfy shoes, Chapstick and the NAB Show app and plot a course for the most efficient way to see all they need to see.

NAB is a big show that spans a large footprint, and typically companies showing their wares need to pick a hall — Central, South Lower, South Upper or North. This year, however, The Studio-B&H made some pros’ lives a bit easier by adding a booth in South Lower in addition to their usual presence in Central Hall.

B&H’s business and services have grown, so it made perfect sense to Michel Suissa, managing director at The Studio-B&H, to grow their NAB presence to include many of the digital workflows the company has been servicing.

We reached out to Suissa to find out more.

This year B&H and its Studio division are in the South Lower. Why was it important for you guys to have a presence in both the Central and South Halls this year?
The Central Hall has been our home for a long time and it remains our home with our largest footprint, but we felt we needed to have a presence in South Hall as well.

Production and post workflows merge and converge constantly and we need to be knowledgeable in both. The simple fact is that we serve all segments of our industry, not just image acquisition and camera equipment. Our presence in image and data centric workflows has grown leaps and bounds.

This world is a familiar one for you personally.
That’s true. The post and VFX worlds are very dear to me. I was an editor, Flame artist and colorist for 25 years. This background certainly plays a role in expanding our reach and services to these communities. The Studio-B&H team is part of a company-wide effort to grow our presence in these markets. From a business standpoint, the South Hall attendees are also our customers, and we needed to show we are here to assist and support them.

What kind of workflows should people expect to see at both your NAB locations?
At the South Hall, we will show a whole range of solutions to show the breadth and diversity of what we have to offer. That includes VR post workflow, color grading, animation and VFX, editing and high-performance Flash storage.

In addition to the new booth in South Hall, we have two in Central. One is for B&H’s main product offerings, including our camera shootout, which is a pillar of our NAB presence.

This Studio-B&H booth features a digital cinema and broadcast acquisition technology showcase, including hybrid SDI/IP switching, 4K studio cameras, a gyro-stabilized camera car, the most recent full-frame cinema cameras, and our lightweight cable cam, the DynamiCam.

Our other Central Hall location is where our corporate team can discuss all business opportunities with new and existing B2B customers

How has The Studio-B&H changed along with the industry over the past year or two?
We have changed quite a bit. With our services and tools, we have re-invented our image from equipment providers to solution providers.

Our services now range from system design to installation and deployment. One of the more notable recent examples is our recent collaboration with HBO Sports on World Championship Boxing. The Studio-B&H team was instrumental in deploying our DynamiCam system to cover several live fights in different venues and integrating with NEP’s mobile production team. This is part of an entirely new type of service —  something the company had never offered its customers before. It is a true game-changer for our presence in the media and entertainment industry.

What do you expect the “big thing” to be at NAB this year?
That’s hard to say. Markets are in transition with a number of new technology advancements: machine learning and AI, cloud-based environments, momentum for the IP transition, AR/VR, etc.

On the acquisition side, full frame/large sensor cameras have captured a lot of attention. And, of course, HDR will be everywhere. It’s almost not a novelty anymore. If you’re not taking advantage of HDR, you are living in the past.


Red’s new Gemini 5K S35 sensor offers low-light and standard mode

Red Digital Cinema’s new Gemini 5K S35 sensor for its Red Epic-W camera leverages dual-sensitivity modes, allowing shooters to use standard mode for well-lit conditions or low-light mode for darker environments.

In low-light conditions, the Gemini 5K S35 sensor allows for cleaner imagery with less noise and better shadow detail. Camera operators can easily switch between modes through the camera’s on-screen menu with no down time.

The Gemini Mini 5K S35 sensor offers an increased field of view at 2K and 4K resolutions compared to the higher-resolution Red Helium sensor. In addition, the sensor’s 30.72mm x 18mm dimensions allow for greater anamorphic lens coverage than with Helium or Red Dragon sensors.

“While the Gemini sensor was developed for low-light conditions in outer space, we quickly saw there was so much more to this sensor,” explains Jarred Land, president of Red Digital Cinema. “In fact, we loved the potential of this sensor so much, we wanted to evolve it to for broader appeal. As a result, the Epic-W Gemini now sports dual-sensitivity modes. It still has the low-light performance mode, but also has a default, standard mode that allows you to shoot in brighter conditions.”

Built on the compact DSMC2 form factor, this new camera and sensor combination captures 5K full-format motion at up to 96fps along with data speeds of up to 275MB per second. Additionally, it supports Red’s IPP2 enhanced image processing pipeline in-camera. Like all of Red’s DSMC2 cameras, the Epic-W is able to shoot simultaneous Redcode RAW and Apple ProRes or Avid DNxHD/HR recording and adheres to Red’s “Obsolescence Obsolete” program, which allows current Red owners to upgrade their technology as innovations are unveiled. It also lets’ them move between camera systems without having to purchase all new gear.

Starting at $24,500, the new Red Epic-W with Gemini 5K S35 sensor is available for purchase now. Alternatively, Weapon Carbon Fiber and Red Epic-W 8K customers will have the option to upgrade to the Gemini sensor at a later date.


Color plays key role in Ava DuVernay’s A Wrinkle in Time

Color itself plays a significant role in the fantasy feature A Wrinkle in Time. To help get the look she wanted, director Ava DuVernay chose Mitch Paulson of Hollywood’s Efilm to handle final color grading — the two worked together on the Oscar-nominated film Selma. Wrinkle, which was shot by Tobias Schliessler, captures the magical feel of lead character Meg’s (Storm Reid) journey through time and space.

The film has several different looks. The rather gloomy appearance of the Meg’s difficult life on earth is contrasted by the incredibly vibrant appearance of the far-off planets she’s taken to by a trio of magical women — played by Oprah Winfrey, Reese Witherspoon and Mindy Kaling.

Paulson recalls DuVernay’s thinking. “Ava talked a bit about The Wizard of Oz, where the early scenes are in black and white and then it goes into color. She didn’t want to take things that far but that informed the overall approach. The parts on Earth at the beginning are somewhat desaturated and depressed looking. Meg lives with her mom because her dad has mysteriously disappeared. She has issues at school and is constantly bullied.”

To fine-tune this idea, Paulson built curves inside of Autodesk Lustre 2017. These were designed to desaturate many colors, particularly blues and greens, without significantly altering skin tones. Then he went through shot-by-shot to refine this even further using Lustre’s Diamond Keyer function to isolate certain colors (such as the blue in a row of school lockers) and further pull out some saturation. “I keyed almost everything,” he says, “grass, skies, water. I’d have at least three to four keys per shot.”

Then, as Meg and friends travel to the other planets, Paulson says, “We did the opposite and used curves and keying to make things brighter and more saturated. As soon as they jump to the first planet, you feel the difference.” He also points out that the time travelers find themselves in a large grassy field — a scene for which he isolated the real green of the New Zealand location and brought the saturation beyond anything we’d be used to seeing in real life.

“By manipulating the chrominance softness and tolerance diamonds of the keyer, you can quickly and easily isolate the color for a key. I find it more effective than an HSL tracker,” he explains. The colorist also finds system’s shapes tool to be very effective. “I use it all the time to isolate a portion of an actor’s face or hair to create a subtle idea of light there that sometimes really help as a final step to making a VFX shot blend perfectly with the background.”

Not all the planets the characters travel to are happy places, and Paulson worked with the filmmakers to create some variations on the color themes. The planet, Camazotz is an evil place, he says. “That’s not obvious at first but we sort of queue it right away by making it look just a bit off. For example, we took almost all the green out of the plants.”

Besides the standard d-cinema version, Paulson also did trim passes for Dolby Cinema 2D, Dolby Cinema 3D (14 foot-lamberts) and standard 3D (3.5 foot-lamberts), each of which requires additional refinement. “Tobias likes the really deep blacks you can get in the Dolby Cinema version, but we didn’t want to push things too far. It’s already so colorful and saturated that when we’d open the files in PQ (Dolby’s Perceptual Quantizer) we pulled a lot of it back so that it has an extra pop, but it still is very similar to the way the P3 version looks.”

Dailies were colored at Efilm by Adrian DeLude on Colorfront OSD. Files were conformed in Autodesk Flame. Deluxe’s Portal service was the tool used by VFX vendors to locate and download camera-original material and upload iterations of shots, which were then integrated onto Paulson’s Lustre timeline as the final grade proceeded.


Video: Red Sparrow colorist David Hussey talks workflow

After film school, and working as an assistant editor, David Hussey found himself drawn to color grading. He then became an assistant to a colorist and his path was set.

In a recent video interview with the now senior colorist at LA’s Company 3, Hussey talks about the differences of coloring a short-form project versus a long-form film and walks us through his workflow on Red Sparrow, which stars Jennifer Lawrence as a Russian ballerina-turned-spy.

Please watch…


Netflix’s Altered Carbon: the look, the feel, the post

By Randi Altman

Netflix’s Altered Carbon is a new sci-fi series set in a dystopian future where people are immortal thanks to something called “stacks,” which contain their entire essence — their personalities, their memories, everything. The one setback is that unless you are a Meth (one of the rich and powerful), you need to buy a “sleeve” (a body) for your stack, and it might not have any resemblance to your former self. It could be a different color, a different sex, a different age, a different everything. You have to take what you can get.

Based on a 2002 novel by Richard K. Morgan, it stars Swedish actor Joel Kinnaman.

Jill Bogdanowicz

We reached out to the show’s colorist, Jill Bogdanowicz, as well as post producer Allen Marshall Palmer to find out more about the show’s varied and distinctive looks.

The look has a very Blade Runner-type feel. Was that in homage to the films?
Bogdanowicz: The creators wanted a film noir look. Blade Runner is the same genre, but the show isn’t specifically an homage to Blade Runner.

Palmer: I’ll leave that for fans to dissect.

Jill, can you talk about your process? What tools did you use?
Bogdanowicz: I designed a LUT to create that film noir look before shooting. I actually provided a few options, and they chose my favorite one and used it throughout. After they shot everything and I had all 10 episodes in my bay, I got familiar with the content, wrapped my head around the story and came up with ideas to tell that story with color.

The show covers many different times and places so scenes needed to be treated visually to show audiences where the story is and what’s happened. I colored both HDR (Dolby Vision) and SDR passes using DaVinci Resolve.

I worked very closely with both DPs — Martin Ahlgren and Neville Kidd — in pre-timing the show, and they gave me a nice idea of what they were looking for so I had a great starting point. They were very close knit. The entire team on this project was an absolute pleasure, and it was a great creative collaboration, which comes through in the final product of the show.

The show is shot and posted like a feature and has a feature feel. Was that part of your marching orders?
Bogdanowicz: I’m primarily a features colorist, so I’m very familiar with the film noir look and heavy VFX, and that’s one reason I was included on this project. It was right up my alley.

Palmer: We approached Altered Carbon as a 10-part feature rather than a television series. I coined the term “feature episodic entertainment,” which describes what we were aspiring to — destination viewing instead of something merely disposable. In a world with so many viewing options, we wanted to command the viewer’s full attention, and fans are rewarded for that attention.

We were very concerned about how images, especially VFX, were going to look in HDR so we had weekly VFX approval sessions with Jill, our mastering colorist, in her color timing bay.

Executive producers and studio along with the VFX and post teams were able to sit together — adjusting color corrections if needed before giving final approval on shots. This gave us really good technical and creative quality control. Despite our initial concerns about VFX shots in HDR, we found that with vendors like Double Negative and Milk with their robust 16-bit EXR pipelines we weren’t “breaking” VFX shots when color correcting for HDR.

How did the VFX affect the workflow?
Bogdanowicz: Because I was brought on so early, the LUT I created was shared with the VFX vendors so they had a good estimation of the show’s contrast. That really helped them visualize the look of the show so that the look of the shots was pretty darn close by the time I got them in my bay.

Was there a favorite scene or scenes?
Bogdanowicz: There are so many spectacular moments, but the emotional core for me is in episode 104 when we see the beginning of the Kovacs and Quell love story in the past and how that love gives Kovacs the strength to survive in the present day.

Palmer: That’s a tough question! There are so many, it’s hard to choose. I think the episode that really jumps out is the one in which Joel Kinnaman’s character is being tortured and the content skips back and forth in time, changes and alternates between VR and reality. It was fun to create a different visual language for each space.

Can you talk about challenges in the process and how you overcame them?
Bogdanowicz: The show features a lot of VFX and they all need to look as real as possible, so I had to make sure they felt part of the worlds. Fortunately, VFX supervisor Everett Burrell and his team are amazing and the VFX is top notch. Coming up with different ideas and collaborating with producers James Middleton and Laeta Kalogridis on those ideas was a really fun creative challenge. I used the Sapphire VFX plugin for Resolve to heavily treat and texture VR looks in different ways.

Palmer: In addition to the data management challenges on the picture side, we were dealing with mixing in Dolby Atmos. It was very easy to get distracted with how great the Atmos mix sounds — the downmixes generally translated very well, but monitoring in 5.1 and 2.0 did reveal some small details that we wanted to adjust. Generally, we’re very happy with how both the picture and sound is translating into viewer’s homes.

Dolby Vision HDR is great at taking what’s in the color bay into the home viewing environment, but there are still so many variables in viewing set-ups that you can still end up chasing your own tail. It was great to see the behind the scenes of Netflix’s dedication to providing the best picture and sound quality through the service.

The look of the AI hotel was so warm. I wanted to live there. Can you talk about that look?
Bogdanowicz: The AI hotel look was mostly done in design and lighting. I saw the warm practical lights and rich details in the architecture and throughout the hotel and ran with it. I just aimed to keep the look filmic and inviting.

What about the look of where the wealthy people lived?
Bogdanowicz: The Meth houses are above the clouds, so we kept the look very clean and cool with a lot of true whites and elegant color separation.

Seems like there were a few different looks within the show?
Bogdanowicz: The same LUT for the film noir look is used throughout the show, but the VR looks are very different. I used Sapphire to come up with different concepts and textures for the different VR looks, from rich quality of the high-end VR to the cheap VR found underneath a noodle bar.

Allen, can you walk us through the workflow from production to post?
Palmer: With the exception of specialty shots, the show was photographed on Alexa 65 — mostly in 5K mode, but occasionally in 6.5K and 4K for certain lenses. The camera is beautiful and a large part of the show’s cinematic look, but it generates a lot of data (about 1.9TB/hour for 5K) so this was the first challenge. The camera dictates using the Codex Vault system, and Encore Vancouver was up to the task for handling this material. We wanted to get the amount of data down for post, so we generated 4096×2304 ProRes 4444XQ “mezzanine” files, which we used for almost all of the show assembly and VFX pulls.

During production and post, all of our 4K files were kept online at Efilm using their portal system. This allowed us fast, automated access to the material so we could quickly do VFX pulls, manage color, generate 16-bit EXR frames and send those off to VFX vendors. We knew that time saved there was going to give us more time on the back end to work creatively on the shots so the Portal was a very valuable tool.

How many VFX shots did you average per episode? Seems like a ton, especially with the AI characters. Who provided those and what were those turnarounds like?
Palmer: There were around 2,300 visual effects shots during this season — probably less than most people would think because we built a large Bay City street inside a former newspaper printing facility outside of Vancouver. The shot turnaround varied depending on the complexity and where we were in the schedule. We were lucky that something like episode 1’s “limo ride” sequence was started very early on because it gave us a lot of time to refine our first grand views of Bay City. Our VFX supervisor Everett Burrell and VFX producer Tony Meagher were able to get us out in front of a lot of challenges like the amount of 3D work in the last two episodes by starting that work early on since we knew we would need those shots from the script and prep phase.

Video: Fotokem DI colorist Walter Volpatto on The Last Jedi and color

 

Last month Blackmagic held its first Expo, and one of the keynote speakers was Fotokem colorist Walter Volpatto. He was born in Italy and grew up on a farm, quite a long way from his current life in Los Angeles.

Volpatto originally got into this industry as a broadcast engineer, but his path continued, and when computers became more a part of this world, he started learning about photography and how computers interact with images.

“I was in the right place at the right moment,” he says. “I was lucky enough to be working with color timers who helped train me and my eye to the color, the image, the feeling and the world they were trying to create. So I was technical first and artistic second and that creates a unique blend.”

And the power of color? Volpatto says, “It’s kind of like when in the 1800s impressionists took over the world of painting; it’s the same now with the colorists. They can create a look that was impossible in-camera, and colorists can now give life to what the camera captured and every shade in between. I’m more on the naturalistic side, but it’s difficult because you have to be able to create what the client wants, but do it in a way that doesn’t step on their photography.”

We were lucky enough to get some quality time with Volpatto — we asked him about his recent high-profile color work on Star Wars: The Last Jedi, how he got started as a colorist and more…