Author Archives: Randi Altman

Sony to ship Venice camera this month, adds capabilities

Sony’s next-gen CineAlta motion picture camera Venice, which won a postPerspective Impact Award for IBC2017, will start shipping this month. As previously announced, V.1.0 features support for full-frame 24x36mm recording. In addition, and as a result of customer feedback, Sony has added several new capabilities, including a Dual Base ISO mode. With 15+ stops of exposure latitude, Venice will support an additional High Base ISO of 2500 using the sensor’s physical attributes. This takes advantage of Sony’s sensor for low-light performance with high dynamic range — from 6 stops over to 9 stops under 18% middle gray.

This new capability increases exposure indexes at higher ISOs for night exteriors, dark interiors, working with slower lenses or where content needs to be graded in HDR, while maintaining the maximum shadow details. An added benefit within Venice is its built-in 8-step optical ND filter servo mechanism. This can emulate different ISO operating points when in High Base ISO 2500 and also maintains the extremely low levels of noise characteristics of the Venice sensor.

Venice also features new color science designed to offer a soft tonal film look, with shadows and mid-tones having a natural response and the highlights preserving the dynamic range.

Sony has also developed the Venice camera menu simulator. This tool is designed to give camera operators an opportunity to familiarize themselves with the camera’s operational workflow before using Venice in production.

Features and capabilities planned to be available later this year as free firmware upgrades in Version 2 include:
• 25p in 6K full-frame mode will be added in Version 2
• False Color (moved from Version 3 to Version 2)

Venice has an established workflow with support from Sony’s RAW Viewer 3, and third-party vendors including Filmlight Baselight 5, Davinci Resolve 14.3, and Assimilate Scratch 8.6 among others. Sony continues to work closely with all relevant third parties on workflows including editing, grading, color management and dailies.

Another often requested feature is support for high frame rates, which Sony is working to implement and make available at a later date.

Venice features include:
• True 36x24mm full frame imaging based on the photography standard that goes back 100 years
• Built-in 8-step optical ND filter servo mechanism
• Dual Base ISO mode, with High Base ISO 2500
• New color science for appealing skin tones and graceful highlights – out of the box
• Aspect ratio freedom: Full frame 3:2 (1.5:1), 4K 4:3 full height anamorphic, spherical 17:9, 16:9.
• Lens mount with 18mm flange depth opens up tremendous lens options (PL lens mount included)
• 15+ stops of exposure latitude
• User-interchangeable sensor that requires removal of just six screws
• 6K resolution (6048 x 4032) in full frame mode

Review: HP’s ZBook Studio G4 mobile workstation

By Brady Betzel

It seems like each year around this time, I offer my thoughts on an HP mobile workstation and how it serves multimedia professionals. This time I am putting the HP ZBook Studio G4 through its paces. The ZBook Studio line of HP’s mobile workstations seems to fit right in the middle between ease of mobility, durability and power. The ZBook 14u and 15u are the budget series mobile workstations that run Intel i5/i7 processors with AMD FirePro graphics and top out at around $1,600. The ZBook 15 and 17 are the more powerful mobile workstations in the line with the added ability to include Intel Xeon processors, ECC memory, higher-end Nvidia Quadro graphics cards and more. But in the this review we will take the best of all models and jam them into the light and polished ZBook Studio G4.

The HP ZBook Studio G4 I was sent to test out had the following components:
– Windows 10 64 bit
– Intel Xeon 1535M (7th gen) quad-core processor – 3.10GHz with 4.2 Turbo Boost
– 4K UHD DreamColor/15.6-inch IPS screen
– 32GB ECC (2x16GB)
– Nvidia Quadro M1200 (4GB)
– 512GB HP Z Turbo Drive PCIe (MLC)
– 92Whr fast charging battery
– Intel vPro WLAN
– Backlit keyboard
– Fingerprint reader

According to the info I was sent directly from HP, the retail price is $3,510 on hp.com (US webstore). I built a very similar workstation on http://store.hp.com and was able to get the price at $3,301.65 before shipping and taxes, and $3,541.02 with taxes and free shipping. So actually pretty close.

So, besides the natural processor, memory and hard drive upgrades from previous generations, the ZBook Studio G4 has a few interesting updates, including the higher-wattage batteries with fast charge and the HP Sure Start Gen3 technology. The new fast charge is similar to the feature that some products like the GoPro Hero 5/6 cameras and Samsung Galaxy phones have, where they charge quicker than “normal.” The ZBook Studio, as well as the rest of the ZBook line, will charge 50% of your battery in around 30 minutes when in standby mode. Even when using the computer, I was able to charge the first 50% in around 30 minutes, a feature I love. After the initial 50% charge is complete, the charging will be at a normal rate, which wasn’t half bad and only took a few hours to get it to about 100%.

The battery I was sent was the larger of the two options and provided me with an eight-hour day with decent usage. When pushed using an app like Resolve I would say it lasted more like four hours. Nonetheless it lasted a while and I was happy with the result. Keep in mind the batteries are not removable, but they do have a three-year warranty, just like the rest of the mobile workstation.

When HP first told me about its Sure Start Gen 3, I thought maybe it was just a marketing gimmick, but then I experienced its power — and it’s amazing. Essentially, it is a hardware function available on only 7th generation Intel processors that allows the BIOS to repair itself upon identification of malware or corruption. While using the ZBook Studio G4, I was installing some software and had a hard crash (blue screen). I noticed when it restarted the BIOS was running through the Sure Start protocol, and within minutes I was back up and running. It was reassuring and would really set my mind at ease if deciding between a workstation-level solution or retail store computing solution.

You might be asking yourself why you should buy an enterprise-level mobile workstation when you could go buy a laptop for cheaper and almost as powerful at Best Buy or on Amazon? Technically, what really sets apart workstation components is their ability to run 24/7 and 365 days a year without downtime. This is helped by Intel Xeon processors that allow for ECC (Error Correcting Code memory), essentially bits don’t get flipped as they can with non-ECC memory. Or for laymen, like me, ECC memory prevents crashing by fixing errors itself before we see any repercussions.

Another workstation-level benefit is the environmental testing that HP runs the ZBooks through to certify their equipment as military grade, also known as MIL-810G testing. Essentially, they run multiple extreme condition tests such as high and low temperatures, salt, fog and even high-vibration testing like gunfire. Check out a more in-depth description on Wikipedia. Finally, HP prides itself on its ISV (Independent Software Vendors) verification. ISV certification means that HP spends a lot of time working with software vendors like Adobe, Avid, Autodesk and others to ensure compatibility with their products and HP’s hardware so you don’t have to. They even release certified drivers that help to ensure compatibility regularly.

In terms of warranty, HP gives you a three-year limited warranty. This includes on-site service within the Americas, and as mentioned earlier it covers the battery, which is a nice bonus. Much like other warranties it covers problems arising from faulty manufacturing, but not intentional or accidental damage. Luckily for anyone who purchases a Zbook, these systems can take a beating. Physically, the computer weighs in around 4.6lbs and is 18mm thin. It is machined aluminum that isn’t sharp, but it can start to dig into your wrists when typing for long periods. Around the exterior you get two Thunderbolt 3 ports, an HDMI port, three USB 3.1 ports (one on left and two on the right), an Ethernet port and Kensington Lock port. On the right side, you also get a power port — I would love for HP to design some sort of break-away cable like the old Magsafe cables on the MacBook Pros — and there is also a headphone/mic input.

DreamColor Display
Alright, so now I’ll go through some of the post-nerd specs that you might be looking for. Up first is the HP DreamColor display, which is a color-critical viewing solution. With a couple clicks in the Windows toolbar on the lower right you will find a colored flower — click on that and you can immediately adjust the color space you want to view your work in: AdobeRGB, sRGB, BT.709, DCI-P3 or Native. You can even calibrate or backup your own calibration for later use. While most colorists or editors use an external calibrated monitoring solution and don’t strictly rely on your viewing monitor as the color-critical source, using the DreamColor display will get you close to a color critical display without purchasing additional hardware.

In addition, DreamColor displays can play back true 24fps without frame rate conversion. One of my favorite parts of DreamColor is that if you use an external DreamColor monitor through Thunderbolt 3 (not using an SDI card), you can load your color profile onto the second or third monitor and in theory they should match. The ZBook Studio G4 seems to have been built as a perfect DIT (digital imaging technician) solution for color critical work in any weather-challenged or demanding environment without you having to worry about failure.

Speed & Testing
Now let’s talk about speed and how the system did with speed tests. When running a 24TB (6TB-4TB drives) G-Speed ShuttleXL with Thunderbolt 3 from G-Technology, I was able to get write speeds of around 1450MB/s and read speeds of 960MB/s when running the AJA System Test using a 4GB test file running RAID-0. For comparison, I ran the same test on the internal 512GB HP Z Turbo Drive, which had a write speed of 1310MB/s and read speed 1524MB/s. Of course, you need to keep in mind that the internal drive is a PCIe SSD whereas the RAID is 7200RPM drives. Finally, I ran the standard benchmarking app Cinebench R15 that comes from the makers of Maxon Cinema 4D, a 3D modeling app. For those interested, the OpenGL test ran at 138.85fps with a Ref. Match of 99.6%, CPU 470cb and CPU (Single Core) 177cb with an MP Ratio of 2.65x.

I also wanted to run the ZBook through some practical and real-world tests, and I wanted to test the rendering and exporting speeds. I chose to use Blackmagic’s DaVinci Resolve 14.2 software because it is widely used and an easily accessible app for many of today’s multimedia pros. For a non-scientific yet important benchmark, I needed to see how well the ZBook G4 played back R3D files (Red camera files), as well as QuickTimes with typical codecs you would find in a professional environment, such as ProRes and DNxHD. You can find a bunch of great sample R3D files on Red’s website. The R3D I chose was 16 seconds in length, shot on a Red Epic Dragon at 120fps and UHD resolution (3840×2160). To make sure I didn’t have anything skewing the results, I decided to clear all optimized media, if there was any, delete any render cache, uncheck “Use Optimized Media If Available” and uncheck “Performance Mode” just in case that did any voodoo I wasn’t aware of.

First was a playback test where I wanted to see at what decode quality I could playback in at realtime without dropping frames when I performed a slight color correction and added a power window. For this clip, I was able to get it to playback in a 23.98/1080p timeline in realtime when it was set to Half Resolution Good. At Half Resolution Premium I was dropping one or two frames. While playing back and at Full Resolution Premium, I was dropping five or six frames —playing back at around 17 or 18fps. Playing back at Half Resolution Good is actually great playback quality for such a high-quality R3D with all the head room you get when coloring a raw camera file and not a transcode. This is also when the fans inside the ZBook really kicked in. I then exported a ProRes4444 version of the same R3D clip from RedCine-X Pro with the LUT info from the camera baked in. I played the clip back in Resolve with a light color treatment and one power window with no frames dropped. When playing back the ProRes4444 file the fans stayed at a low pitch.

The second test was a simple DNxHD 10-bit export from the raw R3D. I used the DNxHD 175x codec — it took about 29 seconds, which was a little less than double realtime. I then added spatial noise reduction on my first node using the following settings: Mode: Better, Radius: Medium, Spatial Threshold (luma/chroma locked): 25. I was able to playback the timeline at around 5fps and exported the same DNxHD 175x file, but it took about 1 minute 27 seconds, about six times realtime. Doing the same DNxHD 175x export test with the ProRes4444 file, it took about 12 seconds without noise reduction and with the noise reduction about 1 minute and 16 seconds — about 4.5 times realtime. In both cases when using Noise Reduction, the fans kicked on.

Lastly, I wanted to see how Resolve would handle a simple one minute, 1080p, ProRes QuickTime in various tests. I don’t think it’s a big surprise but it played back without dropping any frames with one node of color correction, one power window and as a parallel node with a qualifier. When adding spatial noise reduction I started to get bogged down to about 6fps. The same DNxHD 175x export took about 27 seconds or a little less than half realtime. With the same spatial noise reduction as above it took about 4 minutes and 21 seconds, about 4.3 times realtime.

Summing Up
The HP ZBook Studio G4 is a lightweight and durable enterprise-level mobile workstation that packs the punch of a color-critical 4K (UHD — 3840×2160) DreamColor display, powered by an Nvidia Quadro M1200, and brought together by an Intel Xeon processor that will easily power many color, editing or other multimedia jobs. With HP’s MIL-810G certification, you have peace of mind that even with some bumps, bruises and extreme weather your workstation will work. At under 5lbs and 18mm thin with a battery that will charge 50% in 30 minutes, you can bring your professional apps like DaVinci Resolve, Adobe Premiere and Avid Media Composer anywhere and be working.

I was able to use the ZBook along with some of my Tangent Element color correction panels in a backpack and have an instant color critical DIT solution without the need for a huge cart — all capable of color correction and transcoding. The structural design of the ZBook is an incredibly sturdy, machined aluminum chassis that is lightweight enough to easily go anywhere quickly. The only criticisms are I would often miss the left click of the trackpad leaving me in a right-click scenario, the Bang & Olufsen speakers sound a little tin-like to me and, finally, it doesn’t have a touch bar… just kidding.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

DigitalFilm Tree’s Ramy Katrib talks trends and keynoting BMD conference

By Randi Altman

Blackmagic, which makes tools for all parts of the production and post workflow, is holding its very first Blackmagic Design Conference and Expo, produced with FMC and NAB Show. This three-day event takes place on February 11-13 in Los Angeles. The event includes a paid conference featuring over 35 sessions, as well as a free expo on February 12, which includes special guests, speakers and production and post companies.

Ramy Katrib, founder and CEO of Hollywood-based post house and software development company DigitalFilm Tree, is the keynote speaker for the conference. FotoKem DI colorist Walter Volpatto and color scientist Joseph Slomka will be keynoting the free expo on the 12th.

We reached out to Katrib to find out what he’ll be focusing on in his keynote, as well as pick his brains about technology and trends.

Can you talk about the theme of your keynote?
Resolve has grown mightily over the past few years, and is the foundation of DigitalFilm Tree’s post finishing efforts. I’ll discuss the how Resolve is becoming an essential post tool. And with Resolve 14, folks who are coloring, editing, conforming and doing VFX and audio work are now collaborating on the same timeline, and that is huge development for TV, film and every media industry creative and technician.

Why was it important for you to keynote this event?
DaVinci was part of my life when I was a colorist 25 years ago, and today BMD is relevant to me while I run my own post company, DigitalFilm Tree. On a personal note, I’ve known Grant Petty since 1999 and work with many folks at BMD who develop Resolve and the hardware products we use, like I/O cards and Teranex converters. This relationship involves us sharing our post production pain points and workflow suggestions, while BMD has provided very relevant software and hardware solutions.

Can you give us a sample of something you might talk about?
I’m looking forward to providing an overview of how Resolve is now part of our color, VFX, editorial, conform and deliverables effort, while having artists provide micro demos on stage.

You alluded to the addition of collaboration in Resolve. How important is this for users?
Resolve 14’s new collaboration tools are a huge development for the post industry, specifically in this golden age of TV where binge delivery of multiple episodes at the same time is common place. As the complexity of production and post increases, greater collaboration across multiple disciplines is a refreshing turn — it allows multiple artists and technicians to work in one timeline instead of 10 timelines and round tripping across multiple applications.

Blackmagic has ramped up their NLE offerings with Resolve 14. Do you see more and more editors embracing this tool for editing?
Absolutely. It always takes a little time to ramp up in professional communities. It reminds me of when the editors on Scrubs used Final Cut Pro for the first time and that ushered FCP into the TV arena. We’re already working with scripted TV editors who are in the process of transitioning to Resolve. Also, DigitalFilm Tree’s editors are now using Resolve for creative editing.

What about the Fairlight audio offerings within? Will you guys take advantage of that in any way? Do you see others embracing it?
For simple audio work like mapping audio tracks, creating multi mixes for 5.1 and 7.1 delivery and mapping various audio tracks, we are talking advantage of Fairlight and audio functionality within Resolve. We’re not an audio house, yet it’s great to have a tool like this for convenience and workflow efficiency.

What trends did you see in 2017 and where do you think things will land in 2018?
Last year was about the acceptance of cloud-based production and post process. This year is about the wider use of cloud-based production and post process. In short, what used to be file-based workflows will give way to cloud-based solutions and products.

postPerspective readers can get $50 off of Registration for the Blackmagic Design Conference & Expo by using Code: POST18. Click here to register

Cinesite VFX supervisor Stephane Paris: 860 shots for The Commuter

By Randi Altman

The Commuter once again shows how badass Liam Neeson can be under very stressful circumstances. This time, Neeson plays a mild-mannered commuter named Michael who gets pushed too far by a seemingly benign but not-very-nice Vera Farmiga.

For this Jaume Collet-Serra-directed Lionsgate film, Cinesite’s London and Montreal locations combined to provide over 800 visual effects shots. The studio’s VFX supervisor, Stephane Paris, worked hand in hand with The Commuter’s overall VFX supervisor Steve Begg.

Stephane Paris

The visual effects shots vary, from CG commuters to Neesom’s outfits changing during his daily commute to fog and smog to the climactic huge train crash sequence. Cinesite’s work on the film included a little bit of everything. For more, we reached out to Paris…

How early did Cinesite get involved in The Commuter?
We were involved before principal photography began. I was then on set at Pinewood Studios, just outside London, for about six weeks alongside Steve. They had set up two stages. The first was a single train carriage adapted and dressed to look like multiple carriages — this was used to film all the main action onboard the train. The carriage was surrounded by bluescreen and shot on a hydraulic system to give realistic shake and movement. In one notable shot, the camera pulls back through the entire length of the train, through the carriage walls. A camera rig was set up on the roof and programmed to repeat the same pullback move through each iteration of the carriage — this was subsequently stitched together by the VFX team.

How did you work with the film’s VFX supervisor, Steve Begg?
Cinesite had worked with Steve previously on productions such as Spectre, Skyfall and Inkheart. Having created effects with him for the Bond films, he was confident that Cinesite could create the required high-quality invisible effects for the action-heavy sequences. We interacted with Steve mainly. The client’s approach was to concentrate on the action, performances and story during production, so we lit and filmed the bluescreens carefully, ensuring reflections were minimized and the bluescreens were secure in order to allow creative freedom to Jaume during filming. We were confident that by using this approach we would have what we needed for the visual effects at a later stage.

You guys were the main house on the film, providing a whopping 860 visual effects shots. What was your turnaround like? How did you work for the review and approval process?
Yes, Cinesite was the lead vendor, and in total we worked on The Commuter for about a year, beginning with principal photography in August 2016 and delivering in August 2017. Both our London and Montreal studios worked together on the film. We have worked together previously, notably on San Andreas and more recently on Independence Day: Resurgence, so I had experience of working across both locations. Most of the full CG heavy shots were completed in London, while the environments, some of the full CG shots and 2D backgrounds were completed in Montreal, which also completed the train station sequence that appears early in the film.

My time was split fairly evenly between both locations, so I would spend two to three weeks in London followed by the same amount of time in Montreal. Steve never needed to visit the Montreal studio, but he was very hands-on and involved throughout. He visited our London studio at least twice a week, where we used the RV system to review both the London and Montreal work.

Can you describe the types of shots you guys provided?
We delivered over 860, from train carriage composites right through to entirely CG shots for the spectacular climactic train crash sequence. The crash required the construction of a two-kilometers-long environment asset complete with station, forest, tracks and industrial detritus. Effects were key, with flying gravel, breaking and deforming tracks, exploding sleepers, fog, dust, smoke and fire, in addition to the damaged train carriages. Other shots required a realistic Neeson digi-double to perform stunts.

The teams also created shots near the film’s opening that demonstrate the repetition of Michael’s daily commute. In a poignant shot at Grand Central Station multiple iterations of Michael’s journey are shown simultaneously, with the crowds gradually accelerating around him while his pace remains measured. His outfit changes, and the mood lighting changes to show the passing of the seasons around him.

The shot was achieved with a combination of multiple motion control passes, creation of the iconic station environment using photogrammetry and, ultimately, by creating the crowd of fellow commuters in CG for the latter part of the shot (a seamless transition was required between the live-action passes and the CG people).

Did you do previs? If so, what tools did you use?
No. London’s Nvizible handled all the initial previs for the train crash. Steve Begg blocked everything out and then sent it to Jaume for feedback initially, but the final train crash layout was done by our team with Jaume at Cinesite.

What did you use tool-wise for the VFX?
Houdini’s RBD particle and fluid simulation processes were mainly used, with some Autodesk Maya for falling pieces of train. Simulated destruction of the train was also created using Houdini, with some internal set-up.

What was the most challenging scene or scenes you worked on? 
The challenge was, strangely enough, more about finding proper references that would fit our action movie requirements. Footage of derailing trains is difficult to find, and when you do find it you quickly notice that train carriages are not designed to tear and break the way you would like them to in an action movie. Naturally, they are constructed to be safe, with lots of energy absorption compartments and equipped with auto triggering safe mechanisms.

Putting reality aside, we devised a visually exciting and dangerous movie train crash for Jaume, complete with lots of metal crumbling, shattering windows and multiple large-scale impact explosions.

As a result, the crew had to ensure they were maintaining the destruction continuity across the sequence of shots as the train progressively derails and crashes. A high number of re-simulations were applied to the train and environment destruction whenever there was a change to one of these in a shot earlier in the sequence. Devising efficient workflows using in-house tools to streamline this where possible was key in order to deliver a large number of effects-heavy destruction shots whilst maintaining accurate continuity and remaining responsive to the clients’ notes during the show.

The A-List: The Big Sick director Michael Showalter

By Iain Blair

If life is stranger than fiction, then the acclaimed Oscar-nominated film The Big Sick is Exhibit A. Based on the unlikely real-life courtship between Pakistani comedian/writer Kumail Nanjiani and writer/producer Emily V. Gordon, it tells the story of Kumail (playing a version of himself), who connects with grad student Emily (Zoe Kazan) after one of his standup sets. However, what they thought would be just a one-night stand blossoms into the real thing, which complicates the life that is expected of Kumail by his traditional Muslim parents.

Michael Showalter on set.

When Emily is beset with a mystery illness, and then placed in a medically induced coma, it forces Kumail to navigate the medical crisis with her parents, Beth and Terry (Holly Hunter and Ray Romano), whom he’s never met, while dealing with the emotional tug-of-war between his family and his heart.

The Big Sick is a crowd-pleasing rom-com, written by Gordon and Nanjiani, and produced by Judd Apatow (Trainwreck, This is 40) and Barry Mendel (Trainwreck, The Royal Tenenbaums). But it also deals with drama, racism and the clash of cultures. It was directed by Michael Showalter, who co-wrote and directed the SXSW Audience Award-winning film Hello, My Name is Doris, starring Sally Field. He’s a founding member of the comedy groups The State and Stella, his other film credits include The Baxter, Wet Hot American Summer and They Came Together. He has also co-created numerous television projects, including Wet Hot American Summer: First Day of Camp (Netflix) and Search Party (TBS).

We recently spoke with Showalter about making the film, which has been generating awards buzz (it won AFI’s Movie of the Year award), including an Academy Award nomination for Best Original Screenplay.

Is it true you actually gave Nanjiani his first big TV job, and what did you think when you first read this?
Yes. I’ve known Kumail a long time. I met him in New York in the comedy scene when he first arrived. I love his comedy and sensibility, and I also love him personally. He’s a great guy and we’ve worked together a lot over the years.

We hired him as a staff writer and actor for a Comedy Central series, Michael and Michael Have Issues. I did, and then I cast him in a supporting role in My Name is Doris. Then he sent me this script without saying much about it. I didn’t know it was based on their lives and that all this had happened — I just loved it and everything about it.

Kumail Nanjiani as “Kumail” and Zoe Kazan as “Emily” in THE BIG SICK. Photo by Sarah Shatz.

It’s definitely not your usual rom-com.
I kind of knew what sort of film they wanted it to be — more than just the genre, but the feel of it. I knew the tone they were going for, and that I could do that. So I begged them to hire me, and then I met with Judd and Barry and we spent eight months rewriting it — Kumail, Emily, Judd, Barry and me, and then I got hired and off we went.

The structure is very different from a normal rom-com. How challenging was that, and what sort of film did you set out to make?
You’re right, as usually the second act is where the characters fall in love, then they break up and then they get back together in the third act, but in this all of that happens right in the first act. Then the love interest isn’t even there for the entire second act — which is pretty challenging — and the film gets a lot darker in the second half. So we had to figure out how to keep it moving forward, and I wanted to make a film that’s very funny, first and foremost — a comedy.

But it’s a comedy that walks the line between comedy and drama, even tragedy, and I wanted to give full weight to both elements and not let it get too sentimental. I love theater and some of my favorite plays — like Angels in America — start off as laugh-out-loud comedies and then get really serious, and I love the way they allow those opposites to co-exist.

How involved was Judd Apatow in developing the film?
Judd was very involved in all aspects — tightening up the screenplay, casting and then editing. He’s so experienced, and a great collaborator.

How was the shoot?
We shot on digital, in New York, it was just 25 days, so pretty tight, but it went great thanks to a great line producer and crew. The biggest issue was that it’s set in cold weather and we shot in a heat wave.

Do you like the post process?
I love it. It’s so creative and, of course, it’s where you actually make the movie.

Tell us about working with editor Robert Nassau, who cut My Name is Doris for you and Wanderlust for Judd Apatow. What were the main editing challenges?
As a TV showrunner and film director, my preferred way of working in post is to empower editors, and I rely on them the same way I do with a production designer or DP. I’m not big on micro managing, so I like to give the footage to my editor and then see what they do with it. And I go into production with a very clear game-plan. There’s not a lot of figuring it all out on the day. Then I’m very interested in the editor’s interpretation of the footage, and if it’s working, I give notes and we go along like that. I’m not the sort of director who’s in the room all the time, looking over the editor’s shoulder. I’m much more laissez-faire.

Where did you edit and post this?
Rob has his own editing suite at home in New York, so he did the assembly and director’s cut there while I was in LA. Then he came out to LA for the producer’s cut, and on any given day either me or Kumail, Judd and Barry — or all of us — would be there too, going over specific scenes and beats. But Judd had final cut, and once I’d done my cut, all the post became much more of a group endeavor.

How important are sound and music to you?
They’re both crucial elements and we did it all in LA, working with Judd’s sound team, which does most of his projects. We wanted the sound to be very intimate and very clean, so you feel like you’re with the characters all the time and you hear everything they’re saying in these small, intimate places, as opposed to having a rougher, grittier sound design. Then composer Mike Andrews, who’s scored a lot of projects for Judd, like Bridesmaids, came on board and did a score that really mirrored the emotions of the characters, without over-scoring it.

Who was the colorist and where did you do the DI?
We did the digital intermediate and dailies at Technicolor Postworks NY, and Alex Bickel was the colorist. I’m very involved in all that. The color is very important, and we wanted a very warm, authentic look, as opposed to going more muted and drained-out. We experimented a bit and Alex did a great job.

What’s next?
I’ve got a few things I’m working on but I can’t talk about them yet!


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Behind the Title: Flight School concept artist Ruby Wang

NAME: Ruby Wang

COMPANY: Dallas-based Flight School Studio

CAN YOU DESCRIBE YOUR COMPANY?
Flight School is a place where everybody gets together and comes up with an awesome idea and then works on that idea to create something that is wonderful. We use AR, VR or other tech to tell stories. We do that with our own ideas and we also work with clients.

WHAT’S YOUR JOB TITLE?
Concept Artist.

WHAT DOES THAT ENTAIL?
As a concept artist, I take the idea of the story and visualize it. We create characters, backgrounds and props to support the story. I work at the very beginning of a project. Sometimes when working on a character, we’ll have to explain the character to the 3D modelers and texture artists so they understand how it functions: The skeleton, textures and clothes. That’s so they can create it in 3D for the rest of the team to use.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I think people expect concept artists to make pretty pictures. As concept artists, we’re not necessarily fine-art illustrators. All the pretty pictures we create have to be functional and explain something to other people. We create a lot of not-so-pretty pictures — it’s all about ideas, not about how pretty the picture is. As a concept artist you can’t be afraid to create crappy drawings. As long as it conveys the idea, that is a good piece of concept work.

WHAT TOOLS DO YOU USE?
Mainly Adobe Photoshop. Sometimes we go back to hand drawings — pencil and paper. It just feels good.

WHAT’S YOUR FAVORITE PART OF THE JOB?
My favorite part of the job is that you can actually create and foresee the final look of the project. You create it and then someone actually makes it happen! It is the most satisfying thing to see your work go through everybody’s hands. We all create something together based on my designs. That is really awesome.

WHAT’S YOUR LEAST FAVORITE?
We have to do a lot of technical and specific drawings to explain stuff and make things functional. That part is so mechanical sometimes. You really have to dig in and make sure everything works for the modeler. That part can sometimes be really boring.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
In the early morning. Super early morning when the sunlight hasn’t quite come out yet. I like to wake up early and go jogging or take a walk. Then, I come in to the studio and start my day. If you wake up super early in the morning, you have a lot of time to do other stuff and it’s not even noon yet!

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I think I would be a children’s book illustrator and telling my own stories in different ways. I am very curious, and I like trying to explain what I am learning to other people or even children. But, it is hard to convey what’s in my mind, so I use stories to tell other people my ideas and what I’ve learned. To convey a heavy or dark idea, you have to use art to help other people more easily understand.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
Pretty early on. When I was little I would tell stories to my sister through my art. I would sit in a room, drawing my story, paper by paper and I would force her to listen to my stories every weekend. I always loved telling stories. When I grew up, I tried to find ways to tell stories everyday. So, I studied 3D animation and found out animation is fun, but the visual development part is what I really enjoyed. You can support the story by creating the characters and environments. I could be doing other jobs but I always want to be telling stories.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
I am working on Island Time, a VR game where you crash into an island. It is a really tiny island and you have to survive by catching fish and eating coconuts. You also get to make friends with Carl, a crab who lives there. I had a really great time designing Carl. Designing stuff for Island was really fun because the game is very goofy and cartoony. It’s more fun to see your work created in 3D and in-engine because you can interact with it. It’s really fun.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
It is really hard to choose. I write stories in my spare time, but I rarely show them to other people. The Flight School team and I are working on a project right now that is based around my original story. I am really excited about it because I get to do more than just concept art. It is the first time I’ve shared my stories with a larger team, and it is exciting to craft the world and tell a story from my own perspective. I wish I could talk more about it!

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
One thing is Google Maps – I am super bad at directions. I need Google Maps to survive.
My camera. Sometimes, I’ll see some lighting that is awesome in nature, or a texture I’ve never seen before and want to study it but can’t memorize it. So I need a camera to capture the moment so I can really study it. My iPhone — I have an iPhone 6, and I use it to connect with other people. I used to live in San Francisco and I had a lot of friends. Now that I’m in Texas, I get lonely and my friends take turns calling me each night to keep me company.

DO YOU LISTEN TO MUSIC WHILE YOU WORK?
Yes, but not only music. I also listen to audiobooks and podcasts. For audiobooks, it is usually a book I’ve read in English already so I don’t have to focus too much on the narrator talking. The Harry Potter audiobooks are awesome. Podcasts are whatever my other colleagues recommend to me. Dirty John and S-Town are two I have really enjoyed.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
My work is creating art and drawings, but the way I destress is by creating art and drawings… for my own pleasure. I create stuff for myself. You don’t have to make it functional – it’s just pure fun. If I don’t draw, I feel weird. It’s like when you are addicted to something and you have to do it everyday or you feel like something is wrong with you!

Panavision intros Millennium DXL2 camera with Red Monstro 8K sensor

Panavision is at the BSC Expo 2018 showing its new Millennium DXL2 8K camera. The large-format camera is the heart of a complete imaging ecosystem designed from filmmakers’ perspectives, seamlessly incorporating Panavision’s unmatched optics and camera architecture, the Red Monstro 8K VV sensor and Light Iron color science (LiColor2). The DXL2 builds on the success of the Millennium DXL and benefits from Panavision’s partnership with cinematographers, whose real-world experience and input are manifested in the DXL2’s new offerings.

The Red Monstro 8K VV sensor in the DXL2 offers 16-plus stops of dynamic range with improvements in image quality and shadow detail, a native ISO setting of 1600 and ProRes 4K up to 60 fps. Images are presented on the camera in log format using Light Iron color science. An integrated PX-Pro color spectrum filter custom-made for the DXL offers a significant increase in color separation and dramatically higher color precision to the image. Built-in Preston MDR, 24v power and expanded direct-to-edit features are also standard equipment on the DXL2. An anamorphic flare attachment (AFA) offers a convenient, controllable method of introducing flare with spherical lenses.

New to the DXL2, LiColor2 streamlines the 8K pipeline, smoothly handling the workflow and offering convenient and quick access to high-quality RAW images, accommodating direct-to-edit without delays.

Since its introduction, the DXL has been used on over 20 feature films, and countless television shows, commercials and music videos. Oscar-nominee John Schwartzman, ASC, photographed two features on the DXL and is among those who have tested the DXL2, providing input that has guided the design. He’s currently planning to shoot his next feature with it.

DXL2 cameras are available to rent exclusively from Panavision.

Z Cam, Assimilate reduce price of S1 VR camera/Scratch VR bundle

The Z Cam S1 VR camera/WonderStitch/Assimilate Scratch VR Z bundle, an integrated VR production workflow offering, is now $3,999, down from $4,999.

The Z Cam S1/Scratch VR Z bundle provides acquisition via Z Cam’s S1 pro VR camera, stitching via the WonderStitch software and a streamlined VR post workflow via Assimilate’s realtime Scratch VR Z tools.

Here are some details:
If streaming live 360 from the Z Cam S1 through Scratch VR Z, users can take advantage of realtime features such as inserting/composting graphics/text overlays, including animations, and keying for elements like greenscreen — all streaming live to Facebook Live 360.

Scratch VR Z can be used to do live camera preview, prior to shooting with the S1. During the shoot, Scratch VR Z is used for dailies and data management, including metadata. It’s a direct connect to the PC and then to the camera via a high-speed Ethernet port. Stitching of the imagery is done in Z Cam’s WonderStitch, now integrated into Scratch VR Z, then comes traditional editing, color grading, compositing, multichannel audio from the S1 or adding external ambisonic sound, finishing and then publishing to all final online or stand-alone 360 platforms.

The Z Cam S1/Scratch VR Z bundle is available now.

Neil Anderson upped to colorist at Lucky Post, talks inspiration

Neil Anderson has been promoted to colorist at Dallas’ Lucky Post after joining the company in 2013 right out of film school. Anderson’s projects include national brands such as Canada Dry, Costa, TGI Friday’s, The Salvation Army and YETI. His latest feature work was featured at the 2018 Sundance Film Festival in Augustine Frizzell’s comedy, Never Goin’ Back. He works on Blackmagic Resolve 14.

Anderson’s interest in cameras and color science inspired his career as a colorist, but he says his inspiration changes all the time, depending on where his mind is at. “Sometimes I’ll see a commercial on TV and think, ‘Wow. There was great care put into that piece, I wonder how they did that?’ Then I’ll go back and rewatch it over and over again trying to pick it apart and see what I can glean. Or if I’m developing a specific workflow/look and I’m struggling to get exactly what I’m after, I’ll find interesting frames from films that pop into my head for guidance.”

In terms of colorists who inspire him, Anderson points to Peter Doyle (who most recently colored Darkest Hour). “He’s incredibly technical, and he exploits his thorough knowledge of color science to guide films through a color pipeline in an almost algorithmic fashion. I’m at awe by his expertise and, in a way, use him as a model of how I want to approach projects.

“I also admire Steven Scott for maybe the opposite reason. While technical like Peter, to me he approaches projects with a painter’s eye first. I’ve heard him say the best inspiration is to simply pay attention to the world around us. His work and approach remind me to branch out artistically just as much as I try technically.”

When he thinks about cinematographers, Roger Deakins comes to mind. “He’s a DP that really captures almost the entire look of the film in-camera, and the color grading is supposedly very simple and minor in the end. This is because he and his colorist work hand in hand before the shoot, developing a look they’ll see and use on set,” explains Anderson. “This workflow is a critical tool for modern colorists, and Roger is a reminder of the importance of having a good relationship with your DP.”

Tim Nagle, a Lucky Post finishing artist, describes Anderson as a “quiet and ardent observer of life’s design, from light and shadow on a city street to bold color blocks in a Wong Kar-wai film. His attention to detail and process are implacable.”

“Color is like magic to most people; the process feels like happenstance and you don’t realize how it’s supporting the narrative until it’s not,” concludes Anderson. “I love the challenge of each project and mining through color theory to achieve the best results for our clients.”

Quick Chat: ArsenalCreative’s new VFX supervisor Mike Wynd

VFX supervisor Mike Wynd has joined ArsenalCreative from MPC, where he spent eight years in a similar role. Over the years, Wynd has worked on many high-profile projects for directors such as Rupert Sanders, Noam Murro and Adam Berg. He has also won a number of industry awards, including a Silver Clio and a Gold British Arrow, as well as a VES Award nomination.

Wynd started his career in Melbourne, Australia, working for Computer Pictures before landing at Images Post in Auckland, New Zealand. Eight years later, he headed back to Australia to serve as head of 3D at Garner MacLennan Design, where he worked on many high-end animations and effects, including the first Lord of the Rings movie. After that studio was bought out, Wynd joined Digital Pictures. Next, he assisted in establishing a new 3D/design team at FSM. After that he relocated to Los Angeles, where he worked for Moving Pixels. Later, he took on the role of VFX supervisor for MPC.

We reached out to Wynd to ask him a few questions about being a VFX supervisor:

What drew you to VFX supervision?
The thing I enjoy most about VFX supervision is the problem solving. From how best to shoot what we require to seamlessly integrating our effects, through to the actual approach and tools that we’ll employ in post production. We’ve always got a finite amount of time and money with which to produce our work and a little bit of alternative thinking can go a long way to achieve higher quality and more efficient results.

How early do you like being brought onto a project?
I’d prefer to be bought in on a project ideally from day one. Especially on a complex VFX project, being involved alongside production means that we, as a team, can troubleshoot many aspects of the job, that in the long run, will mean savings in cost and time as well as higher quality results. It also gives time for relationships to be formed between VFX and production so that on the shoot the VFX team is seen as an asset rather than a hindrance.

Do you go on set? Why is that so important?
I do go on set… a lot! I have been very lucky over the years to travel to some incredible locations all over the world. It’s so important because this is where the foundations are laid for a successful job. Being able to see how and why footage is shot the way it is goes a long way toward finding solutions to post issues.

Actually seeing the environment of a scene can offer clues that may help in significantly reducing any issues that may arise once the footage is back in the studio. And, of course, there’s the nuts and bolts of capturing set information, along with color and lighting references critical to the project. And probably the most important reason to be on-set is to act as the conduit connecting production and post. The two parties often act so separately from one another, yet each is only doing half the job.

Have you worked on anything at ArsenalCreative yet?
It’s early days for me at ArsenalCreative, but thus far I’ve worked on a Chevy presentation for the motor shows and a series of pod shots for Lexus.

If you had one piece of advice for someone about to embark on a project that involves VFX, what would it be?
Ha! Get VFX involved from day one!