Author Archives: Randi Altman

The A-List: Director Marc Webb on The Only Living Boy in New York

By Iain Blair

Marc Webb has directed movies both big and small. He made his feature film debut in 2009 with the low-budget indie rom-com (500) Days of Summer, which was nominated for two Golden Globes. He then went on to helm two recent The Amazing Spider-Man blockbusters, the fourth and fifth films in the multi-billion-dollar-grossing franchise.

Webb isn’t just about the big screen. He directed and executive produced the TV series Limitless for CBS, based on the film starring Bradley Cooper, and is currently an executive producer and director of the CW’s Golden Globe-winning series Crazy Ex-Girlfriend.

Marc Webb

Now Webb, whose last film was the drama Gifted, released earlier this year, has again returned to his indie roots with the film The Only Living Boy in New York, starring Jeff Bridges, Kate Beckinsale, Pierce Brosnan, Cynthia Nixon, Callum Turner and Kiersey Clemons.

Set in New York City, the sharp and witty coming-of-age story focuses on a privileged young man, Thomas Webb (Turner) — the son of a publisher and his artistic wife — who has just graduated from college. After moving from his parents’ Upper West Side apartment to the Lower East Side, he befriends his neighbor W.F. (Bridges), an alcoholic writer who dispenses worldly wisdom alongside healthy shots of whiskey.

Thomas’ world begins to shift when he discovers that his long-married father (Brosnan) is having an affair with a seductive younger woman (Beckinsale). Determined to break up the relationship, Thomas ends up sleeping with his father’s mistress, launching a chain of events that will change everything he thinks he knows about himself and his family.

Collaborating with Webb from behind the scenes was director of photography Stuart Dryburgh (Gifted, The Secret Life of Walter Mitty, Alice Through the Looking Glass) and editor Tim Streeto (The Squid and the Whale, Greenberg, Vinyl).

I recently talked with Webb about making the film, and if there is another superhero movie in his future.

What was the appeal of making another small film on the heels of Gifted?
They were both born out of a similar instinct, an impulse to simplify after doing two blockbusters. I had them lined up after Spider-Man and the timing worked out.

 

What sort of themes were you interested in exploring through this?
I think of it as a fable, with a very romantic image of New York as the backdrop, and on some levels it’s an examination of honesty or coming clean. I think people often cover a lot in trying to protect others, and that’s important in life where you have various degrees of truth-telling. But at some point you have to come clean, and that can be very hard. So it’s about that journey for Thomas, and regardless of the complex nature of his desires, he tries to be honest with himself and those close to him.

Can you talk about the look of New York in this film and working with your DP, who also shot your last film?
It was the same DP, but we had the opposite approach and philosophy on this. Gifted was very naturalistic with a diverse color palette and lots of hand-held stuff. On this we mostly kept the camera at eye level, as if it was a documentary, and it has more panache and “style” and more artifice. We restrained the color palette since New York has a lot of neutral tones and people wear a lot of black, and I wanted to create a sort tribute to the classic New York films I love. So we used a lot of blacks and grays, and almost no primary colors, to create an austere look. I wanted to push that but without becoming too stylized; that way when you do see a splash of red or some bright color, it has more impact and it becomes meaningful and significant. We also tried to do a lot of fun shots, like high angle stuff that gives you this objective POV of the city, making it a bit more dramatic.

Why did you shoot 35mm rather than digital?
I’ve always loved film and shooting in film, and it also suited this story as it’s a classic medium. And when you’re projecting digital, sometimes there’s an aliasing in the highlights that bothers me. It can be corrected, but aesthetically I just prefer film. And everyone respects film on set. The actors know you’re not just going to redo takes indefinitely. They feel a little pressure about the money.

Doesn’t that affect the post workflow nowadays?
Yes, it does, as most post people are now used to working in a purely digital format, but I think shooting analog still works better for a smaller film like this, and I’ve had pretty good experiences with film and the labs. There are more labs now than there were two years ago, and there are still a lot of films being shot on film. TV is almost completely digital now, with the odd exception of Breaking Bad. So the post workflow for film is still very accessible.

Where did you do the post?
We did the editing at Harbor Picture Company, and all the color correction at Company 3 with Stefan Sonnenfeld, who uses Blackmagic Resolve. C5’s Ron Bochar was the supervising sound editor and did a lot of it at Harbor. (For the mix at Harbor he employed D-Command using Avid Pro Tools as a mix engine.)

Do you like the post process?
I really love post… going through all the raw footage and then gradually molding it and shaping it. And because of my music video background I love working on all the sound and music in particular.  I started off as an editor, and my very first job in the business was re-cutting music videos for labels and doing documentaries and EPKs. Then I directed a bunch of music videos and shorts, so it’s a process that I’m very familiar with and understand the power of. I feel very much at home in an edit bay, and I edit the movie in my head as I shoot.

You edited with Tim Streeto. Tell us how it worked.
I loved his work on The Squid and the Whale, and I was anxious to work with him. We had a cool relationship. He wasn’t on the set, and he began assembling as I shot, as we had a fairly fast post schedule. I knew what I wanted, so it wasn’t particularly dramatic. We made some changes as we went, but it was pretty straightforward. We had our cut in 10 weeks, and the whole post was just three or four months.

What were the main challenges of editing this?
Tracking the internal life of the character and making sure the tone felt playful. We tried several different openings to the film before we settled on the voiceover that had this organic raison-d’etre, and that all evolved in the edit.

The Spider-Man films obviously had a huge number of very complex visual effects shots. Did you do many on this film?
Very few. Phosphene in New York did them. We had the opening titles and then we did some morphing of actors from time to time in order to speed things up. (Says Phosphene CEO/EP Vivian Connolly, “We designed an animated the graphic opening sequence of the film — using Adobe Photoshop and After Effects — which was narrated by Jeff Bridges. We commissioned original illustrations by Tim Hamilton, and animated them to help tell the visual story of the opening narration of the film.”)

It has a great jazzy soundtrack. Can you talk about the importance of music and sound?
The score had to mingle with all the familiar sounds of the concrete jungle, and we used a bit of reverb on some of the sounds to give it more of a mystical quality. I really love the score by Rob Simonsen, and my favorite bit is the wedding toast sequence. We’d temped in waltzes, but it never quite worked. Then Rob came up with this tango, and it all just clicked.

I also used some Dave Brubeck, some Charlie Mingus and some Moondog — he was this well-known blind New York street musician I’ve been listening to a lot lately — and together it all evoked the mood I wanted. Music is so deeply related to how I started off making movies, so music immediately helps me understand a scene and how to tell it the best way, and it’s a lot of fun for me.

How about the DI? What look did you go for?
It was all about getting a very cool look and palette. We’d sometimes dial up a bit of red in a background, but we steered away from primary colors and kept it a bit darker than most of my films. Most of the feel comes from the costumes and sets and locations, and Stefan did a great job, and he’s so fast.

What’s next? Another huge superhero film?
I’m sure I’ll do another at some point, but I’ve really enjoyed these last two films. I had a ball hanging out with the actors. Smaller movies are not such a huge risk, and you have more fun and can be more experimental.

I just did a TV pilot, Extinct, for CBS, which was a real fun murder mystery, and I’ll probably do more TV next.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Epic Games, Nvidia team on enterprise solutions for VR app developers

Epic Games and Nvidia have teamed up to offer enterprise-grade solutions to help app developers create more immersive VR experiences.

To help ease enterprise VR adoption, Epic has integrated Nvidia Quadro professional GPUs into the test suite for Unreal Engine 4, the company’s realtime toolset for creating applications across PC, console, mobile, VR and AR platforms. This ensures Nvidia technologies integrate seamlessly into developers’ workflows, delivering results for everything from CAVEs and multi-projection systems through to enterprise VR and AR solutions.

“With our expanding focus on industries outside of games, we’ve aligned ourselves ever more closely with Nvidia to offer an enterprise-grade experience,” explains Marc Petit, GM of the Unreal Engine Enterprise business. “Nvidia Quadro professional GPUs empower artists, designers and content creators who need to work unencumbered with the largest 3D models and datasets, tackle complex visualization challenges and deliver highly immersive VR experiences.”

The Human Race

One project that has driven this effort is Epic’s collaboration with GM and The Mill on The Human Race, a realtime short film and mixed reality experience featuring a configurable Chevrolet Camaro ZL1, which was built using Nvidia Quadro pro graphics.

Says Bob Pette, VP of professional visualization at Nvidia, “Unreal, from version 4.16, is the first realtime toolset to meet Nvidia Quadro partner standards. Our combined solution provides leaders in these markets the reliability and performance they require for the optimum VR experience.”

Aubrey Woodiwiss joins Carbon LA as lead colorist

Full-service creative studio Carbon has added colorist Aubrey Woodiwiss as senior colorist/director of color grading to their LA roster. He comes to Carbon with a portfolio that includes spots for Dulux, NBA 2K17, Coors and Honda, and music videos for Beyonce’s Formation, Jay-Z’s On to the Next One and the Calvin Harris/Rihanna song This Is What You Came For.

“I’m always prepared to bend and shape myself around the requirements of the project at hand, but always with a point of view,” says Woodiwiss, who honed his craft at The Mill and Electric Theater Collective during his career.

“I am fortunate to have been able to collate various experiences within life and work, and have been able to reapply them back into the work I do. I vary my approach and style as required, and never bring a labored or autonomous look to anything. Communication is key, and a large part of what I do as well,” he adds.

Woodiwiss’ focus on creativity began during his adolescence, when he experimented with editing films on VHS and later directed and cut homemade music videos. Woodiwiss started his pro career in the early 2000s at Framestore, first as a runner and then as a digital lab operator, helping to pioneer film scanning and digital film tech on Harry Potter, Love Actually, Bridget Jones Diary and Troy.

While he’s traversed creative mediums from film, commercials, music videos and on over 3,000 projects, he maintains a linear mindset when it comes to each project. “I approach them similarly in that I try to realize the vision set by the creators of the project,” says Woodiwiss, who co-creative directed the immersive mixed media art exhibition and initiative mentl, with Pulse Films director Ben Newman and producer Craig Newman (Radiohead, Nick Cave).

Carbon’s addition of the FilmLight Baselight color system and Woodiwiss as senior colorist to its established VFX/design services hammers home the studio’s move toward a complete post solution in Los Angeles. Plans are in the works to offer remote grading capabilities from any of the Carbon offices in NY, Chicago and Los Angeles.

Behind the Title: Milk VFX supervisor Jean-Claude Deguara

NAME: Jean-Claude Deguara

COMPANY: Milk Visual Effects (@milkvfx)

CAN YOU DESCRIBE YOUR COMPANY?
Milk is an independent visual effects company. We create complex sequences for high-end television and feature films, and we have studios in London and Cardiff, Wales. We launched four years ago and we pride ourselves on our friendly working culture and ability to nurture talent.

WHAT’S YOUR JOB TITLE?
VFX Supervisor

WHAT DOES THAT ENTAIL?
Overseeing the VFX for feature films, television and digital content — from the initial concept development right through to delivery. This includes on-set supervision and supervising teams of artists.

HOW DID YOU TRANSITION TO VFX?
I started out as a runner at London post house Soho 601, and got my first VFX role at The Hive. Extinct was my very first animation job — a Channel 4 dinosaur program. Then I moved to Mill Film to work on Harry Potter.

HOW LONG HAVE YOU BEEN WORKING IN VFX?
Over 20 years.

HOW HAS THE VFX INDUSTRY CHANGED IN THE TIME YOU’VE BEEN WORKING?
In London, the industry has grown from what was a small cottage industry in the late 1990s, pre Harry Potter. More creative freedom has come with the massive technology advances.

When I started out TV was all done on Digi Beta, but now, with the quality of cameras, television VFX has caught up with film.

Dinosaurs in the Wild

Being able to render huge amounts of data in the cloud as we did recently on our special venue project Dinosaurs in the Wild means that smaller companies can compete better.

DID A PARTICULAR FILM INSPIRE YOU ALONG THIS PATH IN ENTERTAINMENT?
Ray Harryhausen’s films inspired me as child. We’d watch at Christmas in awe!

I was also massively inspired by Spitting Image. I applied for a job only to find they were about to close down.

DID YOU GO TO FILM SCHOOL?
No, I went to Weston Supermare College of Art (Bristol University) and studied for an art and design diploma. Then I went straight into the film/TV industry as a runner.

WHAT’S YOUR FAVORITE PART OF THE JOB?
The creative planning and building of shots and collaborating with all the other departments to try to problem solve in order to tell the best possible story visually, within the budget.

WHAT’S YOUR LEAST FAVORITE?
Answering emails, and the traveling.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
If I didn’t do this, I’d like to be directing.

Sherlock

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
Sherlock (BBC/Hartswood), Dinosaurs in the Wild, Jonathan Strange & Mr Norrell (BBC) and Beowulf (ITV). I am currently VFX supervisor on Good Omens (BBC/Amazon).

WHAT IS THE PROJECT/S THAT YOU ARE MOST PROUD OF?
It’s really hard to choose, but the problem solving on Sherlock has been very satisfying. We’ve created invisible effects across three series.

WHAT TOOLS DO YOU USE DAY TO DAY?
I previz in Autodesk Maya.

WHERE DO YOU FIND INSPIRATION NOW?
Scripts. I get creative “triggers” when I’m reading scripts or discussing a new scene or idea, which for me, pushes it to the next level. I also get a lot of inspiration working with my fellow artists at Milk. They’re a talented bunch.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I’d go to the gym, but the pub tends to get in the way!

Red intros Monstro 8K VV, a full-frame sensor

Red Digital Cinema has a new cinematic full-frame sensor for its Weapon cameras called the Monstro 8K VV. Monstro evolves beyond the Dragon 8K VV sensor with improvements in image quality including dynamic range and shadow detail.

This newest camera and sensor combination, Weapon 8K VV, offers full-frame lens coverage, captures 8K full-format motion at up to 60fps, produces ultra-detailed 35.4 megapixel stills and delivers incredibly fast data speeds — up to 300MB/s. And like all of Red’s DSMC2 cameras, Weapon shoots simultaneous RedCode RAW and Apple ProRes or Avid DNxHD/HR recording. It also adheres to the company’s Obsolescence Obsolete — its operating principle that allows current Red owners to upgrade their technology as innovations are unveiled and move between camera systems without having to purchase all new gear.

The new Weapon is priced at $79,500 (for the camera brain) with upgrades for carbon fiber Weapon customers available for $29,500. Monstro 8K VV will replace the Dragon 8K VV in Red’s line-up, and customers that had previously placed an order for a Dragon 8K VV sensor will be offered this new sensor beginning now. New orders will start being fulfilled in early 2018.

Red has also introduced a service offering for all carbon fiber Weapon owners called Red Armor-W. Red Armor-W offers enhanced and extended protection beyond Red Armor, and also includes one sensor swap each year.

According to Red president Jarred Land, “We put ourselves in the shoes of our customers and see how we can improve how we can support them. Red Armor-W builds upon the foundation of our original extended warranty program and includes giving customers the ability to move between sensors based upon their shooting needs.”

Additionally, Red has made its enhanced image processing pipeline (IPP2) available in-camera with the company’s latest firmware release (V.7.0) for all cameras with Helium and Monstro sensors. IPP2 offers a completely overhauled workflow experience, featuring enhancements such as smoother highlight roll-off, better management of challenging colors, an improved demosaicing algorithm and more.

Editing 360 Video in VR (Part 2)

By Mike McCarthy

In the last article I wrote on this topic, I looked at the options for shooting 360-degree video footage, and what it takes to get footage recorded on a Gear 360 ready to review and edit on a VR-enabled system. The remaining steps in the workflow will be similar regardless of which camera you are using.

Previewing your work is important so, if you have a VR headset you will want to make sure it is installed and functioning with your editing software. I will be basing this article on using an Oculus Rift to view my work in Adobe Premiere Pro 11.1.2 on a Thinkpad P71 with an Nvidia Quadro P5000 GPU. Premiere requires an extra set of plugins to interface to the Rift headset. Adobe acquired Mettle’s Skybox VR Player plugin back in June, and has made it available to Creative Cloud users upon request, which you can do here.

Skybox VR player

Skybox can project the Adobe UI to the Rift, as well as the output, so you could leave the headset on when making adjustments, but I have not found that to be as useful as I had hoped. Another option is to use the GoPro VR Player plugin to send the Adobe Transmit output to the Rift, which can be downloaded for free here (use the 3.0 version or above). I found this to have slightly better playback performance, but fewer options (no UI projection, for example). Adobe is expected to integrate much of this functionality into the next release of Premiere, which should remove the need for most of the current plugins and increase the overall functionality.

Once our VR editing system is ready to go, we need to look at the footage we have. In the case of the Gear 360, the dual spherical image file recorded by the camera is not directly usable in most applications and needs to be processed to generate a single equirectangular projection, stitching the images from both cameras into a single continuous view.

There are a number of ways to do this. One option is to use the application Samsung packages with the camera: Action Director 360. You can download the original version here, but will need the activation code that came with the camera in order to use it. Upon import, the software automatically processes the original stills and video into equirectangular 2:1 H.264 files. Instead of exporting from that application, I pull the temp files that it generates on media import, and use them in Premiere. (C:\Users\[Username]\Documents\CyberLink\ActionDirector\1.0\360) is where they should be located by default. While this is the simplest solution for PC users, it introduces an extra transcoding step to H.264 (after the initial H.265 recording), and I frequently encountered an issue where there was a black hexagon in the middle of the stitched image.

Action Director

Activating Automatic Angle Compensation in the Preferences->Editing panel gets around this bug, while trying to stabilize your footage to some degree. I later discovered that Samsung had released a separate Version 2 of Action Director available for Windows or Mac, which solves this issue. But I couldn’t get the stitched files to work directly in the Adobe apps, so I had to export them, which was yet another layer of video compression. You will need a Samsung activation code that came with the Gear 360 to use any of the versions, and both versions took twice as long to stitch a clip as its run time on my P71 laptop.

An option that gives you more control over the stitching process is to do it in After Effects. Adobe’s recent acquisition of Mettle’s SkyBox VR toolset makes this much easier, but it is still a process. Currently you have to manually request and install your copy of the plugins as a Creative Cloud subscriber. There are three separate installers, and while this stitching process only requires Skybox Suite AE, I would install both the AE and Premiere Pro versions for use in later steps, as well as the Skybox VR player if you have an HMD to preview with. Once you have them installed, you can use the Skybox Converter effect in After Effects to convert from the Gear 360’s fisheye files to the equirectangular assets that Premiere requires for editing VR.

Unfortunately, Samsung’s format is not one of the default conversions supported by the effect, so it requires a little more creativity. The two sensor images have to be cropped into separate comps and with plugin applied to each of them. Setting the Input to fisheye and the output to equirectangular for each image will give the desired distortion. A feathered mask applied to the circle to adjust the seam, and the overlap can be adjusted with the FOV and re-orient camera values.

Since this can be challenging to setup, I have posted an AE template that is already configured for footage from the Gear 360. The included directions should be easy to follow, and the projection, overlap and stitch can be further tweaked by adjusting the position, rotation and mask settings in the sub-comps, and the re-orientation values in the Skybox Converter effects. Hopefully, once you find the correct adjustments for your individual camera, they should remain the same for all of your footage, unless you want to mask around an object crossing the stitch boundary. More info on those types of fixes can be found here. It took me five minutes to export 60 seconds of 360 video using this approach, and there is no stabilization or other automatic image analysis.

Video Stitch Studio

Orah makes Video-Stitch Studio, which is a similar product but with a slightly different feature set and approach. One limitation I couldn’t find a way around is that the program expects the various fisheye source images to be in separate files, and unlike AVP I couldn’t get the source cropping tool to work without rendering the dual fisheye images into separate square video source files. There should be a way to avoid that step, but I couldn’t find one. (You can use the crop effect to remove 1920 pixels on one side or the other to make the conversions in Media Encoder relatively quickly.) Splitting the source file and rendering separate fisheye spheres adds a workflow step and render time, and my one-minute clip took 11 minutes to export. This is a slower option, which might be significant if you have hours of footage to process instead of minutes.

Clearly, there are a variety of ways to get your raw footage stitched for editing. The results vary greatly between the different programs, so I made video to compare the different stitching options on the same source clip. My first attempt was with a locked-off shot in the park, but that shot was too simple to see the differences, and it didn’t allow for comparison of the stabilization options available in some of the programs. I shot some footage from a moving vehicle to see how well the motion and shake would be handled by the various programs. The result is now available on YouTube, fading between each of the five labeled options over the course of the minute long clip. I would categorize this as testing how well the various applications can handle non-ideal source footage, which happens a lot in the real world.

I didn’t feel that any of the stitching options were perfect solutions, so hopefully we will see further developments in that regard in the future. You may want to explore them yourself to determine which one best meets your needs. Once your footage is correctly mapped to equirectangular projection, ideally in a 2:1 aspect ratio, and the projects are rendered and exported (I recommend Cineform or DNxHR), you are ready to edit your processed footage.

Launch Premiere Pro and import your footage as you normally would. If you are using the Skybox Player plugin, turn on Adobe Transmit with the HMD selected as the only dedicated output (in the Skybox VR configuration window, I recommend setting the hot corner to top left, to avoid accidentally hitting the start menu, desktop hide or application close buttons during preview). In the playback monitor, you may want to right click the wrench icon and select Enable VR to preview a pan-able perspective of the video, instead of the entire distorted equirectangular source frame. You can cut, trim and stack your footage as usual, and apply color corrections and other non-geometry-based effects.

In version 11.1.2 of Premiere, there is basically one VR effect (VR Projection), which allows you to rotate the video sphere along all three axis. If you have the Skybox Suite for Premiere installed, you will have some extra VR effects. The Skybox Rotate Sphere effect is basically the same. You can add titles and graphics and use the Skybox Project 2D effect to project them into the sphere where you want. Skybox also includes other effects for blurring and sharpening the spherical video, as well as denoise and glow. If you have Kolor AVP installed that adds two new effects as well. GoPro VR Horizon is similar to the other sphere rotation ones, but allows you to drag the image around in the monitor window to rotate it, instead of manually adjusting the axis values, so it is faster and more intuitive. The GoPro VR Reframe effect is applied to equirectangular footage, to extract a flat perspective from within it. The field of view can be adjusted and rotated around all three axis.

Most of the effects are pretty easy to figure out, but Skybox Project 2D may require some experimentation to get the desired results. Avoid placing objects near the edges of the 2D frame that you apply it to, to keep them facing toward the viewer. The rotate projection values control where the object is placed relative to the viewer. The rotate source values rotate the object at the location it is projected to. Personally, I think they should be placed in the reverse order in the effects panel.

Encoding the final output is not difficult, just send it to Adobe Media Encoder using either H.264 or H.265 formats. Make sure the “Video is VR” box is checked at the bottom of the Video Settings pane, and in this case that the frame layout is set to monoscopic. There are presets for some of the common framesizes, but I would recommend lowering the bitrates, at least if you are using Gear 360 footage. Also, if you have ambisonic audio set channels to 4.0 in the audio pane.

Once the video is encoded, you can upload it directly to Facebook. If you want to upload to YouTube, exports from AME with the VR box checked should work fine, but for videos from other sources you will need to modify the metadata with this app here.  Once your video is uploaded to YouTube, you can embed it on any webpage that supports 2D web videos. And YouTube videos can be streamed directly to your Rift headset using the free DeoVR video player.

That should give you a 360-video production workflow from start to finish. I will post more updated articles as new software tools are developed, and as I get new 360 cameras with which to test and experiment.


Mike McCarthy is an online editor/workflow consultant with 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.

Deb Oh joins Nylon Studios from Y&R

Music and sound boutique Nylon Studios, which has offices in NYC and Sydney, has added Deb Oh as senior producer. A classically trained musician, Oh has almost a decade of experience in the commercial music space, working as a music supervisor and producer on both the agency and studio sides.

She comes to Nylon from Y&R, where she spent two years working as a music producer for Dell, Xerox, Special Olympics, Activia and Optum, among others. Outside of the studio, Oh has continued to pursue music, regularly writing and performing with her band Deb Oh & The Cavaliers and serving as music supervisor for the iTunes podcast series, “Limetown.”

A lifelong musician, Oh grew up learning classical piano and singing at a very early age. She began writing and performing her own music in high school and kept up her musical endeavors while studying Political Science at NYU. Following graduation, she made the leap to follow her passion for music full time, landing as a client service coordinator at Headroom. She was then promoted to music supervisor. After five years with the audio shop, she made the leap to the agency side to broaden her skillset and glean perspective into the landscape of vendors, labels and publishers in the commercial music industry.

 

Digging Deeper: The Mill Chicago’s head of color Luke Morrison

A native Londoner, Morrison started his career at The Mill where worked on music videos and commercials. In 2013, he moved across to the Midwest to head up The Mill Chicago’s color department.

Since then, Morrison has worked on campaigns for Beats, Prada, Jeep, Miller, Porsche, State Farm, Wrigley’s Extra Gum and a VR film for Jack Daniel’s.

Let’s find out more about Morrison.

How early on did you know color would be your path?
I started off, like so many at The Mill, as a runner. I initially thought I wanted to get into 3D, and after a month of modeling a photoreal screwdriver I realized that wasn’t the path for me. Luckily, I poked my nose into the color suites and saw them working with neg and lacing up the Spirit telecine. I was immediately drawn to it. It resonated with me and with my love of photography.

You are also a photographer?
Yes, I actually take pictures all the time. I always carry some sort of camera with me. I’m fortunate to have a father who is a keen photographer and he had a darkroom in our house when I was young. I was always fascinated with what he was doing up there, in the “red room.”

Photography for me is all about looking at your surroundings and capturing or documenting life and sharing it with other people. I started a photography club at The Mill, S35, because I wanted to share that part of my passion with people. I find as a ‘creative’ you need to have other outlets to feed into other parts of you. S35 is about inspiring people — friends, colleagues, clients — to go back to the classic, irreplaceable practice of using 35mm film and start to consider photography in a different way than the current trends.

State Farm

In 2013, you moved from London to Chicago. Are the markets different and did anything change?
Yes and no. I personally haven’t changed my style to suit or accommodate the different market. I think it’s one of the things that appeals to my clients. Chicago, however, has quite a different market than in the UK. Here, post production is more agency led and directors aren’t always involved in the process. In that kind of environment, there is a bigger role for the colorist to play in carrying the director’s vision through or setting the tone of the “look.”

I still strive to keep that collaboration with the director and DP in the color session whether it’s a phone call to discuss ahead of the session, doing some grade tests or looping them in with a remote grade session. There is definitely a difference in the suite dynamics, too. I found very quickly I had to communicate and translate the client’s and my creative intent differently here.

What sort of content do you work on?
We work on commercials, music promos, episodics and features, but always have an eye on new ways to tell narratives. That’s where the pioneering work in the emerging technology field comes into play. We’re no longer limited and are constantly looking for creative ways to remain at the forefront of creation for VR, AR, MR and experiential installations. It’s really exciting to watch it develop and to be a part of it. When Jack Daniel’s and DFCB Chicago approached us to create a VR experience taking the viewer to the Jack Daniel’s distillery in Kentucky, we leapt at the chance.

Do you like a variety of projects?
Who doesn’t? It’s always nice to be working on a variety, keeping things fresh and pushing yourself creatively. We’ve moved into grading more feature projects and episodic work recently, which has been an exciting way to be creatively and technically challenged. Most recently, I’ve had a lot of fun grading some comedy specials, one for Jerrod Carmichael and one for Hasan Minhaj. This job is ever-changing, be it thanks to evolving technology, new clients or challenging projects. That’s one of the many things I love about it.

Toronto Maple Leafs

You recently won two AICE awards for best color for your grade on the Toronto Maple Leafs’ spot Wise Man. Can you talk about that?
It was such a special project to collaborate on. I’ve been working with Ian Pons Jewell, who directed it, for many years now. We met way back in the day in London, when I was a color assistant. He would trade me deli meats and cheeses from his travels to do grades for him! That shared history made the AICE awards all the more special. It’s incredible to have continued to build that relationship and see how each of us have grown in our careers. Those kinds of partnerships are what I strive to do with every single client and job that comes through my suite.

When it comes to color grading commercials, what are the main principles?
For me, it’s always important to understand the idea, the creative intent and the tone of the spot. Once you understand that, it influences your decisions, dictates how you’ll approach the grade and what options you’ll offer the client. Then, it’s about crafting the grade appropriately and building on that.

You use FilmLight Baselight, what do your clients like most about what you can provide with that system?
Clients are always impressed with the speed at which I’m able to address their comments and react to things almost before they’ve said them. The tracker always gets a few “ooooooh’s” or “ahhhh’s.” It’s like they’re watching fireworks or something!

How do you keep current with emerging technologies?
That’s the amazing thing about working at The Mill: we’re makers and creators for all media. Our Emerging Technologies team is constantly looking for new ways to tell stories and collaborate with our clients, whether it’s branded content or passion projects, using all technologies at our disposal: anything is at our fingertips, even a Pop Llama.

Name three pieces of technology you can’t live without.
Well, I’ve got to have my Contax T2, an alarm clock, otherwise I’d never be anywhere on time, and my bicycle.

Would you say you are a “technical” colorist or would you rather prioritize instincts?
It’s all about instincts! I’m into the technical side, but I’m mostly driven by my instincts. It’s all about feeling and that comes from creating the correct environment in the suite, having a good kick off chat with clients, banging on the tunes and spinning the balls.

Where do you find inspiration?
I find a lot of inspiration from just being outside. It might sound like a cliché but travel is massive for me, and that goes hand in hand with my photography. I think it’s important to change your surroundings, be it traveling to Japan or just taking a different route to the studio. The change keeps me engaged in my surroundings, asking questions and stimulating my imagination.

What do you do to de-stress from it all?
Riding my bike is my main thing. I usually do a 30-mile ride a few mornings a week and then 50 to 100 miles at the weekend. Riding keeps you constantly focused on that one thing, so it’s a great way to de-stress and clear your mind.

What’s next for you?
I’ve got some great projects coming up that I’m excited about. But outside of the suite, I’ll be riding in this year’s 10th Annual Fireflies West ride. For the past 10 years, Fireflies West participants have embarked on a journey from San Francisco to Los Angeles in support of City of Hope. This year’s ride has the added challenge of an extra day tacked onto it making the ride 650 miles in total over seven days, so…I best get training! (See postPerspectives’ recent coverage on the ride.)

Neill Blomkamp’s Oats Studios uses Unity 2017 on ADAM shorts

Academy Award-nominated director Neill Blomkamp (District 9) is directing the next installments in the ADAM franchise — ADAM: The Mirror and ADAM: The Prophet — using the latest version of the Unity, which launches today.

Created and produced by Blomkamp’s Oats Studios, these short films show the power of working within an integrated realtime environment — allowing the team to build, texture, animate, light and render all in Unity to deliver high-quality graphics at a fraction of the cost and time of a normal film production cycle.

ADAM: The Mirror will premiere during the live stream of the Unite Austin 2017 keynote, which begins at 4pm Pacific tonight, and will be available on the Oats YouTube channel shortly after. ADAM: The Prophet will follow before the end of 2017.

“Ever since I started making films I’ve dreamed of a virtual sandbox that would let me build, shoot and edit photorealistic worlds all in one place. Today that dream came true thanks to the power of Unity 2017,” said Neill Blomkamp, founder Oats Studios. “The fact that we could achieve near photorealistic visuals at half of average time of our production cycles is astounding. The future is here, and I can’t wait to see what our fans think.”

Neill Blomkamp

The original ADAM was released in 2016 as a short film to demonstrate technical innovations on Unity. It won a Webby Award and was screened at several film festivals, including the Annecy Film Festival and the Nashville Film Festival. ADAM: The Mirror picks up after the end of the events of ADAM where the cyborg hero discovers a clue about what and who he is. ADAM: The Prophet gives viewers their first glimpse of one of the villains in the ADAM universe.

Using the power of realtime rendering, Oats used Unity 2017 to help them create photorealistic graphics and lifelike digital humans. This was achieved through a combination of Unity’s advanced high-end graphics power, new materials using the Custom Render Texture feature, advanced photogrammetry techniques, Alembic-streamed animations for facial and cloth movement and Unity’s Timeline feature.

Innovations will be highlighted in the coming months via a series of behind-the-scenes videos and articles on the Unity website. Innovations in these short films include:
• Lifelike digital humans in realtime: Oats created the best-looking human ever in Unity using custom sub-surface scattering shaders for skin, eyes and hair.
• Alembic-based realtime facial performance capture: Oats has created a new facial performance capture technique that streams 30 scanned heads per second for lifelike animation, all without the use of morph targets or rigs.
• Virtual worlds via photogrammetry: Staying true to their live-action background, Oats shot more than 35,000 photos of environments and props and after the initial photogrammetry solve, imported these into Unity using the delighting tool. This allowed them to quickly create rich complex materials without the need to spend time to model high-resolution models.
• Rapid streamlined iteration in realtime: Working with realtime rendering lets artists and designers “shoot” the story as if on a set, with a live responsiveness that allows room to experiment and make creative decisions anywhere in the process.
• Unity’s timeline backbone for collaboration: Unity’s Timeline feature, a visual sequencing tool that allows artists to orchestrate scenes without additional programming, combined with Multi-Scene Authoring allowed a team of 20 artists to collaborate on the same shot simultaneously.

The A-List: Victoria & Abdul director Stephen Frears

By Iain Blair

Much like the royal subjects of his new film Victoria & Abdul and his 2006 offering, The Queen (which won him his second Oscar nomination), British director Stephen Frears has long been considered a national treasure. Of course, the truth is that he’s an international treasure.

The director, now 76 years old, has had a long and prolific career that spans some five decades and that has embraced a wide variety of styles, themes and genres. He cut his teeth at the BBC, where he honed his abilities to work with tight budgets and schedules. He made his name in TV drama, working almost exclusively for the small screen in the first 15 years of his career.

Stephen Frears with writer Iain Blair.

In the mid-1980s, Frears turned to the cinema, shooting The Hit, which starred Terence Stamp, John Hurt and Tim Roth. The following year he made My Beautiful Laundrette for Channel 4, which crossed over to big screen audiences and altered the course of his career.

Since then, he’s made big Hollywood studio pictures, such as the Oscar-nominated Florence Foster Jenkins, The Grifters and Dangerous Liaisons, as well as Mary Reilly and Hero. But he’s probably as well-known for smaller, grittier vehicles, such as the Oscar-nominated Philomena, Muhammad Ali’s Greatest Fight, Cheri, Dirty Pretty Things, High Fidelity, Prick Up Your Ears and Snapper, films that provided a rich palette for Frears to explore stories with a strong social and political conscience.

His latest film, Victoria & Abdul, is a drama (spiced with a good dash of comedy) about the unlikely but real-life relationship between Queen Victoria (Judi Dench) and her Muslim Indian servant Abdul Karim (Ali Fazal).

I recently spoke with Frears about making the film, which is already generating a lot of Oscar buzz, especially for Dench.

This seems to be a very timely film, with its race relations, and religious and class issues. Was that part of its appeal?
Absolutely. When I read it I immediately thought it was quite provocative and a very interesting story, and I always look for interesting stories, and the whole relationship was part of the fun. I thought it was a brilliant script, and it’s got so much going on – the personal story about them, all the politics and global stuff about the British Empire.

You’ve worked with Judi Dench before, but she had already portrayed Victoria in Mrs. Brown back in 1997. Did you have to twist her arm to revisit the character?
I said I’d only make this with her, as she’s a brilliant actress and she looks a bit like Victoria, but I think initially she passed. I’m actually not quite sure since I never had a conversation with her about it. What happened was, we organized a reading and she came to that and listened to it, and then she was on board.

What did she bring to the role?
Complete believability. You absolutely believe in her as Victoria. She can do all that, playing the most powerful woman in the world, and then she was also human, which is why she was so fond of Abdul. It’s the same as directing someone like Meryl Streep. She’s just so skillful and so intelligent, and their sense of their role and its direction is very, very strong, and they’re so skilled at telling the story.

This doesn’t look like your usual heavy, gloomy Victorian period piece. How did you approach this visually?
I have a wonderful production designer, Alan MacDonald, who has worked with me on many films, including Florence Foster Jenkins, Philomena and The Queen. And we shot this with DP Danny Cohen, who is so inventive. From the start we wanted it to feel period but do it in a more modern way in order to get away from that lugubrious feeling and the heavy Victoriana. When we got to Osborne House, which was her holiday home on the Isle of Wight, it’s anything but heavy and lugubrious. It’s this light and airy villa.

Fair to say the film starts dark and gets lighter in tone and color as it goes on — while the story starts lighter and more comical, and gets darker as it goes along?
Yes, because at the start she’s depressed, she’s dressed all in black, and then it’s like Cinderella, and she’s woken up… by Abdul’s kiss on her feet.

Did that really happen?
Yes, I think it did, and I think both servants kissed her feet — but it wasn’t under a table full of jellies (laughs).

You shot all over England, Scotland and India in many of the original locations. It must have been a challenge mixing all the locations with sets?
It was. The big coup was shooting in Osborne House, which no one has ever done before. That was a big thrill but also a relief. England is full of enormous country homes, so you just go down the list finding the best ones. I’ve done Balmoral twice now, so I know how you do it, and Windsor Castle, which is Gothic. But of course, they’re not decorated in the Victorian manner, so we had to dress all the rooms appropriately. Then you mix all the sets and locations, like putting a big puzzle together.

How was shooting in India?
We shot in Agra, by the Taj Mahal. The original statue of Victoria there was taken down after independence, but we were allowed to make a copy and put it back up.

Where did you do the post? How long was the process?
It was about five months, all in London, and we cut it at Goldcrest where I’ve done all the post work on my last few films. Philomena was not done there. It all depends on the budget.

Do you like the post process?
I love being on location and I enjoy shooting, but it’s always hard and full of problems. Post is so calm by comparison, and so different from all the money and time pressures and chaos of the shoot. It’s far more analytic and methodical, and it’s when you discover the good choices you made as well as your mistakes. It’s where you actually make your film with all the raw elements you’ve amassed along the way.

You worked with a new editor, Melanie Ann Oliver, who cut Les Mis and The Danish Girl for director Tom Hooper and Anna Karenina for director Joe Wright. How did that relationship work?
She wasn’t on set, but we talked every day about it, and she became the main conduit for it all, like all editors. She’s the person you’re talking to all the time, and we spent about three months editing. The main challenge was trying to find the right tone and the balance between all the comedy, jokes and the subtext — what was really going on. We went in knowing it would be very comedic at the start, and then it gets very serious and emotional by the end.

Who did the visual effects and how many visual effects shots are there?
I always use the same team. Union VFX did them all, and Adam Gascoyne, who did Florence Foster Jenkins and Philomena with me, was the VFX supervisor. The big VFX shots were of all the ships crossing the ocean, and a brilliant one of Florence. And as it’s a period piece, there’s always a lot of adding stuff and clean up, and we probably had several hundred VFX shots or so in the end, but I never know just how many.

Iain Blair and Judi Dench

How important are sound and music to you?
They’re both hugely important, even though I don’t really know much about music or sound mixing and just depend on my team, which includes supervising sound editor Becki Ponting. We mixed all the music by Thomas Newman at Abbey Road, and then we did the final mix at Twickenham Studios. The thing with composers like Thomas Newman and Alexandre Desplat who did The Queen and Florence is that they read me really well. When Alexandre was hired to score The Queen, they asked him to write a very romantic score, and he said, “No, no, I know Stephen’s films. They’re witty, so I’ll write you a witty score,” and it was perfect and won him an Oscar nomination. Same with this. Tom read it very, very well.

Did you do a DI?
Yes, at Goldcrest as usual, with Danny and colorist Adam Glasman. They’re very clever, and I’m not really involved. Danny does it. He gets me in and shows me stuff but I just don’t pretend to be technically clever enough about the DI as mine is a layman’s approach to it, so they do all the work and show me everything, and then I give any suggestions I might have. The trick with any of this is to surround yourself with the best technicians and the best actors, tell them what you want, and let them do their jobs.

Having made this film, what do you think about Victoria now?
I think she was far more humane than is usually shown. I never really studied her at school, but there was this enduring image of an old battleaxe, and I think she was far more complex than that image. She learned Urdu from Abdul. That tells you a lot.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.