NBCUni 7.26

Category Archives: production

MovieLabs, film studios release ‘future of media creation’ white paper

MovieLabs (Motion Pictures Laboratories), a nonprofit technology research lab that works jointly with member studios Sony, Warner Bros., Disney, Universal and Paramount, has published a new white paper presenting an industry vision for the future of media creation technology by 2030.

The paper, co-authored by MovieLabs and technologists from Hollywood studios, paints a bold picture of future technology and discusses the need for the industry to work together now on innovative new software, hardware and production workflows to support and enable new ways to create content over the next 10 years. The white paper is available today for free download on the MovieLabs website.

The 2030 Vision paper lays out key principles that will form the foundation of this technological future, with examples and a discussion of the broader implications of each. The key principles envision a future in which:

1. All assets are created or ingested straight to the cloud and do not need to move.
2. Applications come to the media.
3. Propagation and distribution of assets is a “publish” function.
4. Archives are deep libraries with access policies matching speed, availability and security to the economics of the cloud.
5. Preservation of digital assets includes the future means to access and edit them.
6. Every individual on a project is identified and verified and their access permissions are efficiently and consistently managed.
7. All media creation happens in a highly secure environment that adapts rapidly to changing threats.
8. Individual media elements are referenced, tracked, interrelated and accessed using a universal linking system.
9. Media workflows are non-destructive and dynamically created using common interfaces, underlying data formats and metadata.
10. Workflows are designed around realtime iteration and feedback.

Rich Berger

“The next 10 years will bring significant opportunities, but there are still major challenges and inherent inefficiencies in our production and distribution workflows that threaten to limit our future ability to innovate,” says Richard Berger, CEO of MovieLabs. “We have been working closely with studio technology leaders and strategizing how to integrate new technologies that empower filmmakers to create ever more compelling content with more speed and efficiency. By laying out these principles publicly, we hope to catalyze an industry dialog and fuel innovation, encouraging companies and organizations to help us deliver on these ideas.”

The publication of the paper will be supported with a panel discussion at the IBC Conference in Amsterdam. The panel, “Hollywood’s Vision for the Future of Production in 2030,” will include senior technology leaders from the five major Hollywood motion picture studios. It will take place on Sunday, September 15 at 2:15pm in the IBC Conference in the Forum room of the RAI. postPerspective’s Randi Altman will moderate the panel made up of Sony’s Bill Baggelaar, Disney’s Shadi Almassizadeh, Universal’s Michael Wise and Paramount’s Anthony Guarino. More details can be found here.

“Sony Pictures Entertainment has a deep appreciation for the role that current and future technologies play in content creation,” says CTO of Sony Pictures Don Eklund. “As a subsidiary of a technology-focused company, we benefit from the power of Sony R&D and Sony’s product groups. The MovieLabs 2030 document represents the contribution of multiple studios to forecast and embrace the impact that cloud, machine learning and a range of hardware and software will have on our industry. We consider this a living document that will evolve over time and provide appreciated insight.”

According to Wise, SVP/CTO at Universal Pictures, “With film production experiencing unprecedented growth, and new innovative forms of storytelling capturing our audiences’ attention, we’re proud to be collaborating across the industry to envision new technological paradigms for our filmmakers so we can efficiently deliver worldwide audiences compelling entertainment.”

For those not familiar with MovieLabs, their stated goal is “to enable member studios to work together to evaluate new technologies and improve quality and security, helping the industry deliver next-generation experiences for consumers, reduce costs and improve efficiency through industry automation, and derive and share the appropriate data necessary to protect and market the creative assets that are the core capital of our industry.”

Digital Arts expands team, adds Nutmeg Creative talent

Digital Arts, an independently owned New York-based post house, has added several former Nutmeg Creative talent and production staff members to its roster — senior producer Lauren Boyle, sound designer/mixers Brian Beatrice and Frank Verderosa, colorist Gary Scarpulla, finishing editor/technical engineer Mark Spano and director of production Brian Donnelly.

“Growth of talent, technology, and services has always been part of the long-term strategy for Digital Arts, and we’re fortunate to welcome some extraordinary new talent to our staff,” says Digital Arts owner Axel Ericson. “Whether it’s long-form content for film and television, or working with today’s leading agencies and brands creating dynamic content, we have the talent and technology to make all of our clients’ work engaging, and our enhanced services bring their creative vision to fruition.”

Brian Donnelly, Lauren Boyle and Mark Spano.

As part of this expansion, Digital Arts will unveil additional infrastructure featuring an ADR stage/mix room. The current facility boasts several state-of-the-art audio suites, a 4K finishing theater/mixing dubstage, four color/finishing suites and expansive editorial and production space, which is spread over four floors.

The former Nutmeg team has hit the ground running working their long-time ad agency, network, animation and film studio clients. Gary Scarpulla worked on color for HBO’s Veep and Los Espookys, while Frank Verderosa has been working with agency Ogilvy on several Ikea campaigns. Beatrice mixed spots for Tom Ford’s cosmetics line.

In addition, Digital Arts’ in-house theater/mixing stage has proven to be a valuable resource for some of the most popular TV productions, including recording recent commentary sessions for the legendary HBO series, Game of Thrones and the final season of Veep.

Especially noteworthy is colorist Ericson’s and finishing editor Mark Spano’s collaboration with Oscar-winning directors Karim Amer and Jehane Noujaim to bring to fruition the Netflix documentary The Great Hack.

Digital Arts also recently expanded its offerings to include production services. The company has already delivered projects for agencies Area 23, FCB Health and TCA.

“Digital Arts’ existing infrastructure was ideally suited to leverage itself into end-to-end production,” Donnelly says. “Now we can deliver from shoot to post.”

Tools employed across post are Avid Pro Tools, D Control ES, S3 for audio post and Avid Media Composer, Adobe Premiere and Blackmagic Resolve for editing. Color grading is via Resolve.

Main Image: (L-R) Frank Verderosa, Brian Beatrice and Gary Scarpulla

 

NBCUni 7.26

Dick Wolf’s television empire: his production and post brain trust

By Iain Blair

The TV landscape is full of scripted police procedurals and true crime dramas these days, but the indisputable and legendary king of that crowded landscape is Emmy-winning creator/producer Dick Wolf, whose name has become synonymous with high-quality drama.

Arthur Forney

Since it burst onto the scene back in 1990, his Law & Order show has spawned six dramas and four international spinoffs, while his “Chicago” franchise gave birth to another four series — the hugely popular Chicago Med, Chicago Fire and Chicago P.D. His Chicago Justice was cancelled after one season.

Then there’s his “FBI” shows, as well as the more documentary-style Cold Justice. If you’ve seen Cold Justice — and you should — you know that this is the real deal, focusing on real crimes. It’s all the more fascinating and addictive because of it.

Produced by Wolf and Magical Elves, the real-life crime series follows veteran prosecutor Kelly Siegler, who gets help from seasoned detectives as they dig into small-town murder cases that have lingered for years without answers or justice for the victims. Together with local law enforcement from across the country, the Cold Justice team has successfully helped bring about 45 arrests and 20 convictions. No case is too cold for Siegler, as the new season delves into new unsolved homicides while also bringing updates to previous cases. No wonder Wolf calls it “doing God’s work.” Cold Justice airs on true crime network Oxygen.

I recently spoke with Emmy-winning Arthur Forney, executive producer of all Wolf Entertainment’s scripted series (he’s also directed many episodes), about posting those shows. I also spoke with Cold Justice showrunner Liz Cook and EP/head of post Scott Patch.

Chicago Fire

Dick Wolf has said that, as head of post, you are “one of the irreplaceable pieces of the Wolf Films hierarchy.” How many shows do you oversee?
Arthur Forney: I oversee all of Wolf Entertainment’s scripted series, including Law & Order: Special Victims Unit, Chicago Fire, Chicago P.D., Chicago Med, FBI and FBI: Most Wanted.

Where is all the post done?
Forney: We do it all at NBCUniversal StudioPost in LA.

How involved is Dick Wolf?
Forney: Very involved, and we talk all the time.

How does the post pipeline work?
Forney: All film is shot on location and then sent back to the editing room and streamed into the lab. From there we do all our color corrections, which takes us into downloading it into Avid Media Composer.

What are the biggest challenges of the post process on the shows?
Forney: Delivering high-quality programming with a shortened post schedule.

Chicago Med

What are the editing challenges involved?
Forney: Trying to find the right way of telling the story, finding the right performances, shaping the show and creating intensity that results in high-quality television.

What about VFX? Who does them?
Forney: All of our visual effects are done by Spy Post in Santa Monica. All of the action is enhanced and done by them.

Where do you do the color grading?
Forney: Coloring/grading is all done at NBCUniversal StudioPost.

Now let’s talk to Cook and Patch about Cold Justice:

Liz and Scott, I recently saw the finale to Season 5 of Cold Justice. That was a long season.
Liz Cook: Yes, we did 26 episodes, so it was a lot of very long days and hard work.

It seems that there’s more focus than ever on drug-related cases now.
Cook: I don’t think that was the intention going in, but as we’ve gone on, you can’t help but recognize the huge drug problem in America now. Meth and opioids pop up in a lot of cases, and it’s obviously a crisis, and even if they aren’t the driving force in many cases, they’re definitely part of many.

L-R: Kelly Siegler, Dick Wolf, Scott Patch and Liz Cook. Photo by Evans Vestal Ward

How do you go about finding cases for the show?
Cook: We have a case-finding team, and they get the cases various ways, including cold-calling. We have a team dedicated to that, calling every day, and we get most of them that way. A lot come through agencies and sheriff’s departments that have worked with us before and want to help us again. And we get some from family members and some from hits on the Facebook page we have.

I assume you need to work very closely with local law agencies as you need access to their files?
Cook: Exactly. That’s the first part of the whole puzzle. They have to invite us in. The second part is getting the family involved. I don’t think we’d ever take on a case that the family didn’t want us to do.

What’s involved for you, and do you like being a showrunner?
Cook: It’s a tough job and pretty demanding, but I love it. We go through a lot of steps and stuff to get a case approved, and to get the police and family on board, and then we get the case read by one of our legal readers to evaluate it and see if there’s a possibility that we can solve it. At that point we pitch it to the network, and once they approve it and everyone’s on board, then if there are certain things like DNA and evidence that might need testing, we get all that going, along with ballistics that need researching, and stuff like phone records and so on. And it actually moves really fast – we usually get all these people on board within three weeks.

How long does it take to shoot each show?
Cook: It varies, as each show is different, but around seven or eight days, sometimes longer. We have a case coming up with cadaver dogs, and that stuff will happen before we even get to the location, so it all depends. And some cases will have 40 witnesses, while others might have over 100. So it’s flexible.

Cold Justice

Where do you post, and what’s the schedule like?
Scott Patch: We do it all at the Magical Elves offices here in Hollywood — the editing, sound, color correction. The online editor and colorist is Pepe Serventi, and we have it all on one floor, and it’s really convenient to have all the post in house. The schedule is roughly two months from the raw footage to getting it all locked and ready to air, which is quite a long time.

Dailies come back to us and we do our first initial pass by the story team and editors, and they’ll start whittling all the footage down. So it takes us a couple of weeks to just look at all the footage, as we usually have about 180 hours of it, and it takes a while to turn all that into something the editors can deal with. Then it goes through about three network passes with notes.

What about dealing with all the legal aspects?
Patch: That makes it a different kind of show from most of the others, so we have legal people making sure all the content is fine, and then sometimes we’ll also get notes from local law agencies, as well as internal notes from our own producers. That’s why it takes two months from start to finish.

Cook: We vet it through local law, and they see the cuts before it airs to make sure there are no problems. The biggest priority for us is that we don’t hurt the case at all with our show, so we always check it all with the local D.A. and police. And we don’t sensationalize anything.

Cold Justice

Patch: That’s another big part of editing and post – making sure we keep it authentic. That can be a challenge, but these are real cases with real people being accused of murder.

Cook: Our instinct is to make it dramatic, but you can’t do that. You have to protect the case, which might go to trial.

Talk about editing. You have several editors, I assume because of the time factor. How does that work?
Patch: Some of these cases have been cold for 25 or 30 years, so when the field team gets there, they really stand back and let the cops talk about the case, and we end up with a ton of stuff that you couldn’t fit into the time slot however hard you tried. So we have to decide what needs to be in, what doesn’t.

Cook: On day one, our “war room” day, we meet with the local law and everyone involved in the case, and that’s eight hours of footage right there.

Patch: And that gets cut down to just four or five minutes. We have a pretty small but tight team, with 10 editors who split up the episodes. Once in a while they’ll cross over, but we like to have each team and the producers stay with each episode as long as they can, as it’s so complicated. When you see the finished show, it doesn’t seem that complicated, but there are so many ways you could handle the footage that it really helps for each team to really take ownership of that particular episode.

How involved is Dick Wolf in post?
Cook: He loves the whole post process, and he watches all the cuts and has input.

Patch: He’s very supportive and obviously so experienced, and if we’re having a problem with something, he’ll give notes. And for the most part, the network gives us a lot of flexibility to make the show.

What about VFX on the show?
Patch: We have some, but nothing too fancy, and we use an outside VFX/graphics company, LOM Design. We have a lot of legal documents on the show, and that stuff gets animated, and we’ll also have some 3D crime scene VFX. The only other outside vendor is our composer, Robert ToTeras.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.


Quick Chat: Bonfire Labs’ Mary Mathaisell

Over the course of nearly 30 years, San Francisco’s Bonfire Labs has embraced change. Over the years, the company evolved from an editorial and post house to a design and creative content studio that leverages the best aspects of the agency and production company models without adhering to either one.

This hybrid model has worked well for product launches for Google, Facebook, Salesforce, Logitech and many others.

The latest change is in the company’s ownership, with the last of the original founders stepping down and a new management partnership taking over — led by executive producer Mary Mathaisell, managing director Jim Bartel and head of strategy and creative Chris Weldon.

We spoke with Mathaisell to get a better sense of Bonfire Labs’ past, present and future.

Can you give us some history of Bonfire Labs? When did you join the company? How/why did you first get into producing?
I’ve been with Bonfire Labs for seven years. I started here as head of production. After being at several large digital agencies working on campaigns and content for brands like Target, Gap, LG and PayPal, I wanted to build something more sustainable than just another campaign and was thrilled that Bonfire was interested in growing into a full-service creative company with integrated production.

Prior to working at AKQA and Publicis, I worked in VFX and production as well as design for products and interfaces, but my primary focus and love has always been commercial production.

The studio has evolved from a traditional post studio to creative strategy and content company. What were the factors that drove those changes?
Bonfire Labs has always been smart about staying small and strategic about the kind of work and clients to focus on. We have been able to change based on both the kind of work we want to be doing and what the market needs. With a giant need for content, especially video content, we have decided to staff and service clients as experts across all the phases of creative development and production and finishing. Instead of going to an agency and a production company and post houses, our clients can work directly with us on everything from concept to finishing.

Silicon Valley is clearly a big client base for you. What are they generally coming to you for? Are the content needs in high tech different from other business sectors?
Our clients usually have a new product, feature or brand that they want the world to know about. We work on product launches, brand awareness campaigns, product education, event content and social content. Most of our work is for technology companies, but every company these days has a technology component. I would say that speed to market is one key differentiator for our clients. We are often building stories as we are in production, so we get a lot done with our clients through creative collaboration and by not following the traditional rules of an agency or a production company.

Any specific trends that you’re seeing recently from your clients? New areas that Bonfire is looking to explore, either new markets for your talents or technology you’re looking to explore further?
Rapid brand prototyping is a new service we are offering to much excitement. Because we have experience across so many technology brands and work closely with our clients, we can develop a language and brand voice faster than most traditional agencies. Technology brands are evolving so quickly that we often start working on content creation before a brand has defined itself or transitioned to its next phase. Rapid brand prototyping allows brands to test content and grow the brand simultaneously.

Blade Shadow

Can you talk about some projects that you have done recently that challenged you and the team?
We rolled out a launch film for a new start-up client called Blade Shadow. We are working with Salesforce to develop trailblazer stories and anthem films for its .org branch, which focuses on NGOs, education and philanthropy.

The company is undergoing a transition with some of the original partners. Can you talk about that a bit as well?
The original founders have passed the torch to the group of people who have been managing and producing the work over the past five to 15 years. We have six new owners, three managing partners and three associate partners. Jim Bartel is the managing director; Chris Weldon is the head of strategy and creative, and I’m the executive producer in charge of content development and production. The three of us make up the management team.

The three of us make up the management team. Sheila Smith (head of production) Robbie Proctor (head of editorial) and Phil Spitler (creative technology lead) are associate partners as they contribute to and lead so much of our work and process and have been part of the company for over 10 years each.

 


Envoi’s end-to-end cloud solution for data migration, post 

Envoi has launched a cloud-based data, migration and post production workflow solution at the AWS M&E Symposium on June 18  in Los Angeles. Enabled by Cantemo and Teradici PCoIP technology, Envoi is offering this as a media “production-to-payment” platform.

Envoi is a cloud-based content management, distribution and monetization platform, giving broadcasters and video content providers a complete secure end-to-end video management and distribution system. Available on Amazon Web Services (AWS) Marketplace, Envoi is designed to provide simple and efficient data migration to the cloud and between services in the workflow.

Thanks to a partnership with Cantemo, Envoi has been integrated with Cantemo’s media asset management solution, Cantemo Portal and its cloud video management hub, Cantemo Iconik. Iconik makes it easy to collaborate on media, regardless of geographic location. Advanced Artificial Intelligence simplifies content discovery by improving metadata collection. By combining Envoi with Cantemo Portal, media companies of virtually all sizes can now monetize their video libraries within 48 hours.

Envoi also enables post in the cloud with the integration of Iconik with Teradici-enabled workstations on AWS. These workstations are configured to support a wide range of editing and post production tools. By supporting the entire post process on AWS, Envoi says it is providing a solution that increases the security, performance and collaboration potential within the creative process. Delivering the solution through AWS Marketplace simplifies procurement, delivery and deployment for Envoi’s customers.

 


Phil Kubel named director of HPA

The Hollywood Professional Association (HPA) has appointed Phil Kubel as the organization’s director. He will be the Burbank-based presence of the HPA management team, managing the organization’s day-to-day business as well as supporting strategic planning, membership development and program development.

After his graduation from USC, Kubel worked in a number of production-related positions. In 2003 he became one of the founding members of HRTV, a national television network that featured equestrian and horse racing content. Kubel was instrumental in the design, engineering and production build of the studios and broadcast facility at Santa Anita Park in Arcadia, California. He went on to oversee day-to-day operations of all digital media, production and technology initiatives at HRTV, including creating the subscription-based HRTV.com.

In addition to Kubel’s technical portfolio, he served as VP of post production for HRTV and was the creative force behind the documentary series Inside Information, which earned 10 Emmy wins.

In 2015, Kubel was named VP/EP for a new digital media initiative for The Stronach Group. Under Stronach Digital, he oversaw the launch of XBTV, which is now an industry-leading multi-media horse racing product that provides insight and analysis for wagering customers.

“It’s an exciting time to be joining HPA,” notes Kubel. “We have a rare opportunity to use our accumulated knowledge and relationships to support industry growth by connecting the players and leading the conversation. I look forward to continuing the vision of HPA and developing it as a world-class resource for production professionals.”

He will report to HPA’s executive director, Barbara Lange.


Cutters Studios promotes Heather Richardson, Patrick Casey

Cutters Studios has promoted Heather Richardson to executive producer and Patrick Casey to head of production. Richardson’s oversight will expand into managing and recruiting talent, and in maintaining and building the company’s client base. Casey will focus on optimizing workflows, project management and bidding processes.

Richardson joined Cutters in 2015, after working as a producer for visual effects studio A52 in LA and for editorial company Cosmo Street in both LA and New York for more than 10 years. On behalf of Cutters, she has produced Super Bowl spots for Lifewtr, Nintendo and WeatherTech, and campaigns including Capital One, FCA North America (Fiat, Dodge Ram, and Jeep), Gatorade, Google, McDonald’s and Modelo.

“I’ve been fortunate to have worked with some excellent executive producers during my career, and I’m honored and excited for the opportunity to expand the scope of my role on behalf of Cutters Studios, and alongside Patrick Casey,” says Richardson. “Patrick’s kindness and thoughtfulness in addition to his intelligence and experience are priceless.”

In addition to leading Cutters editors, Casey produced the groundbreaking Always “#LikeAGirl” campaign, Budweiser’s Harry Caray’s Last Call and Whirlpool’s “Care Counts” campaign that won top Cannes Lions, Clio, Effie and Adweek Project Isaac Awards.


Atomos’ new Shogun 7: HDR monitor, recorder, switcher

The new Atomos Shogun 7 is a seven-inch HDR monitor, recorder and switcher that offers an all-new 1500-nit, daylight-viewable, 1920×1200 panel with a 1,000,000:1 contrast ratio and 15+ stops of dynamic range displayed. It also offers ProRes RAW recording and realtime Dolby Vision output. Shogun 7 will be available in June 2019, priced at $1,499.

The Atomos screen uses a combination of advanced LED and LCD technologies which together offer deeper, better blacks the company says rivals OLED screens, “but with the much higher brightness and vivid color performance of top-end LCDs.”

A new 360-zone backlight is combined with this new screen technology and controlled by the Dynamic AtomHDR engine to show millions of shades of brightness and color. It allows Shogun 7 to display 15+ stops of real dynamic range on-screen. The panel, says Atomos, is also incredibly accurate, with ultra-wide color and 105% of DCI-P3 covered, allowing for the same on-screen dynamic range, palette of colors and shades that your camera sensor sees.

Atomos and Dolby have teamed up to create Dolby Vision HDR “live” — a tool that allows you to see HDR live on-set and carry your creative intent from the camera through into HDR post. Dolby have optimized their target display HDR processing algorithm which Atomos has running inside the Shogun 7. It brings realtime automatic frame-by-frame analysis of the Log or RAW video and processes it for optimal HDR viewing on a Dolby Vision-capable TV or monitor over HDMI. Connect Shogun 7 to the Dolby Vision TV and AtomOS 10 automatically analyzes the image, queries the TV and applies the right color and brightness profiles for the maximum HDR experience on the display.

Shogun 7 records images up to 5.7kp30, 4kp120 or 2kp240 slow motion from compatible cameras, in RAW/Log or HLG/PQ over SDI/HDMI. Footage is stored directly to AtomX SSDmini or approved off-the-shelf SATA SSD drives. There are recording options for Apple ProRes RAW and ProRes, Avid DNx and Adobe CinemaDNG RAW codecs. Shogun 7 has four SDI inputs plus a HDMI 2.0 input, with both 12G-SDI and HDMI 2.0 outputs. It can record ProRes RAW in up to 5.7kp30, 4kp120 DCI/UHD and 2kp240 DCI/HD, depending on the camera’s capabilities. Also, 10-bit 4:2:2 ProRes or DNxHR recording is available up to 4Kp60 or 2Kp240. The four SDI inputs enable the connection of most quad-link, dual-link or single-link SDI cinema cameras. Pixels are preserved with data rates of up to 1.8Gb/s.

In terms of audio, Shogun 7 eliminates the need for a separate audio recorder. Users can add 48V stereo mics via an optional balanced XLR breakout cable, or select mic or line input levels, plus record up to 12 channels of 24/96 digital audio from HDMI or SDI. Monitoring selected stereo tracks is via the 3.5mm headphone jack. There are dedicated audio meters, gain controls and adjustments for frame delay.

Shogun 7 features the latest version of the AtomOS 10 touchscreen interface, first seen on the Ninja V.  The new body of Shogun 7 has a Ninja V-like exterior with ARRI anti-rotation mounting points on the top and bottom of the unit to ensure secure mounting.

AtomOS 10 on Shogun 7 has the full range of monitoring tools, including Waveform, Vectorscope, False Color, Zebras, RGB parade, Focus peaking, Pixel-to-pixel magnification, Audio level meters and Blue only for noise analysis.

Shogun 7 can also be used as a portable touchscreen-controlled multi-camera switcher with asynchronous quad-ISO recording. Users can switch up to four 1080p60 SDI streams, record each plus the program output as a separate ISO, then deliver ready-for-edit recordings with marked cut-points in XML metadata straight to your NLE. The current Sumo19 HDR production monitor-recorder will also gain the same functionality in a free firmware update.

There is asynchronous switching, plus use genlock in and out to connect to existing AV infrastructure. Once the recording is over, users can import the XML file into an NLE and the timeline populates with all the edits in place. XLR audio from a separate mixer or audio board is recorded within each ISO, alongside two embedded channels of digital audio from the original source. The program stream always records the analog audio feed as well as a second track that switches between the digital audio inputs to match the switched feed.


Review: Mzed.com’s Directing Color With Ollie Kenchington

By Brady Betzel

I am constantly looking to educate myself, no matter what the source — or subject. Whether I am learning how to make a transition in Adobe After Effects from an eSports editor on YouTube to Warren Eagles teaching color correction in Blackmagic’s DaVinci Resolve on FXPHD.com, I’m always beefing up my skills. I even learn from bad tutorials — they teach you what not to do!

But when you come across a truly remarkable learning experience, it is only fair to share with the rest of the world. Last year I saw an ad for an MZed.com course called “Directing Color With Ollie Kenchington,” and was immediately interested. These days you can pretty much find any technical tutorial you can dream of on YouTube, but truly professional, higher education-like, theory-based education series are very hard to come by. Even ones you need to pay for aren’t always worth their price of admission, which is a huge let down.

Ollie sharing his wisdom.

Once I gained access to MZed.com I wanted to watch every educational series they had. From lighting techniques with ASC member Shane Hurlbut to the ARRI Amira Camera Primer, there are over 150 hours of education available from industry leaders. However, I found my way to Directing Color…

I am often asked if I think people should go to college or a film school. My answer? If you have the money and time, you should go to college followed by film school (or do both together, if the college offers it). Not only will you learn a craft, but you will most likely spend hundreds of hours studying and visualizing the theory behind it. For example, when someone asks me about the science behind camera lenses, I can confidently answer them thanks to my physics class based on lenses and optics from California Lutheran University (yes, a shameless plug).

In my opinion, a two-, four- or even 10-year education allows me to live in the grey. I am comfortable arguing for both sides of a debate, as well as the options that are in between —  the grey. I feel like my post-high school education really allowed me to recognize and thrive in the nuances of debate. Leaving me to play devil’s advocate maybe a little too much, but also having civil and proactive discussions with others without being demeaning or nasty — something we are actively missing these days. So if living in the grey is for you, I really think a college education supplemented by online or film school education is valuable (assuming you make the decision that the debt is worth it like I did).

However, I know that is not an option for everyone since it can be very expensive — trust me, I know. I am almost done paying off my undergraduate fees while still paying off my graduate ones, which I am still two or three classes away from finishing. That being said, Directing Color With Ollie Kenchington is the only online education series I have seen so far that is on the same level as some of my higher education classes. Not only is the content beautifully shot and color corrected, but Ollie gives confident and accessible lessons on how color can be used to draw the viewer’s attention to multiple parts of the screen.

Ollie Kenchington is a UK-based filmmaker who runs Korro Films. From the trailer of his Directing Color series, you can immediately see the beauty of Ollie’s work and know that you will be in safe hands. (You can read more about his background here.)

The course raises the online education bar and will elevate the audiences idea of professional insight. The first module “Creating a Palette” covers the thoughts behind creating a color palette for a small catering company. You may even want to start with the last Bonus Module “Ox & Origin” to get a look at what Ollie will be creating throughout the seven modules and about an hour and a half of content.

While Ollie goes over “looks,” the beauty of this course is that he goes through his internal thought processes including deciding on palettes based on color theory. He didn’t just choose teal and orange because it looks good, he chooses his color palette based on complementary colors.

Throughout the course Ollie covers some technical knowledge, including calibrating monitors and cameras, white balancing and shooting color charts to avoid having wrong color balance in post. This is so important because if you don’t do these simple steps, your color correction session while be much harder. And wasting time on fixing incorrect color balance takes time away from the fun of color grading. All of this is done through easily digestible modules that range from two to 20 minutes.

The modules include Creating a Palette; Perceiving Color; Calibrating Color; Color Management; Deconstructing Color 1 – 3 and the Bonus Module Ox & Origin.

Without giving away the entire content in Ollie’s catalog, my favorite modules in this course are the on-set modules. Maybe because I am not on-set that often, but I found the “thinking out loud” about colors helpful. Knowing why reds represent blood, which raise your heart rate a little bit, is fascinating. He even goes through practical examples of color use in films such as in Whiplash.

In the final “Deconstructing Color” modules, Ollie goes into a color bay (complete with practical candle backlighting) and dives in Blackmagic’s DaVinci Resolve. He takes this course full circle to show how since he had to rush through a scene he can now go into Resolve and add some lighting to different sides of someone’s face since he took time to set up proper lighting on set, he can focus on other parts of his commercial.

Summing Up
I want to watch every tutorial MZed.com has to offer. From “Philip Bloom’s Cinematic Masterclass” to Ollie’s other course “Mastering Color.” Unfortunately, as of my review, you would have to pay an additional fee to watch the “Mastering Color” series. It seems like an unfortunate trend in online education to charge a fee and then when an extra special class comes up, charge more, but this class will supposedly be released to the standard subscribers in due time.

MZed.com has two subscription models: MZed Pro, which is $299 for one year of streaming the standard courses, and MZed Pro Premium for $399. This includes the standard courses for one year and the ability to choose one “Premium” course.

“Philip Bloom’s Cinematic Master Class” was the Premium course I was signed up for initially, but you you can decide between this one and the “Mastering Color” course. You will not be disappointed regardless of which one you choose. Even their first course “How to Photograph Everyone” is chock full of lighting and positioning instruction that can be applied in many aspects of videography.

I really was impressed with Directing Color with Ollie Kenchington, and if the other course are this good MZed.com will definitely become a permanent bookmark for me.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

VFX house Rodeo FX acquires Rodeo Production

Visual effects studio Rodeo FX, whose high-profile projects include Dumbo, Aquaman and Bumblebee, has purchased Rodeo Production and added its roster of photographers and directors to its offerings.

The two companies, whose common name is just a coincidence, will continue to operate as distinct entities. Rodeo Production’s 10-year-old Montreal office will continue to manage photo and video production, but will now also offer RodeoFX’s post production services and technical expertise.

In Toronto, Rodeo FX plans to open an Autodesk Flame editing suite in the Rodeo Production’ studio and expand its Toronto roster of photographers and directors with the goal of developing stronger production and post services for clients in the city’s advertising, television and film industries.

“This is a milestone in our already incredible history of growth and expansion,” says Sébastien Moreau, founder/president of Rodeo FX, which has offices in LA and Munich in addition to Montreal.

“I have always worked hard to give our artists the best possible opportunities, and this partnership was the logical next step,” says Rodeo Production’s founder Alexandra Saulnier. “I see this as a fusion of pure creativity and innovative technology. It’s the kind of synergy that Montreal has become famous for; it’s in our DNA.”

Rodeo Production clients include Ikea, Under Armour and Mitsubishi.

Shooting, posting New Republic’s Indie film, Sister Aimee

After a successful premiere at the Sundance Film Festival, New Republic Studios’ Sister Aimee screened at this month’s SXSW. The movie tells the story of an infamous American evangelist of the 1920s, Sister Aimee Semple McPherson, who gets caught up in her lover’s dreams of Mexico and finds herself on a road trip toward the border.

Sister Aimee shot at the newly renovated New Republic Studios near Austin, Texas, over two and a half weeks. “Their crew used our 2,400-square-foot Little Bear soundstage, our 3,000-square-foot Lone Wolf soundstage, our bullpen office space and numerous exterior locations in our backlot,” reports New Republic Studios president Mindy Raymond, adding that the Sister Aimee production also had access to two screening rooms with 5.1 surround sound, HDMI hookups to 4K monitors and theater-style leather chairs to watch dailies. The film also hit the road, shooting in the New Mexico desert.

L-R: Directors Samantha Buck, Marie Schlingmann at SXSW. Credit: Harrison Funk

Co-written and co-directed by Samantha Buck and Marie Schlingmann, the movie takes some creative license with the story of Aimee. “We don’t look for factual truth in Aimee’s journey,” they explain. “Instead we look for a more timeless truth that says something about female ambition, the female quest for immortality and, most of all, the struggle for women to control their own narratives. It becomes a story about storytelling itself.”

The film, shot by cinematographer Carlos Valdes-Lora at 3.2K ProRes 4444 XQ on an Arri Alexa Mini, was posted at Dallas and Austin-based Charlieuniformtango.

We reached out to the DP and the post team to find out more.

Carlos, why did you choose the package of the Alexa and Cooke Mini S4 Primes?
Carlos Valdes-Lora: In early conversations with the directors, we all agreed that we didn’t want Sister Aimee to feel like a traditional period movie. We didn’t want to use softening filters or vintage lenses. We aimed instead for clear images, deep focus and a rich color palette that remains grounded in the real world. We felt that this would lend the story a greater sense of immediacy and draw the viewer closer to the characters. Following that same thinking, we worked very extensively with the 25mm and 32mm, especially in closeups and medium closeups, emphasizing accessibility.

The Cooke Mini S4s are a beautiful and affordable set (relative to our other options.) We like the way they give deep dimensionality and warmth to faces, and how they create a slightly lower contrast image compared to the other modern lenses we looked at. They quickly became the right choice for us, striking the right balance between quality, size and value.

The Cookes paired with the Alexa Mini gave us a lightweight camera system with a very contained footprint, and we needed to stay fast and lean due to our compressed shooting schedule and often tight shooting quarters. The Chapman Cobra dolly was a big help in that regard as well.

What was the workflow to post like?
Charlieuniformtango producers Bettina Barrow, Katherine Harper, David Hartstein: Post took place primarily between Charlieuniformtango’s Dallas and Austin offices. Post strategizing started months before the shoot, and active post truly began when production began in July 2018.

Tango’s Evan Linton handled dailies brought in from the shoot, working alongside editor Katie Ennis out of Tango’s Austin studio, to begin assembling a rough cut as shooting continued. Ennis continued to cut at the studio through August with directors Schlingmann and Buck.

Editorial then moved back to the directors’ home state of New York to finish the cut for Sundance. (Editor Ennis, who four-walled out of Tango Austin for the first part of post, went to  New York with the directors, working out of a rented space.)

VFX and audio work started early at Tango, with continuously updated timelines coming from editorial, working to have certain locked shots also finished for the Sundance submission, while saving much of the cleanup and other CG heavy shots for the final picture lock.

Tango audio engineer Nick Patronella also tackled dialogue edit, sound design and mix for the submission out of the Dallas studio.

Can you talk about the VFX?
Barrow, Harper, Hartstein: The cut was locked in late November, and the heavy lifting really began. With delivery looming, Tango’s Flame artists Allen Robbins, Joey Waldrip, David Hannah, David Laird, Artie Peña and Zack Smith divided effects shots, which ranged from environmental cleanup, period-specific cleanup, beauty work such as de-aging, crowd simulation, CG sign creation and more. 3D

(L-R) Tango’s Artie Peña, Connor Adams, Allen Robbins in one of the studio’s Flame suites.

Artist Connor Adams used Houdini, Mixamo and Maya to create CG elements and crowds, with final comps being done in Nuke and sent to Flame for final color. Over 120 VFX shots were handled in total and Flame was the go-to for effects. Color and much of the effects happened simultaneously. It was a nice workflow as the project didn’t have major VFX needs that would have impacted color.

What about the color grade?
Barrow, Harper, Hartstein: Directors Buck and Schlingmann and DP Valdes-Lora worked with Tango colorist Allen Robbins to craft the final look of the film — with the color grade also done in Flame. The trio had prepped shooting for a Kodachrome-style look, especially for the exteriors, but really overall. They found important reference in selections of Robert Capa photographs.

Buck, Schlingmann and Valdes-Lora responded mostly to Kodachrome’s treatment of blues, browns, tans, greens and reds (while staying true to skin tone), but also to their gamma values, not being afraid of deep shadows and contrast wherever appropriate. Valdes-Lora wanted to avoid lighting/exposing to a custom LUT on set that would reflect this kind of Kodachrome look, in case they wanted to change course during the process. With the help of Tango, however, they discovered that by dialing back the Capa look it grounded the film a little more and made the characters “feel” more accessible. The roots of the inspiration remained in the image but a little more naturalism, a little more softness, served the story better.

Because of that they monitored on set with Alexa 709, which he thought exposing for would still provide enough room. Production designer Jonathan Rudak (another regular collaborator with the directors) was on the same page during prep (in terms of reflecting this Capa color style), and the practical team did what they could to make sure the set elements complemented this approach.

What about the audio post?
Barrow, Harper, Hartstein: With the effects and color almost complete, the team headed to Skywalker Ranch for a week of final dialogue edit, mix, sound design and Foley, led by Skywalker’s Danielle Dupre, Kim Foscato and E. Larry Oatfield. The team also was able to simultaneously approve color sections in Skywalker’s Stag Theater allowing for an ultra-efficient schedule. With final mix in hand, the film was mastered just after Christmas so that DCP production could begin.

Since a portion of the film was musical, how complex was the audio mix?
Skywalker sound mixer Dupre: The musical number was definitely one of the most challenging but rewarding scenes to design and mix. It was such a strong creative idea that played so deeply into the main character. The challenge was in striking a balance between tying it into the realism of the film while also leaning into the grandiosity of the musical to really sell the idea.

It was really fun to play with a combination of production dialogue and studio recordings to see how we could make it work. It was also really rewarding to create a soundscape that starts off minimally and simply and transitions to Broadway scale almost undetectably — one of the many exciting parts to working with creative and talented filmmakers.

What was the biggest challenge in post?
Barrow, Harper, Hartstein: Finishing a film in five to six weeks during the holidays was no easy feat. Luckily, we were able to have our directors hands-on for all final color, VFX and mix. Collaborating in the same room is always the best when you have no time to spare. We had a schedule where each day was accounted for — and we stuck to it almost down to the hour.

 

BlacKkKlansman director Spike Lee

By Iain Blair

Spike Lee has been on a roll recently. Last time we sat down for a talk, he’d just finished Chi-Raq, an impassioned rap reworking of Aristophanes’ “Lysistrata,” which was set against a backdrop of Chicago gang violence. Since then, he’s directed various TV, documentary and video projects. And now his latest film BlacKkKlansman has been nominated for a host of Oscars, including Best Picture, Best Director, Best Adapted Screenplay, Best Film Editing,  Best Original Score and Best Actor in a Supporting Role (Adam Driver).

Set in the early 1970s, the unlikely-but-true story details the exploits of Ron Stallworth (John David Washington), the first African-American detective to serve in the Colorado Springs Police Department. Determined to make a name for himself, Stallworth sets out on a dangerous mission: infiltrate and expose the Ku Klux Klan. The young detective soon recruits a more seasoned colleague, Flip Zimmerman (Adam Driver), into the undercover investigation. Together, they team up to take down the extremist hate group as the organization aims to sanitize its violent rhetoric to appeal to the mainstream. The film also stars Topher Grace as David Duke.

Behind the scenes, Lee reteamed with co-writer Kevin Willmott, longtime editor Barry Alexander Brown and composer Terence Blanchard, along with up-and-coming DP Chayse Irvin. I spoke with the always-entertaining Lee, who first burst onto the scene back in 1986 with She’s Gotta Have It, about making the film, his workflow and the Oscars.

Is it true Jordan Peele turned you onto this story?
Yeah, he called me out of the blue and gave me possibly the greatest six-word pitch in film history — “Black man infiltrates Ku Klux Klan.” I couldn’t resist it, not with that pitch.

Didn’t you think, “Wait, this is all too unbelievable, too Hollywood?”
Well, my first question was, “Is this actually true? Or is it a Dave Chappelle skit?” Jordan assured me it’s a true story and that Ron wrote a book about it. He sent me a script, and that’s where we began, but Kevin Willmott and I then totally rewrote it so we could include all the stuff like Charlottesville at the end.

Iain Blair and Spike Lee

Did you immediately decide to juxtapose the story’s period racial hatred with all the ripped-from-the-headlines news footage?
Pretty much, as the Charlottesville rally happened August 11, 2017 and we didn’t start shooting this until mid-September, so we could include all that. And then there was the terrible synagogue massacre, and all the pipe bombs. Hate crimes are really skyrocketing under this president.

Fair to say, it’s not just a film about America, though, but about what’s happening everywhere — the rise of neo-Nazism, racism, xenophobia and so on in Europe and other places?
I’m so glad you said that, as I’ve had to correct several people who want to just focus on America, as if this is just happening here. No, no, no! Look at the recent presidential elections in Brazil. This guy — oh my God! This is a global phenomenon, and the common denominator is fear. You fire up your base with fear tactics, and pinpoint your enemy — the bogeyman, the scapegoat — and today that is immigrants.

What were the main challenges in pulling it all together?
Any time you do a film, it’s so hard and challenging. I’ve been doing this for decades now, and it ain’t getting any easier. You have to tell the story the best way you can, given the time and money you have, and it has to be a team effort. I had a great team with me, and any time you do a period piece you have added challenges to get it looking right.

You assembled a great cast. What did John David Washington and Adam Driver bring to the main roles?
They brought the weight, the hammer! They had to do their thing and bring their characters head-to-head, so it’s like a great heavyweight fight, with neither one backing down. It’s like Inside Man with Denzel and Clive Owen.

It’s the first time you’ve worked with the Canadian DP Chayse Irvin, who mainly shot shorts before this. Can you talk about how you collaborated with him?
He’s young and innovative, and he shot a lot of Beyonce’s Lemonade long-form video. What we wanted to do was shoot on film, not digital. I talked about all the ‘70s films I grew up with, like French Connection and Dog Day Afternoon. So that was the look I was after. It had to match the period, but not be too nostalgic. While we wanted to make a period film, I also wanted it to feel and look contemporary, and really connect that era with the world we live in now. He really nailed it. Then my great editor, Barry Alexander Brown, came up with all the split-screen stuff, which is also very ‘70s and really captured that era.

How tough was the shoot?
Every shoot’s tough. It’s part of the job. But I love shooting, and we used a mix of practical locations and sets in Brooklyn and other places that doubled for Colorado Springs.

Where did you post?
Same as always, in Brooklyn, at my 40 Acres and a Mule office.

Do you like the post process?
I love it, because post is when you finally sit down and actually make your film. It’s a lot more relaxing than the shoot — and a lot of it is just me and the editor and the Avid. You’re shaping and molding it and finding your way, cutting and adding stuff, flopping scenes, and it never really follows the shooting script. It becomes its own thing in post.

Talk about editing with Barry Alexander Brown, the Brit who’s cut so many of your films. What were the big editing challenges?
The big one was finding the right balance between the humor and the very serious subject matter. They’re two very different tones, and then the humor comes from the premise, which is absurd in itself. It’s organic to the characters and the situations.

Talk about the importance of sound and music, and Terence Blanchard’s spare score that blends funk with classical.
He’s done a lot of my films, and has never been nominated for an Oscar — and he should have been. He’s a truly great composer, trumpeter and bandleader, and a big part of what I do in post. I try to give him some pointers that aren’t restrictive, and then let him do his thing. I always put as much as emphasis on sound and music as I do on the acting, editing and cinematography. It’s hugely important, and once we have the score, we have a film.

I had a great sound team. Phil Stockton, who began with me back on School Daze, was the sound designer. David Boulton, Mike Russo and Howard London did the ADR mix, and my longtime mixer Tommy Fleischman was on it. We did it all at C5 in New York. We spent a long time on the mix, building it all up.

Where did you do the DI and how important is it to you?
At Company 3 with colorist Tom Poole, who’s so good. It’s very important but I’m in and out, as I know Tom and the DP are going to get the look I want.

Spike Lee on set.

Did the film turn out the way you hoped?
Here’s the thing. You try to do the best you can, and I can’t predict what the reaction will be. I made the film I wanted to make, and then I put it out in the world. It’s all about timing. This was made at the right time and was made with a lot of urgency. It’s a crazy world and it’s getting crazier by the minute.

How important are industry awards and nomination to you? 
They’re very important in that they bring more attention, more awareness to a film like this. One of the blessings from the strong critical response to this has been a resurgence in looking at my earlier films again, some of which may have been overlooked, like Bamboozled and Summer of Sam.

Do you see progress in Hollywood in terms of diversity and inclusion?
There’s been movement, maybe not as fast as I’d like, but it’s slowly happening, so that’s good.

What’s next?
We just finished the second season of She’s Gotta Have It for Netflix, and I have some movie things cooking. I’m pretty busy.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

VFX editor Warren Mazutinec on life, work and Altered Carbon

By Jeremy Presner

Long-time assistant editor Warren Mazutinec’s love for filming began when he saw Star Wars as an eight-year-old in a small town in Edmonton, Alberta. Unlike many other Lucas-heads, however, this one got to live out his dream grinding away in cutting rooms from Vancouver to LA working with some of the biggest editors in the galaxy.

We met back in 1998 when he assisted me on the editing of the Martin Sheen “classic” Voyage of Terror. We remain friends to this day. One of Warren’s more recent projects was Netflix’s VFX-heavy Altered Carbon, which got a lot of love from critics and audiences alike.

My old friend, who is now based in Vancouver, has an interesting story to tell, moving from assistant editor to VFX editor working on films like Underworld 4, Tomorrowland, Elysium and Chappie, so I threw some questions at him. Enjoy!

Warren Mazutinec

How did you get into the business?
I always wanted to work in the entertainment industry, but that was hard to find in Alberta. No film school-type programs were even offered, so I took the closest thing at a local college: audiovisual communications. While there, I studied photography, audio and video, but nothing like actual filmmaking. After that I attended Vancouver Film School. After film school, and with the help of some good friends, I got an opportunity to be a trainee at Shavick Entertainment.

What was it like working at a “film factory” that cranked out five to six pictures a year?
It was fun, but the product ultimately became intolerable. Movies for nine-year-olds can only be so interesting… especially low-budget ones.

What do your parents think of your career option?
Being from Alberta, everyone thought it wasn’t a real job — just a Hollywood dream. It took some convincing; my dad still tells me to look for work between gigs.

How did you learn Avid? Were you self-taught?
I was handed the manual by a post supervisor on day one. I never read it. I just asked questions and played around on any machine available. So I did have a lot of help, but I also went into work during my free time and on weekends to sit and learn what I needed to do.

Over the years I’ve been lucky enough to have cool people to work with and to learn with and from. I did six movies before I had an email address, more before I even owned a computer.

As media strayed away from film into digital, how did your role change in the cutting room? How did you refine your techniques with a changing workflow?
My first non-film movie was Underworld 4. It was shot with a Red One camera. I pretty much lied and said I knew how to deal with it. There was no difference really; just had to say goodbye to lab rolls, Keykode, etc. It was also a 3D stereo project, so that was a pickle, but not too hard to figure out.

How did you figure out the 3D stereo post?
It was basically learning to do everything twice. During production we really only played back in 3D for the novelty. I think most shows are 3D-ified in post. I’m not sure though, I’ve only done the one.

Do you think VR/AR will be something you work with in the future?
Yes, I want to be involved in VR at some point. It’s going to be big. Even just doing sound design would be cool. I think it’s the next step, and I want in.

Who are some of your favorite filmmakers?
David Lynch is my number one, by far. I love his work in all forms. A real treasure tor sure. David Fincher is great too. Scorsese, Christopher Nolan. There are so many great filmmakers working right now.

Is post in your world constantly changing, or have things more or less leveled off?
Both. But usually someone has dailies figured out, so Avid is pretty much the same. We cut in DNx115 or DnX36, so nothing like 4K-type stuff. Conform at the end is always fun, but there are tests we do at the start to figure it all out. We are rarely treading in new water.

What was it like transitioning to VFX editor? What tools did you need to learn to do that role?
FileMaker. And Jesus, son, I didn’t learn it. It’s a tough beast but it can do a lot. I managed to wrangle it to do what I was asked for, but it’s a hugely powerful piece of software. I picked up a few things on Tomorrowland and went from there.

I like the pace of the VFX editor. It’s different than assisting and is a nice change. I’d like to do more of it. I’d like to learn and use After Effects more. On the film I was VFX editor for, I was able to just use the Avid, as it wasn’t that complex. Mostly set extensions, etc.

How many VFX shot revisions would a typical shot go through on Elysium?
On Elysium, the shot version numbers got quite high, but part of that would be internal versioning by the vendor. Director Neil Blomkamp is a VFX guy himself, so he was pretty involved and knew what he wanted. The robots kept looking cooler and cooler as the show went on. Same for Chappie. That robot was almost perfect, but it took a while to get there.

You’ve worked with a vast array of editors, from, including Walter Murch, Lee Smith, Julian Clarke, Nancy Richardson and Bill Steinkamp. Can you talk about that, and have any of them let you cut material?
I’ll assemble scenes if asked to, just to help the editor out so he isn’t starting from scratch. If I get bored, I start cutting scenes as well. On Altered Carbon, when Julian (Clark) was busy with Episodes 2 and 3, I’d try to at least string together a scene or two for Episode 8. Not fine-cutting, mind you, just laying out the framework.

Walter asked a lot of us — the workload was massive. Lee Smith didn’t ask for much. Everyone asks for scene cards that they never use, ha!

Walter hadn’t worked on the Avid for five years or so prior to Tomorrowland, so there was a lot of him walking out of his room asking, “How do I?” It was funny because a lot of the time I knew what he was asking, but I had to actually do it on my machine because it’s so second nature.

What is Walter Murch like in the cutting room? Was learning his organizational process something you carried over into future cutting rooms?
I was a bit intimidated prior to meeting him. He’s awesome though. We got along great and worked well together. There was Walter, a VFX editor and four assistants. We all shared in the process. Of course, Walter’s workflow is unlike any other so it was a huge adjustment, but within a few weeks we were a well-oiled machine.

I’d come in at 6:30am to get dailies sorted and would usually finish around lunch. Then we’d screen in our theater and make notes, all of us. I really enjoyed screening the dailies that way. Then he would go into his room and do his thing. I really wish all films followed his workflow. As tough as it is, it all makes sense and nothing gets lost.

I have seen photos with the colored boxes and triangles on the wall. What does all that mean, and how often was that board updated?
Ha. That’s Walter’s own version of scene cards. It makes way better sense. The colors and shapes mean a particular thing — the longer the card the longer the scene. He did all that himself, said it helps him see the picture. I would peek into his room and watch him do this. He seemed so happy doing it, like a little kid.

Do you always add descriptions and metadata to your shots in Avid Media Composer?
We add everything possible. Usually there is a codebook the studios want, so we generate that with FileMaker on almost all the bigger shows. Walter’s is the same just way bigger and better. It made the VFX database look like a toy.

What is your workflow for managing/organizing footage?
A lot of times you have to follow someone else’s procedure, but if left to my own devices I try to make it the simplest it can be so anyone can figure out what was done.

How do you organize your timeline?
It’s specific to the editor, but I like to use as many audio tracks as possible and as few video tracks as possible, but when it’s a VFX-heavy show, that isn’t possible due to stacking various shot versions.

What did you learn from Lee Smith and Julian Clarke?
Lee Smith is a suuuuuper nice guy. He always had great stories from past films and he’s a very good editor. I’m glad he got the Oscar for Dunkirk, he’s done a lot of great work.

Julian is also great to work with. I’ve worked with him on Elysium, Chappie and Altered Carbon. He likes to cut with a lot of sound, so it’s fun to work with him. I love cutting sound, and on Altered Carbon we had over 60 tracks. It was a alternating stereo setup and we used all the tracks possible.

Altered Carbon

It was such a fun world to create sound for. Everything that could make a sound we put in. We also invented signature sounds for the tech we hoped they’d use in the final. And they did for some things.

Was that a 5.1 temp mix?? Have you ever done one?
No. I want to do a 5.1 Avid mix. Looks fun.

What was the schedule like on Altered Carbon? How was that different than some of the features you’ve worked on?
It was six-day weeks and 12 hours a day. Usually one week per month I’d trade off with the 2nd assistant and she’d let me have an actual weekend. It was a bit of a grind. I worked on Episodes 2, 3 and 8, and the schedules for those were tight, but somehow we got through it all. We had a great team up here for Vancouver’s editorial. They were also cutting in LA as well. It was pretty much non-stop editing the whole way through.

How involved was Netflix in terms of the notes process? Were you working with the same editors on the episodes you assisted?
Yes, all episodes were with Julian. First it went through Skydance notes, then Netflix. Skydance usually had more as they were the first to see the cuts. There were many versions for sure.

What was it like working with Neil Blomkamp?
It was awesome. He makes cool films, and it’s great to see footage like that. I love shooting guns, explosions, swords and swearing. I beat him in ping-pong once. I danced around in victory and he demanded we play again. I retired. One of the best environments I’ve ever worked in. Elysium was my favorite gig.

What’s the largest your crew has gotten in post?
Usually one or two editors, up to four assistants, a PA, a post super — so eight or nine, depending.

Do you prefer working with a large team or do you like smaller films?
I like the larger team. It can all be pretty overwhelming and having others there to help out, the easier it can be to get through. The more the merrier!

Altered Carbon

How do you handle long-ass-days?
Long days aren’t bad when you have something to do. On Altered Carbon I kept a skateboard in my car for those times. I just skated around the studio waiting for a text. Recently I purchased a One-Wheel (skateboard with 1 wheel) and plan to use it to commute to work as much as possible.

How do you navigate the politics of a cutting room?
Politics can be tricky. I usually try to keep out of things unless I’m asked, but I do like to have a sit down or a discussion of what’s going on privately with the editor or post super. I like to be aware of what’s coming, so the rest of us are ready.

Do you prefer features to TV?
It doesn’t matter anymore because the good filmmakers work in both mediums. It used to be that features were one thing and TV was another, with less complex stories. Now that’s different and at times it’s the opposite. Features usually pay more though, but again that’s changing. I still think features are where it’s at, but that’s just vanity talking.

Sometimes your project posts in Vancouver but moves to LA for finishing. Why? Does it ever come back?
Mostly I think it’s because that’s where the director/producers/studio lives. After it’s shot everyone just goes back home. Home is usually LA or NY. I wish they’d stay here.

How long do you think you’ll continue being an AE? Until you retire? What age do you think that’ll be?
No idea; I just want to keep working on projects that excite me.

Would you ever want to be an editor or do you think you’d like to pivot to VFX, or are you happy where you are?
I only hope to keep learning and doing more. I like the VFX editing, I like assisting and I like being creative. As far as cutting goes, I’d like to get on a cool series as a junior editor or at least start doing a few scenes to get better. I just want to keep advancing, I’d love to do some VR stuff.

What’s next for you project wise?
I’m on a Disney Show called Timmy Failure. I can’t say anything more at this point.

What advice do you have for other assistant editors trying to come up?
It’s going to take a lot longer than you think to become good at the job. Being the only assistant does not make you a qualified first assistant. It took me 10 years to get there. Also you never stop learning, so always be open to another approach. Everyone does things differently. With Murch on Tomorrowland, it was a whole new way of doing things that I had never seen before, so it was interesting to learn, although it was very intimidating at the start.


Jeremy Presner is an Emmy-nominated film and television editor residing in New York City. Twenty years ago, Warren was AE on his first film. Since then he has cut such diverse projects as Carrie, Stargate Atlantis, Love & Hip Hop and Breaking Amish.

Review: iOgrapher Multi Case for mobile filmmaking

By Brady Betzel

Thanks to the amazing iPhone X, Google Pixel and Samsung Galaxy, almost everyone has a high-end video camera on their person at all times and this is helping to spur on mobile filmmaking and vlogging.

From YouTube to Instagram to movies like Unsane (Steven Soderbergh) or Tangerine (Sean Baker) — and regardless of whether you think a $35,000 camera setup tells a story better than a $1,000 cell phone (looking at you Apple Phone XS Max) — mobile filmmaking is here to stay and will only get better.

iOgrapher’s latest release is the iOgrapher Multi Case, a compact mobile filmmaking mounting solution that works with today’s most popular phones. iOgrapher has typically created solutions that were tied to the mobile device being used for filmmaking, such as an iPhone, the latest Samsung Galaxy phones, iPads or even action cameras like a GoPro Hero 7 Black.

With the new iOgrapher Multi Case you can fit any mobile device that measures more than 5 ½” x 2 ¼” and less than 6 ½” by 3 ⅜”. Unfortunately, you won’t be fitting an iPad or a GoPro in the iOgrapher Multi Case, but don’t fret! iOgrapher makes rigs for those as well. On the top of the Multi Case are two cold shoe mounts for lights, microphones or any other device, like a GoPro. To mount things with ¼” 20 screw mounts in the cold shoes you will need to find a cold shoe to ¼” 20 adapter, which is available on iOgrapher’s accessory page. You can also find these at Monoprice or Amazon for real cheap.

And if you are looking to order more mounts you may want to order some extra cold shoe adapters that can be mounted on the handles of the iOgrapher Multi Case in the additional ¼” 20 screw mounts. The mounts on the handles are great for adding in additional lighting or microphones. I’ve even found that if you are going to be doing some behind-the-scenes filming or need another angle for your shooting, a small camera like a GoPro can be easily mounted and angled. With all this mounting you should assume that you are going to be using the iOgrapher on a sturdy tripod. Just for fun, I mounted the iOgrapher Multi Case onto a GoPro 3-Way Grip, which can also be used as a light tripod. It wasn’t exactly stable but it worked. I wouldn’t suggest using it for more than an emergency shooting situation though.

On the flip side (all pun intended), the iOgrapher can be solidly mounted vertically with the ¼” 20 screw mounts on the handles. With Instagram making headway with vertical video in their Instagram Stories, iOgrapher took that idea and built that into their Multi Case, further cementing grumbling from the old folks who just don’t get vertical video.

Testing
I tried out both a Samsung Galaxy s8+ as well as an iPhone 7+ with their cases on inside of the iOgrapher Multi Case. Both fit. The iPhone 7+ was stretching the boundaries of the Multi Case, but it did fit and worked well. The way the phones are inserted into the Multi Case is by a spring-loaded bottom piece. From the left or top side, if you are shooting vertically, you push the bottom of the mobile device into the corner covered slots of the iOgrapher Multi Case until the top or the left side can be secured under the left or top side of the Multi Case. It’s really easy.

I was initially concerned with the spring loading of the case; I wasn’t sure if the springs would be resilient enough to handle the constant pulling in and out of the phones, but the springs are high quality and held up beautifully. I even tried inserting my mobile phones tons of times and didn’t notice any issues with the springs or my phones.

Take care when inserting your phone into the Multi Case if you have a protective shield on the screen of your device. If you aren’t extra careful it can pull or snag on the cover — especially with the tight fit of a case. Just pay attention and there will be nothing to worry about. The simple beauty of the iOgrapher is that with a wider grip of your filmmaking device, you have a larger area to distribute any shaking coming from your hands, essentially helping stabilize your filmmaking without the need for a full-fledged gimbal.

If you accidentally drop your iOgrapher you may get a scratch, but for the most part they are built sturdy and can withstand punishment, whether it’s from your four year old or from weather. If you want to get a little fancy, you can buy affordable lights like the Litra Torch (check out my review) to attach to the cold shoe mounts, or even a Rode microphone (don’t forget the TRS to TRRS adapter if you are plugging into an iPhone), and you are off and running.

Summing Up
I have been really intrigued with iOgrapher’s products since day one. They are an affordable and sturdy way to jump into filmmaking using cameras everyone carries with them every day: their phones.

Whether you are a high school student looking to get steady and professional mobile video, or a journalist looking for a quick way to make the most of your shots with just a phone, light, mic and tripod mount, the iOgrapher Multi Case will unlock your mobile filmmaking potential.

The iOgrapher Multi Case is a very durable protective case for your mobile filmmaking devices that is a steal at $79.99. If you are a parent that is looking for an inexpensive way to try and tease your child’s interest in video take a look at www.iographer.com and grab a few accessories like a Manfrotto light and Rode VideoMicro to add some subtle lighting and pick up the best quality audio.

Make sure to check out Dave Basulto’s — the creator of iOgrapher — demo of the iOgrapher Multi Case, including trying out the fit of different phones.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Catching up with Aquaman director James Wan

By Iain Blair

Director James Wan has become one of the biggest names in Hollywood thanks to the $1.5 billion-grossing Fast & Furious 7, as well as the Saw, Conjuring and Insidious films — three of the most successful horror franchises of the last decade.

Now the Malaysian-born, Australian-raised Wan, who also writes and produces, has taken on the challenge of bringing Aquaman and Atlantis to life. The origin story of half-surface dweller, half-Atlantean Arthur Curry stars Jason Momoa in the title role. Amber Heard plays Mera, a fierce warrior and Aquaman’s ally throughout his journey.

James Wan and Iain Blair

Additional cast includes Willem Dafoe as Vulko, council to the Atlantean throne; Patrick Wilson as Orm, the present King of Atlantis; Dolph Lundgren as Nereus, King of the Atlantean tribe Xebel; Yahya Abdul-Mateen II as the revenge-seeking Manta; and Nicole Kidman as Arthur’s mom, Atlanna.

Wan’s team behind the scenes included such collaborators as Oscar-nominated director of photography Don Burgess (Forrest Gump), his five-time editor Kirk Morri (The Conjuring), production designer Bill Brzeski (Iron Man 3), visual effects supervisor Kelvin McIlwain (Furious 7) and composer Rupert Gregson-Williams (Wonder Woman).

I spoke with the director about making the film, dealing with all the effects, and his workflow.

Aquaman is definitely not your usual superhero. What was the appeal of doing it? 
I didn’t grow up with Aquaman, but I grew up with other comic books, and I always was well aware of him as he’s iconic. A big part of the appeal for me was he’d never really been done before — not on the big screen and not really on TV. He’s never had the spotlight before. The other big clincher was this gave me the opportunity to do a world-creation film, to build a unique world we’ve never seen before. I loved the idea of creating this big fantasy world underwater.

What sort of film did you set out to make?
Something that was really faithful and respectful to the source material, as I loved the world of the comic book once I dove in. I realized how amazing this world is and how interesting Aquaman is. He’s bi-racial, half-Atlantean, half-human, and he feels he doesn’t really fit in anywhere at the start of the film. But by the end, he realizes he’s the best of both worlds and he embraces that. I loved that. I also loved the fact it takes place in the ocean so I could bring in issues like the environment and how we treat the sea, so I felt it had a lot of very cool things going for it — quite apart from all the great visuals I could picture.

Obviously, you never got the Jim Cameron post-Titanic memo — never, ever shoot in water.
(Laughs) I know, but to do this we unfortunately had to get really wet as over 2/3rds of the film is set underwater. The crazy irony of all this is when people are underwater they don’t look wet. It’s only when you come out of the sea or pool that you’re glossy and dripping.

We did a lot of R&D early on, and decided that shooting underwater looking wet wasn’t the right look anyway, plus they’re superhuman and are able to move in water really fast, like fish, so we adopted the dry-for-wet technique. We used a lot of special rigs for the actors, along with bluescreen, and then combined all that with a ton of VFX for the hair and costumes. Hair is always a big problem underwater, as like clothing it behaves very differently, so we had to do a huge amount of work in post in those areas.

How early on did you start integrating post and all the VFX?
It’s that kind of movie where you have to start post and all the VFX almost before you start production. We did so much prep, just designing all the worlds and figuring out how they’d look, and how the actors would interact with them. We hired an army of very talented concept artists, and I worked very closely with my production designer Bill Brzeski, my DP Don Burgess and my visual effects supervisor Kelvin McIlwain. We went to work on creating the whole look and trying to figure out what we could shoot practically with the actors and stunt guys and what had to be done with VFX. And the VFX were crucial in dealing with the actors, too. If a body didn’t quite look right, they’d just replace them completely, and the only thing we’d keep was the face.

It almost sounds like making an animated film.
You’re right, as over 90% of it was VFX. I joke about it being an animated movie, but it’s not really a joke. It’s no different from, say, a Pixar movie.

Did you do a lot of previs?
A lot, with people like Third Floor, Day For Nite, Halon, Proof and others. We did a lot of storyboards too, as they are quicker if you want to change a camera angle, or whatever, on the fly. Then I’d hand them off to the previs guys and they’d build on those.

What were the main technical challenges in pulling it all together on the shoot?
We shot most of it Down Under, near Brisbane. We used all nine of Village Roadshow Studios’ soundstages, including the new Stage 9, as we had over 50 sets, including the Atlantis Throne Room and Coliseum. The hardest thing in terms of shooting it was just putting all the actors in the rigs for the dry-for-wet sequences; they’re very cumbersome and awkward, and the actors are also in these really outrageous costumes, and it can be quite painful at times for them. So you can’t have them up there too long. That was hard. Then we used a lot of newish technology, like virtual production, for scenes where the actors are, say, riding creatures underwater.

We’d have it hooked up to the cameras so you could frame a shot and actually see the whole environment and the creature the actor is supposed to be on — even though it’s just the actors and bluescreen and the creature is not there. And I could show the actors — look, you’re actually riding a giant shark — and also tell the camera operator to pan left or right. So it was invaluable in letting me adjust performance and camera setups as we shot, and all the actors got an idea of what they were doing and how the VFX would be added later in post. Designing the film was so much fun, but executing it was a pain.

The film was edited by Kirk Morri, who cut Furious 7, and worked with you on the Insidious and The Conjuring films. How did that work?
He wasn’t on set but he’d visit now and again, especially when we were shooting something crazy and it would be cool to actually see it. Then we’d send dailies and he’d start assembling, as we had so much bluescreen and VFX stuff to deal with. I’d hop in for an hour or so at the end of each day’s shoot to go over things as I’m very hands on — so much so that I can drive editors crazy, but Kirk puts up with all that.

I like to get a pretty solid cut from the start. I don’t do rough assemblies. I like to jump straight into the real cut, and that was so important on this because every shot is a VFX shot. So the sooner you can lock the shot, the better, and then the VFX teams can start their work. If you keep changing the cut, then you’ll never get your VFX shots done in time. So we’d put the scene together, then pass it to previs, so you don’t just have actors floating in a bluescreen, but they’re in Atlantis or wherever.

Where did you do the post?
We did most of it back in LA on the Warner lot.

Do you like the post process?
I absolutely love it, and it’s very important to my filmmaking style. For a start, I can never give up editing and tweaking all the VFX shots. They have to pull it away from me, and I’d say that my love of all the elements of the post process — editing, sound design, VFX, music — comes from my career in suspense movies. Getting all the pieces of post right is so crucial to the end result and success of any film. This post was creatively so much fun, but it was long and hard and exhausting.

James Wan

All the VFX must have been a huge challenge.
(Laughs) Yes, as there’s over 2,500 VFX shots and we had everyone working on it — ILM, Scanline, Base, Method, MPC, Weta, Rodeo, Digital Domain, Luma — anyone who had a computer! Every shot had some VFX, even the bar scene where Arthur’s with his dad. That was a set, but the environment outside the window was all VFX.

What was the hardest VFX sequence to do?
The answer is, the whole movie. The trench sequence was hard, but Scanline did a great job. Anything underwater was tough, and then the big final battle was super-difficult, and ILM did all that.

Did the film turn out the way you hoped?
For the most part, but like most directors, I’m never fully satisfied.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

DevinSuperTramp: The making of a YouTube filmmaker

Devin Graham, aka DevinSuperTramp, made the unlikely journey from BYU dropout to a viral YouTube sensation who has over five million followers. After leaving school, Graham went to Hawaii to work on a documentary. The project soon ran out of money and he was stuck on the island… feeling very much a dropout and a failure. He started making fun videos with his friends to pass the time, and DevinSuperTramp was born. Now he travels, filming his view of the world, taking on daring adventures to get his next shot, and risking life and limb.

Shooting while snowboarding behind a trackhoe with a bunch of friends for a new video.

We recently had the chance to sit down with Graham to hear firsthand what lessons he’s learned along his journey, and how he’s developed into the filmmaker he is today.

Why extreme adventure content?
I grew up in the outdoors — always hiking and camping with my dad, and snowboarding. I’ve always been intrigued by pushing human limits. One thing I love about the extreme thing is that everyone we work with is the best at what they do. Like, we had the world’s best scooter riders. I love working with people who devote their entire lives to this one skillset. You get to see that passion come through. To me, it’s super inspiring to show off their talents to the world.

How did you get DevinSuperTramp off the ground? Pun intended.
I’ve made movies ever since I can remember. I was a little kid shooting Legos and stop-motion with my siblings. In high school, I took photography classes, and after I saw the movie Jurassic Park, I was like, “I want to make movies for a living. I want to do the next Jurassic Park.” So, I went to film school. Actually, I got rejected from the film program the first time I applied, which made me volunteer for every film thing going on at the college — craft service, carrying lights, whatever I could do. One day, my roommate was like, “YouTube is going to be the next big thing for videos. You should get on that.”

And you did.
Well, I started making videos just kind of for fun, not expecting anything to happen. But it blew up. Eight years later, it’s become the YouTube channel we have now, with five million subscribers. And we get to travel around the world creating content that we love creating.

Working on a promo video for Recoil – all the effects were done practically.

And you got to bring it full circle when you worked with Universal on promoting Fallen Kingdom.
I did! That was so fun and exciting. But yeah, I was always making content. I didn’t wait ‘til after I graduated. I was constantly looking for opportunities and networking with people from the film program. I think that was a big part of (succeeding at that time), just looking for every opportunity to milk it for everything I could.

In the early days, how did you promote your work?
I was creating all my stuff on YouTube, which, at that time, had hardly any solid, quality content. There was a lot of content, but it was mostly shot on whatever smartphone people had, or it was just people blogging. There wasn’t really anything cinematic, so right away our stuff stood out. One of the first videos I ever posted ended up getting like a million views right away, and people all around the world started contacting me, saying, “Hey, Devin, I’d love for you to shoot a commercial for us.” I had these big opportunities right from the start, just by creating content with my friends and putting it out on YouTube.

Where did you get the money for equipment?
In the beginning, I didn’t even own a camera. I just borrowed some from friends. We didn’t have any fancy stuff. I was using a Canon 5D Mark II and the Canon T2i, which are fairly cheap cameras compared to what we’re using now. But I was just creating the best content I could with the resources I had, and I was able to build a company from that.

If you had to start from scratch today, do you think you could do it again?
I definitely think it’s 100 percent doable, but I would have to play the game differently. Even now we are having to play the game differently than we did six months ago. Social media is hard because it’s constantly evolving. The algorithms keep changing.

Filming in Iceland for an upcoming documentary.

What are you doing today that’s different from before?
One thing is just using trends and popular things that are going on. For example, a year and a half ago, Pokémon Go was very popular, so we did a video on Pokémon and it got 20 million views within a couple weeks. We have to be very smart about what content we put out — not just putting out content to put out content.

One thing that’s always stayed true since the beginning is consistent content. When we don’t put out a video weekly, it actually hurts our content being seen. The famous people on YouTube now are the ones putting out daily content. For what we’re doing, that’s impossible, so we’ve sort of shifted platforms from YouTube, which was our bread and butter. Facebook is where we push our main content now, because Facebook doesn’t favor daily content. It just favors good-quality content.

Teens will be the first to say that grown-ups struggle with knowing what’s cool. How do you chase after topics likely to blow up?
A big one is going on YouTube and seeing what videos are trending. Also, if you go to Google Trends, it shows you the top things that were searched that day, that week, that month. So, it’s being on top of that. Or, maybe, Taylor Swift is coming out with a new album; we know that’s going to be really popular. Just staying current with all that stuff. You can also use Facebook, Twitter and Instagram to get an idea of what people are really excited about.

Can you tell us about some of the equipment you use, and the demands that your workflow puts on your storage needs?
We shoot so much content. We own two Red 8K cameras that we film everything with, and we’re shooting daily for the most part. On an average week, we’re shooting about eight terabytes, and then backing that up — so 16 terabytes a week. Obviously, we need a lot of storage, and we need storage that we can access quickly. We’re not putting it on tape. We need to pull stuff up right there and start editing on it right away.

So, we need the biggest drives that are as fast as possible. That’s why we use G-Tech’s 96TB G-Speed Shuttle XL towers. We have around 10 of those, and we’ve been shooting with those for the last three to four years. We needed something super reliable. Some of these shoots involve forking out a lot of money. I can’t take a hard drive and just hope it doesn’t fail. I need something that never fails on me — like ever. It’s just not worth taking that risk. I need a drive I can completely trust and is also super-fast.

What’s the one piece of advice that you wish somebody had given you when you were starting out?
In my early days, I didn’t have much of a budget, so I would never back up any of my footage. I was working on two really important projects and had them all on one drive. My roommate knocked that drive off the table, and I lost all that footage. It wasn’t backed up. I only had little bits and pieces still saved on the card — enough to release it, but a lot of people wanted to buy the stock footage and I didn’t have most of the original content. I lost out on a huge opportunity.

Today, we back up every single thing we do, no matter how big or how small it is. So, if I could do my early days over again, even if I didn’t have all the money to fund it, I’d figure out a way to have backup drives. That was something I had to learn the hard way.

NAB NY: A DP’s perspective

By Barbie Leung

At this year’s NAB New York show, my third, I was able to wander the aisles in search of tools that fit into my world of cinematography. Here are just a few things that caught my eye…

Blackmagic, which had large booth at the entrance to the hall, was giving demos of its Resolve 15, among other tools. Panasonic also had a strong presence mid-floor, with an emphasis on the EVA-1 cameras. As usual, B&H attracted a lot of attention, as did Arri, which brought a couple of Arri Trinity rigs to demo.

During the HDR Video Essentials session, colorist Juan Salvo of TheColourSpace, talked about the emerging HDR 10+ standard proposed by Samsung and Amazon Video. Also mentioned was the trend of consumer displays getting brighter every year and that impact on content creation and content grading. Salvo pointed out the affordability of LG’s C7 OLEDs (about 700 Nits) for use as client monitors, while Flanders Scientific (which had a booth at the show) remains the expensive standard for grading. It was interesting to note that LG, while being the show’s Official Display Partner, was conspicuously absent from the floor.

Many of the panels and presentations unsurprisingly focused on content monetization — how to monetize faster and cheaper. Amazon Web Service’s stage sessions emphasized various AWS Elemental technologies, including automating the creation of video highlight clips for content like sports videos using facial recognition algorithms to generate closed captioning, and improving the streaming experience onboard airplanes. The latter will ultimately make content delivery a streamlined enough process for airlines that it would enable advertisers to enter this currently untapped space.

Editor Janis Vogel, a board member of the Blue Collar Post Collective, spoke at the #galsngear “Making Waves” panel, and noted the progression toward remote work in her field. She highlighted the fact that DaVinci Resolve, which had already made it possible for color work to be done remotely, is now also making it possible for editors to collaborate remotely. The ability to work remotely gives professionals the choice to work outside of the expensive-to-live-in major markets, which is highly desirable given that producers are trying to make more and more content while keeping budgets low.

Speaking at the same panel, director of photography/camera operator Selene Richholt spoke to the fact that crews are being monetized with content producers either asking production and post pros to provide standard service at substandard rates, or more services without paying more.

On a more exciting note, she cited recent 9×16 projects that she has shot with the camera mounted vertically (as opposed to shooting 16×9 and cropping in) in order to take full advantage of lens properties. She looks forward to the trend of more projects that can mix aspects ratios and push aesthetics.

Well, that’s it for this year. I’m already looking forward to next year.

 


Barbie Leung is a New York-based cinematographer and camera operator working in film, music video and branded content. Her work has played Sundance, the Tribeca Film Festival, Outfest and Newfest. She is also the DCP mastering technician at the Tribeca Film Festival.

Telestream’s Wirecast now integrated in BoxCast platform

BoxCast has completed the integration of Telestream Wirecast with its BoxCast platform. Telestream Wirecast is a live video production software for Mac or Windows that helps create high-quality live video webcasts from multiple sources, including webcams and screen shares to using multiple cameras, graphics and media for live events.

As a result of the BoxCast/Wirecast integration, users can now easily stream high-quality video using BoxCast’s advanced, cloud-based platform. With unlimited streaming, viewership and destinations, BoxCast manages the challenging part of live video streaming.

The BoxCast Live Streaming Platform provides Wirecast users access to a number of features, including:
• Single Source Simulcasting
• Ticketed Monetization
• Password Protection
• Video Embedding
• Cloud Transcoding
• Live Support

How does it work? Using BoxCast’s RTMP video ingestion option, users can select BoxCast as a streaming destination from within Wirecast. This allows Wirecast to stream directly to BoxCast. It will use the computer for encoding the video and audio, and it will transmit over RTMP.

The setup can be used with either a single-use RTMP or static RTMP channel. However in both cases, the setup must be done within 10 minutes of a scheduled broadcast.

Another way to stream from Wirecast is to send the Wirecast program output to a secondary HDMI or SDI output that is plugged into the BoxCaster or BoxCaster Pro. The BoxCaster’s hardware encoding relieves your computer of encoding the video and audio in addition to taking advantage of specially-designed communication protocols to optimize your available network connectivity.

BoxCast integration with Telestream Wirecast is available immediately.

Review: OConnor camera assistant bag

By Brady Betzel

After years and years of gear acquisition, I often forget to secure proper bags and protection for my equipment. From Pelican cases to the cheapest camera bags, a truly high-quality bag will extend the life of your equipment.

In this review I am going to go over a super-heavy-duty assistant camera bag by OConnor, which is a part of the Vitec Group. While the Vitec Group provides many different products — from LED lighting to robotic camera systems — OConnor is typically known for their professional fluid heads and tripods. This camera bag is made to not only fit their products, but also other gear, such as pan bars and ARRI plates. The OConnor AC bag is a no-nonsense camera and accessory bag with velcro enforced-repositionable inserts that will accommodate most cameras and accessories you have.

As soon as I opened the box and touched the AC bag I could tell it was high quality. The bag exterior is waterproof and easily wipeable. But, more importantly, there is an internal water- and dust-proof liner that allows the lid to be hinged while the equipment is close at hand while the liner is fully zipped. This internal waterproofing is resistant up to a 1.2M/4ft. column of water. Once I got past the quality of materials, my second inspection focused on the zippers. If I have a camera bag with bad zippers or snaps, it usually is given away or tossed, but the AC bag has strong and easy gliding zippers.

On the lid and inside of the front pockets are extremely tough and see-through mesh pockets for everything from batteries to memory cards. On the front is a business card/label holder. Around the outside are multiple pockets with fixing points for Carabiner hooks. In addition, there are d-rings for the included leather strap if you want to carry this bag over your shoulder instead of using the handles. The bag comes with five dividers to be velcroed on the inside, including two right angle dividers.The dividers are made to securely tie down all OConnor heads and accessories. Finally, the AC bag comes with a separate pouch to use on set for quick use.

Summing Up
In the end, the OConnor AC bag is a well made and roomy bag that will protect your camera gear and accessories from dust as well as water for $375. The inside measures in at 18x12x10.5 inches while the outside measures in at 22×14.5×10.5 inches and has been designed to fit inside of a Pelicase 1620. You can check out the OConnor AC bag on their website and find a dealer in your area.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Review: The Litra Torch for pro adventure lighting

By Brady Betzel

If you are on Instagram you’ve definitely seen your fair share of “adventure” photography and video. Typically, it’s those GoPro-themed action-adventure shots of someone cliff diving off a million-mile-high waterfall. I definitely get jealous. Nonetheless, one thing I love about GoPro cameras is their size. They are small enough to fit in your pocket, and they will reliably produce a great image. Where those actioncams suffer is with light performance. While it is getting better every day, you just can’t pull a reliably clean and noise-free image from a camera sensor so small. This is where actioncam lights come into play as a perfect companion, including the Litra Torch.

The Litra Torch is an 800 Lumen, 1.5 by 1.5-inch magnetic light. I first started seeing the tiny light trend on Instagram where people were shooting slow shutter photos at night but also painting certain objects with a tiny bit of light. Check out Litra on Instagram: @litragear to see some of the incredible images people are producing with this tiny light. I saw an action sports person showing off some incredible nighttime pictures using the GoPro Hero. He mentioned in the post that he was using the Litra Torch, so I immediately contacted Litra, and here I am reviewing the light. Litra sent me the Litra Paparazzi Bundle, which retails for $129.99. The bundle includes the Litra Torch,  along with a filter kit and cold shoe mount.

So the Litra Torch has four modes, all accessible by clicking the button on top of the light: 800 Lumen brightness, 450 Lumens, 100 Lumens and flashing. The Torch has a consistent color temperature of 5700 kelvin, essentially the light is a crisp white — right in between blue and yellow. The rechargeable lithium-ion battery can be charged via the micro USB cable and will last up to 30 minutes or more depending on the brightness selected. With a backup battery attached you could be going for hours.

Over a month with intermittent use I only charged it once. One night I had to check out something under the hood of my car and used the Litra Torch to see what I was doing. It is very bright and when I placed the light onto the car I realized it was magnetic! Holy cow. Why doesn’t GoPro put magnets into their cameras for mounting! The Torch also has two ¼-20 camera screw mounts so you can mount them just about anywhere. The construction of the Torch is amazing — it is drop-proof, waterproof and made of a highly resilient aluminum. You can feel the high quality of the components the first time you touch the Torch.

In addition to the Torch itself, the cold shoe mount and diffuser, the Paparazzi Bundle comes with the photo filter kit. The photo filter kit comes with five frames to mount the color filters onto the Torch; three sets of Rosco Tungsten 4600k filters; three sets of Rosco Tungsten 3200k filters; 1 White Diffuser filter; and one each of a red, yellow and green color filter. Essentially, they give you a cheap way to change white balance temperatures and also some awesome color filters to play around with. I can really see the benefit of having at least two if not three of the Litra Torches in your bag with the filter sets; you can easily set up a properly lit product shoot or even a headshot session with nothing more than three tiny Torch lights.

Putting It To The Test
To test out the light in action I asked my son to set-up a Lego scene for me. One hour later I had some Lego models to help me out. I always love seeing people’s Lego scenes on Instagram so I figured this would also be a good way to show off the light and the extra color filters sent in the Paparazzi Bundle. One thing I discovered is that I would love to have a slide-in filter holder that is built onto the light; it would definitely help me avoid wasting time having to pop filters into frames.

All in all, this light is awesome. The only problem is I wish I had three so I could do a full three-point lighting setup. However, with some natural light and one Litra Torch I had enough to pull off some cool lighting. I really liked the Torch as a colored spotlight; you can get that blue or red shade on different objects in a scene quickly.

Summing Up
In the end, the Litra Torch is an amazing product. In the future I would really love to see multiple white balance temperatures built into the Torch without having to use photo filters. Also, a really exciting but probably expensive prospect of building a Bluetooth connection and multiple colors. Better yet, make this light a full-color-spectrum app-enabled light… oh wait, just recently they announced the Litra Pro on Kickstarter. You should definitely check that out as well with it’s advanced options and color profile.

I am spoiled by all of those at home lights, like the LIFX brand, that change to any color you want, so I’m greedy and want those in a sub-$100 light. But those are just wishes — the Litra Torch is a must-have for your toolkit in my opinion. From mounting it on top of my Canon DSLR using the cold shoe mount, to using the magnetic ability and mounting in unique places, as well as using the screw mount to attach to a tripod — the Litra Torch is a mind-melting game changer for anyone having to lug around a 100-pound light kit, which makes this new Kickstarter of the Litra Pro so enticing.

Check out their website for more info on the Torch and new Litra Pro, as well as a bunch of accessories. This is a must-have for any shooter looking to carry a tiny but powerful light anywhere, especially for summer and the outdoors!


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Behind the Title: Steelhead MD Ted Markovic

NAME: Ted Markovic

COMPANY: LA-based Steelhead

CAN YOU DESCRIBE YOUR COMPANY?
We are a content studio and cross-platform production company. You can walk through our front door with a script and out the back with a piece of content. We produce everything from social to Super Bowl.

WHAT’S YOUR JOB TITLE?
Managing Director

WHAT DOES THAT ENTAIL?
I am responsible for driving the overall culture and financial health of the organization. That includes building strong client relationships, new business development, operational oversight, marketing, recruiting and retaining talent and managing the profits and losses of all departments.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
We all have a wide range of responsibilities and wear many hats. I occasionally find myself replacing the paper towels in the bathrooms because some days that’s what it takes.

WHAT’S YOUR FAVORITE PART OF THE JOB?
We are a very productive group that produces great work. I get a sense of accomplishment almost every day.

WHAT’S YOUR LEAST FAVORITE?
Replacing the paper towels in the bathrooms.

WHAT IS YOUR MOST PRODUCTIVE TIME OF THE DAY?
I get a lot more done while everyone else is busy eating their lunch or driving home.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Solving the traffic problem in Los Angeles. I see a lot of opportunities there.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I am a third-generation post production executive, and essentially grew up in a film lab in New York. I suspect the profession chose me.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
I am currently working on a Volkswagen Tier 2 project where we are shooting six cars over seven days on our stage at Steelhead. We’re incorporating dynamic camera shots of cars on a cyc with kinetic typography, motion graphics and VFX. It’s a great example of how we can do it all under one roof.

We recently worked with Nintendo and Interogate to bring the new Switch games to life in a campaign called Close Call. On set with rams, air mortars, lighting effects and lots of sawed-in-half furniture, we were able create real weight in-camera to layer with our VFX. We augmented the practical effects with HDR light maps, fire and debris simulations, as well as procedurally generated energy beams, 3D models, and 2D compositing to create a synergy between the practical and visual effects that really sells the proximity and sense of danger we were looking to create.

While the coordination of practical and post was no small chore, another interesting challenge we had to overcome was creating the CG weapons to mesh with the live-action plates. We started with low-resolution models directly from the games themselves, converted them and scrubbed in a good layer of detail and refined them to make them photoreal. We also had to conceptualize how some of the more abstract weapons would play with real-world physics.

Another project worth mentioning was a piece we created for Volkswagen called Strange Terrains. The challenge was to create 360-degree timelapse video from day-to-night. Something that’s never been done before. And in order to get this unique footage, we had to build an equally unique rigging system. We partnered with Supply Frame to design and build a custom-milled aluminum head to support four 50.6 megapixel Canon EOS 5DS cameras.

The “holy grail” of timelapse photography is getting the cameras to ramp the exposure over broad light changes. This was especially challenging to capture due to the massive exposure changes in the sky and the harshness of the white salt. After capturing around approximately 2,000 frames per camera — 9TB of working storage — we spent countless hours stitching, compositing, computing and rendering to get a fluid final product.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
About eight years ago, I created a video for my parents’ 40th wedding anniversary. My mother still cries when she watches it.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
The wheel is a pretty essential piece of technology that I’m not sure I could live without. My smartphone is as expected as well as my Sleepwell device for apnea. That device changed my life.

DO YOU LISTEN TO MUSIC WHILE YOU WORK? CARE TO SHARE YOUR FAVORITE MUSIC TO WORK TO?
I can work listening to anything but reggae.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Exercise.

Behind the Title: Sim LA’s VP of Post LA Greg Ciaccio

Name: Greg Ciaccio

Company: Sim

Can you describe your company?
We’re a full-service company providing studio space, lighting and grip, cameras, dailies and finishing in Los Angeles, New York, Toronto, Vancouver and Atlanta with outposts in New Mexico and Texas.

What’s your job title?
VP, Post Los Angeles

What does that entail?
Essentially, I’m the GM of our dailies and rentals and finishing businesses — the 2nd and 3rd floor of our building — formerly Kodak Cinesite. The first floor houses our camera rental business.

What would surprise people the most about what falls under that title?
I coproduce our SimLab industry events with Bill Russell in our camera department.

What’s your favorite part of the job?
Having camera, dailies, editorial and finishing under one roof — the workflows that tie them all together provide meaningful solutions for our clients.

What’s your least favorite?
Like most facility heads, business constraints. There’s not much of it, which is great, but running any successful company relies on managing the magic.

What is your favorite time of the day?
The early mornings when I can power through management work so I can spend time with staff and clients.

If you didn’t have this job, what would you be doing instead?
Probably a post sound mixer. I teach post production management one night a week at CSUN, so that provides a fresh perspective on my role in the industry.

How early on did you know this would be your path?
I really started back in the 4th grade in lighting. I then ran and designed lighting in high school and college, moving into radio-TV-film halfway through. I then moved into production sound. The move from production to post came out of a desire for (fairly) regular hours and consistent employment.

Can you name some recent projects you have worked on?
TV series: Game of Thrones, The Gifted, Krypton, The Son, Madam Secretary, Jane the Virgin. On the feature dailies and DI side: Amy Poehler’s Wine Country.

We’re also posting Netflix’ Best Worst Weekend Ever in ACES (Academy Color Encoding System) in UHD/Dolby Vision HDR.

Game of Thrones

What is the project that you are most proud of?
Game of Thrones. The quality bar which HBO has set is evident in the look of the show. It’s so well-produced — the production design, cinematography, editing and visual effects are stunning.

Name three pieces of technology that you can’t live without.
My iPhone X, my Sony Z9D HDR TV and my Apple Watch.

What social media channels do you follow?
Instagram for DP/other creative photography interests; LinkedIn for general socially/influencer-driven news; Facebook for peripheral news/personal insights; and channels, which include ETCentric — USC ETC; ACES Central for ACES-related community info; and Digital Cinema Society for industry events

Do you listen to music while you work? Care to share your favorite music to work to?
I listen to Pandora. The Thievery Corporation station.

What do you do to de-stress from it all?
Getting out for lunch and walking when possible. I visit our staff and clients throughout the day. Morning yoga. And the music helps!

Understanding and partnering on HDR workflows

By Karen Moltenbrey

Every now and then a new format or technology comes along that has a profound effect on post production. Currently, that tech is high dynamic range, or HDR, which offers a heightened visual experience through a greater dynamic range of luminosity.

Michel Suissa

So why is HDR important to the industry? “That is a massive question to answer, but to make a pretty long story relatively short, it is by far one of the recent technologies to emerge with the greatest potential to change how images are affecting audiences,” says Michel Suissa, manager of professional solutions at The Studio–B&H. “Regardless of the market and the medium used to distribute programming, irrelevant to where and how these images are consumed, it is a clearly noticeable enhancement, and at the same time a real marketing gold mine for manufacturers as well as content producers, since a premium can be attached to offering HDR as a feature.”

And he should know. Suissa has been helping a multitude of post studios navigate the HDR waters in their quest for the equipment necessary to meet their high dynamic range needs.

Suissa started seeing a growing appetite for HDR roughly three years ago, both in the consumer and professional markets and at about the same time. “Three years ago, if someone had said they were creating HDR content, a very small percentage of the community would have known what they were talking about,” he notes. “Now, if you don’t know what HDR is and you’re in the industry, then you are probably behind the times.”

Nevertheless, HDR is demanding in terms of the knowledge one needs to create HDR content and distribute it, as well as make sure people can consume it in a way that’s satisfying, Suissa points out. “And there’s still a lot of technical requirements that people have to carefully navigate through because it is hardly trivial,” he says.

How does a company like B&H go about helping a post studio select the right tools for their individual workflow needs? “The basic yet critically important task is understanding their workflow, their existing tool set and what is expected of them in terms of delivery to their clients,” says Suissa.

To assist studios and content creators working in post, The Studio–B&H team follows a blueprint that’s based on engaging customers about the nature of the work they do, asking questions like: Which camera material do they work from? In which form is the original camera material used? What platform do they use for editing? What is the preferred application to master HDR images? What is the storage and network infrastructure? What are the master delivery specifications they must adhere to (what flavor of HDR)?

“People have the most difficulty understanding the nature of the workflow: Do the images need to be captured differently from a camera? Do they need to be ingested in the post system differently? Do they need to be viewed differently? Do they need to be formatted differently? Do they need to be mastered differently? All those things created a new set of specifications that people have to learn, and this is where it has changed the way people handle post production,” Suissa contends. “There’s a lot of intricacies, and you have to understand what it is you’re looking at in order to make sure you’re making the correct decisions — not just technically, but creatively as well.”

When adding an HDR workflow, studios typically approach B&H looking for equipment across their entire pipeline. However, Suissa states that similar parameters apply for HDR work as for other high-performance environments. People will continue to need decent workstations, powerful GPUs, professional storage for performance and increased capacity, and an excellent understanding of monitoring. “Other aspects of a traditional pipeline can sometimes remain in play, but it is truly a case-by-case analysis,” he says.

The most critical aspect of working with HDR is the viewing experience, Suissa says, so selecting an appropriate monitoring solution is vital — as is knowing the output specifications that will be used for final delivery of the content.

Without question, Suissa has seen an increase in the number of studios asking about HDR equipment of late. “Generally speaking, the demand by people wanting to at least understand what they need in order to deliver HDR content is growing, and that’s because the demand for content is growing,” he says.

Yes, there are compromises that studios are making in terms of HDR that are based on budget. Nevertheless, there is a tipping point that can lead to the rejection of a project if it is not up to HDR standards. In fact, Suissa foresees in the next six months or so the tightening of standards on the delivery side, whether for Amazon, Netflix or the networks, and the issuance of mandates by over-the-air distribution channels in order for content to be approved as HDR.

B&H/Light Iron Collaboration
Among the studios that have purchased HDR equipment from B&H is Light Iron, a Panavision company with six facilities spanning the US that offer a range of post solutions, including dailies and DI. According to Light Iron co-founder Katie Fellion, the number of their clients requesting HDR finishing has increased in the past year. She estimates that one out of every three clients is considering HDR finishing, and in some cases, they are doing so even if they don’t have distribution in place yet.

Suissa and Light Iron SVP of innovation Michael Cioni gradually began forging a fruitful collaboration during the last few years, partnering a number of times at various industry events. “At the same time, we doubled up on our relationship of providing technology to them,” Suissa adds, whether for demonstrations or for Light Iron’s commercial production environment.

Katie Fellion

For some time, Light Iron has been moving toward HDR, purchasing equipment from various vendors along the way. In fact, Light Iron was one of the very first vendors to become involved with HDR finishing when Amazon introduced HDR-10 mastering for the second season of one of its flagship shows, Transparent, in 2015.

“Shortly after Transparent, we had several theatrical releases that also began to remaster in both HDR-10 and Dolby Vision, but the requests were not necessarily the norm,” says Fellion. “Over the last three years, that has steadily changed, as more studios are selling content to platforms that offer HDR distribution. Now, we have several shows that started their Season 1 with a traditional HD finish, but then transitioned to 4K HDR finishes in order to accommodate these additional distribution platform requirements.”

Some of the more recent HDR-finished projects at Light Iron include Glow (Season 2) and Thirteen Reasons Why (Season 2) for Netflix, Uncle Drew for Lionsgate, Life Itself for Amazon, Baskets (Season 3) and Better Things (Season 2) for FX and Action Point for Paramount.

Without question, HDR is important to today’s finishing, but one cannot just step blindly into this new, highly detailed world. There are important factors to consider. For instance, the source requirements for HDR mastering — 4K 16-bit files — require more robust tools and storage. “A show that was previously shot and mastered in 2K or HD may now require three or four times the amount of storage in a 4K HDR workflow. Since older post facilities had been previously designed around a 2K/HD infrastructure, newer companies that had fewer issues with legacy infrastructure were able to adopt 4K HDR faster,” says Fellion. Light Iron was designed around a 4K+ infrastructure from day one, she adds, allowing the post house to much more easily integrate HDR at a time when other facilities were still transitioning from 2K to 4K.

Nevertheless, this adoption required changes to the post house’s workflow. Fellion explains: “In a theatrical world, because HDR color is set in a much larger color gamut than P3, the technically correct way to master is to start with the HDR color first and then trim down for P3. However, since HDR theatrical exhibition is still in its infancy, there are not options for most feature films to monitor in a projected environment — which, in a feature workflow, is an expected part of the finishing process. As a result, we often use color-managed workflows that allow us to master first in a P3 theatrical projection environment and then to version for HDR as a secondary pass.”

Light-Iron-NY colorist-Steven Bodner grading music video Picture-Day in HDR on a Sony BVM X300.

In the episodic world, if a project is delivering in HDR, unless creative preference determines otherwise, Light Iron will typically start with the HDR version first and then trim down for the SDR Rec.709 versions.

For either, versioning and delivery have to be considered. For Dolby Vision, this starts with an analysis of the timeline to output an XML for the 709 derivative, explains Fellion of Light Iron’s workflow. And then from that 709 derivative, the colorist will review and tweak the XML values as necessary, sometimes going back to the HDR version and re-analyzing if a larger adjustment needs to be made for the Rec.709 version. For an HDR-10 workflow, this usually involves a different color pass and delivered file set, as well as analysis of the final HDR sequence, to create metadata values, she adds.

Needless to say, embracing HDR is not without challenges. Currently, HDR is only used in the final color process since there’s not many workflows to support HDR throughout the dailies or editorial process, says Fellion. “This can certainly be a challenge to creatives who have spent the past few months staring at images in SDR only to have a different reaction when they first view them in HDR.” Also, in HDR there may be elements on screen that weren’t previously visible in SDR dailies or offline (such as outside a window or production cables under a table), which creates new VFX requirements in order to adjust those elements.

“As more options are developed for on-set monitoring — such as Light Iron’s HDR Video Village System — productions are given an opportunity to see HDR earlier in the process and make mental and physical adjustments to help accommodate for the final HDR picture,” Fellion says.

Having an HDR monitor on set can aid in flagging potential issues that might not be seen in SDR. Currently, however, for dailies and editorial, HDR monitoring is not really used, according to Fellion, who hopes to see that change in the future. Conversely, in the finishing world, “an HDR monitor capable of a minimum 1,000-nit display, such as the Sony [BVM] X300, as well as a consumer-grade HDR UHD TV for client reviews, are part of our standard tool set for mastering,” she notes.

In fact, several months ago, Light Iron purchased new high-end HDR mastering monitors from B&H. The studio also sourced AJA Hi5 4K Plus converter boxes from B&H for its HDR workflow.

And, no doubt, there will be additional HDR equipment needs in Light Iron’s future, as delivery of HDR content continues to ramp up. But there’s a hefty cost involved in moving to HDR. Depending on whether a facility’s DI systems already had the capacity to play back 4K 16-bit files — a key requirement for HDR mastering — the cost can range from a few thousand dollars for a consumer-grade monitor to tens of thousands for professional reference monitoring, DI system, storage and network upgrades, as well as licensing and training for the Dolby Vision platform, according to Fellion.

That is one reason why it’s important for suppliers and vendors to form relationships. But there are other reasons, too. “Those leading the charge [in HDR] are innovators and people you want to be associated with,” Suissa explains. “You learn a lot by associating yourself with professionals on the other side of things. We provide technology. We understand it. We learn it. But we also practice it differently than people who create content. The exchange of knowledge is critical, and it enables us to help our customers better understand the technology they are purchasing.”

Main Image: Netflix’s Glow


Karen Maierhofer is a longtime technical writer with more than two decades of experience in segments of the CG and post industries.

Color for Feature Films

By Karen Maierhofer

Just as with episodic series, making the right color choices can greatly impact a film and its storytelling. While the look and mood of a project is set by the director and DP, colorists face creative decisions while delivering those desired results, even when nature or other factors prevent it from being captured on set.

As a result of their work, colorists help set the atmosphere, tone, emotion and depth of a project. They help guide storylines and audiences’ reactions to what is playing out on screen. They can make us happy, sad, scared or thrilled. And, they can make us fall in love, or out of love, with a character.

Here we look at three tent-pole films and their color process.

Deadpool 2
Like the original film, Deadpool 2 is colorful, especially when it comes to the overall tone of the character and action. However, that was the focus of the writers. Deluxe’s Efilm colorist, Skip Kimball, was concerned with the visual look of the movie, one that delivered a filmic style for the over-the-top destruction and gore playing out on the screen.

Amid the movie’s chaos, Kimball used understated saturation and limited contrast, with minimal stylization to preserve the on-set lighting choices of DP Jonathan Sela.

Skip Kimball

The working relationship between Kimball and Sela dates back nearly 15 years and spans several projects, including The Omen, Die Hard 5 and Max Payne, resulting in an informal shorthand of sorts between the two that enables them to dial in looks quickly. “Jonathan’s work is consistently great, and that makes my job easier. I simply help his on-set choices shine further,” says Kimball.

Despite the popularity of the original Deadpool, which Kimball did not work on, there was no directive to use that film as a guide for the sequel. Kimball attacked Deadpool 2 using Blackmagic Resolve, working with the raw camera footage whenever possible, as long as it was not a visual effects shot. “I get what the DP had exposed onto my screen, and then the DP and director come in and we discuss the look and feel of their project. Then I just kind of make things happen on the screen,” Kimball says, noting he prefers to work alongside the DP and director in the same room, as he can pick up on certain body language, “so I am making a change before they ask for it.”

At times, the DP and director will provide stills of examples they have in mind for certain shots, although mostly Kimball gets his direction from discussions they have. And that is exactly how they proceeded with Deadpool 2 — through discussions with the DP mostly. “It was kind of desaturated and low contrast in spots, while other shots had a lot more chroma in them, depending on the scene,” says Kimball.

One sequence Kimball particularly likes in the film is the prison scene with Deadpool and the young mutant Firefist. “It’s just a different look, with lots of cyans and greens. It’s not a typical look,” he says. “We were trying to make it feel uncomfortable, not a pleasant place to be.”

According to Kimball, the biggest challenge he faced on Deadpool 2 was managing all the VFX drop-ins. This required him to start with plates in his timeline, then update it accordingly as VFX shots were delivered from multiple vendors. In some instances, Kimball blended multiple versions of the effects to achieve director David Leitch’s vision. “There were a lot of VFX houses working on various shots, and part of my job is to help get them all to flow and look [unified],” he adds.

One of those VFX vendors was Efilm’s sister company, Method Studios, which provided approximately 300 VFX shots. As Kimball points out, it is more convenient when the VFX are done in-house with the coloring. “You can walk down the hall and bring [the VFX team] in to show them what you’re doing with their shots,” he says. “When it’s done out of house and you want to grade something a certain way and have to push it so far to where it breaks the visual effect, then you have to get them on the phone and ask them come in or send them examples of where the scene is going.”

In addition to Deadpool 2’s overall cinematic style, the film contains unique flashback and afterlife sequences that are differentiated from the main action through varied light and color. A lot of the afterlife glow was accomplished on set through in-camera filters and angled light rays, though Kimball augmented that further through additional glow, warm sepia tones and light VFX within Resolve.

“They wanted it to stand out and the audience to recognize immediately that it is a flashback,” he explains. “It was fun to create because that was all done in Resolve, with color correction and power windows, along with the OpenFX plug-ins.” Kimball explains he blurred unimportant scene elements and used a tilt lens effect. “For color, they went with a desaturated cyan feel and warmth in the highlights to create a dreamy quality that’s also a bit spooky,” he adds.

This film required many output formats — UHD, HD, HDR10 and IMAX. In addition, Kimball color graded all the promotional trailers, home entertainment release, and the related music video for Celine Dion’s Ashes.

When asked what sets this project apart from many of the others he has done, Kimball pondered the answer before responding, “It’s hard to say because it is all instinctual to me.”

Fans have many favorite scenes in the film, but for Kimball, it’s not so much about the individual sequences that make the movie memorable, but rather it’s about bringing it all together and making everything flow. He adds, “Executing the vision of the director, you know.”

Black Panther
One of the hottest movies of the year so far is Marvel’s Black Panther, a film about a prince who, after the death of his father, returns home to the African nation of Wakanda to take his rightful place as king. His path isn’t easy, though, and he must fight for the right to lead his people. Technicolor colorist Maxine Gervais was charged with creating a distinctive look as the movie jumped from conventional cities to the isolated, yet technologically advanced, nation of Wakanda. To handle the huge workload, her team called on a network of six or more FilmLight Baselight color grading workstations, operating simultaneously.

Maxine Gervais

“We knew that this was a fantasy movie with big themes and a strong story,” says Gervais, adding that since the film wasn’t an established franchise but a completely new departure, it gave the team more creative freedom. On most Marvel movies you have a sequel to match. Characters’ wardrobes, skin colors, sets, but on Black Panther everything was new so we didn’t have to match a particular aesthetic. We were creating a new world. The only scene where we needed to somewhat match in tones was to Captain America: Civil War, a flashback of Black Panther’s father’s death. Everything else was literally a ‘blank’ canvas in some ways — rich warm tones, colorful, darker filmic scenes.”

Gervais worked very closely with Oscar-nominated cinematographer Rachel Morrison, ASC, (Mudbound) to create colors that would follow the film’s story. “We wanted the film and photography to feel real, unlike most superhero movies,” explains Morrison. “Our aim was to highlight the beauty of Africa. And like all of our work, we were hoping for a subjectivity and clear point of view.”

Black Panther has very distinct settings and looks,” added Gervais. “Wakanda is this magical, futuristic African nation, with a lush colorful world the audience has never experienced. Then you have the darker reality of cityscapes in Oakland, plus the lab scenes, which have a more sterile look with cooler colors and tones.”

According to Gervais, for her, the most demanding part of the grade was the jungle scenes. “It was shot at night, so to keep all the detail we needed to see, and to make it feel organic, I ended up grading in multiple levels.” Cinematographer Morrison agrees: “The jungle scene was the biggest challenge. It was shot interior on a sound stage and had a bit of a ‘set’ feel to it. We knocked everything down and then really worked to amplify the contrast in the background.”

“We were both looking for a high sensitivity for contrast, deep blacks and shadows and a strong, rich image. I think we achieved that very well,” says Gervais. “The way we did this was almost in reverse engineering. We isolated a different part of the image to bring it up or down add contrast or remove it. You don’t want the cars to be shiny; you want minimum light reflection on cars, but you do want a bit of moonlight hitting foliage, etc. You want to see faces but everything should still be very dark as it is deep in a forest. We took down strong highlights but we also added highlights where they were mostly absent. I followed Rachel’s directions on this and worked it until she was happy with it.”

Looking back on how it started, Gervais says, “We first looked at an Avid output of the movie with Ryan (Coogler), Rachel and executives. Some of the VFX had a CDL applied from Ryan’s notes. As the movie played we could all call out comments, ideas. I wrote down everything to have a general feel for what was being said, and for my first pass Rachel gave me some notes about specific scenes where she was after a rich contrast look. This was very much a team effort. Before any supervised session with director, DP and executives, I would sit with 3D supervisor Evan Jacobs and VFX supervisor Geoffrey Baumann and review my first pass with notes that were taken from session to session. This way, we could make sure we were all going down the right path. Ryan and Rachel are wonderful to work with. They are both passionate and have a strong vision of what they want. I really enjoyed working with them — we were all new to the Marvel world.”

When it came to deliverables, multiple variations were required: 2D and 3D, laser projector as well as standard digital cinema. It is also available in IMAX, and of course there are multiple home video versions as well. “To complete all the work within the tight deadline, we extended the team for the first time in my career,” explains Gervais. “My assistant colorist Jeff Pantaleo and I went on to rotoscoping a lot of the shots and tried to avoid using too many mattes so it would simplify other deliveries like 3D. Then we had a team dedicated to offset all the shapes for 3D. Thankfully, Baselight 5.0 includes tools to speed up the way shapes are translated, so this helped a great deal. We ended up with a huge number of layers and shapes.

Creating the futuristic scenes and superhero action inevitably meant that the movie was highly reliant on VFX, featuring 2,500 shots within 134 minutes. Ensuring that the large team could keep track of VFX required extensions to Baselight’s Categories function, which made it immediately obvious which shots were temporary and which were final on the client monitor. This proved essential to keeping the project on track.

Overall, Gervais loved her first Marvel movie, and all the challenges it brought. “It was an amazing experience to work with all these talented people,” she says. “On Black Panther, I used way more composite grading than I have ever done before, blending many layers. I had to push the technology and push myself to find ways to make it work. And I think it turned out pretty good.”

Gervais has also employed Baselight on some upcoming titles, including Albert Hughes’ Alpha and director Robert Zemeckis’ Welcome to Marwen.

Solo: A Star Wars Story
One of the most revered movie series in history is Star Wars. Fans are not simply fans, they are superfans who hold dearly all tenets associated with the franchise — from the details of the ships to the glow of the lasers to the nuances of the characters and more. So, when color grading a film in the Star Wars universe, the colorist has to appease not only the DP and director, but also has to be cognizant of the galaxy of fans with their ultra-critical eye.

Joe Gawler

Such was the pressure facing Joe Gawler when color grading the recent Solo: A Star Wars Story, one of the two stand-alone Star Wars features. Directed by Ron Howard, with cinematography by Bradford Young, Solo follows the antics of young Han Solo and his gang of smugglers as they plan to steal coaxium from the planet Kessel.

While on the project, Gawler was immersed in the lore of Star Wars from many fronts, including working out of the famed Skywalker Ranch. “The whole creative team was at the Ranch for four weeks to get the color done,” he says, attributing the film’s large amount of visual effects for the extended timeframe. “As the new shots were rolling in from ILM, we would add them into the timeline and continue color grading.”

Harbor Picture Company’s Gawler, who usually works out of the studio’s New York office, stepped into this production during its early stages, visiting the London set where he, along with Young, helped finalize the aesthetic and look for the show’s look-up table, through which the movie would be lit on set and dailies would be created. Meanwhile, on set, any changes the dailies colorist Darren Rae made were passed through to VFX and to final color as a CDL (color decision list) file.

In fact, Solo introduced a number of unique factors to Gawler’s typical workflow. Among them was working on a film with so many visual effects — a hallmark of any Star Wars feature, but far more than any production he has color corrected in the past. Also, while he and Young participated in tweaking the LUT, it was created by ILM senior image and process engineer J. Schulte. Indeed, the film’s color pipeline was both developed and managed through ILM, where those fabled visual effects were crafted.

“That was something new to me,” Gawler says about the pipeline establishment. “There were some specific lasers, lights and things that are all part of the Star Wars world that were critical to ILM, and we had to make sure we got just the right hue and level of saturation. Those kinds of colors can get a little crazy if they’re not managed properly through the color science,” he explains. “But the way they managed the color and the way the shots came in from ILM was so smooth and the work so good that it moved like principal photography through the process, which isn’t always the case with visual effects, in my experience.”

So, by the time Gawler was at Skywalker Ranch, he had an informed timeline and CDL values, such as the actual dailies and decisions made for the production, already sitting inside his color correction, ready for him to decide what to use. He then spent a few days balancing out the shots before Young joined him and they dug in. “We’ve been working together for such a long time, and there’s a level of trust between us,” Gawler says of his relationship with the DP.

The pair started working together on an indie project called Pariah — which won the Excellence in Cinematography: Dramatic at Sundance in 2011 — and continued to do so as their resumes grew. Last year, they worked together on Arrival (2016), which led to a Best Cinematography Academy Award nomination for Young. “And now, holy cow, he is shooting a Star Wars film,” says Gawler. “It’s been one of those special relationships everyone dreams of having, where you find a director of photography you connect with, and you go places together.”

Gawler used Resolve for his color grading. He and Young would work alongside each other for a few days, then would meet with Howard. “It is such a big movie, and I was really pleasantly surprised at what a creatively collaborative experience it was,” he notes. “Ron respects Bradford, his editors, his sound mixers and me as a colorist, so he would take in whatever we were presenting to him and then comment. Everyone had such a wonderful energy on the show. It felt like every single person on the VFX team, editorial team, director, producers, Bradford and I were all rowing the boat in the same direction.”

The work Gawler does with Young is kept as natural as possible, with the light that is available. “His work is so good that we generally refrain from doing too much power windowing and secondaries. We only do that when absolutely necessary,” he says. “We try to keep more of a photo-chemical feel to the images, like you would have if you printed on film.”

Young, Gawler contends, is known for a dark, underlit aesthetic. But on this particular film, they didn’t want to go too dark — though it does have Young’s classic underlit, subtle hue. “We were making an effort to print up the image, so it almost felt like it had been flashed in processing,” he explains. “We had to find that balance of having it bright enough to see things we needed to see clearly, without compromising how Bradford shot the movie to begin with. The image is very committed; it’s not the most flexible thing to make his photography look like 20 different things.”

As a result, plenty of time was spent with the on-set lighting. “So, a lot of the work was just staying true to what was done on the day of the shoot,” he adds.

Solo is like most Star Wars films, with diverse locations and setups, though there are a few scenes that stand out in Gawler’s mind, including the one at the beginning of the film with the underground lair of Lady Proxima, which shows tunnels spanning the city. The sequence was shot with a blacklight, with lots of blues and purples. “We had a very narrow bandwidth of color to work with, but we wanted to back away from it feeling too electric to something that felt more organic,” he explains. “We spent a lot of time homing in on what kind of saturation and formality it would have.”

The scene Gawler spent the most time on, though, was the heist aboard a special train that weaves through snow-capped mountains. “That’s the biggest, longest, most cutty action sequence in the entire movie,” he says. “We had all these exterior plates shot in the Dolomites [in Spain]. We spent a tremendous amount of time just trying to get everything to match just right on the cut.”

All told, Gawler estimates the sequence alone contains 600 to 700 cuts. And he had to create a progression, wherein the characters drop down on top of the train before dawn’s first light, when it’s dark and cool, and the heist occurs during sunrise as the train rounds a bend. “We made sure they were happy with how every shot cut from one to the next and how it progressed [time-wise]. It was probably our biggest challenge and our biggest success,” he says. “It really gets the audience going.”

Most of Solo’s scenes were shot on stage, in highly controlled environments. However, scenes that occur on the planet Savareen were filmed in the Canary Islands, where wind and weather became factors, with shifting clouds and light. “I felt that it was one of the few spots in the movie where it was up to the colorist to try and pull all these different types of shots together,” notes Gawler, “and it was beautiful. It felt a little like a Western, with this standoff. It comes right after a chase with the TIE fighters and Millennium Falcon in space, and then Boom! You’re on this desert-like planet with a blaring sun and sand and dust everywhere.”

Another standout for Gawler was the large number of deliverables. Once the master was locked and approved (the grade was done in 4K) with support from Efilm in Hollywood, they had to sit with an IMAX colorist to make sure the work translated properly to that format. Then they moved to Dolby Vision, whose laser projector has a much greater range of contrast and brightness than a halogen digital cinema projector. “I give credit to J Schulte at ILM. He had these output display lookup tables for each flavor of delivery. So, it wasn’t a heavy lift for me to go from what we did at the Ranch to sitting in the Dolby cinema theater, where we spent maybe another three days tweaking everything,” he adds.

And then there was a 3D version and a Dolby 3D version of Solo, along with those for home video, 3D for home video, RealD 3D, and Dolby Vision’s home theater. “Being a colorist from New York, I don’t generally get a lot of tent-pole films with so many different flavors of deliverables,” Gawler says.

But this is not just any tent-pole. It’s Star Wars.

Throughout the project, that fact was always in the back of Gawler ’s mind. “This is a real part of culture — pop culture, film culture. There’s all this lore. You work on other projects and hope the film is going to find an audience. But with Star Wars, there’s no doubt millions of people are going to see it,” he adds.


Karen Maierhofer is a longtime technical writer with more than two decades of experience in segments of the CG and post industries.

Luke Scott to run newly created Ridley Scott Creative Group

Filmmaker Ridley Scott has brought together all of his RSA Films-affiliated companies together in a multi-business restructure to form the Ridley Scott Creative Group. The Ridley Scott Creative Group aims to strengthen the network across the related companies to take advantage of emerging opportunities across all entertainment genres as well as their existing work in film, television, branded entertainment, commercials, VR, short films, documentaries, music video, design and animation, and photography.

Ridley Scott

Luke Scott will assume the role of global CEO, working with founder Ridley Scott and partners Jake and Jordan Scott to oversee the future strategic direction of the newly formed group.

“We are in a new golden age of entertainment,” says Ridley Scott. “The world’s greatest brands, platforms, agencies, new entertainment players and studios are investing hugely in entertainment. We have brought together our talent, capabilities and creative resources under the Ridley Scott Creative Group, and I look forward to maximizing the creative opportunities we now see unfolding with our executive team.”

The companies that make up the RSCG will continue to operate autonomously but will now offer clients synergy under the group offering.

The group includes commercial production company RSA Films, which produced such ads such as Apple’s 1984, Budweiser’s Super Bowl favorite Lost Dog and more recently, Adidas Originals’ Original is Never Finished campaign, as well as branded content for Johnnie Walker, HBO, Jaguar, Ford, Nike and the BMW Films series; the music video production company founded by Jake Scott, Black Dog Films (Justin Timberlake, Maroon 5, Nicki Minaj, Beyoncé, Coldplay, Björk and Radiohead); the entertainment marketing company 3AM; commercial production company Hey Wonderful founded by Michael Di Girolamo; newly founded UK commercial production company Darling Films; and film and television production company Scott Free (Gladiator, Taboo, The Martian, The Good Wife), which continues to be led by David W. Zucker, president, US television; Kevin J. Walsh, president, US film; and Ed Rubin-Managing, director, UK television/film.

“Our Scott Free Films and Television divisions have an unprecedented number of movies and shows in production,” reports Luke Scott. “We are also seeing a huge appetite for branded entertainment from our brand and agency partners to run alongside high-quality commercials. Our entertainment marketing division 3AM is extending its capabilities to all our partners, while Black Dog is moving into short films and breaking new, world-class talent. It is a very exciting time to be working in entertainment.”

 

 

 

 

 

Sim and the ASC partner on educational events, more

During Cine Gear recently, Sim announced a 30-year sponsorship with the American Society of Cinematographers (ASC). Sim offers end-to-end solutions for creatives in film and television, and the ASC is a nonprofit focusing on the art of cinematography. As part of the relationship, the ASC Clubhouse courtyard will now be renamed Sim Plaza.

Sim and the ASC have worked together frequently on events that educate industry professionals on current technology and its application to their evolving craft. As part of this sponsorship, Sim will expand its involvement with the ASC Master Classes, SimLabs, and conferences and seminars in Hollywood and beyond.

During an official ceremony, a commemorative plaque was unveiled and embedded into the walkway of what is now Sim Plaza in Hollywood. Sim will also host a celebration of the ASC’s 100th anniversary in 2019 at Sim’s Hollywood location.

What else does this partnership entail?
• The two organizations will work together closely over the next 30 years on educational events for the cinematography community. Sim’s sponsorship will help fund society programs and events to educate industry professionals (both practicing and aspiring) on current technology and its application to the evolving craft.
• The ASC Master Class program, SimLabs and other conferences and seminars will continue on over these 30 years with Sim increasing its involvement. Sim is not telling the ASC what kind of initiatives they should be doing, but is rather lending a helping hand to drive visual storytelling forward. For example, they have already hosted ASC Master Class sessions in Toronto and Hollywood, sponsored the annual ASC BBQ for the last couple of years, and founder Rob Sim himself is an ASC associate member.

How will the partnership will increase programming and resources to support the film and television community for the long term?
• It has a large focus on three things: financial resources, programming assistance and facility support.
• It will provide access and training with world-class technology in film and television.
• It will offer training directly from industry leaders in Hollywood and beyond
• It will develop new programs for people who can’t attend ASC Master Class sessions, such as an online experience, which is something ASC and Sim are working on together.
• It will expand SimLabs beyond Hollywood —with the potential to bring it to Vancouver, Atlanta, New York and Toronto with the goal of creating new avenues for people who are associated with the ASC and who know they can call on Sim.
• It will bring volunteers. Sim has many volunteers on ASC committees, including the Motion Imaging Technology Council and its Lens committee.

Main Image: L-R: Sim President/CEO James Haggarty, Sim founder and ASC associate member Rob Sim,ASC events coordinator Patty Armacost and ASC president Kees van Oostrum.

Testing large format camera workflows

By Mike McCarthy

In the last few months, we have seen the release of the Red Monstro, Sony Venice, Arri Alexa LF and Canon C700 FF, all of which have larger or full-frame sensors. Full frame refers to the DSLR terminology, with full frame being equivalent to the entire 35mm film area — the way that it was used horizontally in still cameras. All SLRs used to be full frame with 35mm film, so there was no need for the term until manufacturers started saving money on digital image sensors by making them smaller than 35mm film exposures. Super35mm motion picture cameras on the other hand ran the film vertically, resulting in a smaller exposure area per frame, but this was still much larger than most video imagers until the last decade, with 2/3-inch chips being considered premium imagers. The options have grown a lot since then.

L-R: 1st AC Ben Brady, DP Michael Svitak and Mike McCarthy on the monitor.

Most of the top-end cinema cameras released over the last few years have advertised their Super35mm sensors as a huge selling point, as that allows use of any existing S35 lens on the camera. These S35 cameras include the Epic, Helium and Gemini from Red, Sony’s F5 and F55, Panasonic’s VaricamLT, Arri’s Alexa and Canon’s C100-500. On the top end, 65mm cameras like the Alexa65 have sensors twice as wide as Super35 cameras, but very limited lens options to cover a sensor that large. Full frame falls somewhere in between and allows, among other things, use of any 35mm still film lenses. In the world of film, this was referred to as Vista Vision, but the first widely used full-frame digital video camera was Canon’s 5D MkII, the first serious HDSLR. That format has suddenly surged in popularity recently, and thanks to this I recently had opportunity to be involved in a test shoot with a number of these new cameras.

Keslow Camera was generous enough to give DP Michael Svitak and myself access to pretty much all their full-frame cameras and lenses for the day in order to test the cameras, workflows and lens options for this new format. We also had the assistance of first AC Ben Brady to help us put all that gear to use, and Mike’s daughter Florendia as our model.

First off was the Red Monstro, which while technically not the full 24mm height of true full frame, uses the same size lenses due to the width of its 17×9 sensor. It offers the highest resolution of the group at 8K. It records compressed RAW to R3D files, as well as options for ProRes and DNxHR up to 4K, all saved to Red mags. Like the rest of the group, smaller portions of the sensor can be used at lower resolution to pair with smaller lenses. The Red Helium sensor has the same resolution but in a much smaller Super35 size, allowing a wider selection of lenses to be used. But larger pixels allow more light sensitivity, with individual pixels up to 5 microns wide on the Monstro and Dragon, compared to Helium’s 3.65-micron pixels.

Next up was Sony’s new Venice camera with a 6K full-frame sensor, allowing 4K S35 recording as well. It records XAVC to SxS cards or compressed RAW in the X-OCN format with the optional ASX-R7 external recorder, which we used. It is worth noting that both full-frame recording and integrated anamorphic support require additional special licenses from Sony, but Keslow provided us with a camera that had all of that functionality enabled. With a 36x24mm 6K sensor, the pixels are 5.9microns, and footage shot at 4K in the S35 mode should be similar to shooting with the F55.

We unexpectedly had the opportunity to shoot on Arri’s new AlexaLF (Large Format) camera. At 4.5K, this had the lowest resolution, but that also means the largest sensor pixels at 8.25microns, which can increase sensitivity. It records ArriRaw or ProRes to Codex XR capture drives with its integrated recorder.

Another other new option is the Canon C700 FF with a 5.9K full-frame sensor recording RAW, ProRes, or XAVC to CFast cards or Codex Drives. That gives it 6-micron pixels, similar to the Sony Venice. But we did not have the opportunity to test that camera this time around, maybe in the future.

One more factor in all of this is the rising popularity of anamorphic lenses. All of these cameras support modes that use the part of the sensor covered by anamorphic lenses and can desqueeze the image for live monitoring and preview. In the digital world, anamorphic essentially cuts your overall resolution in half, until the unlikely event that we start seeing anamorphic projectors or cameras with rectangular sensor pixels. But the prevailing attitude appears to be, “We have lots of extra resolution available so it doesn’t really matter if we lose some to anamorphic conversion.”

Post Production
So what does this mean for post? In theory, sensor size has no direct effect on the recorded files (besides the content of them) but resolution does. But we also have a number of new formats to deal with as well, and then we have to deal with anamorphic images during finishing.

Ever since I got my hands on one of Dell’s new UP3218K monitors with an 8K screen, I have been collecting 8K assets to display on there. When I first started discussing this shoot with DP Michael Svitak, I was primarily interested in getting some more 8K footage to use to test out new 8K monitors, editing systems and software as it got released. I was anticipating getting Red footage, which I knew I could playback and process using my existing software and hardware.

The other cameras and lens options were added as the plan expanded, and by the time we got to Keslow Camera, they had filled a room with lenses and gear for us to test with. I also had a Dell 8K display connected to my ingest system, and the new 4K DreamColor monitor as well. This allowed me to view the recorded footage in the highest resolution possible.

Most editing programs, including Premiere Pro and Resolve, can handle anamorphic footage without issue, but new camera formats can be a bigger challenge. Any RAW file requires info about the sensor pattern in order to debayer it properly, and new compression formats are even more work. Sony’s new compressed RAW format for Venice, called X-OCN, is supported in the newest 12.1 release of Premiere Pro, so I didn’t expect that to be a problem. Its other recording option is XAVC, which should work as well. The Alexa on the other hand uses ArriRaw files, which have been supported in Premiere for years, but each new camera shoots a slightly different “flavor” of the file based on the unique properties of that sensor. Shooting ProRes instead would virtually guarantee compatibility but at the expense of the RAW properties. (Maybe someday ProResRAW will offer the best of both worlds.) The Alexa also has the challenge of recording to Codex drives that can only be offloaded in OS X or Linux.

Once I had all of the files on my system, after using a MacBook Pro to offload the media cards, I tried to bring them into Premiere. The Red files came in just fine but didn’t play back smoothly over 1/4 resolution. They played smoothly in RedCineX with my Red Rocket-X enabled, and they export respectably fast in AME, (a five-minute 8K anamorphic sequence to UHD H.265 in 10 minutes), but for some reason Premiere Pro isn’t able to get smooth playback when using the Red Rocket-X. Next I tried the X-OCN files from the Venice camera, which imported without issue. They played smoothly on my machine but looked like they were locked to half or quarter res, regardless of what settings I used, even in the exports. I am currently working with Adobe to get to the bottom of that because they are able to play back my files at full quality, while all my systems have the same issue. Lastly, I tried to import the Arri files from the AlexaLF, but Adobe doesn’t support that new variation of ArriRaw yet. I would anticipate that will happen soon, since it shouldn’t be too difficult to add that new version to the existing support.

I ended up converting the files I needed to DNxHR in DaVinci Resolve so I could edit them in Premiere, and I put together a short video showing off the various lenses we tested with. Eventually, I need to learn how to use Resolve more efficiently, but the type of work I usually do lends itself to the way Premiere is designed — inter-cutting and nesting sequences with many different resolutions and aspect ratios. Here is a short clip demonstrating some of the lenses we tested with:

This is a web video, so even at UHD it is not meant to be an analysis of the RAW image quality, but instead a demonstration of the field of view and overall feel with various lenses and camera settings. The combination of the larger sensors and the anamorphic lenses leads to an extremely wide field of view. The table was only about 10 feet from the camera, and we can usually see all the way around it. We also discovered that when recording anamorphic on the Alexa LF, we were recording a wider image than was displaying on the monitor output. You can see in the frame grab below that the live display visible on the right side of the image isn’t displaying the full content that got recorded, which is why we didn’t notice that we were recording with the wrong settings with so much vignetting from the lens.

We only discovered this after the fact, from this shot, so we didn’t get the opportunity to track down the issue to see if it was the result of a setting in the camera or in the monitor. This is why we test things before a shoot, but we didn’t “test” before our camera test, so these things happen.

We learned a lot from the process, and hopefully some of those lessons are conveyed here. A big thanks to Brad Wilson and the rest of the guys at Keslow Camera for their gear and support of this adventure and, hopefully, it will help people better prepare to shoot and post with this new generation of cameras.

Main Image: DP Michael Svitak


Mike McCarthy is an online editor/workflow consultant with 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.

HPA Tech Retreat: The production budget vs. project struggle

“Executive producers often don’t speak tech language,” said Aaron Semmel, CEO and head of BoomBoomBooya, in addressing the HPA Tech Retreat audience Palm Springs in late February. “When people come to us with requests and spout all sorts of tech mumbo jumbo, it’s very easy for us to say no,” he continued. “Trust me, you need to speak to us in our language.”

Semmel was part of a four-person HPA panel that included Cirina Catania, The Catania Group; Larry O’Connor, OWC Digital; and Jeff Stansfield, Advantage Video Systems. Moderated by Andy Marken of Marken Communications, the panel explored solutions that can bring the executive and line producers and the production/post teams closer together to implement the right solutions for every project and satisfy everyone, including accounting.

An executive and co-producer on more than a dozen film and TV series projects, Semmel said his job is to bring together the money and then work with the best creative people possible. He added that the team’s job was to make certain the below-the-line items — actual production and post production elements — stay on or below budget.

Semmel noted that most executive producers often work off of the top sheet of the budget, typically an overview of the budget. He explained that executive producers may go through all of the budget and play with numbers here and there but leave the actual handling of the budget to the line producer and supervising producer. In this way, they can “back into” a budget number set by the executive producer.

“I understand the technologies at a higher level and could probably take a highlighter and mark budget areas where we could reduce our costs, but I also know I have very experienced people on the team who know the technologies better than I do to make effective cuts.

L-R: Jeff Stansfield, Aaron Semmel, Cirina Catania

“For example, in talking with many of you in the audience here at the Retreat, I learned that there’s no such thing as an SSD hard drive,” he said. “I now know there are SSDs and there are hard drives and they’re totally different.”

Leaning into her mic, Catania got a laugh when she said, “One of the first things we all have to do is bring our production workflows into the 21st century. But seriously, the production and post teams are occasionally not consulted during the lengthy budgeting process. Our keys can make some valuable contributions if they have a seat at the table during the initial stages. In terms of technology, we have some exciting new tools we’d like to put to work on the project that could save you valuable time, help you organize your media and metadata, and have a direct and immediate positive impact on the budget. What if I told you that you could save endless hours in post if you had software that helped your team enter metadata and prep for post during the early phase — and hardware that worked much faster, more securely and more reliably.”

With wide agreement from the audience, Catania emphasized that it is imperative for all departments involved in prep/production/post and distribution to be involved in the budget process from the outset.

“We know the biggest part of your budget might be above-the-line costs,” she continued. “But production, post and distribution are where much of the critical work also gets done. And if we’re involved at the outset, and that includes with people like Jeff (Stansfield), who can help us come up with creative workflow and financing options, that will save you and the investors’ money, we will surely turn a profit.”

Semmel said the production/post team could probably be of assistance in the early budget stages to pinpoint where work could be done more efficiently to actually improve the overall quality and ensure EPs do what they need to do for their reputation… deliver the best and be under budget.

The Hatfields and the McCoys via History Channel

“But for some items, there seem to be real constraints,” he emphasized. “For example, we were shooting America’s Feud: Hatfields & McCoys, a historical documentary in Romania — yes, Romania,” he grinned; “and we were behind schedule. We shot the farmhouse attack on day one, shot the burning of the house on day two and on day three we received our dailies to review for day one’s work. We were certain we had everything we needed so we took a calculated risk and burned the building,” he recalled. “But no one exhaled until we had a chance to go through the dailies.”

“What if I told you there’s a solution that will transfer your data at 2800MB/s and enable you to turn around your dailies in a couple of hours instead of a couple of days?” O’Connor asked.

Semmel replied, “I don’t understand the 2800MB/s stuff, but you clearly got my attention by saying dailies in a couple of hours instead of days. If there had been anything wrong with the content we had shot, we would have been faced with the huge added expense of rebuilding and reshooting everything,” he added. “Even accounting can understand the savings in hours vs. days.”

Semmel pointed out that because films and TV shows start and end digital, there’s always a concern about frames and segments being lost when you’re on location and a long distance from the safety net of your production facilities.

“No one likes that risk, including production/post leaders, integrators or manufacturers,” said O’Connor. “In fact, a lot of crews go to extraordinary lengths to ensure nothing is lost; and frankly, I don’t blame them.”

He recalled a film crew going to Haiti to shoot a documentary that was told by the airline they were over their limit on baggage for the trip.

“They put their clothes in an airport locker and put their three RAID storage systems in their backpacks. They wanted to make certain they could store, backup and backup their work again to ensure they had all of the content they needed when they got back to their production/post facility.”

Stansfield and Catania said they had seen and heard of similar gut-level decisions made by executive and line producers. They encouraged the production/post audience not to simply accept the line item budgets they are given to work with but be more involved at the beginning of the project to explore and define all of the below-the-line budget to minimize risk and provide alternative plans just in case unexpected challenges arise.

“An EP and line producer’s mantra for TV and film projects is you only get two out of three things: time, money and quality,” Semmel said. “If you can deliver all three, then we’ll listen, but you have to approach it from our perspective.

“Our budgets aren’t open purses,” he continued. “You have to make recommendations and deliver products and solutions that enable us to stay under budget, because no matter how neat they are or how gee-whiz technical they are, they aren’t going to be accepted. We have two very fickle masters — finance and viewer — so you have to give us the tools and solutions that satisfy both of them. Don’t give us bits, bytes and specs, just focus on meeting our needs in words we can understand.

“When you do that, we all win; and we can all work on the next project together,” Semmel concluded. “We only surround ourselves with people who will help us through the project. People who deliver.”

Peter Doyle on coloring Churchill’s England for Darkest Hour

By Daniel Restuccio

Technicolor supervising digital colorist Peter Doyle is pretty close to being a legend in the movie industry. He’s color graded 12 of the 100 top box office movies, including Peter Jackson’s Lord of the Rings trilogy, six Harry Potter films, Aleksander Sokurov’s Venice Golden Lion-winning Faust, Joel and Ethan Coen’s Inside Llewyn Davis, The Ballad of Buster Scruggs and most recently the Golden Globe-nominated Darkest Hour.

Grading Focus Features’ Darkest Hour — which focuses on Winston Churchill’s early days as Prime Minister of the United Kingdom during WWII — represents a reunion for Doyle. He previously worked with director Joe Wright (Pan) and director of photography Bruno Delbonnel (Inside Llewyn Davis). (Darkest Hour picked up a variety of Oscar nominations, including Best Picture and Best Cinematography for Delbonnel.)

Peter Doyle

The vibe on Darkest Hour, according to Doyle, was very collaborative and inspiring. “Joe is an intensely visual director and has an extraordinary aesthetic… visually, he’s very considerate and very aware. It was just great to throw out ideas, share them and work to find what would be visually appropriate with Bruno in terms of his design of light, and what this world should look like.”

All the time, says Doyle, they worked to creatively honor Joe’s overall vision of where the film should be from both the narrative and the visual viewpoint.

The creative team, he continues, was focused on what they hoped to achieve in terms of “the emotional experience with the visuals,” what did they want this movie to look like and, technically, how could they get the feeling of that imagery onto the screen?

Research and Style Guide
They set about to build a philosophy of what the on-screen vision of the film would be. That turned into a “style guide” manifesto of actually how to get that on screen. They knew it was the 1940s during World War II, so logically they examined newsreels and the cameras and lenses that were used at the time. One of the things that came out of the discussions with Joe and Bruno was the choice of the 1.85:1 aspect ratio. “It’s quite an ensemble cast and the 2.35:1 would let you spread the cast across the screen, but wide 1.85:1 felt most appropriate for that.”

Doyle also did some research at the Victoria and Albert Museum’s very large photographic collection and dug into his own collection of photographic prints made with alternate color processes. Sepia and black and white got ruled out. They investigated the color films of the time and settled in on the color work of Edward Steichen.

Delbonnel chose Arri Alexa SXT cameras and Cooke S4s and Angenieux zoom lenses. They mastered in ArriRaw 3.2K. Technicolor has technology that allowed Doyle to build a “broad stroke” color-model-based emulation of what the color processes were like in the ’40s and apply that to the Alexa. “The idea,” explains Doyle, “was to take the image from the Alexa camera and mold it into an approximation of what the color film stocks would have looked like at the time. Then, having got into that world, tweak it slightly, because that’s quite a strong look,” and they still needed it to be “sensitive to the skin tones of the actors.”

Color Palette and Fabrics
There was an “overall arc” to this moment in history, says Doyle. The film’s setting was London during WWII, and outside it was hot and sunny. Inside, all lights were dimmed filaments, and that created a scenario where visually they would have extremely high-contrast images. All the colors were natural-based dyes, he explains, and the fabrics were various kind of wools and silks. “The walls and the actual environment that everyone would have been in would be a little run down. There would have been quite a patina and texture on the walls, so a lot of dirt and dust. These were kind of the key points that they gave me in order to work something out.”

Doyle’s A-ha Moment
“I took some hero shots of Kristin Scott Thomas (Clementine Churchill) and Gary Oldman (Winston Churchill), along with a few of the other actors, from Bruno’s rushes,” explains Doyle, adding that those shots became his reference.

From those images he devised different LUTs (Look Up Tables) that reflected different kinds of color manipulation processes of the time. It also meant that during principal photography they could keep referencing how the skin tones were working. There are a lot of close-ups and medium close-ups in Darkest Hour that gave easy access to the performance, but it also required them to be very aware of the impact of lighting on prosthetics and makeup.

Doyle photographed test charts on both 120mm reversal film of Ektachrome he had sitting in his freezer from the late ’70s and the Alexa. “The ‘a-ha moment’ was when we ran a test image through both. It was just staggering how different the imagery really looked. It gave us a good visual reference of the differences between film and digital, but more accurately the difference between reversal film and digital. It allowed us to zero in on the reactions of the two imaging methods and build the show LUTs and emulation of the Steichen look.”

One Word
When Doyle worked on Llewelyn Davis, Delbonnel and the Coen brothers defined the look of the film with one word: “sad.” For Darkest Hour, the one word used was “contrast,” but as a multi-level definition not just in the context of lights and darks in the image. “It just seemed to be echoed across all the various facets of this film,” says Doyle. “Certainly, Darkest Hour is a story of contrasting opinions. In terms of story and moments, there are soldiers at war in trenches, whilst there are politicians drinking champagne — certainly contrast there. Contrast in terms of the environment with the extreme intense hot summer outside and the darkness and general dullness on the inside.”

A good example, he says, is “the Parliament House speech that’s being delivered with amazing shafts of light that lit up the environment.”

The DP’s Signature
Doyle feels that digital cinematography tends to “remove the signature” of the director of photography, and that it’s his job to put it back. “In those halcyon days of film negative, there were quite a lot of processes that a DP would use in the lab that would become part of the image. A classic example, he says, is Terrence Malick’s Days of Heaven, which was shot mostly during sunrise and sunset by Nestor Almendros, and “the extraordinary lightness of the image. Or Stanley Kubrick’s Barry Lyndon, which was shot by John Alcott with scenes lit entirely by candles “that have a real softness.” The looks of those movies are a combination of the cinematographer’s lighting and work with the lab.

“A digital camera is an amazing recording device. It will faithfully reproduce what it records on set,” says Doyle. “What I’ve done with Bruno in the testing stage is bring back the various processes that you would possibly do in the lab, or at least the concept of what you would do in the laboratory. We’re really bending and twisting the image. Everyone sees the film the way that the DP intends, and then everyone’s relationship with that film is via this grade.”

This is why it’s so important to Doyle to have input from day one rushes through to the end. He’s making sure the DP’s “signature” is consistent to final grade. On Darkest Hour they tested, built and agreed on a look for the film for rushes. Colorist Mel Kangleon worked with Delbonnel on a daily basis to make sure all the exposures were correct from a technical viewpoint. Also, aesthetically to make sure the grade and look were not being lost.

“The grades that we were doing were what was intended by Bruno, and we made sure the actual imagery on the screen was how he wanted it to be,” explains Doyle. “We were making sure that the signature was being carried through.”

Darkest Hour and HDR
On Darkest Hour, Doyle built the DCI grade for the Xenon projector, 14 foot-lambert, as the master color corrected deliverable. “Then we took what was pretty much the LAD gray-card value of that DCI grade. So a very classic 18% gray that was translated across to the 48-, the 108-, the 1,000- and the 4,000-nit grade. We essentially parked the LAD gray (18% gray) at what we just felt was an appropriate brightness. There is not necessarily a lot of color science to that, other than saying, ‘this feels about right.’ That’s (also) very dependent on the ambient light levels.”

The DCI projector, notes Doyle, doesn’t really have “completely solid blacks; they’re just a little gray.” Doyle wished that the Xenon could’ve been brighter, but that is what the theatrical distribution chain is at the moment, he says.

When they did the HDR (High Dynamic Range) version, which Doyle has calls as a “new language” of color correction, they took the opportunity to add extra contrast and dial down the blacks to true black. “I was able to get some more detail in the lower shadows, but then have absolutely solid blacks —  likewise on the top end. We opened up the highlights to be even more visceral in their brightness. Joe Wright says he fell in love with the Dolby Vision.”

If you’re sitting in a Dolby Vision Cinema, says Doyle, you’re sitting in a black box. “Therefore, you don’t necessarily need to have the image as bright as a Rec 709 grade or LAD gray, which is typically for a lounge room where there are some lights on. There is a definite ratio between the presumed ambient light level of a room and where they park that LAD,” explains Doyle.

Knowing where they want the overall brightness of the film to be, they translate the tone curve to maintain exactly what they did in the DCI grade. Then perceptually it appears the same in the various mediums. Next they custom enhance each grade for the different display formats. “I don’t really necessarily call it a trim pass; it’s really adding a flare pass,” elaborates Doyle. “A DCI projector has quite a lot of flare, which means it’s quite organic and reactive to the image. If you project something on a laser, it doesn’t necessarily have anywhere near that amount of flair, and that can be a bit of a shock. Suddenly, your highlights are looking incredibly harsh. We went through and really just made sure that the smoothness of the image was maintained and emulated on the other various mediums.”

Doyle also notes that Darkest Hour benefited from the results of his efforts working with Technicolor color scientists Josh Pines and Chris Kutchka, working on new color modeling tools and being able “to build 3D LUTs that you can edit and that are cleaner. That can work in a little more containable way.”

Advice and Awards
In the bright new world of color correction, what questions would Doyle suggest asking directors? “What is their intent emotionally with the film? How do they want to reinforce that with color? Is it to be approached in a very literal way, or should we think about coming up with some kind of color arc that might be maybe counter intuitive? This will give you a feel for the world that the director has been thinking of, and then see if there’s a space to come at it from a slightly unexpected way.”

I asked Doyle if we have reached the point where awards committees should start thinking about an Academy Award category for color grading.

Knowing what an intensely collaborative process color grading is, Doyle responded that it would be quite challenging. “The pragmatist in me says it could be tricky to break it down in terms of the responsibilities. It depends on the relationship between the colorist, the DP and the director. It really does change with the personalities and the crew. That relationship could make the breakdown a little tricky just to work out whose idea was it to actually make it, for example, blue.”

Because this interview was conducted in December, I asked Doyle, what he would ask Santa to bring him for Christmas. His response? “I really think the new frontier is gamut mapping and gamut editing — that world of fitting one color space into another. I think being able to edit those color spaces with various color models that are visually more appropriate is pretty much the new frontier.”


Daniel Restuccio is a producer and teacher based in Southern California.

AICP and AICE to merge January 1

The AICP and AICE are celebrating the New Year in a very special way — they are merging into one organization. These two associations represent companies that produce and finish the majority of advertising and marketing content in the moving image. Post merger, AICP and AICE will function as a single association under the AICP brand. They will promote and advocate for independent production and post companies when it comes to producing brand communications for advertising agencies, advertisers and media companies.

The merger comes after months of careful deliberations on the part of each association’s respective boards and final votes of approval by their memberships. Under the newly merged association’s structure, executive director of AICE Rachelle Madden will assume the title of VP, post production and digital production affairs of AICP. She will report to president/CEO of AICP Matt Miller. Madden is now tasked with taking the lead on AICP’s post production offerings, including position papers, best practices, roundtables, town halls and other educational programs. She will also lead a post production council, which is being formed to advise the AICP National Board on post matters.

Former AICE members will be eligible to join the General Member Production companies of AICP, with access to all benefits starting in 2018. These include: Participation in the Producers’ Health Benefits Plan (PHBP); the AICP Legal Initiative (which provides legal advice on contracts with agencies and advertisers); and access to position papers, guidelines and other tools as they relate to business affairs and employment issues. Other member benefits include access to attend meetings, roundtables, town halls and seminars, as well as receiving the AICP newsletter, member discounts on services and a listing in the AICP membership directory on the AICP website.

All AICP offerings — including its AICP Week Base Camp for thought leadership — will reflect the expanded membership to include topics and issues pertaining to post production. Previously created AICE documents, position papers and forms will now live on aicp.com.

The AICP was founded in 1972 to protect the interests of independent commercial producers, crafting guidelines and best practice in an effort to help its members run their businesses more effectively. Through its AICP Awards, the organization celebrates creativity and craft in marketing communications.

AICE was founded in 1998 when three independent groups representing editing companies in Chicago, Los Angeles and New York formed a national association to discuss issues and undertake initiatives affecting post production on a broader scale. In addition to editing, the full range of post production disciplines, including color correction, visual effects, audio mixing and music and sound design are represented.

From AICP’s perspective, says Miller, merging the two organizations has benefits for members of both groups. “As we grow more closely allied, it makes more sense than ever for the organizations to have a unified voice in the industry,” he notes. He points out that there are numerous companies that are members of both organizations, reflecting the blurring of the lines between production and post that’s been occurring as media platforms, technologies and client needs have changed.

For Madden, AICE’s members will be joining an organization that provides them with a firm footing in terms of resources, programs, benefits and initiatives. “There are many reasons why we moved forward on this merger, and most of them involve amplifying the voice of the post production industry by combining our interests and advocacy with those of AICP members. We now become part of a much larger group, which gives us a strength in numbers we didn’t have before while adding critical post production perspectives to key discussions about business practices and industry trends.”

Main Image: Matt Miller and Rachelle Madden

Making 6 Below for Barco Escape

By Mike McCarthy

There is new movie coming out this week that is fairly unique. Telling the true story of Eric LeMarque surviving eight days lost in a blizzard, 6 Below: Miracle on the Mountain is the first film shot and edited in its entirety for the new Barco Escape theatrical format. If you don’t know what Barco Escape is, you are about to find out.

This article is meant to answer just about every question you might have about the format and how we made the film, on which I was post supervisor, production engineer and finishing editor.

What is Barco Escape?
Barco Escape is a wraparound visual experience — it consists of three projection screens filling the width of the viewer’s vision with a total aspect ratio of 7.16:1. The exact field of view will vary depending on where you are sitting in the auditorium, but usually 120-180 degrees. Similar to IMAX, it is not about filling the entire screen with your main object but leaving that in front of the audience and letting the rest of the image surround them and fill their peripheral vision in a more immersive experience. Three separate 2K scope theatrical images play at once resulting in 6144×858 pixels of imagery to fill the room.

Is this the first Barco Escape movie?
Technically, four other films have screened in Barco Escape theaters, the most popular one being last year’s release of Star Trek Beyond. But none of these films used the entire canvas offered by Escape throughout the movie. They had up to 20 minutes of content on the side screens, but the rest of the film was limited to the center screen that viewers are used to. Every shot in 6 Below was framed with the surround format in mind, and every pixel of the incredibly wide canvas is filled with imagery.

How are movies created for viewing in Escape?
There are two approaches that can be used to fill the screen with content. One is to place different shots on each screen in the process of telling the story. The other is to shoot a wide enough field of view and high enough resolution to stretch a single image across the screens. For 6 Below, director Scott Waugh wanted to shoot everything at 6K, with the intention of filling all the screens with main image. “I wanted to immerse the viewer in Eric’s predicament, alone on the mountain.”

Cinematographer Michael Svitak shot with the Red Epic Dragon. He says, “After testing both spherical and anamorphic lens options, I chose to shoot Panavision Primo 70 prime lenses because of their pristine quality of the entire imaging frame.” He recorded in 6K-WS (2.37:1 aspect ratio at 6144×2592), framing with both 7:1 Barco Escape and a 2.76:1 4K extraction in mind. Red does have an 8:1 option and a 4:1 option that could work if Escape was your only deliverable. But since there are very few Escape theaters at the moment, you would literally be painting yourself into a corner. Having more vertical resolution available in the source footage opens up all sorts of workflow possibilities.

This still left a few challenges in post: to adjust the framing for the most comfortable viewing and to create alternate framing options for other deliverables that couldn’t use the extreme 7:1 aspect ratio. Other projects have usually treated the three screens separately throughout the conform process, but we treated the entire canvas as a single unit until the very last step, breaking out three 2K streams for the DCP encode.

What extra challenges did Barco Escape delivery pose for 6 Below’s post workflow?
Vashi Nedomansky edited the original 6K R3D files in Adobe Premiere Pro, without making proxies, on some maxed-out Dell workstations. We did the initial edit with curved ultra-wide monitors and 4K TVs. “Once Mike McCarthy optimized the Dell systems, I was free to edit the source 6K Red RAW files and not worry about transcodes or proxies,” he explains. “With such a quick turnaround everyday, and so much footage coming in, it was critical that I could jump on the footage, cut my scenes, see if they were playing well and report back to the director that same day if we needed additional shots. This would not have been possible time-wise if we were transcoding and waiting for footage to cut. I kept pushing the hardware and software, but it never broke or let me down. My first cut was 2 hours and 49 minutes long, and we played it back on one Premiere Pro timeline in realtime. It was crazy!”

All of the visual effects were done at the full shooting resolution of 6144×2592, as was the color grade. Once Vashi had the basic cut in place, there was no real online conform, just some cleanup work to do before sending it to color as an 8TB stack of 6K frames. At that point, we started examining it from the three-screen perspective with three TVs to preview it in realtime, courtesy of the Mosaic functionality built into Nvidia’s Quadro GPU cards. Shots were realigned to avoid having important imagery in the seams, and some areas were stretched to compensate for the angle of the side screens from the audiences perspective.

DP Michael Svitak and director Scott Waugh

Once we had the final color grade completed (via Mike Sowa at Technicolor using Autodesk Lustre), we spent a day in an Escape theater analyzing the effect of reflections between the screens and its effect on the contrast. We made a lot of adjustments to keep the luminance of the side screens from washing out the darks on the center screen, which you can’t simulate on TVs in the edit bay. “It was great to be able to make the final adjustments to the film in realtime in that environment. We could see the results immediately on all three screens and how they impacted the room,” says Waugh.

Once we added the 7.1 mix, we were ready to export assets for our delivery in many different formats and aspect ratios. Making the three streams for Escape playback was a simple as using the crop tool in Adobe Media Encoder to trim the sides in 2K increments.

How can you see movies in the Barco Escape format?
Barco maintains a list of theaters that have Escape screens installed, which can be found at ready2escape.com. But for readers in the LA area, the only opportunity to see a film in Barco Escape in the foreseeable future is to attend one of the Thursday night screenings of 6Below at the Regal LA Live Stadium or the Cinemark XD at Howard Hughes Center. There are other locations available to see the film in standard theatrical format, but as a new technology, Barco Escape is only available in a limited number of locations. Hopefully, we will see more Escape films and locations to watch them in the future.


Mike McCarthy is an online editor/workflow consultant with 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.

Sim Group purchases Vancouver’s The Crossing Studios

Sim Group, a family of companies offering production and post services across TV, feature film and commercials, has strengthened its place in the industry with the acquisition of Vancouver-based The Crossing Studios. This full-service studio and production facility adds approximately 400,000 square feet to Sim’s footprint.

With proximity to downtown Vancouver, the city’s international airport and all local suppliers, The Crossing Studios has been home to many television series, specials and feature films. In addition to providing full-service studio rentals, mill/paint/lockup space and production office space, The Crossing Studios also offer post production services, including Avid suite rentals, dailies, color correction and high-speed connectivity.

The Crossing Studios was founded by Dian Cross-Massey in 2015 and is the second-largest studio facility in the lower mainland, comprised of nine buildings in Vancouver, all are located just 30 minutes from downtown. Cross-Massey has over 25 years of experience in the industry, having worked as a writer, executive producer, visual effects supervisor, director, producer and a production manager. Thanks to this experience, Cross-Massey prides herself on knowing first-hand how to anticipate client needs and contributes to the success of her clients’ projects.

“When I was a producer, I worked with Sim regularly and always felt they had the same approach to fair, honest work as I did, so when the opportunity presented itself to combine resources and support our shared clients with more offerings, the decision to join together felt right,” says Cross-Massey.

The Crossing Studios clients include Viacom, Fox, Nickelodeon, Lifetime, Sony Pictures, NBCUniversal and ABC.

“The decision to add The Crossing Studios to the Sim family was a natural one,” says James Haggarty, CEO, Sim Group. “Through our end-to-end services, we pride ourselves on delivering streamlined solutions that simplify the customer experience. Dian and her team are extremely well respected within the entertainment industry, and together, we’ll not only be able to support the incredible growth in the Vancouver market, but clients will have the option to package everything they need from pre-production through post for better service and efficiencies.”