Category Archives: production

Luke Scott to run newly created Ridley Scott Creative Group

Filmmaker Ridley Scott has brought together all of his RSA Films-affiliated companies together in a multi-business restructure to form the Ridley Scott Creative Group. The Ridley Scott Creative Group aims to strengthen the network across the related companies to take advantage of emerging opportunities across all entertainment genres as well as their existing work in film, television, branded entertainment, commercials, VR, short films, documentaries, music video, design and animation, and photography.

Ridley Scott

Luke Scott will assume the role of global CEO, working with founder Ridley Scott and partners Jake and Jordan Scott to oversee the future strategic direction of the newly formed group.

“We are in a new golden age of entertainment,” says Ridley Scott. “The world’s greatest brands, platforms, agencies, new entertainment players and studios are investing hugely in entertainment. We have brought together our talent, capabilities and creative resources under the Ridley Scott Creative Group, and I look forward to maximizing the creative opportunities we now see unfolding with our executive team.”

The companies that make up the RSCG will continue to operate autonomously but will now offer clients synergy under the group offering.

The group includes commercial production company RSA Films, which produced such ads such as Apple’s 1984, Budweiser’s Super Bowl favorite Lost Dog and more recently, Adidas Originals’ Original is Never Finished campaign, as well as branded content for Johnnie Walker, HBO, Jaguar, Ford, Nike and the BMW Films series; the music video production company founded by Jake Scott, Black Dog Films (Justin Timberlake, Maroon 5, Nicki Minaj, Beyoncé, Coldplay, Björk and Radiohead); the entertainment marketing company 3AM; commercial production company Hey Wonderful founded by Michael Di Girolamo; newly founded UK commercial production company Darling Films; and film and television production company Scott Free (Gladiator, Taboo, The Martian, The Good Wife), which continues to be led by David W. Zucker, president, US television; Kevin J. Walsh, president, US film; and Ed Rubin-Managing, director, UK television/film.

“Our Scott Free Films and Television divisions have an unprecedented number of movies and shows in production,” reports Luke Scott, global CEO of the Ridley Scott Creative Group. “We are also seeing a huge appetite for branded entertainment from our brand and agency partners to run alongside high-quality commercials. Our entertainment marketing division 3AM is extending its capabilities to all our partners, while Black Dog is moving into short films and breaking new, world-class talent. It is a very exciting time to be working in entertainment.”

 

 

 

 

 

Sim and the ASC partner on educational events, more

During Cine Gear recently, Sim announced a 30-year sponsorship with the American Society of Cinematographers (ASC). Sim offers end-to-end solutions for creatives in film and television, and the ASC is a nonprofit focusing on the art of cinematography. As part of the relationship, the ASC Clubhouse courtyard will now be renamed Sim Plaza.

Sim and the ASC have worked together frequently on events that educate industry professionals on current technology and its application to their evolving craft. As part of this sponsorship, Sim will expand its involvement with the ASC Master Classes, SimLabs, and conferences and seminars in Hollywood and beyond.

During an official ceremony, a commemorative plaque was unveiled and embedded into the walkway of what is now Sim Plaza in Hollywood. Sim will also host a celebration of the ASC’s 100th anniversary in 2019 at Sim’s Hollywood location.

What else does this partnership entail?
• The two organizations will work together closely over the next 30 years on educational events for the cinematography community. Sim’s sponsorship will help fund society programs and events to educate industry professionals (both practicing and aspiring) on current technology and its application to the evolving craft.
• The ASC Master Class program, SimLabs and other conferences and seminars will continue on over these 30 years with Sim increasing its involvement. Sim is not telling the ASC what kind of initiatives they should be doing, but is rather lending a helping hand to drive visual storytelling forward. For example, they have already hosted ASC Master Class sessions in Toronto and Hollywood, sponsored the annual ASC BBQ for the last couple of years, and founder Rob Sim himself is an ASC associate member.

How will the partnership will increase programming and resources to support the film and television community for the long term?
• It has a large focus on three things: financial resources, programming assistance and facility support.
• It will provide access and training with world-class technology in film and television.
• It will offer training directly from industry leaders in Hollywood and beyond
• It will develop new programs for people who can’t attend ASC Master Class sessions, such as an online experience, which is something ASC and Sim are working on together.
• It will expand SimLabs beyond Hollywood —with the potential to bring it to Vancouver, Atlanta, New York and Toronto with the goal of creating new avenues for people who are associated with the ASC and who know they can call on Sim.
• It will bring volunteers. Sim has many volunteers on ASC committees, including the Motion Imaging Technology Council and its Lens committee.

Main Image: L-R: Sim President/CEO James Haggarty, Sim founder and ASC associate member Rob Sim,ASC events coordinator Patty Armacost and ASC president Kees van Oostrum.

Cinna 4.13

Testing large format camera workflows

By Mike McCarthy

In the last few months, we have seen the release of the Red Monstro, Sony Venice, Arri Alexa LF and Canon C700 FF, all of which have larger or full-frame sensors. Full frame refers to the DSLR terminology, with full frame being equivalent to the entire 35mm film area — the way that it was used horizontally in still cameras. All SLRs used to be full frame with 35mm film, so there was no need for the term until manufacturers started saving money on digital image sensors by making them smaller than 35mm film exposures. Super35mm motion picture cameras on the other hand ran the film vertically, resulting in a smaller exposure area per frame, but this was still much larger than most video imagers until the last decade, with 2/3-inch chips being considered premium imagers. The options have grown a lot since then.

L-R: 1st AC Ben Brady, DP Michael Svitak and Mike McCarthy on the monitor.

Most of the top-end cinema cameras released over the last few years have advertised their Super35mm sensors as a huge selling point, as that allows use of any existing S35 lens on the camera. These S35 cameras include the Epic, Helium and Gemini from Red, Sony’s F5 and F55, Panasonic’s VaricamLT, Arri’s Alexa and Canon’s C100-500. On the top end, 65mm cameras like the Alexa65 have sensors twice as wide as Super35 cameras, but very limited lens options to cover a sensor that large. Full frame falls somewhere in between and allows, among other things, use of any 35mm still film lenses. In the world of film, this was referred to as Vista Vision, but the first widely used full-frame digital video camera was Canon’s 5D MkII, the first serious HDSLR. That format has suddenly surged in popularity recently, and thanks to this I recently had opportunity to be involved in a test shoot with a number of these new cameras.

Keslow Camera was generous enough to give DP Michael Svitak and myself access to pretty much all their full-frame cameras and lenses for the day in order to test the cameras, workflows and lens options for this new format. We also had the assistance of first AC Ben Brady to help us put all that gear to use, and Mike’s daughter Florendia as our model.

First off was the Red Monstro, which while technically not the full 24mm height of true full frame, uses the same size lenses due to the width of its 17×9 sensor. It offers the highest resolution of the group at 8K. It records compressed RAW to R3D files, as well as options for ProRes and DNxHR up to 4K, all saved to Red mags. Like the rest of the group, smaller portions of the sensor can be used at lower resolution to pair with smaller lenses. The Red Helium sensor has the same resolution but in a much smaller Super35 size, allowing a wider selection of lenses to be used. But larger pixels allow more light sensitivity, with individual pixels up to 5 microns wide on the Monstro and Dragon, compared to Helium’s 3.65-micron pixels.

Next up was Sony’s new Venice camera with a 6K full-frame sensor, allowing 4K S35 recording as well. It records XAVC to SxS cards or compressed RAW in the X-OCN format with the optional ASX-R7 external recorder, which we used. It is worth noting that both full-frame recording and integrated anamorphic support require additional special licenses from Sony, but Keslow provided us with a camera that had all of that functionality enabled. With a 36x24mm 6K sensor, the pixels are 5.9microns, and footage shot at 4K in the S35 mode should be similar to shooting with the F55.

We unexpectedly had the opportunity to shoot on Arri’s new AlexaLF (Large Format) camera. At 4.5K, this had the lowest resolution, but that also means the largest sensor pixels at 8.25microns, which can increase sensitivity. It records ArriRaw or ProRes to Codex XR capture drives with its integrated recorder.

Another other new option is the Canon C700 FF with a 5.9K full-frame sensor recording RAW, ProRes, or XAVC to CFast cards or Codex Drives. That gives it 6-micron pixels, similar to the Sony Venice. But we did not have the opportunity to test that camera this time around, maybe in the future.

One more factor in all of this is the rising popularity of anamorphic lenses. All of these cameras support modes that use the part of the sensor covered by anamorphic lenses and can desqueeze the image for live monitoring and preview. In the digital world, anamorphic essentially cuts your overall resolution in half, until the unlikely event that we start seeing anamorphic projectors or cameras with rectangular sensor pixels. But the prevailing attitude appears to be, “We have lots of extra resolution available so it doesn’t really matter if we lose some to anamorphic conversion.”

Post Production
So what does this mean for post? In theory, sensor size has no direct effect on the recorded files (besides the content of them) but resolution does. But we also have a number of new formats to deal with as well, and then we have to deal with anamorphic images during finishing.

Ever since I got my hands on one of Dell’s new UP3218K monitors with an 8K screen, I have been collecting 8K assets to display on there. When I first started discussing this shoot with DP Michael Svitak, I was primarily interested in getting some more 8K footage to use to test out new 8K monitors, editing systems and software as it got released. I was anticipating getting Red footage, which I knew I could playback and process using my existing software and hardware.

The other cameras and lens options were added as the plan expanded, and by the time we got to Keslow Camera, they had filled a room with lenses and gear for us to test with. I also had a Dell 8K display connected to my ingest system, and the new 4K DreamColor monitor as well. This allowed me to view the recorded footage in the highest resolution possible.

Most editing programs, including Premiere Pro and Resolve, can handle anamorphic footage without issue, but new camera formats can be a bigger challenge. Any RAW file requires info about the sensor pattern in order to debayer it properly, and new compression formats are even more work. Sony’s new compressed RAW format for Venice, called X-OCN, is supported in the newest 12.1 release of Premiere Pro, so I didn’t expect that to be a problem. Its other recording option is XAVC, which should work as well. The Alexa on the other hand uses ArriRaw files, which have been supported in Premiere for years, but each new camera shoots a slightly different “flavor” of the file based on the unique properties of that sensor. Shooting ProRes instead would virtually guarantee compatibility but at the expense of the RAW properties. (Maybe someday ProResRAW will offer the best of both worlds.) The Alexa also has the challenge of recording to Codex drives that can only be offloaded in OS X or Linux.

Once I had all of the files on my system, after using a MacBook Pro to offload the media cards, I tried to bring them into Premiere. The Red files came in just fine but didn’t play back smoothly over 1/4 resolution. They played smoothly in RedCineX with my Red Rocket-X enabled, and they export respectably fast in AME, (a five-minute 8K anamorphic sequence to UHD H.265 in 10 minutes), but for some reason Premiere Pro isn’t able to get smooth playback when using the Red Rocket-X. Next I tried the X-OCN files from the Venice camera, which imported without issue. They played smoothly on my machine but looked like they were locked to half or quarter res, regardless of what settings I used, even in the exports. I am currently working with Adobe to get to the bottom of that because they are able to play back my files at full quality, while all my systems have the same issue. Lastly, I tried to import the Arri files from the AlexaLF, but Adobe doesn’t support that new variation of ArriRaw yet. I would anticipate that will happen soon, since it shouldn’t be too difficult to add that new version to the existing support.

I ended up converting the files I needed to DNxHR in DaVinci Resolve so I could edit them in Premiere, and I put together a short video showing off the various lenses we tested with. Eventually, I need to learn how to use Resolve more efficiently, but the type of work I usually do lends itself to the way Premiere is designed — inter-cutting and nesting sequences with many different resolutions and aspect ratios. Here is a short clip demonstrating some of the lenses we tested with:

This is a web video, so even at UHD it is not meant to be an analysis of the RAW image quality, but instead a demonstration of the field of view and overall feel with various lenses and camera settings. The combination of the larger sensors and the anamorphic lenses leads to an extremely wide field of view. The table was only about 10 feet from the camera, and we can usually see all the way around it. We also discovered that when recording anamorphic on the Alexa LF, we were recording a wider image than was displaying on the monitor output. You can see in the frame grab below that the live display visible on the right side of the image isn’t displaying the full content that got recorded, which is why we didn’t notice that we were recording with the wrong settings with so much vignetting from the lens.

We only discovered this after the fact, from this shot, so we didn’t get the opportunity to track down the issue to see if it was the result of a setting in the camera or in the monitor. This is why we test things before a shoot, but we didn’t “test” before our camera test, so these things happen.

We learned a lot from the process, and hopefully some of those lessons are conveyed here. A big thanks to Brad Wilson and the rest of the guys at Keslow Camera for their gear and support of this adventure and, hopefully, it will help people better prepare to shoot and post with this new generation of cameras.

Main Image: DP Michael Svitak


Mike McCarthy is an online editor/workflow consultant with 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.


HPA Tech Retreat: The production budget vs. project struggle

“Executive producers often don’t speak tech language,” said Aaron Semmel, CEO and head of BoomBoomBooya, in addressing the HPA Tech Retreat audience Palm Springs in late February. “When people come to us with requests and spout all sorts of tech mumbo jumbo, it’s very easy for us to say no,” he continued. “Trust me, you need to speak to us in our language.”

Semmel was part of a four-person HPA panel that included Cirina Catania, The Catania Group; Larry O’Connor, OWC Digital; and Jeff Stansfield, Advantage Video Systems. Moderated by Andy Marken of Marken Communications, the panel explored solutions that can bring the executive and line producers and the production/post teams closer together to implement the right solutions for every project and satisfy everyone, including accounting.

An executive and co-producer on more than a dozen film and TV series projects, Semmel said his job is to bring together the money and then work with the best creative people possible. He added that the team’s job was to make certain the below-the-line items — actual production and post production elements — stay on or below budget.

Semmel noted that most executive producers often work off of the top sheet of the budget, typically an overview of the budget. He explained that executive producers may go through all of the budget and play with numbers here and there but leave the actual handling of the budget to the line producer and supervising producer. In this way, they can “back into” a budget number set by the executive producer.

“I understand the technologies at a higher level and could probably take a highlighter and mark budget areas where we could reduce our costs, but I also know I have very experienced people on the team who know the technologies better than I do to make effective cuts.

L-R: Jeff Stansfield, Aaron Semmel, Cirina Catania

“For example, in talking with many of you in the audience here at the Retreat, I learned that there’s no such thing as an SSD hard drive,” he said. “I now know there are SSDs and there are hard drives and they’re totally different.”

Leaning into her mic, Catania got a laugh when she said, “One of the first things we all have to do is bring our production workflows into the 21st century. But seriously, the production and post teams are occasionally not consulted during the lengthy budgeting process. Our keys can make some valuable contributions if they have a seat at the table during the initial stages. In terms of technology, we have some exciting new tools we’d like to put to work on the project that could save you valuable time, help you organize your media and metadata, and have a direct and immediate positive impact on the budget. What if I told you that you could save endless hours in post if you had software that helped your team enter metadata and prep for post during the early phase — and hardware that worked much faster, more securely and more reliably.”

With wide agreement from the audience, Catania emphasized that it is imperative for all departments involved in prep/production/post and distribution to be involved in the budget process from the outset.

“We know the biggest part of your budget might be above-the-line costs,” she continued. “But production, post and distribution are where much of the critical work also gets done. And if we’re involved at the outset, and that includes with people like Jeff (Stansfield), who can help us come up with creative workflow and financing options, that will save you and the investors’ money, we will surely turn a profit.”

Semmel said the production/post team could probably be of assistance in the early budget stages to pinpoint where work could be done more efficiently to actually improve the overall quality and ensure EPs do what they need to do for their reputation… deliver the best and be under budget.

The Hatfields and the McCoys via History Channel

“But for some items, there seem to be real constraints,” he emphasized. “For example, we were shooting America’s Feud: Hatfields & McCoys, a historical documentary in Romania — yes, Romania,” he grinned; “and we were behind schedule. We shot the farmhouse attack on day one, shot the burning of the house on day two and on day three we received our dailies to review for day one’s work. We were certain we had everything we needed so we took a calculated risk and burned the building,” he recalled. “But no one exhaled until we had a chance to go through the dailies.”

“What if I told you there’s a solution that will transfer your data at 2800MB/s and enable you to turn around your dailies in a couple of hours instead of a couple of days?” O’Connor asked.

Semmel replied, “I don’t understand the 2800MB/s stuff, but you clearly got my attention by saying dailies in a couple of hours instead of days. If there had been anything wrong with the content we had shot, we would have been faced with the huge added expense of rebuilding and reshooting everything,” he added. “Even accounting can understand the savings in hours vs. days.”

Semmel pointed out that because films and TV shows start and end digital, there’s always a concern about frames and segments being lost when you’re on location and a long distance from the safety net of your production facilities.

“No one likes that risk, including production/post leaders, integrators or manufacturers,” said O’Connor. “In fact, a lot of crews go to extraordinary lengths to ensure nothing is lost; and frankly, I don’t blame them.”

He recalled a film crew going to Haiti to shoot a documentary that was told by the airline they were over their limit on baggage for the trip.

“They put their clothes in an airport locker and put their three RAID storage systems in their backpacks. They wanted to make certain they could store, backup and backup their work again to ensure they had all of the content they needed when they got back to their production/post facility.”

Stansfield and Catania said they had seen and heard of similar gut-level decisions made by executive and line producers. They encouraged the production/post audience not to simply accept the line item budgets they are given to work with but be more involved at the beginning of the project to explore and define all of the below-the-line budget to minimize risk and provide alternative plans just in case unexpected challenges arise.

“An EP and line producer’s mantra for TV and film projects is you only get two out of three things: time, money and quality,” Semmel said. “If you can deliver all three, then we’ll listen, but you have to approach it from our perspective.

“Our budgets aren’t open purses,” he continued. “You have to make recommendations and deliver products and solutions that enable us to stay under budget, because no matter how neat they are or how gee-whiz technical they are, they aren’t going to be accepted. We have two very fickle masters — finance and viewer — so you have to give us the tools and solutions that satisfy both of them. Don’t give us bits, bytes and specs, just focus on meeting our needs in words we can understand.

“When you do that, we all win; and we can all work on the next project together,” Semmel concluded. “We only surround ourselves with people who will help us through the project. People who deliver.”


Peter Doyle on coloring Churchill’s England for Darkest Hour

By Daniel Restuccio

Technicolor supervising digital colorist Peter Doyle is pretty close to being a legend in the movie industry. He’s color graded 12 of the 100 top box office movies, including Peter Jackson’s Lord of the Rings trilogy, six Harry Potter films, Aleksander Sokurov’s Venice Golden Lion-winning Faust, Joel and Ethan Coen’s Inside Llewyn Davis, The Ballad of Buster Scruggs and most recently the Golden Globe-nominated Darkest Hour.

Grading Focus Features’ Darkest Hour — which focuses on Winston Churchill’s early days as Prime Minister of the United Kingdom during WWII — represents a reunion for Doyle. He previously worked with director Joe Wright (Pan) and director of photography Bruno Delbonnel (Inside Llewyn Davis). (Darkest Hour picked up a variety of Oscar nominations, including Best Picture and Best Cinematography for Delbonnel.)

Peter Doyle

The vibe on Darkest Hour, according to Doyle, was very collaborative and inspiring. “Joe is an intensely visual director and has an extraordinary aesthetic… visually, he’s very considerate and very aware. It was just great to throw out ideas, share them and work to find what would be visually appropriate with Bruno in terms of his design of light, and what this world should look like.”

All the time, says Doyle, they worked to creatively honor Joe’s overall vision of where the film should be from both the narrative and the visual viewpoint.

The creative team, he continues, was focused on what they hoped to achieve in terms of “the emotional experience with the visuals,” what did they want this movie to look like and, technically, how could they get the feeling of that imagery onto the screen?

Research and Style Guide
They set about to build a philosophy of what the on-screen vision of the film would be. That turned into a “style guide” manifesto of actually how to get that on screen. They knew it was the 1940s during World War II, so logically they examined newsreels and the cameras and lenses that were used at the time. One of the things that came out of the discussions with Joe and Bruno was the choice of the 1.85:1 aspect ratio. “It’s quite an ensemble cast and the 2.35:1 would let you spread the cast across the screen, but wide 1.85:1 felt most appropriate for that.”

Doyle also did some research at the Victoria and Albert Museum’s very large photographic collection and dug into his own collection of photographic prints made with alternate color processes. Sepia and black and white got ruled out. They investigated the color films of the time and settled in on the color work of Edward Steichen.

Delbonnel chose Arri Alexa SXT cameras and Cooke S4s and Angenieux zoom lenses. They mastered in ArriRaw 3.2K. Technicolor has technology that allowed Doyle to build a “broad stroke” color-model-based emulation of what the color processes were like in the ’40s and apply that to the Alexa. “The idea,” explains Doyle, “was to take the image from the Alexa camera and mold it into an approximation of what the color film stocks would have looked like at the time. Then, having got into that world, tweak it slightly, because that’s quite a strong look,” and they still needed it to be “sensitive to the skin tones of the actors.”

Color Palette and Fabrics
There was an “overall arc” to this moment in history, says Doyle. The film’s setting was London during WWII, and outside it was hot and sunny. Inside, all lights were dimmed filaments, and that created a scenario where visually they would have extremely high-contrast images. All the colors were natural-based dyes, he explains, and the fabrics were various kind of wools and silks. “The walls and the actual environment that everyone would have been in would be a little run down. There would have been quite a patina and texture on the walls, so a lot of dirt and dust. These were kind of the key points that they gave me in order to work something out.”

Doyle’s A-ha Moment
“I took some hero shots of Kristin Scott Thomas (Clementine Churchill) and Gary Oldman (Winston Churchill), along with a few of the other actors, from Bruno’s rushes,” explains Doyle, adding that those shots became his reference.

From those images he devised different LUTs (Look Up Tables) that reflected different kinds of color manipulation processes of the time. It also meant that during principal photography they could keep referencing how the skin tones were working. There are a lot of close-ups and medium close-ups in Darkest Hour that gave easy access to the performance, but it also required them to be very aware of the impact of lighting on prosthetics and makeup.

Doyle photographed test charts on both 120mm reversal film of Ektachrome he had sitting in his freezer from the late ’70s and the Alexa. “The ‘a-ha moment’ was when we ran a test image through both. It was just staggering how different the imagery really looked. It gave us a good visual reference of the differences between film and digital, but more accurately the difference between reversal film and digital. It allowed us to zero in on the reactions of the two imaging methods and build the show LUTs and emulation of the Steichen look.”

One Word
When Doyle worked on Llewelyn Davis, Delbonnel and the Coen brothers defined the look of the film with one word: “sad.” For Darkest Hour, the one word used was “contrast,” but as a multi-level definition not just in the context of lights and darks in the image. “It just seemed to be echoed across all the various facets of this film,” says Doyle. “Certainly, Darkest Hour is a story of contrasting opinions. In terms of story and moments, there are soldiers at war in trenches, whilst there are politicians drinking champagne — certainly contrast there. Contrast in terms of the environment with the extreme intense hot summer outside and the darkness and general dullness on the inside.”

A good example, he says, is “the Parliament House speech that’s being delivered with amazing shafts of light that lit up the environment.”

The DP’s Signature
Doyle feels that digital cinematography tends to “remove the signature” of the director of photography, and that it’s his job to put it back. “In those halcyon days of film negative, there were quite a lot of processes that a DP would use in the lab that would become part of the image. A classic example, he says, is Terrence Malick’s Days of Heaven, which was shot mostly during sunrise and sunset by Nestor Almendros, and “the extraordinary lightness of the image. Or Stanley Kubrick’s Barry Lyndon, which was shot by John Alcott with scenes lit entirely by candles “that have a real softness.” The looks of those movies are a combination of the cinematographer’s lighting and work with the lab.

“A digital camera is an amazing recording device. It will faithfully reproduce what it records on set,” says Doyle. “What I’ve done with Bruno in the testing stage is bring back the various processes that you would possibly do in the lab, or at least the concept of what you would do in the laboratory. We’re really bending and twisting the image. Everyone sees the film the way that the DP intends, and then everyone’s relationship with that film is via this grade.”

This is why it’s so important to Doyle to have input from day one rushes through to the end. He’s making sure the DP’s “signature” is consistent to final grade. On Darkest Hour they tested, built and agreed on a look for the film for rushes. Colorist Mel Kangleon worked with Delbonnel on a daily basis to make sure all the exposures were correct from a technical viewpoint. Also, aesthetically to make sure the grade and look were not being lost.

“The grades that we were doing were what was intended by Bruno, and we made sure the actual imagery on the screen was how he wanted it to be,” explains Doyle. “We were making sure that the signature was being carried through.”

Darkest Hour and HDR
On Darkest Hour, Doyle built the DCI grade for the Xenon projector, 14 foot-lambert, as the master color corrected deliverable. “Then we took what was pretty much the LAD gray-card value of that DCI grade. So a very classic 18% gray that was translated across to the 48-, the 108-, the 1,000- and the 4,000-nit grade. We essentially parked the LAD gray (18% gray) at what we just felt was an appropriate brightness. There is not necessarily a lot of color science to that, other than saying, ‘this feels about right.’ That’s (also) very dependent on the ambient light levels.”

The DCI projector, notes Doyle, doesn’t really have “completely solid blacks; they’re just a little gray.” Doyle wished that the Xenon could’ve been brighter, but that is what the theatrical distribution chain is at the moment, he says.

When they did the HDR (High Dynamic Range) version, which Doyle has calls as a “new language” of color correction, they took the opportunity to add extra contrast and dial down the blacks to true black. “I was able to get some more detail in the lower shadows, but then have absolutely solid blacks —  likewise on the top end. We opened up the highlights to be even more visceral in their brightness. Joe Wright says he fell in love with the Dolby Vision.”

If you’re sitting in a Dolby Vision Cinema, says Doyle, you’re sitting in a black box. “Therefore, you don’t necessarily need to have the image as bright as a Rec 709 grade or LAD gray, which is typically for a lounge room where there are some lights on. There is a definite ratio between the presumed ambient light level of a room and where they park that LAD,” explains Doyle.

Knowing where they want the overall brightness of the film to be, they translate the tone curve to maintain exactly what they did in the DCI grade. Then perceptually it appears the same in the various mediums. Next they custom enhance each grade for the different display formats. “I don’t really necessarily call it a trim pass; it’s really adding a flare pass,” elaborates Doyle. “A DCI projector has quite a lot of flare, which means it’s quite organic and reactive to the image. If you project something on a laser, it doesn’t necessarily have anywhere near that amount of flair, and that can be a bit of a shock. Suddenly, your highlights are looking incredibly harsh. We went through and really just made sure that the smoothness of the image was maintained and emulated on the other various mediums.”

Doyle also notes that Darkest Hour benefited from the results of his efforts working with Technicolor color scientists Josh Pines and Chris Kutchka, working on new color modeling tools and being able “to build 3D LUTs that you can edit and that are cleaner. That can work in a little more containable way.”

Advice and Awards
In the bright new world of color correction, what questions would Doyle suggest asking directors? “What is their intent emotionally with the film? How do they want to reinforce that with color? Is it to be approached in a very literal way, or should we think about coming up with some kind of color arc that might be maybe counter intuitive? This will give you a feel for the world that the director has been thinking of, and then see if there’s a space to come at it from a slightly unexpected way.”

I asked Doyle if we have reached the point where awards committees should start thinking about an Academy Award category for color grading.

Knowing what an intensely collaborative process color grading is, Doyle responded that it would be quite challenging. “The pragmatist in me says it could be tricky to break it down in terms of the responsibilities. It depends on the relationship between the colorist, the DP and the director. It really does change with the personalities and the crew. That relationship could make the breakdown a little tricky just to work out whose idea was it to actually make it, for example, blue.”

Because this interview was conducted in December, I asked Doyle, what he would ask Santa to bring him for Christmas. His response? “I really think the new frontier is gamut mapping and gamut editing — that world of fitting one color space into another. I think being able to edit those color spaces with various color models that are visually more appropriate is pretty much the new frontier.”


Daniel Restuccio is a producer and teacher based in Southern California.


AICP and AICE to merge January 1

The AICP and AICE are celebrating the New Year in a very special way — they are merging into one organization. These two associations represent companies that produce and finish the majority of advertising and marketing content in the moving image. Post merger, AICP and AICE will function as a single association under the AICP brand. They will promote and advocate for independent production and post companies when it comes to producing brand communications for advertising agencies, advertisers and media companies.

The merger comes after months of careful deliberations on the part of each association’s respective boards and final votes of approval by their memberships. Under the newly merged association’s structure, executive director of AICE Rachelle Madden will assume the title of VP, post production and digital production affairs of AICP. She will report to president/CEO of AICP Matt Miller. Madden is now tasked with taking the lead on AICP’s post production offerings, including position papers, best practices, roundtables, town halls and other educational programs. She will also lead a post production council, which is being formed to advise the AICP National Board on post matters.

Former AICE members will be eligible to join the General Member Production companies of AICP, with access to all benefits starting in 2018. These include: Participation in the Producers’ Health Benefits Plan (PHBP); the AICP Legal Initiative (which provides legal advice on contracts with agencies and advertisers); and access to position papers, guidelines and other tools as they relate to business affairs and employment issues. Other member benefits include access to attend meetings, roundtables, town halls and seminars, as well as receiving the AICP newsletter, member discounts on services and a listing in the AICP membership directory on the AICP website.

All AICP offerings — including its AICP Week Base Camp for thought leadership — will reflect the expanded membership to include topics and issues pertaining to post production. Previously created AICE documents, position papers and forms will now live on aicp.com.

The AICP was founded in 1972 to protect the interests of independent commercial producers, crafting guidelines and best practice in an effort to help its members run their businesses more effectively. Through its AICP Awards, the organization celebrates creativity and craft in marketing communications.

AICE was founded in 1998 when three independent groups representing editing companies in Chicago, Los Angeles and New York formed a national association to discuss issues and undertake initiatives affecting post production on a broader scale. In addition to editing, the full range of post production disciplines, including color correction, visual effects, audio mixing and music and sound design are represented.

From AICP’s perspective, says Miller, merging the two organizations has benefits for members of both groups. “As we grow more closely allied, it makes more sense than ever for the organizations to have a unified voice in the industry,” he notes. He points out that there are numerous companies that are members of both organizations, reflecting the blurring of the lines between production and post that’s been occurring as media platforms, technologies and client needs have changed.

For Madden, AICE’s members will be joining an organization that provides them with a firm footing in terms of resources, programs, benefits and initiatives. “There are many reasons why we moved forward on this merger, and most of them involve amplifying the voice of the post production industry by combining our interests and advocacy with those of AICP members. We now become part of a much larger group, which gives us a strength in numbers we didn’t have before while adding critical post production perspectives to key discussions about business practices and industry trends.”

Main Image: Matt Miller and Rachelle Madden


Making 6 Below for Barco Escape

By Mike McCarthy

There is new movie coming out this week that is fairly unique. Telling the true story of Eric LeMarque surviving eight days lost in a blizzard, 6 Below: Miracle on the Mountain is the first film shot and edited in its entirety for the new Barco Escape theatrical format. If you don’t know what Barco Escape is, you are about to find out.

This article is meant to answer just about every question you might have about the format and how we made the film, on which I was post supervisor, production engineer and finishing editor.

What is Barco Escape?
Barco Escape is a wraparound visual experience — it consists of three projection screens filling the width of the viewer’s vision with a total aspect ratio of 7.16:1. The exact field of view will vary depending on where you are sitting in the auditorium, but usually 120-180 degrees. Similar to IMAX, it is not about filling the entire screen with your main object but leaving that in front of the audience and letting the rest of the image surround them and fill their peripheral vision in a more immersive experience. Three separate 2K scope theatrical images play at once resulting in 6144×858 pixels of imagery to fill the room.

Is this the first Barco Escape movie?
Technically, four other films have screened in Barco Escape theaters, the most popular one being last year’s release of Star Trek Beyond. But none of these films used the entire canvas offered by Escape throughout the movie. They had up to 20 minutes of content on the side screens, but the rest of the film was limited to the center screen that viewers are used to. Every shot in 6 Below was framed with the surround format in mind, and every pixel of the incredibly wide canvas is filled with imagery.

How are movies created for viewing in Escape?
There are two approaches that can be used to fill the screen with content. One is to place different shots on each screen in the process of telling the story. The other is to shoot a wide enough field of view and high enough resolution to stretch a single image across the screens. For 6 Below, director Scott Waugh wanted to shoot everything at 6K, with the intention of filling all the screens with main image. “I wanted to immerse the viewer in Eric’s predicament, alone on the mountain.”

Cinematographer Michael Svitak shot with the Red Epic Dragon. He says, “After testing both spherical and anamorphic lens options, I chose to shoot Panavision Primo 70 prime lenses because of their pristine quality of the entire imaging frame.” He recorded in 6K-WS (2.37:1 aspect ratio at 6144×2592), framing with both 7:1 Barco Escape and a 2.76:1 4K extraction in mind. Red does have an 8:1 option and a 4:1 option that could work if Escape was your only deliverable. But since there are very few Escape theaters at the moment, you would literally be painting yourself into a corner. Having more vertical resolution available in the source footage opens up all sorts of workflow possibilities.

This still left a few challenges in post: to adjust the framing for the most comfortable viewing and to create alternate framing options for other deliverables that couldn’t use the extreme 7:1 aspect ratio. Other projects have usually treated the three screens separately throughout the conform process, but we treated the entire canvas as a single unit until the very last step, breaking out three 2K streams for the DCP encode.

What extra challenges did Barco Escape delivery pose for 6 Below’s post workflow?
Vashi Nedomansky edited the original 6K R3D files in Adobe Premiere Pro, without making proxies, on some maxed-out Dell workstations. We did the initial edit with curved ultra-wide monitors and 4K TVs. “Once Mike McCarthy optimized the Dell systems, I was free to edit the source 6K Red RAW files and not worry about transcodes or proxies,” he explains. “With such a quick turnaround everyday, and so much footage coming in, it was critical that I could jump on the footage, cut my scenes, see if they were playing well and report back to the director that same day if we needed additional shots. This would not have been possible time-wise if we were transcoding and waiting for footage to cut. I kept pushing the hardware and software, but it never broke or let me down. My first cut was 2 hours and 49 minutes long, and we played it back on one Premiere Pro timeline in realtime. It was crazy!”

All of the visual effects were done at the full shooting resolution of 6144×2592, as was the color grade. Once Vashi had the basic cut in place, there was no real online conform, just some cleanup work to do before sending it to color as an 8TB stack of 6K frames. At that point, we started examining it from the three-screen perspective with three TVs to preview it in realtime, courtesy of the Mosaic functionality built into Nvidia’s Quadro GPU cards. Shots were realigned to avoid having important imagery in the seams, and some areas were stretched to compensate for the angle of the side screens from the audiences perspective.

DP Michael Svitak and director Scott Waugh

Once we had the final color grade completed (via Mike Sowa at Technicolor using Autodesk Lustre), we spent a day in an Escape theater analyzing the effect of reflections between the screens and its effect on the contrast. We made a lot of adjustments to keep the luminance of the side screens from washing out the darks on the center screen, which you can’t simulate on TVs in the edit bay. “It was great to be able to make the final adjustments to the film in realtime in that environment. We could see the results immediately on all three screens and how they impacted the room,” says Waugh.

Once we added the 7.1 mix, we were ready to export assets for our delivery in many different formats and aspect ratios. Making the three streams for Escape playback was a simple as using the crop tool in Adobe Media Encoder to trim the sides in 2K increments.

How can you see movies in the Barco Escape format?
Barco maintains a list of theaters that have Escape screens installed, which can be found at ready2escape.com. But for readers in the LA area, the only opportunity to see a film in Barco Escape in the foreseeable future is to attend one of the Thursday night screenings of 6Below at the Regal LA Live Stadium or the Cinemark XD at Howard Hughes Center. There are other locations available to see the film in standard theatrical format, but as a new technology, Barco Escape is only available in a limited number of locations. Hopefully, we will see more Escape films and locations to watch them in the future.


Mike McCarthy is an online editor/workflow consultant with 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.


Sim Group purchases Vancouver’s The Crossing Studios

Sim Group, a family of companies offering production and post services across TV, feature film and commercials, has strengthened its place in the industry with the acquisition of Vancouver-based The Crossing Studios. This full-service studio and production facility adds approximately 400,000 square feet to Sim’s footprint.

With proximity to downtown Vancouver, the city’s international airport and all local suppliers, The Crossing Studios has been home to many television series, specials and feature films. In addition to providing full-service studio rentals, mill/paint/lockup space and production office space, The Crossing Studios also offer post production services, including Avid suite rentals, dailies, color correction and high-speed connectivity.

The Crossing Studios was founded by Dian Cross-Massey in 2015 and is the second-largest studio facility in the lower mainland, comprised of nine buildings in Vancouver, all are located just 30 minutes from downtown. Cross-Massey has over 25 years of experience in the industry, having worked as a writer, executive producer, visual effects supervisor, director, producer and a production manager. Thanks to this experience, Cross-Massey prides herself on knowing first-hand how to anticipate client needs and contributes to the success of her clients’ projects.

“When I was a producer, I worked with Sim regularly and always felt they had the same approach to fair, honest work as I did, so when the opportunity presented itself to combine resources and support our shared clients with more offerings, the decision to join together felt right,” says Cross-Massey.

The Crossing Studios clients include Viacom, Fox, Nickelodeon, Lifetime, Sony Pictures, NBCUniversal and ABC.

“The decision to add The Crossing Studios to the Sim family was a natural one,” says James Haggarty, CEO, Sim Group. “Through our end-to-end services, we pride ourselves on delivering streamlined solutions that simplify the customer experience. Dian and her team are extremely well respected within the entertainment industry, and together, we’ll not only be able to support the incredible growth in the Vancouver market, but clients will have the option to package everything they need from pre-production through post for better service and efficiencies.”