Tag Archives: Arri

Colorist Christopher M. Ray talks workflow for Alexa 65-shot Alpha

By Randi Altman

Christopher M. Ray is a veteran colorist with a varied resume that includes many television and feature projects, including Tomorrowland, Warcraft, The Great Wall, The Crossing, Orange Is the New Black, Quantico, Code Black, The Crossing and Alpha. These projects have taken Ray all over the world, including remote places throughout North America, Europe, Asia and Africa.

We recently spoke with Ray, who is on staff at Burbank’s Picture Shop, to learn more about his workflow on the feature film Alpha, which focuses on a young man trying to survive alone in the wilderness after he’s left for dead during his first hunt with his Cro-Magnon tribe.

Ray was dailies colorist on the project, working with supervising DI colorist Maxine Gervais. Gervais of Technicolor won an HPA Award for her work on Alpha in the Outstanding Color Grading — Feature Film category.

Let’s find out more….

Chris Ray and Maxine Gervais at the HPA Awards.

How early did you get involved in Alpha?
I was approached about working on Alpha right before the start of principal photography. From the beginning I knew that it was going to be a groundbreaking workflow. I was told that we would be working with the ARRI Alexa 65 camera, mainly working in an on-set color grading trailer and we would be using FilmLight’s Daylight software.

Once I was on board, our main focus was to design a comprehensive workflow that could accommodate on-set grading and Daylight software while adapting to the ever-changing challenges that the industry brings. Being involved from the start was actually was a huge perk for me. It gave us the time we needed to design and really fine-tune the extensive workflow.

Can you talk about working with the final colorist Maxine Gervais and how everyone communicated?
It was a pleasure working with Maxine. She’s really dialed in to the demands of our industry. She was able to fly to Vancouver for a few days while we were shooting the hair/makeup tests, which is how we were able to form in-person communication. We were able to sit down and discuss creative approaches to the feature right away, which I appreciated as I’m the type of person that likes to dive right in.

At the film’s conception, we set in motion a plan to incorporate a Baselight Linked Grade (BLG) color workflow from FilmLight. This would allow my color grades in Daylight to transition smoothly into Maxine’s Baselight software. We knew from the get-go that there would be several complicated “day for night” scenes that Maxine and I would want to bring to fruition right away. Using the BLG workflow, I was able to send her single “Arriraw” frames that gave that “day for night” look we were searching for. She was able to then send them back to me via a BLG file. Even in remote locations, it was easy for me to access the BLG grade files via the Internet.

[Maxine Gervais weighs in on working with Ray: “Christopher was great to work with. As the workflow on the feature was created from scratch, he implemented great ideas. He was very keen on the whole project and was able to adapt to the ever-changing challenges of the show. It is always important to have on-set color dialed in correctly, as it can be problematic if it is not accurately established in production.”]

How did you work with the DP? What direction were you given?
Being on set, it was very easy for DP Martin Gschlacht to come over to the trailer and view the current grade I was working on. Like Maxine, Martin already had a very clear vision for the project, which made it easy to work with him. Oftentimes, he would call me over on set and explain his intent for the scene. We would brainstorm ways of how I could assist him in making his vision come to life. Audiences rarely see raw camera files, or the how important color can influence the story being told.

It also helps that Martin is a master of aesthetic. The content being captured was extremely striking; he has this natural intuition about what look is needed for each environment that he shoots. We shot in lush rain forests in British Columbia and arid badlands in Alberta, which each inspired very different aesthetics.

Whenever I had a bit of down time, I would walk over to set and just watch them shoot, like a fly on the wall quietly observing and seeing how the story was unfolding. As a colorist, it’s so special to be able to observe the locations on set. Seeing the natural desaturated hues of dead grass in the badlands or the vivid lush greens in the rain forest with your own eyes is an amazing opportunity many of us don’t get.

You were on set throughout? Is that common for you?
We were on set throughout the entire project as a lot of our filming locations were in remote areas of British Columbia and Alberta, Canada. One of our most demanding shooting locations included the Dinosaur Provincial Park in Brooks, Alberta. The park is a UNESCO World Heritage site that no one had been allowed to film at prior to this project. I needed to have easy access to the site in order to easily communicate with the film’s executive team and production crew. They were able to screen footage in their trailer and we had this seamless back-and-forth workflow. This also allowed them to view high-quality files in a comfortable and controlled environment. Also, the ability to flag any potential issues and address them immediately on set was incredibly valuable with a film of such size and complexity.

Alpha was actually the first time I worked in an on-set grading trailer. In the past I usually worked out of the production office. I have heard of other films working with an on-set trailer, but I don’t think I would say that it is overly common. Sometimes, I wish I could be stationed on set more often.

The film was shot mostly with the Alexa 65, but included footage from other formats. Can you talk about that workflow?
The film was mostly shot on the Alexa 65, but there were also several other formats it was shot on. For most of the shoot there was a second unit that was shooting with Alexa XT and Red Weapon cameras, with a splinter unit shooting B-roll footage on Canon 1D, 5D and Sony A7S. In addition to these, there were units in Iceland and South Africa shooting VFX plates on a Red Dragon.

By the end of the shoot, there were several different camera formats and over 10 different resolutions. We used the 6.5K Alexa 65 resolution as the master resolution and mapped all the others into it.

The Alexa 65 camera cards were backed up to 8TB “sled” transfer drives using a Codex Vault S system. The 8TB transfer drives were then sent to the trailer where I had two Codex Vault XL systems — one was used for ingesting all of the footage into my SAN and the second was used to prepare footage for LTO archival. All of the other unit footage was sent to the trailer via shuttle drives or Internet transfer.

After the footage was successfully ingested to the SAN with a checksum verification, it was ready to be colored, processed, and then archived. We had eight LTO6 decks running 24/7, as the main focus was to archive the exorbitant amounts of high-res camera footage that we were receiving. Just the Alexa 65 alone was about 2.8TB per hour for each camera.

Had you worked with Alexa 65 footage previously?
Many times. A few year ago, I was in China for seven months working on The Great Wall, which was one of the first films to shoot with the Alexa 65. I had a month of in-depth pre-production with the camera testing, shooting and honing the camera’s technology. Working very closely with Arri and Codex technicians during this time, I was able to design the most efficient workflow possible. Even as the shoot progressed, I continued to communicate closely with both companies. As new challenges arose, we developed and implemented solutions that kept production running smoothly.

The workflow we designed for The Great Wall was very close to the workflow we ended up using on Alpha, so it was a great advantage that I had previous experience working in-depth with the camera.

What were some of the challenges you faced on this film?
To be honest, I love a challenge. As a colorist, we are thrown into tricky situations every day. I am thankful for these challenges; they improve my craft and enable me to become more efficient at problem solving. One of the largest challenges that I faced in this particular project was working with so many different units, given the number of units shooting, the size of the footage alone and the dozens of format types needed.

We had to be accessible around the clock, most of us working 24 hours a day. Needless to say, I made great friends with the transportation driving team and the generator operators. I think they would agree that my grading trailer was one of their largest challenges on the film since I constantly needed to be on set and my work was being imported/exported in such high resolutions.

In the end, as I was watching this absolutely gorgeous film in the theater it made sense. Working those crazy hours was absolutely worth it — I am thankful to have worked with such a cohesive team and the experience is one I will never forget.

Helicopter Film Services intros Titan ultra-heavy lifting drone

Helicopter Filming Services (HFS) has launched an ultra-heavy lift drone that incorporates a large, capable airframe paired with the ARRI SRH-3. Known as the Titan, the drone’s ARRI SRH-3 stabilized head enables easy integration of existing ARRI lens motors and other functionality directly with the ARRI Alexa 65 and LF cameras.

HFS developed the large drone in response to requests from some legendary DPs and VFX supervisors to enable filmmakers to fly large-format digital or 35mm film packages.

“We have trialed other heavy-lift machines, but all of them have been marginal in terms of performance when carrying the larger cameras and lenses that we’re asked to fly,” says Alan Perrin, chief UAV pilot at HFS. “What we needed, and what we’ve designed, is a system that will capably and safely operate with the large-format cameras and lenses that top productions demand.”

The Titan combines triple redundancy on flight controls and double redundancy on power supply and ballistic recovery into an aircraft that can deploy and operate easily on any production involving a substantial flight duration. The drone can easily fly a 35mm film camera while carrying an ARRI 435 and 400-foot magazine.

Here are some specs:
• Optimized for large-format digital and 35mm film cameras
• Max payload up to 30 kilograms
• Max take-off mass — 80 kilograms
• Redundant flight control systems
• Ballistic recovery system (parachute)
• Class-leading stability
• Flight duration up to 15 minutes (subject to payload weight and configuration)
• HD video downlink
• Gimbal: ARRI SRH3 or Movi XL

Final payload-proving flights are taking place now, and the company is in the process of planning first use on major productions. HFS is also exploring the ability to fly a new 65mm film camera on the Titan.

New CFast 2.0 card for ARRI Alexa Mini and Amira cameras

ARRI has introduced the ARRI Edition AV Pro AR 256 CFast 2.0 card by Angelbird, which has been designed and certified for use in the ARRI Alexa Mini and Amira camera systems and can be used for ProRes and MXF/ARRIRAW recording. (Support for new CFast 2.0 cards is currently not planned for ALEXA XT, SXT(W) and LF cameras.)

ARRI has worked closely with Angelbird Technologies, based in Vorarlberg, Austria. Angelbird is no stranger to film production, and some of their gear can be found at ARRI Rental European locations.

For the ARRI Edition CFast card, the Angelbird team developed an ARRI-specific card that uses a combination of thermally conductive material and so-called underfill to provide superior heat dissipation from the chips and to secure the electronic components against mechanical damage.

The result, according to ARRI, is a rock-solid 256 GB CFast 2.0 card with stable recording performance all the way across the storage space. The ARRI Edition AV PRO AR 256 memory card is available from ARRI and other sales channels offering ARRI products.

Academy honors 18 Scientific and Technical achievements

The Academy of Motion Picture Arts and Sciences has announced that 18 scientific and technical achievements represented by 34 individual award recipients, as well as five organizations, will be honored at its annual Scientific and Technical Awards Presentation on February 11.

“This year we are particularly pleased to be able to honor not only a wide range of new technologies, but also the pioneering digital cinema cameras that helped facilitate the widespread conversion to electronic image capture for motion picture production,” says Ray Feeney, Academy Award recipient and chair of the Scientific and Technical Awards Committee. “With their outstanding, innovative work, these technologists, engineers and inventors have significantly expanded filmmakers’ creative choices for moving image storytelling.” 

Unlike other Academy Awards to be presented this year, achievements receiving Scientific and Technical Awards need not have been developed and introduced during 2016. Rather, the achievements must demonstrate a proven record of contributing significant value to the process of making motion pictures.

The Academy Awards for scientific and technical achievements are: 

 Technical Achievement Awards (Academy Certificates)

Thomson Grass Valley for the design and engineering of the pioneering Viper FilmStream digital camera system. The Viper camera enables frame-based logarithmic encoding, which provides uncompressed camera output suitable for importing into existing digital intermediate workflows.

Larry Gritz for the design, implementation and dissemination of Open Shading Language (OSL). OSL is a highly-optimized runtime architecture and language for programmable shading and texturing that has become a de facto industry standard. It enables artists at all levels of technical proficiency to create physically plausible materials for efficient production rendering.

Carl Ludwig, Eugene Troubetzkoy and Maurice van Swaaij for the pioneering development of the CGI Studio renderer at Blue Sky Studios. CGI Studio’s groundbreaking ray-tracing and adaptive sampling techniques, coupled with streamlined artist controls, demonstrated the feasibility of ray-traced rendering for feature film production.

Brian Whited for the design and development of the Meander drawing system at Walt Disney Animation Studios. Meander’s innovative curve-rendering method faithfully captures the artist’s intent, resulting in a significant improvement in creative communication throughout the production pipeline.

Mark Rappaport for the concept, design and development, Scott Oshita for the motion analysis and CAD design, Jeff Cruts for the development of the faux-hair finish techniques, and Todd Minobe for the character articulation and drive-train mechanisms of the Creature Effects Animatronic Horse Puppet. The Animatronic Horse Puppet provides increased actor safety, close integration with live action, and improved realism for filmmakers.

Glenn Sanders and Howard Stark for the design and engineering of the Zaxcom Digital Wireless Microphone System. The Zaxcom system has advanced the state of wireless microphone technology by creating a fully digital modulation system with a rich feature set, which includes local recording capability within the belt pack and a wireless control scheme providing realtime transmitter control and timecode distribution.

David Thomas, Lawrence E. Fisher and David Bundy for the design, development and engineering of the Lectrosonics Digital Hybrid Wireless Microphone System. The Lectrosonics system has advanced the state of wireless microphone technology by developing a method to digitally transmit full-range audio over a conventional analog FM radio link, reducing transmitter size, and increasing power efficiency.

Parag Havaldar for the development of expression-based facial performance-capture technology at Sony Pictures Imageworks. This pioneering system enables large-scale use of animation rig-based facial performance-capture for motion pictures, combining solutions for tracking, stabilization, solving and animator-controllable curve editing.

Nicholas Apostoloff and Geoff Wedig for the design and development of animation rig-based facial performance-capture systems at ImageMovers Digital and Digital Domain. These systems evolved through independent, then combined, efforts at two different studios, resulting in an artist-controllable, editable, scalable solution for the high-fidelity transfer of facial performances to convincing digital characters.

Kiran Bhat, Michael Koperwas, Brian Cantwell and Paige Warner for the design and development of the ILM facial performance-capture solving system. This system enables high-fidelity facial performance transfer from actors to digital characters in large-scale productions while retaining full artistic control, and integrates stable, rig-based solving and the resolution of secondary detail in a controllable pipeline.

Scientific and Engineering Awards (Academy Plaques)

Arri for the pioneering design and engineering of the Super 35 format Alexa digital camera system. With an intuitive design and appealing image reproduction achieved through close collaboration with filmmakers, Arri’s Alexa cameras were among the first digital cameras widely adopted by cinematographers.

Red Digital Cinema for the pioneering design and evolution of the Red Epic digital cinema cameras with upgradeable full-frame image sensors. Red’s design and innovative manufacturing process have helped facilitate the wide adoption of digital image capture in the motion picture industry.

Sony for the development of the F65 CineAlta camera with its pioneering high-resolution imaging sensor, excellent dynamic range and full 4K output. Sony’s photosite orientation and true RAW recording deliver exceptional image quality.             

Panavision and Sony for the conception and development of the Genesis digital motion picture camera. Using a familiar form factor and accessories, the design features of the Genesis allowed it to become one of the first digital cameras to be adopted by cinematographers.

Marcos Fajardo for the creative vision and original implementation of the Arnold Renderer, and to Chris Kulla, Alan King, Thiago Ize and Clifford Stein for their highly-optimized geometry engine and novel ray-tracing algorithms which unify the rendering of curves, surfaces, volumetrics and subsurface scattering as developed at Sony Pictures Imageworks and Solid Angle SL. Arnold’s scalable and memory-efficient single-pass architecture for path tracing, its authors’ publication of the underlying techniques, and its broad industry acceptance were instrumental in leading a widespread adoption of fully raytraced rendering for motion pictures.

Vladimir Koylazov for the original concept, design and implementation of V-Ray from Chaos Group. V-Ray’s efficient production-ready approach to raytracing and global illumination, its support for a wide variety of workflows, and its broad industry acceptance were instrumental in the widespread adoption of fully ray-traced rendering for motion pictures.

Luca Fascione, J.P. Lewis and Iain Matthews for the design, engineering and development of the FACETS facial performance capture and solving system at Weta Digital. FACETS was one of the first reliable systems to demonstrate accurate facial tracking from an actor-mounted camera, combined with rig-based solving, in large-scale productions. This system enables animators to bring the nuance of the original live performances to a new level of fidelity for animated characters.

Steven Rosenbluth, Joshua Barratt, Robert Nolty and Archie Te for the engineering and development of the Concept Overdrive motion control system. This user-friendly hardware and software system creates and controls complex interactions of real and virtual motion in hard realtime, while safely adapting to the needs of on-set filmmakers. 

The A-List: Director Tom Tykwer on ‘A Hologram for the King’

By Iain Blair

Tom Tykwer, the multi-faceted German director/writer/composer/producer, first burst onto the international scene with his 1998 thriller Run Lola Run. Since then he’s directed such diverse films as Heaven, Perfume: The Story of a Murderer, The Princess and the Warrior, Cloud Atlas (with the Wachowskis) and The International. His latest is A Hologram for the King from Roadside Attractions.

Based on Dave Eggers’ novel, A Hologram for the King is set in recession-ravaged 2010. It stars Tom Hanks as Alan Clay, an American businessman who, broke, depressed and freshly divorced, arrives in Jeddah, Saudi Arabia, to close what he hopes will be the deal of a lifetime: selling a state-of-the-art holographic teleconferencing system to the Saudi government.

But, of course, nothing goes as planned. Adrift and alone in an unfamiliar land, Alan befriends a taxi driver who chauffeurs him through the desert to the “King’s Metropolis of Economy and Trade,” a surreal ghost town of vacant skyscrapers and half-completed construction projects. Baffled by the bureaucratic reception he gets at the so-called “Welcome Center,” Alan struggles to figure out why his small IT support team is being forced to spend its days in a sweltering tent as it preps for the big presentation. Worse, because of the Saudi way of doing business, he’s unclear if the king will ever show up for the long-scheduled meeting.

Back in Jeddah, the stressed-out salesman winds up in the hospital, where he is treated by a beautiful and empathetic Muslim doctor (Sarita Choudhury). As Alan gets to know his new Saudi friends better, cultural barriers break down and he begins to contemplate the possibility of a fresh start in a land where tradition and modernity meet in perplexing ways.

I recently caught up with Tykwer to talk about his process on the film.

What do you look for in a project and what was the appeal of making this?
I always look for something different — something that fits my sensibilities. I never want to repeat myself, and that can happen so easily if you’re not careful. I think this was a surprise for me too. I love Dave Eggers’ writing.  I actually tried to turn an earlier book of his into a TV show but it didn’t happen.  When I read this book I immediately felt I knew how to shoot it, so we met and I told him my ideas. I felt that as bleak and dark as the book is, there’s a lightness and sense of hope and a lot of comedy in the attempts of the characters to bridge two very different cultures. And despite all the cultural and political and religious barriers, there is communication. We can reach out to others.

How did you deal with all the restrictions of shooting in Saudi Arabia?
We shot some stuff there, but we couldn’t take the actors there; we ended up shooting most of it in Morocco. The biggest challenge for me was recreating the abstract “King’s Metropolis of Economy and Trade,” this sort of ghost town in the middle of nowhere in Saudi Arabian desert. I went there and to Jeddah and took photos of all the locations, and then we recreated some of it in the Western Sahara, the most southern part of Morocco, where there’s absolutely nothing —no film infrastructure at all. Plenty of films have been shot in Morocco near the cities in the north, but not down there, so we had to ship in everything — the crew, all the equipment and so on. We didn’t have a huge budget, so it was very challenging. We all stayed together in a little hotel, which had power just two hours a day. It was like camping.

Do you like the post process?
I love it, and if you feel confident about the material it’s heaven, since there’s none of the pressure of the shoot, the money worries and so on. But I do feel post isn’t as relaxed as it used to be. In the old days you could spend a year on post and no one would complain, but now everyone wants you to hurry up. I look at post and the editing as very similar to writing. You constantly reshape and re-phrase as you do post.

Where did you post?
I always do post in Berlin, and we also shot some of the interiors on stages there.

The film was edited by Alex Berner, whose credits include Jupiter Ascending, and who worked with you on Cloud Atlas. Tell us about that relationship and how it worked.
He wasn’t on the set, although he did visit us a couple of times. We sent him dailies and he would start cutting and assembling and send me stuff to look at. But I’m so busy on a shoot that I barely have time to look at anything, so I rely on him. I shoot a lot — it was anamorphic 35mm, for probably the last time — and he’ll get four to five hours of material and then start boiling it down to three or four minutes.

After the shoot, we spent about three months going though all the material, and then I took a two-month break to work on Sense8, this sci-fi show for Netflix, and that break was a real gift. You step back and see it more objectively. So then we cut for another two or three months and had a two-hour cut. This felt a little slow, so we trimmed it down to under 100 minutes and did some test screenings, which I actually like. It shows you very quickly where the film drags and the bits that only interest you, not the audience.

Obviously, there are a lot of VFX. How many?
Quite a few hundred shots, done by Rise VFX and Arri, who worked together. The big thing was creating the ghost city. The tent was real, and so were various bits of road we put in. But there were no buildings at all, so we took this very modern office building in Rabat and scanned it in. Then we used this big empty construction site we found in Casablanca and scanned that in too. I like working with VFX, and I like it when the VFX and art department collaborate closely. It should always be one vision.

Can you talk about the importance of music and sound in the film?
It’s so important to me, especially as I compose with Johnny Klimek prior to shooting. It’s very different from the usual way of doing it as we compose most, if not all, the music before the shoot, and the editor has all that to use as he cuts. We never have to use temp music. As usual, we composed the music at home, and then recorded it in Leipzig, and we did the final sound mix at Arri in Berlin.

What’s next?
I’m doing this big TV series called Babylon Berlin — it’s 16 episodes, all set in the 1920s, which is the equivalent of eight films in terms of running time. I love the details of post, and there’s going to be a lot as there’s nothing left in Berlin from that period now. I want a street movie look, so it’ll be hard to do, but it’s an exciting challenge.

Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

 

 

DP John Seale on capturing ‘Mad Max: Fury Road’

This film vet goes digital for the first time with Alexa cameras and Codex recorders

Mad Max: Fury Road is the fourth movie in writer/director George Miller’s post-apocalyptic action franchise and a prequel to the first three. It is also the first digital film for Australian cinematographer John Seale ASC, ACS, whose career spans more than 30 years and includes such titles as The English Patient (for which he won an Oscar), The Mosquito Coast, Witness, Dead Poets Society and Rain Man.

Facing difficult conditions, intense action scenes and the need to accommodate a massive number of visual effects, Seale and his crew chose to shoot principal photography with Arri Alexa cameras and capture ArriRaw on Codex onboard recorders, a workflow that has become standard among filmmakers due to its ruggedness and easy integration with post.

Warner Bros.’ Fury Road, which takes place in a post-apocalyptic wasteland, was shot in Namibia. The coastal deserts of that African country are home to sand dunes measuring 1,000 feet high and 20 miles long. Frequent sandstorms and intense heat required special precautions by the camera crew.

FRD-11255.JPG

“I’d shot plenty of film-negative films in deserts and jungles under severe conditions, but never digital,” notes Seale. “So I was a bit worried, but I had a fantastic crew of people who had done that… had worked with digital cameras in jungles, deserts, dry, heat, wet, moist, whatever. They were ready and put together full precaution kits of rain covers, dust covers and even heat covers to take the heat off the cameras in the middle of the day.

“We were using a lot of new gear.” Seale adds. “Everything that our crew did in pre-production in Sydney and took to Namibia worked very, very well for the entire time. Our time loss through equipment was minimal.”

Seale’s crew was outfitted with six Arri Alexas and a number of Canon 5Ds, with the latter used in part as crash-cams in action sequences. The Alexas were supported by 11 Codex on-board recorders. The relatively large number of cameras and recorders helped the camera crew to remain nimble. While one scene was being shot, the next was being prepped.

“We kept two kick cameras built the whole time and two ultra-high vehicles rigged the whole time,” explains camera coordinator Michelle Pizanis. “When we when drove up (to a location) we could start shooting, rather than break down the camera at one site and rebuild it at the next.”

Fury Rd 173 john e Fury Rd 160 john 2
John Seale on location shooting Mad Max: Fury Road.

The original Mad Max is remembered for its gritty look. Fury Road took a different route due to the film’s heavy use of visual effects. “The DI and the post work is so explicit; almost every shot is going to be manipulated in some way,” Seale explains. “Our edict was ‘just shoot it.’ Continuity of light wasn’t really a question. We knew that the film would be cut very quickly, so there wouldn’t be time to analyze every shot. Intercutting between overcast and full sun wasn’t going to be a problem. On this film, the end result controlled the execution.”

In order to provide maximum image quality and flexibility for the post team, Seale and his crew chose to record ArriRaw with the Alexa cameras. That, the cinematographer notes, made Codex an obvious choice as only Codex recorders were capable of reliably capturing ArriRaw.

“The choice to go with Codex was definitely for the quality of the recording and post-production considerations,” says Seale. “Once again, we were a little worried about desert heat and desert cold. It changes so much from night to day. And during the day, we had dust storms, dust flying everywhere. We sometimes had moisture in the air. But the Codex systems didn’t fail us.”

Shooting digitally with Codex offered an advantage over shooting on film as it avoided the need to reload cameras with film negative in the blowing winds of the desert. “There is a certain amount of paraphernalia needed to shoot digitally,” Seale says. “But our crew was used to that. They built special boxes to put everything in. They had little fans. They had inlet and outlet areas to keep air circulation going. Those boxes were complete. Cables came out and went to the camera. If we were on the move, the boxes were bolted down so that they were out of the way and didn’t fall off. Sometimes we sat on them to get our shot.”

FURY ROAD

RF interfaces were used with the Alexa cameras to transmit images to a command vehicle for monitoring by director George Miller, who was not only able to review shots, he could edit material to determine what further coverage was needed. “For George, it was a godsend,” says Seale. “That refined the film shooting and made it a lot quicker than the normal procedures.”

It was that sort of flexibility that made shooting with Alexa and Codex so appealing, adds Seale. “I was a great advocate of digital 10 or 15 ago when it started to come in. Film negative is a beautiful image recording process, but it’s 120 years old and you get scratches and dead flies caught in the reels. It’s pretty archaic.

“I think the way digital has caught on is extraordinary. Its R&D is vertical, where film development has stopped. The ability of digital to record images coupled with the DI, where you can change it, manipulate it, allows you do anything you like. I know with Mad Max, it won’t look anything like a ‘good film image’ and it won’t look anything like a ‘good digital image’ — it will look like its own image. I think that’s the wonder of it.”

Director George Miller recently appeared at Comic-Con and seems to agree with Seale, “It was very familiar,” he said about returning to the Mad Max world. “A lot of time has passed. Technology has changed. It was an interesting thing to do. Crazy, but interesting.”

NAB: sweet, sweet pictures

By Tim Spitzer

I had the very enjoyable experience of seeing footage captured on two of the newest large sensor camera’s being introduced to the marketplace: Panasonic’s Varicam 35 and Arri’s Arri 65.

Starting with the latter, a “for rental only” camera that captures 6.5K images only in ArriRaw — these are the most beautiful images I have seen captured on a digital sensor. In a brilliantly inspired demo, close-up images of the faces of Arri employees, shot without make-up as Continue reading

Arri updating Amira software to include 4K UHD, MPEG-2 MXF recording

Arri has released Software Update Packet (SUP) 2.0 for its Amira cameras, and announced the subsequent release of SUP 3.0, which is scheduled for release in mid-2015. The former unlocks 4K UHD recording for high-resolution pipelines, while the latter enables MPEG-2 MXF recording for streamlined, broadcast-friendly workflows.

The Amira targets productions ranging from documentaries, news reporting and corporate films to TV and low-budget movies. These two major software updates respond specifically to customer requests and industry trends.

The key new feature of Amira SUP 3.0 is the ability to record MPEG-2 422P@HL at 50 Mbit/s in an MXF wrapper. This XDCAM-compatible MPEG-2 recording format allows television productions to take advantage of Amira’s image quality and ergonomics, while using a low-bandwidth codec that can easily be integrated into typical broadcast environments and workflows.

Recording MPEG-2 MXF with Amira ensures 100 percent compatibility with the format already used by many low-budget or time-pressured television productions, for which a streamlined workflow through ingest, editing and post is key. This cost-efficient format minimizes the number of memory cards needed on set, but also reduces post and archiving costs through reduced data rates and seamless integration with standard tools.

To help further integration of Amira into television production environments, a new audio accessory will be released. Taking the form of an extension to the back of the camera body, it will equip Amira with a slot for a portable audio tuner/receiver. This will allow signals to be received wirelessly from either the sound recordist’s mixer or straight from radio microphones, accommodating the needs of ENG-style productions that capture audio directly in-camera, but value the most cable-free configuration possible.

‘Marvel’s Agent Carter’ TV series using ArriRaw/Codex workflow

ABC’s Agent Carter, the newest television series from Marvel, has been getting great reviews and lots of eyeballs. It is also using a distinctive production workflow — footage is captured on Arri Alexa XTs, and ArriRaw is being used for main unit photography.

In order to enable this workflow, Codex has provided digital recording for director of photography Gabriel Beristain’s cameras, and has consulted with visual effects supervisor Sheena Duggal on the lens mapping, to assist VFX production.

The show, which is set in the 1940s, focuses agent Peggy Carter (Hayley Atwell) a secretary who has been recruited by Howard Stark to take on secret missions. One episode is produced over the course of eight days, with roughly half shot on stages and half at Los Angeles locations that double for the show’s ‘40s New York setting.

The use of ArriRaw on Agent Carter is typical of Beristain (Magic City, Dolores Claiborne, The Spanish Prisoner, The Ring Two, Blade: Trinity), who was also among the first to pair vintage glass and Alexa XT digital cameras for a television series. On Marvel’s Agent Carter, Beristain worked without a DIT, saying that the Codex/ArriRaw workflow has allowed him to focus on aesthetics and stay involved with the cast.

“It’s analogous to the film system in some ways, where I know how my negative is going to behave,” says Beristain. “It’s going back to a system that always worked really well for us, and we’re getting phenomenal results. Codex recording technology provides us with the technology to capture everything, and get the best possible image.”

The Codex/ArriRaw workflow also helps with post and the VFX shots. Over the course of the eight-episode season, an estimated 1,000 visual effects shots will be created. ILM, Base Effects and Double Negative are working on the show.

“It was always our intention that the VFX should look photorealistic and seamless and, since we had already done a Marvel One-Shot short, the bar was set to a high standard,” explains Duggal (Thor: The Dark World, Iron Man 3, The Hunger Games). “The challenge was how to create large volumes of photorealistic VFX shots, at Marvel-feature-quality, but on a network TV post schedule, which ranges from 16 to 20 days, once the picture is locked.

“Gabby decided that we should shoot ArriRaw to capture the best quality images, something that had not been done for network TV before, to my knowledge,” continues Duggal. “And when it came to camera shooting formats, we decided together that we would like to shoot open gate for the VFX plates and 16:9 for the non-VFX shots. I consulted with Codex and we came up with camera graticules and a VFX workflow for the image extraction. I had also been working on a lens mapping initiative with Codex, and camera rental house Otto Nemenz, to map the lenses for VFX, and I’m happy to say that we implemented this for the first time on Marvel’s Agent Carter.”

Arri’s Franz Kraus discusses the Alexa 65

By Randi Altman

During IBC 2014 in Amsterdam, I was offered the opportunity to sit down with Franz Kraus, managing director of Arri. There was some breaking news he was willing to share — a new camera that had been whispered about here and there on the show floor. This was one week before the Cinec show in Munich, where the camera company introduced its newest offering, the Alexa 65.

How could I turn down that kind of opportunity?! So I headed out of the RAI Convention Center, went straight into my “this is how a native New Yorker walks” mode and got to a neighboring hotel just in time to wipe my brow, put my handy iOgrapher iPad mini rig onto a tripod and hit record.

Continue reading