Tag Archives: Panavision

Senior colorist Nicholas Hasson joins Light Iron’s LA team

Post house Light Iron has added senior colorist Nicholas Hasson to its roster. He will be based in the company’s Los Angeles studio.

Hasson colored the upcoming Tiffany Haddish feature Nobody’s Fool and Season 2 of HBO’s Room 104. Additional past credits include Boo 2! A Madea Halloween, Masterminds, All About Nina and commercial campaigns for Apple, Samsung and Google. He worked most recently at Technicolor, but his long career has included time at ILM, Company 3 and Modern VideoFilm.

“Nicholas has a wealth of experience that makes him a great fit with our team,” says Light Iron GM Peter Cioni. “His background in color, online and VFX ensures success in meeting clients’ creative objectives and enables flexibility in working across both episodic and feature projects.”

Like Lightiron’s other LA-based colorists, led by Ian Vertovec, Hasson is able to support cinematographers working in other regions through virtual DI sessions in Panavision’s network of connected facilities. (Light Iron is a Panavision company.)

Hasson joins Light Iron during a time of high-profile streaming releases including Netflix’s Maniac and Facebook’s Sorry For Your Loss, as well as feature releases garnering awards buzz, such as Can You Ever Forgive Me? and What They Had.

“This is a significant time of growth for Panavision’s post production creative services,” concludes Cioni. “We are thrilled to have Nicholas with us as we enter this next chapter of expansion.”

Panavision, Sim, Saban Capital agree to merge

Saban Capital Acquisition Corp., a publicly traded special purpose acquisition company, Panavision and Sim Video International have agreed to combine their businesses to create a premier global provider of end-to-end production and post production services to the entertainment industry. Under the terms of the business combination agreement, Panavision and Sim will become wholly owned subsidiaries of Saban Capital Acquisition Corp. Upon completion, Saban Capital Acquisition Corp. will change its name to Panavision Holdings Inc. and is expected to continue to trade on the Nasdaq stock exchange. Kim Snyder, president and chief executive officer of Panavision, will serve as chairman and chief executive officer. Bill Roberts, chief financial officer of Panavision, will serve in that role for the combined company.

Panavision designs, manufactures and provides high-precision optics and camera technology for the entertainment industry and is a leading global provider of production equipment and services. Sim is a leading provider of production and post production solutions with facilities in Los Angeles, Vancouver, Atlanta, New York and Toronto.

“This acquisition will leverage the best of Panavision’s and Sim’s resources by providing comprehensive products and services to best address the ever-adapting needs of content creators globally,” says Snyder.

“We’re combining the talent and integrated services of Sim with two of the biggest names in the business, Panavision and Saban,” adds James Haggarty, president and CEO of Sim. “The resulting scale of the new combined enterprise will better serve our clients and help shape the content-creation landscape.”

The respective boards of directors of Saban Capital Acquisition Corp., Panavision and Sim have unanimously approved the merger with completion subject to Saban Capital Acquisition Corp. stockholder approval, certain regulatory approvals and other customary closing conditions. The parties expect that the process will be completed in the first quarter of 2019.

Our Virtual Production Roundtable

By Randi Altman

Evolve or die. That old adage, while very dramatic, fits well with the state of our current production workflows. While most productions are now shot digitally, the warmth of film is still in the back of pros’ minds. Camera makers and directors of photography often look for ways to retain that warmth in digital. Whether it’s through lighting, vintage lenses, color grading, newer technology or all of the above.

There is also the question of setting looks on-set and how 8K and HDR are affecting the picture and workflows. And let’s not forget shooting for OTT series. There is a lot to cover!

In an effort to get a variety of perspectives, we reached out to a few cinematographers and some camera manufacturers to talk trends and technology. Enjoy!

Claudio Miranda, ASC

Claudio Miranda is a Chilean cinematographer who won an Oscar for his work on Life of Pi. He also worked on The Curious Case of Benjamin Button, the first movie nominated for a cinematography Oscar that was shot entirely on digital. Other films include Oblivion, Tomorrowland and the upcoming Top Gun: Maverick.

Can you talk about some camera trends you’ve been seeing? Such as large format? The use of old/vintage lenses?
Seems like everyone is shooting large format. Chris Nolan and Quentin Tarantino shot 65mm film for their last projects. New digital cameras such as the Alexa LF and Sony Venice cater to this demand. People seem to like the shallow depth of field of these larger format lenses.

How is HDR affecting the way things are being shot these days? Are productions shooting/monitoring HDR on-set?
For me, too much grain in HDR can be distracting. This must be moderated in the camera acquisition format choice and DI. Panning in a high-contrast environment can cause painful strobing. This can be helped in the DI and set design. HDR done well is more important than 8K or even 3D.

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? Or are latitude and framerate more important currently?
8K can be important for VFX plates. For me, creatively it is not important, 4K is enough. The positive of 8K is just more K. The downside is that I would rather the camera companies focus on dynamic range, color latitude, sensitivity and the look and feel of the captured image instead of trying to hit a high K number. Also, there are storage and processing issues.

Can you talk about how shooting streaming content, for OTTs like Netflix/Amazon, has changed production practices, and workflows, if at all?
I have not shot for a streaming service. I do think we need to pay attention to all deliverables and make adjustments accordingly. In the DI, I am there for the standard cinema pass, HDR pass, IMAX pass, home video pass and other formats that arise.

Is the availability of all those camera resolutions a help or a hindrance?
I choose the camera that will fit the job. It is my job in prep to test and pick the camera that best serves the movie.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
On set, I am able to view HDR or 709. I test the pipeline and make sure the LUT is correct and make modifications if needed. I do not play with many LUTs on set, I normally just have one. I treat the camera like a film stock. I know I will be there in the DI to finalize the look. On set is not the place for futzing with LUTs on the camera. My plate is full enough as it is.

If not already covered, how has production changed in the last two years?
I am not sure production has changed, but there are many new tools to use to help make work more efficient and economical. I feel that I have always had to be mindful of the budget, no matter how large the show is. I am always looking for new solutions.

Daryn Okada, ASC
Daryn Okada is known for his work on films such as Mean GirlsAnna Karenina and Just Like Heaven. He has also worked on many TV series, such as Scandal, Grey’s Anatomy and Castle. He served as president of the ASC from 2006 to 2009.

Can you talk about some camera trends you’ve been seeing? Such as large format? The use of old/vintage lenses? 

Modern digital cinema cameras can achieve a level of quality with the proper workflows and techniques to evolve a story’s visual identity parallel explorations shooting on film. Larger image sensors, state-of-the-art lenses and mining historic optics enable cinematographers to use their experience and knowledge of the past to paint rich visual experiences for today’s audience.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look? Are productions shooting/monitoring HDR on-set?
HDR is a creative and technical medium just as shooting and projecting 65mm film would be. It’s up to the director and the cinematographer to decide how to orchestrate the use of HDR for their particular story.

Can you address 8K? What are the positives, and the negatives? Do we just have too many Ks these days? Or are latitude and framerate more important currently?
8K will is working its way into production like 65mm and 35mm VistaVision did by providing more technical resolution for use in VFX or special-venue exhibition. The enormous amount of data and cost to handle it must be justified by its financial return and does it benefit a particular story. Latitude and color depth are paramount to creating a motion picture’s pallet and texture. Trying to use a format just because it’s technically possible may be distracting to an audience’s acceptance of a story or creative concept.

Can you talk about how shooting streaming content, for OTTs like Netflix/Amazon, has changed production practices, and workflows, if at all?

I think the delivery specifications of OTT have generally raised the bar, making 4K and wide color gamut the norm. For cinematographers that have spent years photographing features, we are accustomed to creating images with detail for a big screen and a wide color pallet. It’s a natural creative process to shoot for 4K and HDR in that respect. 

Are the availability of all those camera resolutions a help or a hindrance? 
Having the best imaging available is always welcomed. Even if a camera is not technically exploited, the creation of subtle images is richer and possible through the smoother transition and blending of color, contrast and detail from originating with higher resolutions and color range.

Can you talk about color management from the sensor/film to the screen? How do you ensure correct color management from the set into dailies and post, the DI and final delivery?
As a cinematographer we are still involved in workflows for dailies and post production to ensure everyone’s creative efforts to the final production are maintained for the immediate viewer and preserved for the audiences in the future.

How has production changed over the last two years?
There are more opportunities to produce content with creative high-quality cinematography thanks to advancements in cameras and cost-effective computing speed combined with demands of high quality displays and projection.

Vanja Černjul, ASC
This New York-based DP recently worked on the huge hit Crazy Rich Asians. In addition to feature film work, Černjul has shot TV shows (Deuce’s season 1 finale and two seasons of Marco Polo, as well as commercials for Panasonic and others.

Can you talk about some camera trends you’ve been seeing? Such as large format? The use of old/vintage lenses?
One interesting trend I noticed is the comeback of image texture. In the past, cinematographers used to expose film stock differently according to the grain texture they desired. Different exposure zones within the same frame had different grain character, which produced additional depth of the image. We lost that once we switched to digital. Crude simulations of film grain, such as overall filters, couldn’t produce the dimensionality we had with film.

Today, I am noticing new ways of bringing the texture back as a means of creative expression. The first one comes in the form of new, sophisticated post production tools designed to replicate the three-dimensional texturing that occurs naturally when shooting film, such as the realtime texturing tool LiveGrain. Monitoring the image on the set with a LiveGrain texture applied can impact lighting, filtration or lens choices. There are also new ways to manipulate texture in-camera. With the rise of super-sensitive, dual-native ISO sensors we can now shoot at very low-light levels and incorporate so-called photon shot noise into the image. Shot noise has organic character, very much like film grain.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look? Are productions shooting/monitoring HDR on-set?

The creative potential of HDR technology is far greater than that of added resolution. Unfortunately, it is hard for cinematographers to take full advantage of HDR because it is still far from being the standard way the audience sees our images. We can’t have two completely different looks for a single project, and we have to make sure the images are working on SDR screens. In addition, it is still impractical to monitor in HDR on the set, which makes it difficult to adjust lighting and lens choices to expanded dynamic range. Once HDR screens become a standard, we will be able to really start creatively exploring this new territory.

Crazy Rich Asians

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? Or are latitude and framerate more important currently?
Additional resolution adds more available choices regarding relationship of optical systems and aspect ratios. I am now able to choose lenses for their artifacts and character regardless of the desired aspect ratio. I can decide to shoot one part of the film in spherical and the other part in anamorphic and crop the image to the project’s predetermined aspect ratio without fear of throwing away too much information. I love that freedom.

Can you talk about how shooting streaming content, for OTTs like Netflix/Amazon, has changed production practices and workflows, if at all?
For me, the only practical difference between shooting high-quality content for cable or streaming is the fact that Netflix demands their projects to be capt
ured in true 4K RAW. I like the commitment to higher technical standards, even though this may be an unwelcome restriction for some projects.

Is the availability of all those camera resolutions a help or a hindrance?
I like choices. As large format lenses become more available, shooting across formats and resolutions will become easier and simpler.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
The key for correct color management from the set to final color grading is in preproduction. It is important to take the time to do proper tests and establish the communication between DIT, the colorist and all other people involved as early as possible. This ensures that original ideas aren’t lost in the process.

Adjusting and fine-tuning the LUT to the lenses, lighting gels and set design and then testing it with the colorist is very important. Once I have a bulletproof LUT, I light and expose all the material for it specifically. If this part of the process is done correctly, the time in final color grading can be spent on creative work rather than on fixing inconsistencies.

I am very grateful for ACES workflow, which offers long-overdue standardization. It is definitely a move in the right direction.

How has production changed over the last two years?
With all the amazing post tools that are becoming more available and affordable, I am seeing negative trends of further cutting of preproduction time, and lack of creative discipline on the set. I sincerely hope this is just a temporary confusion due to recalibration of the process.

Kate Reid, DP
Kate Reid is a UK-based DP working in TV and film. Her recent work includes the TV series Hanna (Amazon) Marcella 2 (Netflix) and additional photography on the final season Game of Thrones for HBO. She is currently working on Press for BBC.

Can you talk about some camera trends you’ve been seeing? Such as Large Format? The use of old/vintage lenses?
Large format cameras are being used increasingly on drama productions to satisfy the requirement for additional resolution by certain distribution platforms. And, of course, the choice to use large format cameras in drama brings with it another aesthetic that DPs now have as another tool: Choosing if increased depth-of-field fall off, clarity in the image etc., enhances the particular story they wish to portray on screen.

Like many other DPs, I have always enjoyed using older lenses to help make the digital image softer, more organic and less predictable, but the larger format cameras now mean that much of this older glass designed for 35mm size sensor may not cover the increased sensor size, so newer lenses designed for the larger format cameras may become popular by necessity, alongside older larger format glass that is enjoying a renaissance.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look? Are productions shooting/monitoring HDR on-set?
I have yet to shoot a show that requires HDR delivery. It hasn’t yet become the default in drama production in the UK.

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? Or are latitude and frame rate more important currently?
I don’t inherently find an ultra sharp image attractive. Through older glass and diffusion filters on the lens, I am usually looking to soften and break down my image, so I personally am not all about the extra Ks. How the camera’s sensor reproduces color and handles highlights and shadows is of more interest to me, and I believe has more impact on the picture.

Of primary importance is how practical a camera is to work with — size and how comfortable the camera is to handle would supersede excessive resolution — as the first requirement of any camera has got to be whether it allows you to achieve the shots you have in mind, because a story isn’t told through its resolution.

Can you talk about how shooting streaming content for OTTs, like Netflix/Amazon, has changed production practices, and workflows, if at all?
The major change is the requirement by Netflix for true 4K resolution, determining which cameras cinematographers are allowed to shoot on. For many cinematographers the Arri Alexa was their digital camera of choice, which was excluded by this rule, and therefore we have had to look to other cameras for such productions. Learning a new camera, its sensor, how it handles highlights, produces color, etc., and ensuring the workflow through to the post facility is something that requires time and testing, which has certainly added to a DP’s workload.

From a creative perspective, however, I found shooting for OTTs (I shot two episodes of the TV series Hanna made by Working Title TV and NBC Universal for Amazon) has been more liberating than making a series for broadcast television as there is a different idea and expectation around what the audience wants to watch and enjoy in terms of storytelling. This allowed for a more creative way of filming.

Is the availability of all those camera resolutions a help or a hindrance?
Where work is seen now can vary from a mobile phone screen to a digital billboard in Times Square, so it is good for DPs to have a choice of cameras and their respective resolutions so we can use the best tool of each job. It only becomes a hindrance if you let the technology lead your creative process rather than assist it.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
Ideally, I will have had the time and opportunity to shoot tests during prep and then spend half a day with the show’s colorist to create a basic LUT I can work with on set. In practice, I have always found that I tweak this LUT during the first days of production with the DIT, and this is what serves me throughout the rest of the show.

I usually work with just one LUT that will be some version of a modified Rec. 709 (unless the look of the show drastically requires something else). It should then be straight forward in that the DIT can attach a LUT to the dailies, and this is the same LUT applied by editorial so that exactly what you see on set is what is being viewed in the edit.

However, where this fails is that the dailies uploaded to FTP sites — for viewing by the execs, producers and other people who have access to the work — are usually very compressed with low resolution, so it bears little resemblance to how the work looked on set or looks in the edit. This is really unsatisfying as for months, key members of production are not seeing an accurate reflection of the picture. Of course, when you get into the grade this can be restored, but it’s dangerous if those viewing the dailies in this way have grown accustomed to something that is a pale comparison of what was shot on set.

How has production changed over the last two years?
There is less differentiation between film and television in how productions are being made and, critically, where they are being seen by audiences, especially with online platforms now making award-winning feature films. The high production values we’ve seen with Netflix and Amazon’s biggest shows has seen UK television dramas pushing to up their game, which does put pressure on productions, shooting schedules and HODs, as the budgets to help achieve this aren’t there yet.

So, from a ground-level perspective, for DPs working in drama this looks like more pressure to produce work of the highest standard in less time. However, it’s also a more exciting place to be working as the ideas about how you film something for television versus cinema no longer need apply. The perceived ideas of what an audience is interested in, or expect, are being blown out the water by the success of new original online content, which flies in the face of more traditional storytelling. Broadcasters are noticing this and, hopefully, this will lead to more exciting and cinematic mainstream television in the future.

Blackmagic’s Bob Caniglia
In addition to its post and broadcast tools, Blackmagic offers many different cameras, including the Pocket Cinema Camera, Pocket Cinema Camera 4K, Micro Studio Camera 4K, Micro Cinema Camera, Studio Camera, Studio Camera 4K, Ursa Mini Pro, Ursa Mini 4.6K, Ursa Broadcast.

Can you talk about some camera trends you’ve been seeing? Such as large format? The use of old/vintage lenses?
Lens freedom is on everyone’s mind right now… having the freedom to shoot in any style. This is bringing about things like seeing projects shot on 50-year-old glass because the DP liked the feel of a commercial back in the ‘60s.

We actually just had a customer test out actual lenses that were used on The Godfather, The Shining and Casablanca, and it was amazing to see the mixing of those with a new digital cinema camera. And so many people are asking for a camera to work with anamorphic lenses. The trend is really that people expect their camera to be able to handle whatever look they want.

For large format use, I would say that both Hollywood and indie filmmakers are using them more often. Or, at least they trying to get the general large format look by using anamorphic lenses to get a shallow depth of field.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look? Are productions shooting/monitoring HDR on-set?
Right now, HDR is definitely more of a concern for DPs in Hollywood, but also with indie filmmakers and streaming service content creators. Netflix and Hulu have some amazing HDR shows right now. And there is plenty of choice when it comes to the different HDR formats and shooting and monitoring on set. All of that is happening everyday, while 8K still needs the industry to catch up with the various production tools.

As for impacting shooting, HDR is about more immersive colors, and a DP needs to plan for it. It gives viewers a whole new level of image detail in what they shoot. They have to be much more aware of every surface or lighting impact so that the viewer doesn’t get distracted. Attention to detail gets even higher in HDR, and DPs and colorists will need to keep a close eye on every shot, including when an image in a sideview mirror’s reflection is just a little too sharp and needs a tweak.

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? Or are latitude and framerate more important currently?
You can never have enough Ks! Seriously. It is not just about getting a beautiful 8K TV, it is about giving the production and post pros on a project as much data as possible. More data means more room to be creative, and is great for things like keying.

Latitude and framerate are important as well, and I don’t think any one is more important than another. For the viewers, the beauty will be in large displays, you’re already seeing 8K displays in Times Square, and though you may not need 8K on your phone, 8K on the side of a building or highway will be very impactful.

I do think one of the ways 8K is changing production practices is that people are going to be much more storage conscious. Camera manufacturers will need to continue to improve workflows as the images get larger in an effort to maximize storage efficiencies.

Can you talk about how shooting streaming content, for OTTs like Netflix/Amazon, has changed production practices, and workflows, if at all?
For streaming content providers, shoots have definitely been impacted and are forcing productions to plan for shooting in a wider number of formats. Luckily, companies like Netflix have been very good about specifying up front the cameras they approve and which formats are needed.

Is the availability of all those camera resolutions a help or a hindrance?
While it can be a bit overwhelming, it does give creatives some options, especially if they have a smaller delivery size than the acquisition format. For instance, if you’re shooting in 4K but delivering in HD, you can do dynamic zooms from the 4K image that look like an optical zoom, or you can get a tight shot and wide shot from the same camera. That’s a real help on a limited budget of time and/or money.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
Have the production and the post people planning together from the start and create the look everyone should be working on right up front.

Set the LUTs you want before a single shot is done and manage the workflow from camera to final post. Also, choose post software that can bring color correction on-set, near-set and off-set. That lets you collaborate remotely. Definitely choose a camera that works directly with any post software, and avoid transcoding.

How has production changed in the last two years?
Beyond the rise of HDR, one of the other big changes is that more productions are thinking live and streaming more than ever before. CNN’s Anderson Cooper now does a daily Facebook Live show. AMC has the live Talking Dead-type formats for many of their shows. That trend is going to keep happening, so cinematographers and camera people need to be thinking about being able to jump from scripted to live shooting.

Red Digital Cinema’s Graeme Nattress
Red Digital Cinema manufactures professional digital cameras and accessories. Red’s DSMC2 camera offers three sensor options — Gemini 5K S35, Helium 8K S35 and Monstro 8K VV.

Can you talk about some camera trends you’ve been seeing?
Industry camera trends continue to push image quality in all directions. Sensors are getting bigger, with higher resolutions and more dynamic range. Filmmakers continue to innovate, making new and amazing images all the time, which drives our fascination for advancing technology in service to the creative.

How is HDR affecting the way things are being shot these days?
One of the benefits of a primary workflow based on RAW recording is that HDR is not an added extra, but a core part of the system. Filmmakers do consider HDR important, but there’s some concern that HDR doesn’t always look appealing, and that it’s not always an image quality improvement. Cinematography has always been about light and shade and how they are controlled to shape the image’s emotional or storytelling intent. HDR can be a very important tool in that it greatly expands the display canvas to work on, but a larger canvas doesn’t mean a better picture. The increased display contrast of HDR can make details more visible, and it can also make motion judder more apparent. Thus, more isn’t always better; it’s about how you use what you have.

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? What’s more important, resolution or dynamic range?
Without resolution, we don’t have an image. Resolution is always going to be an important image parameter. What we must keep in mind is that camera resolution is based on input resolution to the system, and that can — and often will — be different to the output resolution on the display. Traditionally, in video the input and output resolutions were one and the same, but when film was used — which had a much higher resolution than a TV could display — we were taking a high-resolution input and downsampling it to the display, the TV screen.

As with any sampled system, in a digital cinema camera there are some properties we seek to protect and others to diminish. We want a high level of detail, but we don’t want sharpening artifacts and we don’t want aliasing. The only way to achieve that is through a high-resolution sensor, properly filtered (optical low-pass) that can see a large amount of real, un-enhanced detail. So yes, 8K can give you lots of fine detail should you want it, but the imaging benefits extend beyond downsampling to 4K or 2K. 8K makes for an incredibly robust image, but noise is reduced, and what noise remains takes on more of a texture, which is much more aesthetically pleasing.

One challenge of 8K is an increase in the amount of sensor data to be recorded, but that can be addressed through quality compression systems like RedCode.

Addressing dynamic range is very important because dynamic range and resolution work together to produce the image. It’s easy to think that high resolutions have a negative impact upon dynamic range, but improved pixel design means you can have dynamic range and resolution.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
Color management is vitally important and so much more than just keeping color control from on-set through to delivery. Now with the move to HDR and an increasing amount of mobile viewing, we have a wide variety of displays, all with their own characteristics and color gamuts. Color management allows content creators to display their work at maximum quality without compromise. Red cameras help in multiple ways. On camera, one can monitor in both SDR and HDR simultaneously with the new IPP2 image processing pipeline’s output independence, which also allows you to color via CDL and creative 3D LUT in such a way as to have those decisions represented correctly on different monitor types.

In post and grading, the benefits of output independence continue, but now it’s critical that scene colors, which can so easily go out of gamut, are dealt with tastefully. Through the metadata support in the RedCode format, all the creative decisions taken on set follow through to dailies and post, but never get in the way of producing the correct image output, be it for VFX, editorial or grading.

Panavision’s Michael Cioni 
Panavision designs and manufactures high-precision camera systems, including both film and digital cameras, as well as lenses and accessories for the motion picture and television industries.

Can you talk about some camera trends you’ve been seeing?
With the evolution of digital capture, one of the most interesting things I’ve noticed in the market are new trends emerging from the optics side of cinematography. At a glance, it can appear as if there is a desire for older or vintage lenses based on the increasing resolution of large format digital cameras. While resolution is certainly a factor, I’ve noticed the larger contributor to vintage glass is driven by the quality of sensors, not the resolution itself. As sensors increase in resolution, they simultaneously show improvements in clarity, low-light capability, color science and signal-to-noise ratio.

The compounding effect of all these elements are improving images far beyond what was capable with analog film technology, which explains why the same lens behaves differently on film, S35 digital capture and large-format digital capture. As these looks continue to become popular, Panavision is responding through our investments in both restoration of classic lenses as well as designing new lenses with classic characteristics and textures that are optimized for large format photography on super sensors.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look?
Creating images is not always about what component is better, but rather how they elevate images by working in concert. HDR images are a tool that increases creative control alongside high resolution and 16-bit color. These components work really well together because a compelling image can make use of more dynamic range, more color and more clarity. Its importance is only amplified by the amalgamation of high-fidelity characteristics working together to increase overall image flexibility.

Today, the studios are still settling into an HDR world because only a few groups, led by OTT, are able to distribute in HDR to wide audiences. On-set tools capable of HDR, 4K and 16-bit color are still in their infancy and currently cost-prohibitive. 4K/HDR on the set is going to become a standard practice by 2021. 4K wireless transmitters are the first step — they are going to start coming online in 2019. Smaller OLED displays capable of 750 nits+ will follow in 2020, creating an excellent way to monitor higher quality images right on set. In 2021, editorial will start to explore HDR and 4K during the offline process. By 2024, all productions will be HDR from set to editorial to post to mobile devices. Early adopters that work out the details today will find themselves ahead of the competition and having more control as these trends evolve. I recommend cinematographers embrace the fundamentals of HDR, because understanding the tools and trends will help prevent images from appearing artificial or overdone.

Can you address 8K? What are the positives and the negatives? Do we just have too many Ks these days? What’s more important, resolution or dynamic range?
One of the reasons we partnered with Red is because the Monstro 8K VV sensor makes no sacrifice in dynamic range while still maintaining ultra high smoothness at 16 bits. The beauty of technology like this is that we can finally start to have the best from all worlds — dynamic range, resolution, bit depth, magnification, speed and workflow — without having to make quality sacrifices. When cinematographers have all these elements together, they can create images previously never seen before, and 8K is as much part of that story as any other element.

One important way to view 8K is not solely as a thermometer for high-resolution sharpness. A sensor with 35 million pixels is necessary in order to increase the image size, similar to trends in professional photography. 8K large format creates a larger, more magnified image with a wider field of view and less distortion, like the difference in images captured by 70mm film. The biggest positive I’ve noticed is that DXL2’s 8K large-format Red Monstro sensor is so good in terms of quality that it isn’t impacting images themselves. Lower quality sensors can add a “fingerprint” to the image, which can distort the original intention or texture of a particular lens.

With sensors like Monstro capable of such high precision, the lenses behave exactly as the lens maker intended. The same Panavision lenses on a lower grade sensor, or even 35mm film, are exhibiting characteristics that we weren’t able to see before. This is literally breathing new life into lenses that previously didn’t perform the same way until Monstro and large format.

Is the availability of so many camera formats a help or a hindrance?
You don’t have to look far to identify individuals who are easily fatigued by having too many choices. Some of these individuals cope with choices by finding ways to regulate them, and they feel fewer choices means more stability and perhaps more control (creative and economic). As an entrepreneur, I find the opposite to be true: I believe regulating our world, especially with regards to the arts and sciences, is a recipe for protecting the status quo. I fully admit there are situations in which people are fatigued by too many complex choices.

I find that failure is not of the technology itself, rather it’s the fault of the manufactures who have not provided the options in easy-to-consume ways. Having options is exactly what creatives need in order to explore something new and improved. But it’s also up to manufacturers to deliver the message in ways everyone can understand. We’re still learning how to do that, and with each generation the process changes a bit. And while I am not always certain which are the best ways to help people understand all the options, I am certain that the pursuit of new art will motivate us to go out of our comfort zones and try something previously thought not possible.

Have you encountered any examples of productions that have shot streaming content (i.e. for Netflix/Amazon) and had to change production practices and workflows for this format/deliverable?
Netflix and Amazon are exceptional examples of calculated risk takers. While most headlines discuss their investment in the quantity of content, I find the most interesting investment they make is in relationships. Netflix and Amazon are heavily invested in standards groups, committees, outreach, panels and constant communication. The model of the past and present (incumbent studios) are content creators with technology divisions. The model of the future (Netflix, Amazon, Hulu, Apple, Google and YouTube) are all the technology companies with the ability to create content. And technology companies approach problems from a completely different angle by not only embracing the technology, they help invent it. In this new technological age, those who lead and those who follow will likely be determined by the tools and techniques used to deliver. What I call “The Netflix Effect” is the impact Netflix has on traditional groups and how they have all had to strategically pivot based on Netflix’s impact.

How do you ensure correct color management from the set into dailies and post production, the DI and final delivery?
The DXL2 has an advanced color workflow. In collaboration with LiveGrade by Pomfort, DXL2 can capture looks wirelessly from DITs in the form of CDLs and LUTs, which are not only saved into the metadata of the camera, but also baked into in-camera proxy files in the form of Apple ProRes or Avid DNx. These files now contain visual references of the exact looks viewed on monitors and can be delivered directly to post houses, or even editors. This improves creative control because it eliminates the guess work in the application of external color decisions and streamlines it back to the camera where the core database is kept with all the other camera information. This metadata can be traced throughout the post pipeline, which also streamlines the process for all entities that come in contact with camera footage.

How has production changed over the last two years?
Sheesh. A lot!

ARRI‘s Stephan Ukas-Bradley
The ARRI Group manufactures and distributes motion picture cameras, digital intermediate systems and lighting equipment. Their camera offerings include the Alexa LF, Alexa Mini, Alexa 65, Alexa SXT W and the Amira.

Can you talk about some camera trends you’ve been seeing? Such as large format? The use of old/vintage lenses?
Large format opens some new creative possibilities, using a shallow depth of field to guide the audience’s view and provide a wonderful bokeh. It also conveys a perspective truer to the human eye, resulting in a seemingly increased dimensional depth. The additional resolution combined with our specially designed large format Signature Primes result in beautiful and emotional images.

Old and vintage lenses can enhance a story. For instance, when Gabriel Beristain, ASC, used Bausch & Lomb Super Baltar on the Starz show Magic City, and Bradford Young used detuned DNA lenses in conjunction with Alexa 65 on Solo: A Star Wars Story, certain characteristics like flares, reflections, distortions and focus fall-off are very difficult to recreate in post organically, so vintage lenses provide an easy way to create a unique look for a specific story and a way for the director of photography to maintain creative control.

How is HDR affecting the way things are being shot these days? Do you find HDR more important than 8K at the moment in terms of look? Are productions shooting/monitoring HDR on-set?
Currently, things are not done much differently on set when shooting HDR versus SDR. While it would be very helpful to monitor in both modes on-set, HDR reference monitors are still very expensive and very few productions have the luxury to do that. One has to be aware of certain challenges when shooting for an HDR finish. High contrast edges can result in a more pronounced stutter/strobing effect when panning the camera, windows that are blown out in SDR might retain detail in the HDR pass and now all of a sudden, a ladder or grip stand are visible.

In my opinion, HDR is more important than higher resolution. HDR is resolution-independent in regard to viewing devices like phone/tablets and gives the viewer a perceived increased sharpness, and it is more immersive than increased resolution. Also, let’s not forget that we are working in the motion picture industry and that we are either capturing moving objects or moving the camera, and with that introducing motion blur. Higher resolution only makes sense to me in combination with higher frame rates, and that in return will start a discussion about aesthetics, as it may look hyper-real compared to the traditional 24fps capture. Resolution is one aspect of the overall image quality, but in my opinion extended dynamic range, signal/noise performance, sensitivity, color separation and color reproduction are more important.

Can you talk about how shooting streaming content for OTTs, like Netflix/Amazon, has changed production practices and workflows, if at all?
Shooting streaming content has really not changed production practices or workflows. At ARRI, we offer very flexible and efficient workflows and we are very transparent documenting our ARRIRAW file formats in SMPTE RDD 30 (format) and 31 (processing) and working with many industry partners to provide native file support in their products.

Is the availability of all those camera resolutions a help or a hindrance?
I would look at all those different camera types and resolutions as different film stocks and recommend to creatives to shoot their own test and select the camera systems based on what suits their project best.

We offer the ARRI Look Library for Amira, Alexa Mini and Alexa SXT (SUP 3.0), which is a collection of 87 looks, each of them available in three different intensities provided in Rec. 709 color space. Those looks can either be recorded or only used for monitoring. These looks travel with the picture, embedded in the metadata of the ARRIRAW file, QuickTime Atom or HD/SDI stream in form of the actual LUT and ASC CDL. One can also create a look dynamically on set, feeding the look back to the camera and having the ASC CDL values embedded in the same way.

More commonly, one would record in either ARRIRAW or ProRes LogC, while applying a standard Rec. 709 look for monitoring. The “C” in LogC stands for Cineon, which is a film-like response very much the like of a scanned film image. Colorists and post pros are very familiar with film and color grading LogC images is easy and quick.

How has production changed over the last two years?
I don’t have the feeling that production has changed a lot in the past two years, but with the growing demand from OTTs and increased production volume, it is even more important to have a reliable and proven system with flexible workflow options.

Main Image: DP Kate Reid.

Panavision Millennium DXL2’s ecosystem grows with color science, lenses, more

Panavision’s Millennium DXL2 8K camera was on display at Cine Gear last week featuring  a new post-centric firmware upgrade, along with four new large-format lens sets, a DXL-inspired accessories kit for Red DSMC2 cameras and a preview of custom advancements in filter technology.

DXL2 incorporates technology advancements based on input from cinematographers, camera assistants and post production groups. The camera offers 16 stops of dynamic range with improved shadow detail, a native ISO setting of 1600 and 12-bit ProRes XQ up to 120fps. New to the DXL2 is version 1.0 of a directly editable (D2E) workflow. D2E gives DITs wireless LUT and CDL look control and records all color metadata into camera-generated proxy files for instant and render-free dailies.

DXL2, which is available to rent worldwide, also incorporates an updated color profile: Light Iron Color 2 (LiColor2). This latest color science provides cinematographers and DITs with a film-inspired tonal look that makes the DXL2 feel more cinematic and less digital.

Panavision also showcased their large-format spherical and anamorphic lenses. Four new large-format lens sets were on display:
• Primo X is a cinema lens designed for use on drones and gimbals. It’s fully sealed, weatherproof and counterbalanced to be aerodynamic and it’s able to easily maintain a proper center of gravity. Primo X lenses come in two primes – 14mm (T3.1) and 24mm (T1.6) – and one 24-70mm zoom (T2.8) and will be available in 2019.

• H Series is a traditionally designed spherical lens set with a rounded, soft roll-off, giving what the company calls a “pleasing tonal quality to the skin.” Created with vintage glass and coating, these lenses offer slightly elevated blacks for softer contrast. High speeds separate subject and background with a smooth edge transition, allowing the subject to appear naturally placed within the depth of the image. These lenses are available now.
• Ultra Vista is a series of large-format anamorphic optics. Using a custom 1.6x squeeze, Ultra Vista covers the full height of the 8K sensor in the DXL and presents an ultra-widescreen 2.76:1 aspect ratio along with a classic elliptical bokeh and Panavision horizontal flare. Ultra Vista lenses will be available in 2019.
• PanaSpeed is a large-format update of the classic Primo look. At T1.4, PanaSpeed is a fast large-format lens. It will be available in Q3 of 2018.

Panavision also showed an adjustable liquid crystal neutral density (LCND) filter. LCND adjusts up to six individual stops with a single click or ramp — a departure from traditional approaches to front-of-lens filters, which require carrying a set and manually swapping individual NDs based on changing light. LCND starts at 0.3 and goes through 0.6, 0.9, 1.2, 1.5, to 1.8. It will be available in 2019.

Following up on the DXL1 and DXL2, Panavision launched the latest in its cinema line-up with the newly created DXL-M accessory kit. Designed to work with Red DSMC2 cameras, DXL-M marries the quality and performance of DXL with the smaller size and weight of the DSMC2. DXL-M brings popular features of DXL to Red Monstro, Gemini and Helium sensors, such as the DXL menu system (via an app for the iPhone), LiColor2, motorized lenses, wireless timecode (ACN) and the Primo HDR viewfinder. It will be available in Q4 of 2018.

Light Iron opens in Atlanta, targets local film community

In order to support the thriving Georgia production community, post studio Light Iron has opened a new facility in Atlanta. The expansion is the fourth since Panavision acquired Light Iron in 2015, bringing Light Iron’s US locations to six total, including Los Angeles, New York, New Orleans, Albuquerque and Chicago.

“Light Iron has been supporting Georgia productions for years through our mobile dailies services,” explains CFO Peter Cioni. “Now with a team on the ground, productions can take advantage of our facility-based dailies with talent that brings the finishing perspective into the process.”

Clark Cofer

The company’s Atlanta staff recently provided dailies services to season one of Kevin (Probably) Saves the World, season three of Greenleaf and the features Uncle Drew and Superfly.

With a calibrated theater, the Light Iron Atlanta facility has hosted virtual DI sessions from its LA facility for cinematographers working in Atlanta. The theater is also available for projecting camera and lens tests, as well as private screenings for up to 45 guests.

The theater is outfitted with a TVIPS Nevion TBG480, which allows for a full bandwidth 2K signal from either their LA or NY facility for virtual DI sessions. For example, if a cinematographer is working another show in Atlanta, they can still connect with the colorist for the final look of their previous show.

The Light Iron Atlanta dailies team uses Colorfront Express Dailies, which is standard across their facility-based and mobile dailies services worldwide.

Cioni notes that the new location is led by director of business development Clark Cofer, a member of Atlanta’s production and post industry. “Clark brings years of local and state-wide relationships to Light Iron, and we are pleased to have him on our growing team.”

Cofer most recently represented Crawford Media Services, where he drove sales for their renowned content services to companies like Lionsgate, Fox and Marvel. He currently serves as co-president of the Georgia Production Partnership, and is on the board of directors for the DeKalb County Film and Entertainment Advisory Board.

Panavision intros Millennium DXL2 camera with Red Monstro 8K sensor

Panavision is at the BSC Expo 2018 showing its new Millennium DXL2 8K camera. The large-format camera is the heart of a complete imaging ecosystem designed from filmmakers’ perspectives, seamlessly incorporating Panavision’s unmatched optics and camera architecture, the Red Monstro 8K VV sensor and Light Iron color science (LiColor2). The DXL2 builds on the success of the Millennium DXL and benefits from Panavision’s partnership with cinematographers, whose real-world experience and input are manifested in the DXL2’s new offerings.

The Red Monstro 8K VV sensor in the DXL2 offers 16-plus stops of dynamic range with improvements in image quality and shadow detail, a native ISO setting of 1600 and ProRes 4K up to 60 fps. Images are presented on the camera in log format using Light Iron color science. An integrated PX-Pro color spectrum filter custom-made for the DXL offers a significant increase in color separation and dramatically higher color precision to the image. Built-in Preston MDR, 24v power and expanded direct-to-edit features are also standard equipment on the DXL2. An anamorphic flare attachment (AFA) offers a convenient, controllable method of introducing flare with spherical lenses.

New to the DXL2, LiColor2 streamlines the 8K pipeline, smoothly handling the workflow and offering convenient and quick access to high-quality RAW images, accommodating direct-to-edit without delays.

Since its introduction, the DXL has been used on over 20 feature films, and countless television shows, commercials and music videos. Oscar-nominee John Schwartzman, ASC, photographed two features on the DXL and is among those who have tested the DXL2, providing input that has guided the design. He’s currently planning to shoot his next feature with it.

DXL2 cameras are available to rent exclusively from Panavision.

Panavision Hollywood names Dan Hammond VP/GM

Panavision has named Dan Hammond, a longtime industry creative solutions technologist, as vice president and general manager of Panavision Hollywood. He will be responsible for overseeing daily operations at the facility and working with the Hollywood team on camera systems, optics, service and support.

Hammond is a Panavision veteran, who worked at the company between 1989 and 2008 in various departments, including training, technical marketing and sales. Most recently he was at Production Resource Group (PRG), expanding his technical services skills. He is active with industry organizations, and is an associate member of the American Society of Cinematographers (ASC), as well as a member of the Academy of Television Arts and Sciences (ATAS) and Association of Independent Commercial Producers (AICP).

Evoking the beauty and power of Dunkirk with 65mm

FotoKem worked to keep Christopher Nolan’s 65mm source natively photochemical and to provide the truest-to-film digital cinema version possible

By Adrian Pennington

Tipped for Oscar glory, Christopher Nolan’s intense World War II masterpiece, Dunkirk, has pushed the boundaries further than any film before it. Having shot sequences of his previous films (including Interstellar) on IMAX, this time the director made the entire picture on 65mm negative. Approximately 75% of the film was captured on 65mm/15-perf IMAX (1.43:1) and the rest on 65mm/5-perf (2.2:1) on Panavision cameras.

Christopher Nolan on set.

Nolan’s vision and passion for the true film experience was carried out by Burbank-based FotoKem in what became the facility’s biggest and most complex large format project to date. In addition to the array of services that went into creating two 65mm master negatives and 70mm release prints in both 15p and 5p formats, FotoKem also provided the movie’s DCP deliverables based on in-house color science designed to match the film master. With the unique capability to project 70mm film (on a Century JJ projector) side by side with the digital projection of 65mm scans, FotoKem meticulously replicated the organic film look shot by Hoyte van Hoytema, ASC, NSC, FSF, and envisioned by Nolan.

In describing the large format film process, Andrew Oran, FotoKem’s VP of large format services, explains, “Hoyte was in contact with FotoKem’s Dan Muscarella (the movie’s color timer) throughout production, providing feedback on the 70mm contact and 35mm reduction dailies being screened on location. The pipeline was devised so that the IMAX (65mm/15p) footage was timed on a customized 65mm Colormaster by FotoKem color timer Kristen Zimmermann, under Muscarella’s supervision. Her timing lights were provided to IMAX Post, who used those for producing 35mm reduction prints. Those prints were screened in Los Angeles by IMAX, Muscarella and editorial, who in turn provided feedback to production on location. Prints and files travelled securely back and forth between FotoKem and IMAX throughout each day by in-house delivery personnel and via FotoKem’s proprietary globalDATA e-delivery platform.”

A similar route was taken for the Panavision (65mm/5p) footage — also under Muscarella’s keen eye — prior to FotoKem producing 70mm/5p contact daily prints. A set of both prints (35mm and 70mm) were transported for screening in a trailer on location 50,000 miles away in England, France (including shooting on Dunkirk beach itself) and The Netherlands. Traveling with editorial during principal photography was a 70mm projector on which editor Lee Smith, ACE, and Nolan could view dailies in 70mm/5 perf. A 35mm Arri LocPro was also used to watch reduction prints on location.

Oran adds, “Zimmermann also applied color timing lights to the 65mm/5p negatives for contact printing to 70mm at FotoKem. Ultimately, prints from every reel of film negative in both formats were screened by Dan at FotoKem before shipping to production. This way, Dan ensured that the color was as Nolan and Hoytema envisioned. Later, the goal for the DCP was to give the audience the same feel as if they were watching the film version.”

HD deliverables for editorial and studio viewing were created on a customized Millennium telecine. Warner Bros. and Nolan required the quality be high at this step of the process — which can be challenging for 65mm formats. To do this, FotoKem made improvements to the 65mm Millennium telecine machine’s optical and light path, and fed the scans through a custom keycode and metadata workflow in the company’s nextLAB media management platform. Scans for the film’s digital cinema mastering were done at 8K on FotoKem’s Imagica 65mm scanners.

 

Then, to produce the DCPs, FotoKem’s principal color scientist, Joseph Slomka, says, “We created color modeling tools using the negative, interpositive and print process to match the digital image to the film as precisely as technically possible. We sat down with film prints and verified that the modeling data matched a printed original negative in our DI suite with side by side projection.”

Walter Volpatto

This is where FotoKem colorist Walter Volpatto says he determined “how much” and “how close” to match the colors. “We did this by using a special machine — called a Harrahscope Minimax Comparator Projector, developed by Mark Harrah and on loan from the Walt Disney Studios — to project still IMAX frames on the screen,” Volpatto elaborates. “We did this for 400 images from the movie and looked at single frames of digital (projected from a Barco 4K DLP) versus film from Harrahscope, and compared, using the data created by the modeling tools.”

Volpatto worked mainly with RGB offsets in Resolve after each single frame verification to maintain a similarity to traditional color timing. “We also modified the DLP white point settings of the projector for purposes of maintaining the closest match,” he says. “Then, once all the tweaks were made with the stills, we moved to motion picture film reels. Everything described in the printer lights at the film stage were translated to digital based on modeling data.”

In addition to working with Dan (Muscarella) on the film screenings to see the quality he would need to match, Volpatto says that working on Interstellar also helped inform him how to approach this process. “It’s about getting the look that Nolan wants — I just had to replicate it with tremendous accuracy on Dunkirk.”

Joseph Slomka

Aside from the standard DCP, two further digital masters were created for distribution including IMAX scans and digital IMAX distribution, and a Dolby Digital Cinema HDR Master from same source material.

“For the Dolby pass, we had to create another set of color science tools — that still represented Nolan’s vision — to exactly replicate the look of film to HDR,” says Slomka. “Because we had all the computer modeling tools used earlier in the process to identify how the film behaved, we were able to build on that for the HDR version.”

Adds Volpatto, “The whole pipeline was designed to preserve the original viewing experience of print film – everything had to integrate purely and unnoticeably. Having this film and color science knowledge here at FotoKem, it’s hard to see that anybody else could achieve what we did at this level.”

Millennium Digital XL camera: development to delivery

By Lance Holte and Daniel Restuccio

Panavision’s Millennium DXL 8K may be one of today’s best digital cinema cameras, but it might also be one of the most misunderstood. Conceived and crafted to the exacting tradition of the company whose cameras captured such films as Lawrence of Arabia and Inception, the Millennium DXL challenges expectations. We recently sat down with Panavision to examine the history, workflow, some new features and how that all fits into a 2017 moviemaking ecosystem.

Announced at Cine Gear 2016, and released for rent through Panavision in January 2017, the Millennium DXL stepped into the digital large format field as, at first impression, a competitor to the Arri Alexa 65. The DXL was the collaborative result of a partnership of three companies: Panavision developed the optics, accessories and some of the electronics; Red Digital Cinema designed the 8K VV (VistaVision) sensor; and Light Iron provided the features, color science and general workflow for the camera system.

The collaboration for the camera first began when Light Iron was acquired by Panavision in 2015. According to Michael Cioni, Light Iron president/Millennium DXL product manager, the increase in 4K and HDR television and theatrical formats like Dolby Vision and Barco Escape created the perfect environment for the three-company partnership. “When Panavision bought Light Iron, our idea was to create a way for Panavision to integrate a production ecosystem into the post world. The DXL rests atop Red’s best tenets, Panavision’s best tenets and Light Iron’s best tenets. We’re partners in this — information can flow freely between post, workflow, color, electronics and data management into cameras, color science, ergonomics, accessories and lenses.”

HDR OLED viewfinder

Now, one year after the first announcement, with projects like the Lionsgate feature adventure Robin Hood, the Fox Searchlight drama Can You Ever Forgive Me?, the CBS crime drama S.W.A.T. and a Samsung campaign shot by Oscar-winner Linus Sandgren under the DXL’s belt, the camera sports an array of new upgrades, features and advanced tools. They include an HDR OLED viewfinder (which they say is the first), wireless control software for iOS, and a new series of lenses. According to Panavision, the new DXL offers “unprecedented development in full production-to-post workflow.”

Preproduction Considerations
With so many high-resolution cameras on the market, why pick the DXL? According to Cioni, cinematographers and their camera crew are no longer the only people that directly interact with cameras. Panavision examined the impact a camera had on each production department — camera assistants, operators, data managers, DITs, editors, and visual effects supervisors. In response to this feedback, they designed DXL to offer custom toolsets for every department. In addition, Panavision wanted to leverage the benefits of their heritage lenses and enable the same glass that photographed ‘Lawrence of Arabia’ to be available for a wider range of today’s filmmakers on DXL.

When Arri first debuted the Alexa 65 in 2014, there were questions about whether such a high-resolution, data-heavy image was necessary or beneficial. But cinematographers jumped on it and have leaned on large format sensors and glass-to-lens pictures — ranging from Doctor Strange to Rogue One — to deliver greater immersiveness, detail and range. It seems that the large format trend is only accelerating, particularly among filmmakers who are interested in the optical magnification, depth of field and field-of-view characteristics that only large format photography offers.

Kramer Morgenthau

“I think large format is the future of cinematography for the big screen,” says cinematographer Kramer Morgenthau, who shot with the DXL in 2016. “[Large format cinematography] gives more of a feeling of the way human vision is. And so, it’s more cinematic. Same thing with anamorphic glass — anamorphic does a similar thing, and that’s one of the reasons why people love it. The most important thing is the glass, and then the support, and then the user-friendliness of the camera to move quickly. But these are all important.”

The DXL comes to market offering a myriad of creative choice for filmmakers. Among the large format cameras, the Millennium DXL aims to be the crème de la crème — it’s built around an 46mm 8192×4320 Red VV sensor, custom Panavision large format spherical and anamorphic lenses, wrapped in camera department-friendly electronics, using proprietary color science — all of which complements a mixed camera environment.

“The beauty of digital, and this camera in particular, is that DXL actually stands for ‘digital extra light.’ With a core body weight of only 10 pounds, and with its small form factor, I’ve seen DXL used in the back seat of a car as well as to capture the most incredible helicopter scenes,” Cioni notes.

With the help of Light Iron, Panavision developed a tool to match DXL footage to Panavised Red Weapon cameras. Guardians of the Galaxy Vol. 2 used Red Weapon 8K VV Cameras with Panavision Primo 70 lenses. “There are shows like Netflix’s 13 Reasons Why [Season Two] that combined this special matching of the DXL and the Red Helium sensor based on the workflow of the show,” Cioni notes. “They’re shooting [the second season] with two DXLs as their primary camera, and they have two 8K Red cameras with Helium sensors, and they match each other.”

If you are thinking the Millennium DXL will bust your budget, think again. Like many Panavision cameras, the DXL is exclusively leasable through Panavision, but Cioni says they’re happy to help filmmakers to build the right package and workflow. “A lot of budgetary expense can be avoided with a more efficient workflow. Once customers learn how DXL streamlines the entire imaging chain, a DXL package might not be out of reach. We always work with customers to build the right package at a competitive price,” he says.

Using the DXL in Production
The DXL could be perceived as a classic dolly Panavision camera, especially with the large format moniker. “Not true,” says Morgenthau, who shot test footage with the camera slung over his shoulder in the back seat of a car.

He continues, “I sat in the back of a car and handheld it — in the back of a convertible. It’s very ergonomic and user-friendly. I think what’s exciting about the Millennium: its size and integration with technology, and the choice of lenses that you get with the Panavision lens family.”

Panavision’s fleet of large format lenses, many of which date back to the 1950s, made the company uniquely equipped to begin development on the new series of large format optics. To be available by the end of 2017, the Primo Artiste lenses are a full series of T/1.8 Primes — the fastest optics available for large format cinematography — with a completely internalized motor and included metadata capture. Additionally, the Primo Artiste lenses can be outfitted with an anamorphic glass attachment that retains the spherical nature of the base lens, yet induces anamorphic artifacts like directional flares and distorted bokeh.

Another new addition to the DXL is the earlier mentioned Panavision’s HDR OLED Primo viewfinder. Offering 600-nit brightness, image smoothing and optics to limit eye fatigue, the viewfinder also boasts a theoretical contrast ratio of 1,000,000:1. Like other elements on the camera, the Primo viewfinder was the result of extensive polling and camera operator feedback. “Spearheaded by Panavision’s Haluki Sadahiro and Dominick Aiello, we went to operators and asked them everything we could about what makes a good viewfinder,” notes Cioni. “Guiding an industry game-changing product meant we went through multiple iterations. We showed the first Primo HDR prototype version in November 2016, and after six months of field testing, the final version is both better and simpler, and it’s all thanks to user feedback.”

Michael Cioni

In response to the growing popularity of HDR delivery, Light Iron also provides a powerful on-set HDR viewing solution. The HDR Village cart is built with a 4K HDR Sony monitor with numerous video inputs. The system can simultaneously display A and B camera feeds in high dynamic range and standard dynamic range on four different split quadrants. This enables cinematographers to evaluate their images and better prepare for multi-format color grading in post, given that most HDR projects are also required to deliver in SDR.

Post Production
The camera captures R3D files, the same as any other Red camera, but does have metadata that is unique to the DXL, ranging from color science to lens information. It also uses Light Iron’s set of color matrices designed specifically for the DXL: Light Iron Color.

Designed by Light Iron supervising colorist Ian Vertovec, Light Iron Color deviates from traditional digital color matrices by following in the footsteps of film stock philosophy instead of direct replication of how colors look in nature. Cioni likens Light Iron Color to Kodak’s approach to film. “Kodak tried to make different film stocks for different intentions. Since one film stock cannot satisfy every creative intention, DXL is designed to allow look transforms that users can choose, export and integrate into the post process. They come in the form of cube lookup tables and are all non-destructive.”

Light Iron Color can be adjusted and tweaked by the user or by Light Iron, which Cioni says has been done on many shows. The ability to adjust Light Iron Color to fit a particular project is also useful on shows that shoot with multiple camera types. Though Light Iron Color was designed specifically for the Millennium DXL, Light Iron has used it on other cameras — including the Sony A7, and Reds with Helium and Dragon sensors — to ensure that all the footage matches as closely as possible.

While it’s possible to cut with high-resolution media online with a blazing fast workstation and storage solution, it’s a lot trickier to edit online with 8K media in a post production environment that often requires multiple editors, assistants, VFX editors, post PAs and more. The good news is that the DXL records onboard low-bitrate proxy media (ProRes or DNx) for offline editorial while simultaneously recording R3Ds without requiring the use of an external recorder.

Cioni’s optimal camera recording setup for editorial is 5:1 compression for the R3Ds alongside 2K ProRes LT files. He explains, “My rule of thumb is to record super high and super low. And if I have high-res and low-res and I need to make something else, I can generate that somewhere in the middle from the R3Ds. But as long as I have the bottom and the top, I’m good.”

Storage is also a major post consideration. An hour of 8192×4320 R3Ds at 23.976fps runs in the 1TB/hour range — that number may vary, depending on the R3D compression, but when compared to an hour of 6560×3100 Arriraw footage, which lands at 2.6TB an hour, the Millennium DXL’s lighter R3D workflow can be very attractive.

Conform and Delivery
One significant aspect of the Millennium DXL workflow is that even though the camera’s sensor, body, glass and other pipeline tools are all recently developed, R3D conform and delivery workflows remain tried and true. The onboard proxy media exactly matches the R3Ds by name and timecode, and since Light Iron Color is non-destructive, the conform and color-prep process is simple and adjustable, whether the conform is done with Adobe, Blackmagic, Avid or other software.

Additionally, since Red media can be imported into almost all major visual effects applications, it’s possible to work with the raw R3Ds as VFX plates. This retains the lens and camera metadata for better camera tracking and optical effects, as well as providing the flexibility of working with Light Iron Color turned on or off, and the 8K R3Ds are still lighter than working with 4K (as is the VFX trend) DPX or EXR plates. The resolution also affords enormous space for opticals and stabilization in a 4K master.

4K is the increasingly common delivery resolution among studios, networks and over-the-top content distributors, but in a world of constant remastering and an exponential increase in television and display resolutions, the benefit in future-proofing a picture is easily apparent. Baselight, Resolve, Rio and other grading and finishing applications can handle 8K resolutions, and even if the final project is only rendered at 4K now, conforming and grading in 8K ensures the picture will be future-proofed for some time. It’s a simple task to re-export a 6K or 8K master when those resolutions become the standard years down the line.

After having played with DXL footage provided by Light Iron, it was surprising how straightforward the workflow seems. For a very small production, the trickiest part is the requirement of a powerful workstation — or sets of workstations — to conform and play 8K Red media, with a mix of (likely) 4K VFX shots, graphics and overlays. Michael Cioni notes, “[Everyone] already knows a RedCode workflow. They don’t have to learn it, I could show the DXL to anyone who has a Red Raven and in 30 seconds they’ll confidently say, ‘I got this.’”

Rick Anthony named GM of Light Iron New York

Post company Light Iron has named Rick Anthony to the newly created role of general manager in its New York facility. The addition comes after Light Iron added a second floor in 2016, tripling its inventory of editorial suites.

Anthony previously held GM roles at Pac Lab and New York Lab/Postworks/Moving Images, overseeing teams from lab through digital workflows. He began his career at New York film lab, DuArt, where he was a technical supervisor for many years.

Anthony notes several reasons why he joined Light Iron, a Panavision company. “From being at the forefront of color science and workflow to providing bi-coastal client support, this is a unique opportunity. Working together with Panavision, I look forward to serving the dailies, editorial, and finishing needs of any production, be it feature, episodic or commercial.”

Light Iron’s New York facility offers 20 premium editorial suites from its Soho location, as well as in-house and mobile dailies services, HDR-ready episodic timing bays and a 4K DI theater. The facility recently serviced Panavision’s first US-based feature shot on the new Millennium DXL camera.