Category Archives: on-set

Review: The Litra Torch for pro adventure lighting

By Brady Betzel

If you are on Instagram you’ve definitely seen your fair share of “adventure” photography and video. Typically, it’s those GoPro-themed action-adventure shots of someone cliff diving off a million-mile-high waterfall. I definitely get jealous. Nonetheless, one thing I love about GoPro cameras is their size. They are small enough to fit in your pocket, and they will reliably produce a great image. Where those actioncams suffer is with light performance. While it is getting better every day, you just can’t pull a reliably clean and noise-free image from a camera sensor so small. This is where actioncam lights come into play as a perfect companion, including the Litra Torch.

The Litra Torch is an 800 Lumen, 1.5 by 1.5-inch magnetic light. I first started seeing the tiny light trend on Instagram where people were shooting slow shutter photos at night but also painting certain objects with a tiny bit of light. Check out Litra on Instagram: @litragear to see some of the incredible images people are producing with this tiny light. I saw an action sports person showing off some incredible nighttime pictures using the GoPro Hero. He mentioned in the post that he was using the Litra Torch, so I immediately contacted Litra, and here I am reviewing the light. Litra sent me the Litra Paparazzi Bundle, which retails for $129.99. The bundle includes the Litra Torch,  along with a filter kit and cold shoe mount.

So the Litra Torch has four modes, all accessible by clicking the button on top of the light: 800 Lumen brightness, 450 Lumens, 100 Lumens and flashing. The Torch has a consistent color temperature of 5700 kelvin, essentially the light is a crisp white — right in between blue and yellow. The rechargeable lithium-ion battery can be charged via the micro USB cable and will last up to 30 minutes or more depending on the brightness selected. With a backup battery attached you could be going for hours.

Over a month with intermittent use I only charged it once. One night I had to check out something under the hood of my car and used the Litra Torch to see what I was doing. It is very bright and when I placed the light onto the car I realized it was magnetic! Holy cow. Why doesn’t GoPro put magnets into their cameras for mounting! The Torch also has two ¼-20 camera screw mounts so you can mount them just about anywhere. The construction of the Torch is amazing — it is drop-proof, waterproof and made of a highly resilient aluminum. You can feel the high quality of the components the first time you touch the Torch.

In addition to the Torch itself, the cold shoe mount and diffuser, the Paparazzi Bundle comes with the photo filter kit. The photo filter kit comes with five frames to mount the color filters onto the Torch; three sets of Rosco Tungsten 4600k filters; three sets of Rosco Tungsten 3200k filters; 1 White Diffuser filter; and one each of a red, yellow and green color filter. Essentially, they give you a cheap way to change white balance temperatures and also some awesome color filters to play around with. I can really see the benefit of having at least two if not three of the Litra Torches in your bag with the filter sets; you can easily set up a properly lit product shoot or even a headshot session with nothing more than three tiny Torch lights.

Putting It To The Test
To test out the light in action I asked my son to set-up a Lego scene for me. One hour later I had some Lego models to help me out. I always love seeing people’s Lego scenes on Instagram so I figured this would also be a good way to show off the light and the extra color filters sent in the Paparazzi Bundle. One thing I discovered is that I would love to have a slide-in filter holder that is built onto the light; it would definitely help me avoid wasting time having to pop filters into frames.

All in all, this light is awesome. The only problem is I wish I had three so I could do a full three-point lighting setup. However, with some natural light and one Litra Torch I had enough to pull off some cool lighting. I really liked the Torch as a colored spotlight; you can get that blue or red shade on different objects in a scene quickly.

Summing Up
In the end, the Litra Torch is an amazing product. In the future I would really love to see multiple white balance temperatures built into the Torch without having to use photo filters. Also, a really exciting but probably expensive prospect of building a Bluetooth connection and multiple colors. Better yet, make this light a full-color-spectrum app-enabled light… oh wait, just recently they announced the Litra Pro on Kickstarter. You should definitely check that out as well with it’s advanced options and color profile.

I am spoiled by all of those at home lights, like the LIFX brand, that change to any color you want, so I’m greedy and want those in a sub-$100 light. But those are just wishes — the Litra Torch is a must-have for your toolkit in my opinion. From mounting it on top of my Canon DSLR using the cold shoe mount, to using the magnetic ability and mounting in unique places, as well as using the screw mount to attach to a tripod — the Litra Torch is a mind-melting game changer for anyone having to lug around a 100-pound light kit, which makes this new Kickstarter of the Litra Pro so enticing.

Check out their website for more info on the Torch and new Litra Pro, as well as a bunch of accessories. This is a must-have for any shooter looking to carry a tiny but powerful light anywhere, especially for summer and the outdoors!


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Our Virtual Color Roundtable

By Randi Altman

The number of things you can do with color in today’s world is growing daily. It’s not just about creating a look anymore, it’s using color to tell or enhance a story. And because filmmakers recognize this power, they are getting colorists involved in the process earlier than ever before. And while the industry is excited about HDR and all it offers, this process also creates its own set of challenges and costs.

To find out what those in the trenches are thinking, we reached out to makers of color gear as well as hands-on colorists with the same questions, all in an effort to figure out today’s trends and challenges.

Company 3 Senior Colorist Stephen Nakamura
Company 3 is a global group of creative studios specializing in color and post services for features, TV and commercials. 

How has the finishing of color evolved most recently?
By far, the most significant change in the work that I do is the requirement to master for all the different exhibition mediums. There’s traditional theatrical projection at 14 footlamberts (fL) and HDR theatrical projection at 30fL. There’s IMAX. For home video, there’s UHD and different flavors of HDR. Our task with all of these is to master the movie so it feels and looks the way it’s supposed to feel and look on all the different formats.

There’s no one-size-fits-all approach. The colorist’s job is to work with the filmmakers and make those interpretations. At Company 3 we’re always creating custom LUTs. There are other techniques that help us get where we need to be to get the most out of all these different display types, but there’s no substitute for taking the time and interpreting every shot for the specific display format.

How has laser projection and HDR impacted the work? And do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
Not too long ago, a cinematographer could expose an image specifically for one display format — a film print projected at 14fL. They knew exactly where they could place their highlights and shadows to get a precise look onscreen. Today, they’re thinking in terms of the HDR version, where if they don’t preserve detail in the blacks and whites it can really hurt the quality of the image in some of the newer display methods.

I work frequently with Dariuisz Wolski (Sicario: Day of the Soldado, All the Money in the World). We’ve spoken about this a lot, and he’s said that when he started shooting features, he often liked to expose things right at the edge of underexposure because he knew exactly what the resulting print would be like. But now, he has to preserve the detail and fine-tune it with me in post because it has to work in so many different display formats.

There are also questions about how the filmmakers want to use the different ways of seeing the images. Sometimes they really like the qualities of the traditional theatrical standard and really don’t want HDR to look very different and to make the most of the dynamic range. If we have more dynamic range, more light, to work with, it means that in essence we have a larger “canvas” to work on. But you need to take the time to individually treat every shot if you want to get the most out of that “canvas.”

Where do you see the industry moving in the near future?
The biggest change I expect to see is the development of even brighter, higher-contrast exhibition mediums. At NAB, Sony unveiled this wall of LED panels that are stitched together without seams and can display up to 1000 nits. It can be the size of a screen in a movie theater. If that took off, it could be a game changer. If theatrical exhibition gets better with brighter, higher-contrast screens, I think the public will enjoy it, provided that the images are mastered appropriately.

Sicario: Day of the Soldado

What is the biggest challenge you see for the color grading process now and beyond?
As there are more formats, there will be more versions of the master. From P3 to Rec.709 to HDR video in PQ — they all translate color information differently. It’s not just the brightness and contrast but the individual colors. If there’s a specific color blue the filmmakers want for Superman’s suit, or red for Spiderman, or whatever it is, there are multiple layers of challenges involved in maintaining those across different displays. Those are things you have to take a lot of care with when you get to the finishing stage.

What’s the best piece of work you’ve seen that you didn’t work on?
I know it was 12 years ago now, but I’d still say 300, which was colored by Company 3 CEO Stefan Sonnenfeld. I think that was enormously significant. Everyone who has seen that movie is aware of the graphic-novel-looking imagery that Stefan achieved in color correction working with Zack Snyder and Larry Fong.

We could do a lot in a telecine bay for television, but a lot of people still thought of digital color correction for feature films as an extension of the timing process from the photochemical world. But the look in 300 would be impossible to achieve photo-chemically, and I think that opened a lot of people’s minds about the power of digital color correction.

Alt Systems Senior Product Specialist Steve MacMillian
Alt Systems is a systems provider, integrating compositing, DI, networking and storage solutions for the media and entertainment industry.

How has the finishing of color evolved most recently?
Traditionally, there has been such a huge difference between the color finishing process for television production verses for cinematic release. It used to be that a target format was just one thing, and finishing for TV was completely different than finishing for the cinema.

Colorists working on theatrical films will spend most of their efforts on grading for projection, and only after there is a detailed trim pass to make a significantly different version for the small screen. Television colorists, who are usually under much tighter schedules, will often only be concerned with making Rec.709 look good on a standard broadcast monitor. Unless there is a great deal of care to preserve the color and dynamic range of the digital negative throughout the process, the Rec.709 grade will not be suitable for translation to other expanded formats like HDR.

Now, there is an ever-growing number of distribution formats with different color and brightness requirements. And with the expectation of delivering to all of these on ever-tighter production budgets, it has become important to use color management techniques so that the work is not duplicated. If done properly, this allows for one grade to service all of these requirements with the least amount of trimming needed.

How has laser projection and HDR impacted the work?
HDR display technology, in my opinion, has changed everything. The biggest impact on color finishing is the need for monitoring in both HDR and SDR in different color spaces. Also, there is a much larger set of complex delivery requirements, along with the need for greater technical expertise and capabilities. Much of this complexity can be reduced by having the tools that make the various HDR image transforms and complex delivery formats as automatic as possible.

Color management is more important than ever. Efficient and consistent workflows are needed for dealing with multiple sources with unique color sciences, integrating visual effects and color grading while preserving the latitude and wide color gamut of the image.

The color toolset should support remapping to multiple deliverables in a variety of color spaces and luminance levels, and include support for dynamic HDR metadata systems like Dolby and HDR10+. As HDR color finishing has evolved, so has the way it is delivered to studios. Most commonly it is delivered in an HDR IMF package. It is common that Rec.2020 HDR deliverables be color constrained to the P3 color volume and also that Light Level histograms and HDR QC reports be delivered.

Do you feel DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
Not as much as you would think. Two things are working against this. First, film and high-end digital cameras themselves have for some time been capturing latitude suitable for HDR production. Proper camera exposure is all that is needed to ensure that an image with a wide enough dynamic range is recorded. So from a capture standpoint, nothing needs to change.

The other is cost. There are currently only a small number of suitable HDR broadcast monitors, and most of these are extremely expensive and not designed well for the set. I’m sure HDR monitoring is being used on-set, but not as much as expected for productions destined for HDR release.

Also, it is difficult to truly judge HDR displays in a bright environment, and cinematographers may feel that monitoring in HDR is not needed full time. Traditionally with film production, cinematographers became accustomed to not being able to monitor accurately on-set, and they rely on their experience and other means of judging light and exposure. I think the main concern for cinematographers is the effect of lighting choices and apparent resolution, saturation and contrast when viewed in HDR.

Highlights in the background can potentially become distracting when displayed at 1000 nits verses being clamped at 100. Framing and lighting choices are informed by proper HDR monitoring. I believe we will see more HDR monitoring on-set as more suitable displays become available.

Colorfront’s Transkoder

Where do you see the industry moving in the near future?
Clearly HDR display technology is still evolving, and we will see major advances in HDR emissive displays for the cinema in the very near future. This will bring new challenges and require updated infrastructure for post as well as the cinema. It’s also likely that color finishing for the cinema will become more and more similar to the production of HDR for the home, with only relatively small differences in overall luminance and the ambient light of the environment.

Looking forward, standard dynamic range will eventually go away in the same way that standard definition video did. As we standardize on consumer HDR displays, and high-performance panels become cheaper to make, we may not need the complexity of HDR dynamic remapping systems. I expect that headset displays will continue to evolve and will become more important as time goes on.

What is the biggest challenge you see for the color grading process now and beyond?
We are experiencing a period of change that can be compared to the scope of change from SD to HD production, except it is happening much faster. Even if HDR in the home is slow to catch on, it is happening. And nobody wants their production to be dated as SDR-only. Eventually, it will be impossible to buy a TV that is not HDR-capable.

Aside from the changes in infrastructure, colorists used to working in SDR have some new skills to learn. I think it is a mistake to do separate grading versions for every major delivery format. Even though we have standards for HDR formats, they will continue to evolve, so post production must evolve too. The biggest challenge is meeting all of these different delivery requirements on budgets that are not growing as fast as the formats.

Northern Lights Flame Artist and Colorist Chris Hengeveld
NY- and SF-based Northern Lights, along with sister companies Mr. Wonderful for design, SuperExploder for composing and audio post, and Bodega for production, offers one-stop-shop services.

How has the finishing of color evolved most recently?
It’s interesting that you use the term “finishing of color.” In my clients’ world, finishing and color now go hand in hand. My commercial clients expect not only a great grade but seamless VFX work in finalizing their spots. Both of these are now often taking place with the same artist. Work has been pushed from just straight finishing with greenscreen, product replacement and the like to doing a grade up to par with some of the higher-end coloring studios. Price is pushing vastly separate disciplines into one final push.

Clients now expect to have a rough look ready not only of the final VFX, but also of the color pass before they attend the session. I usually only do minor VFX tweaks when clients arrive. Sending QuickTimes back and forth between studio and client usually gets us to a place where our client, and their client, are satisfied with at least the direction if not the final composites.

Color, as a completely subjective experience, is best enjoyed with the colorist in the room. We do grade some jobs remotely, but my experience has clearly been that from both time and creativity standpoints, it’s best to be in the grading suite. Unfortunately, recently due to time constraints and budget issues, even higher-end projects are being evaluated on a computer/phone/tablet back at the office. This leads to more iterations and less “the whole is greater than the sum of the parts” mentality. Client interaction, especially at the grading level, is best enjoyed in the same room as the colorist. Often the final product is markedly better than what either could envision separately.

Where do you see the industry moving in the near future?
I see the industry continuing to coalesce around multi-divisional companies that are best suited to fulfill many clients’ needs at once. Most projects that come to us have diverse needs that center around one creative idea. We’re all just storytellers. We do our best to tell the client’s story with the best talent we offer, in a reasonable timeframe and at a reasonable cost.

The future will continue to evolve, putting more pressure on the editorial staff to deliver near perfect rough cuts that could become finals in the not-too-distant future.

Invisalign

The tools continue to level the playing field. More generalists will be trained in disciplines including video editing, audio mixing, graphic design, compositing and color grading. This is not to say that the future of singularly focused creatives is over. It’s just that those focused creatives are assuming more and more responsibilities. This is a continuation of the consolidation of roles that has been going on for several years now.

What is the biggest challenge you see for the color grading process now and beyond?
The biggest challenge going forward is both technical and budgetary. Many new formats have emerged, including the new ProRes RAW. New working color spaces have also emerged. Many of us work without on-staff color scientists and must find our way through the morass of HDR, ACES, Scene Linear and Rec.709. Working with materials that round trip in-house is vastly easier than dealing with multiple shops all with their own way of working. As we collaborate with outside shops, it behooves us to stay at the forefront of technology.

But truth be told, perhaps the biggest challenge is keeping the creative flow and putting the client’s needs first. Making sure the technical challenges don’t get in the way. Clients need to see a seamless view without technical hurdles.

What’s the best piece of work you’ve seen that you didn’t work on?
I am constantly amazed at the quality of work coming out of Netflix. Some of the series are impeccably graded. Early episodes of Bloodline, which was shot with the Sony F65, come to mind. The visuals were completely absorbing, both daytime and nighttime scenes.

Codex VP Business Development Brian Gaffney
Codex designs tools for color, dailies creation, archiving, review and networked attached storage. Their offerings include the new Codex ColorSynth with Keys and the MediaVault desktop NAS.

How has the finishing of color evolved most recently?
While it used to be a specialized suite in a post facility, color finishing has evolved tremendously over the last 10 years with low-cost access to powerful systems like Resolve for use on-set in commercial finishing to final DI color grading. These systems have evolved from being more than just color. Now they are editorial, sound mixing and complete finishing platforms.

How has laser projection and HDR impacted the work?
Offering brighter images in the theatre and the home with laser projection, OLED walls and HDR displays will certainly change the viewers’ experience, and it has helped create more work in post, offering up another pass for grading.

However, brighter images also show off image artifacts and can bring attention to highlights that may already be clipping. Shadow detail that was graded in SDR may now look milky in HDR. These new display mediums require that you spend more time optimizing the color correction for both display types. There is no magic one grade fits all.

Do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
I think cinematographers are still figuring this out. Much like color correction between SDR and HDR, lighting for the two is different. A window that was purposely blown out in SDR, to hide a lighting rig outside, may show up in HDR, exposing the rig itself. Color correction might be able to correct for this, but unless a cinematographer can monitor in HDR on-set, these issues will come up in post. To do it right, lighting optimization between the two spaces is required, plus SDR and HDR monitoring on-set and near-set and in editorial.

Where do you see the industry moving in the near future?
It’s all about content. With the traditional studio infrastructure and broadcast television market changing to Internet Service Providers (ISPs), the demand for content, both original and HDR remastered libraries, is helping prop up post production and is driving storage- and cloud-based services.

Codex’s ColorSynth and Media Vault

In the long term, if the competition in this space continues and the demand for new content keeps expanding, traditional post facilities will become “secure data centers” and managed service providers. With cloud-based services, the talent no longer needs to be in the facility with the client. Shared projects with realtime interactivity from desktop and mobile devices will allow more collaboration among global-based productions.

What is the biggest challenge you see for the color grading process now and beyond?
Project management — sharing color set-ups among different workstations. Monitoring of the color with proper calibrated displays in both SDR and HDR and in support of multiple deliverables is always a challenge. New display technologies, like laser projection and new Samsung and Sony videowalls, may not be cost effective for the creative community to access for final grading. Only certain facilities may wind up specializing in this type of grading experience, limiting global access for directors and cinematographers to fully visualize how their product will look like on these new display mediums. It’s a cost that may not get the needed ROI, so in the near future many facilities may not be able to support the full demand of deliverables properly.

Blackmagic Director of Sales/Operations Bob Caniglia
Blackmagic creates DaVinci Resolve, a solution that combines professional offline and online editing, color correction, audio post production and visual effects in one software tool.

How has the finishing of color evolved most recently?
The ability to work in 8K, and whatever flavor of HDR you see, is happening. But if you are talking evolution, it is about the ability to collaborate with everyone in the post house, and the ability to do high-quality color correction anywhere. Editors, colorists, sound engineers and VFX artists should not be kept apart or kept from being able to collaborate on the same project at the same time.

New collaborative workflows will speed up post production because you will no longer need to import, export or translate projects between different software applications.

How has laser projection and HDR impacted the work?
The most obvious impact has been on the need for colorists to be using software that can finish a project in whatever HDR format the client asks for. That is the same with laser projection. If you do not use software that is constantly updating to whatever new format is introduced, being able to bid on HDR projects will be hard.

HDR is all about more immersive colors. Any colorist should be ecstatic to be able to work with images that are brighter, sharper and with more data. This should allow them to be even more creative with telling a story with color.

Do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
As for cinematographers, HDR gives viewers a whole new level of image details. But that hyper reality could draw the viewer from the wanted target in a shot. The beautiful details shining back on a coffee pot in a tracking shot may not be worth worrying about in SDR, but in HDR every shot will create more work for the colorist to make sure the viewer doesn’t get distracted by the little things. For DPs, it means they are going to have to be much more aware of lighting, framing and planning the impact of every possible item and shadow in an image.

Where do you see the industry moving in the near future?
Peace in our time amongst all of the different post silos, because those silos will finally be open. And there will be collaboration between all parts of the post workflow. Everyone — audio, VFX, editing and color correction — can work together on the same project seamlessly.

For example, in our Resolve tool, post pros can move between them all. This is what we see happening with colorists and post houses right now, as each member of the post team can be much more creatively flexible because anyone can explore new toolsets. And with new collaboration tools, multiple assistants, editors, colorists, sound designers and VFX artists can all work on the same project at the same time.

Resolve 15

For a long-term view, you will always have true artists in each of the post areas. People who have mastered the craft and can separate themselves as being color correction artists. What is really going to change is that everyone up and down the post workflow at larger post houses will be able to be much more creative and efficient, while small boutique shops and freelancers can offer their clients a full set of post production services.

What is the biggest challenge you see for the color grading process now and beyond?
Speed and flexibility. Because with everyone now collaborating and the colorist being part of every part of the post process, you will be asked to do things immediately… and in any format. So if you are not able to work in real time or with whatever footage format thrown at you, they will find someone who can.

This also comes with the challenge of changing the old notion that the colorist is one of the last people to touch a project. You will be asked to jump in early and often. Because every client would love to show early edits that are graded to get approvals faster.

FilmLight CEO Wolfgang Lempp
FilmLight designs, creates and manufactures color grading systems, image processing applications and workflow tools for the film and television industry

How has the finishing of color evolved recently?
When we started FilmLight 18 years ago, color management was comparatively simple: Video looked like video, and digital film was meant to look like film. And that was also the starting point for the DCI — the digital cinema standard tried to make digital projection look exactly like conventional cinema. This understanding lasted for a good 10 years, and even ACES today is very much built around film as the primary reference. But now we have an explosion of new technologies, new display devices and new delivery formats.

There are new options in resolution, brightness, dynamic range, color gamut, frame rate and viewing environments. The idea of a single deliverable has gone: There are just too many ways of getting the content to the viewer. That is certainly affecting the finishing process — the content has to look good everywhere. But there is another trend visible, too, which here in the UK you can see best on TV. The color and finishing tools are getting more powerful and the process is getting more productive. More programs than ever before are getting a professional color treatment before they go out, and they look all the better for it.

Either way, there is more work for the colorist and finishing house, which is of course something we welcome.

How has laser projection and HDR impacted the work?
Laser projection and HDR for cinema and TV are examples of what I described above. We have the color science and the tools to move comfortably between these different technologies and environments, in that the color looks “right,” but that is not the whole story.

The director and DP will choose to use a format that will best suit their story, and will shoot for their target environment. In SDR, you might have a bright window in an interior scene, for example, which will shape the frame but not get in the way of the story. But in HDR, that same window will be too bright, obliterate the interior scene and distract from the story. So you would perhaps frame it differently, or light up the interior to restore some balance. In other words, you have to make a choice.

HDR shouldn’t be an afterthought, it shouldn’t be a decision made after the shoot is finished. The DP wants to keep us on the edge of our seats — but you can’t be on the edge in HDR and SDR at the same time. There is a lot that can be done in post, but we are still a long way from recreating the multispectral, three-dimensional real world from the output of a camera.

HDR, of course, looks fantastic, but the industry is still learning how to shoot for best effect, as well as how to serve all the distribution formats. It might well become the primary mastering format soon, but SDR will never go away.

Where do you see the industry moving in the future?
For me, it is clear that as we have pushed resolution, frame rate, brightness and color gamut, it has affected the way we tell stories. Less is left to the imagination. Traditional “film style” gave a certain pace to the story, because there was the expectation that the audience was having to interpret, to think through to fill in the black screen in between.

Now technology has made things more explicit and more immersive. We now see true HDR cinema technology emerging with a brightness of 600 nits and more. Technology will continue to surge forward, because that is how manufacturers sell more televisions or projectors — or even phones. And until there is a realistic simulation of a full virtual reality environment, I don’t see that process coming to a halt. We have to be able to master for all these new technologies, but still ensure compatibility with existing standards.

What is the biggest challenge for color grading now and in the future?
Color grading technology is very much unfinished business. There is so much that can be done to make it more productive, to make the content look better and to keep us entertained.

Blackboard

As much as we might welcome all the extra work for our customers, generating an endless stream of versions for each program is not what color grading should be about. So it will be interesting to see how this problem will be solved. Because one way or another, it will have to be. But while this is a big challenge, it hopefully isn’t what we put all our effort into over the coming years.

BlackboardThe real challenge is to understand what makes us appreciate certain images over others. How composition and texture, how context, noise and temporal dynamics — not just color itself — affect our perception.

It is interesting that film as a capture medium is gaining popularity again, especially large-format capture. It is also interesting that the “film look” is still precious when it comes to color grading. It puts all the new technology into perspective. Filmmaking is storytelling. Not just a window to the world outside, replaced by a bigger and clearer window with new technology, but a window to a different world. And the colorist can shape that world to a degree that is limited only by her imagination.

Olympusat Entertainment Senior DI Colorist Jim Wicks
A colorist since 2007, Jim has been a senior DI colorist at Olympusat Entertainment since 2011. He has color restored hundreds of classic films and is very active in the color community.

How has the finishing of color evolved most recently?
The phrase I’m keying in on in your question is “most recently.” I believe the role of a colorist has been changing exponentially for the last several years, maybe longer. I would say that we are becoming, if we haven’t already, more like finishing artists. Color is now just one part of what we do. Because technologies are changing more rapidly than at any time I’ve witnessed, we now have a lot to understand and comprehend in addition to just color. There is ACES, HDR, changing color spaces, integrating VFX workflows into our timelines, laser projection and so on. The list isn’t endless, and it’s growing.

How has laser projection and HDR impacted the work?
For the time being, they do not impact my work. I am currently required to deliver in Rec.709. However, within that confine I am grading a wider range of media than ever before, such as 2K and 4K uncompressed DPX; Phantom Digital Video Files; Red Helium 8K in the IPP2 workspace; and much more. Laser projection and HDR is something that I continue to study by attending symposiums, or wherever I can find that information. I believe laser projection and HDR are important to know now. When the opportunity to work with laser projection and HDR is available to me, I plan to be ready.

Do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
Of course! At the very heart of every production, the cinematographer is the creator and author of the image. It is her creative vision. The colorist is the protector of that image. The cinematographer entrusts us with her vision. In this respect, the colorist needs to be in sync with the cinematographer as never before. As cinematographers move because of technology, so we move. It’s all about the deliverable and how it will be displayed. I see no benefit for the colorist and the cinematographer to not be on the same page because of changing technology.

Where do you see the industry moving in the near future and the long-range future?
In the near future: HDR, laser projection, 4K and larger and larger formats.

In the long-range future: I believe we only need to look to the past to see the changes that are inevitably ahead of us.

Technological changes forced film labs, telecine and color timers to change and evolve. In the nearly two decades since O Brother Where Art Thou? we no longer color grade movies the way we did back when the Coen classic was released in 2000. I believe it is inevitable: Change begets change. Nothing stays the same.

In keeping with the types of changes that came before, it is only a matter of time before today’s colorist is forced to change and evolve just as those before us were forced to do so. In this respect I believe AI technology is a game-changer. After all, we are moving towards driverless cars. So, if AI advances the way we have been told, will we need a human colorist in the future?

What is the biggest challenge you see for the color grading process now and beyond?
Not to sound like a “get off my lawn rant,” but education is the biggest challenge, and it’s a two-fold problem. Firstly, at many fine film schools in the US color grading is not taught as a degree-granting course, or at all.

Secondly, the glut of for-profit websites that teach color grading courses have no standardized curriculum, which wouldn’t be a problem, but at present there is no way to measure how much anyone actually knows. I have personally encountered individuals who claim to be colorists and yet do not know how to color grade. As a manager I have interviewed them — their resumes look strong, but their skills are not there. They can’t do the work.

What’s the best piece of work you’ve seen that you didn’t work on?
Just about anything shot by Roger Deakins. I am a huge fan of his work. Mitch Paulson and his team at Efilm did great work on protecting Roger’s vision for Blade Runner 2049.

Colorist David Rivero
This Madrid-born colorist is now based in China. He color grades and supervises the finishing of feature films and commercials, normally all versions, and often the trailers associated with them.

How has the finishing of color evolved most recently?
The line between strictly color grading and finishing is getting blurrier by the year. Although it is true there is still a clearer separation in the commercial world, on the film side the colorist has become the “de facto” finishing or supervising finishing artist. I think it is another sign of the bigger role the color grading is starting to play in post.

In the last two to three years I’ve noticed that fewer clients are looking at it as an afterthought, or as simply “color matching.” I’ve seen how the very same people went from a six- to seven-day DI schedule five years ago to a 20-day schedule now. The idea that spending a relatively small amount of extra time and budget on the final step can get you a far superior result is finally sinking in.

The tools and technology are finally moving into a “modern age” of grading:
– HDR is a game changer on the image-side of things, providing a noticeable difference for the audience and a different approach on our side on how to deal with all that information.

– The eventual acceptance by all color systems of what was traditionally compositing or VFX tools is also a turning point, although controversial. There are many that think that colorists should focus on grading. However, I think that rather than colorists becoming compositors, it is the color grading concept and mission that is (still) evolving.

How has laser projection and HDR impacted the work? And do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
Well, on my side of the world (China), the laser and HDR technologies are just starting to get to the public. Cinematographers are not really changing how they work yet, as it is a very small fraction of the whole exhibition system.

As for post, it requires a more careful way of handling the image, as it needs higher quality plates, compositions, CG, VFX, a more careful grade, and you can’t get away with as many tricks as you did when it was just SDR. The bright side is the marvelous images, and how different they can be from each other. I believe HDR is totally compatible with every style you could do in SDR, while opening the doors to new ones. There are also different approaches on shooting and lighting for cinematographers and CG artists.

Goldbuster

The biggest challenge it has created has been on the exhibition side in China. Although Dolby cinemas (Vision+Atmos) are controlled and require a specific pass and DCP, there are other laser projection theaters that show the same DCP being delivered to common (xenon lamp) theaters. This creates a frustrating environment. For example, during the 3D grading, you not only need to consider the very dark theaters with 3FL-3.5FL, but also the new laser rooms that are racking up their lamps to show off why they charge higher ticket prices with to 7FL-8FL.

Where do you see the industry moving in the near future and the long-range future?
I hope to see the HDR technologies settling and becoming the new standard within the next five to six years, and using this as the reference master from which all other deliveries are created. I also expect all these relative new practices and workflows (involving ACES, EXRs with the VFX/CG passes, non-LUT deliveries) to become more standardized and controlled.

In the long term, I could imagine two main changes happening, closely related to each other:
– The concept of grading and colorist, especially in films or long formats, evolving in importance and relationship within the production. I believe the separation or independence between photography and grading will get wider (and necessary) as tools evolve and the process is more standardized. We might get into something akin to how sound editors and sound mixers relate and work together on the sound.

– The addition of (serious) compositing in essentially all the main color systems is the first step towards the possibilities of future grading. A feature like the recent FaceRefinement in Resolve is one of the things I dreamed about five or six years ago.

What is the biggest challenge you see for the color grading process now and beyond?
Nowadays one of the biggest challenges is possibly the multi-mastering environment, with several versions on different color spaces, displays and aspect ratios. It is becoming easier, but it is still more painful than it should be.

Shrinking margins is something that also hurts the whole industry. We all work thanks to the benefits, but cutting on budgets and expecting the same results is not something that is going to happen.

What’s the best piece of work you’ve seen that you didn’t work on?
The Revanant, Mad Max, Fury and 300.

Carbon Colorist Aubrey Woodiwiss
Full-service creative studio Carbon has offices in New York, Chicago and Los Angeles.

How has the finishing of color evolved most recently?
It is always evolving, and the tools are becoming ever more powerful, and camera formats are becoming larger with more range and information in them. Probably the most significant evolution I see is a greater understanding of color science and color space workflows.

How has laser projection and HDR impacted the work? And do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
These elements impact how footage is viewed and dealt with in post. As far as I can see, it isn’t affecting how things are shot.

Where do you see the industry moving in the near future? What about in the long-range future?
I see formats becoming larger, viewing spaces and color gamuts becoming wider, and more streaming- and laptop-based technologies and workflows.

What is the biggest challenge you see for the color grading process now and beyond?
The constant challenge is integrating the space you traditionally color grade in to how things are viewed outside of this space.

What’s the best piece of work you’ve seen that you didn’t work on?
Knight of Cups, directed by Terrence Malick with cinematography by Emanuel Lubezki.

Ntropic Colorist Nick Sanders
Ntropic creates and produces work for commercials, music videos, and feature films as well as experiential and interactive VR and AR media. They have locations in San Francisco, Los Angeles and New York City.

How has the finishing of color evolved most recently?
SDR grading in Rec.709 and 2.4 Gamma is still here, still looks great, and will be prominent for a long time. However, I think we’re becoming more aware of how exciting grading in HDR is, and how many creative doors it opens. I’ve noticed a feeling of disappointment when switching from an HDR to an SDR version of a project, and wondered for a second if I’m accidentally viewing the ungraded raw footage, or if my final SDR grade is actually as flat as it appears to my eyes. There is a dramatic difference between the two formats.

HDR is incredible because you can make the highlights blisteringly hot, saturate a color to nuclear levels or keep things mundane and save those heavier-handed tools in your pocket for choice moments in the edit where you might want some extra visceral impact.

How has laser projection and HDR impacted the work? And do you feel that DPs are working differently now that laser projection and the home HDR experiences are becoming more relevant?
In one sense, cinematographers don’t need to do anything differently. Colorists are able to create high-quality SDR and HDR interpretations of the exact same source footage, so long as it was captured in a high-bit-depth raw format and exposed well. We’re even seeing modern HDR reimaginings of classic films. Movies as varied in subject matter as Saving Private Ryan and the original Blade Runner are coming back to life because the latitude of classic film stocks allows it. However, HDR has the power to greatly exaggerate details that may have otherwise been subtle or invisible in SDR formats, so some extra care should be taken in projects destined for HDR.

Extra contrast and shadow detail mean that noise is far more apparent in HDR projects, so ISO and exposure should be adjusted on-set accordingly. Also, the increased highlight range has some interesting consequences in HDR. For example, large blown-out highlights, such as overexposed skies, can look particularly bad. HDR can also retain more detail and color in the upper ranges in a way that may not be desirable. An unremarkable, desaturated background in SDR can become a bright, busy and colorful background in HDR. It might prove distracting to the point that the DP may want to increase his or her key lighting on the foreground subjects to refocus our attention on them.

Panasonic “PvP”

Where do you see the industry moving in the near future? What about the long-range future?
I foresee more widespread adoption of HDR — in a way that I don’t with 3D and VR — because there’s no headset device required to feel and enjoy it. Having some HDR nature footage running on a loop is a great way to sell a TV in Best Buy. Where the benefits of another recent innovation, 4K, are really only detectable on larger screens and begin to deteriorate with the slightest bit of compression in the image pipeline, HDR’s magic is apparent from the first glance.

I think we’ll first start to see HDR and SDR orders on everything, then a gradual phasing out of the SDR deliverables as the technology becomes more ubiquitous, just like we saw with the standard definition transition to HD.

For the long-range, I wouldn’t be surprised to see a phasing out of projectors as LED walls become more common for theater exhibitions due to their deeper black levels. This would effectively blur the line between technologies available for theater and home for good.

What is the biggest challenge you see for the color grading process now and beyond?
The lack of a clear standard makes workflow decisions a little tricky at the moment. One glaring issue is that consumer HDR displays don’t replicate the maximum brightness of professional monitors, so there is a question of mastering one’s work for the present, or for the near future when that higher capability will be more widely available. And where does this evolution stop? 4,000 nits? 10,000 nits?

Maybe a more pertinent creative challenge in the crossover period is which version to grade first, SDR or HDR, and how to produce the other version. There are a couple of ways to go about it, from using LUTs to initiate and largely automate the conversion to starting over from scratch and regrading the source footage in the new format.

What’s the best piece of work you’ve seen that you didn’t work on?
Chef’s Table on Netflix was one of the first things I saw in HDR; I still think it looks great!

Main Image: Courtesy of Jim Wicks.

DG 7.9.18

Netflix’s Lost in Space: mastering for Dolby Vision HDR, Rec.709

There is a world of difference between Netflix’s ambitious science-fiction series Lost in Space (recently renewed for another 10 episodes) and the beloved but rather low-tech, tongue-in-cheek 1960s show most fondly remembered for the repartee between persnickety Dr. Smith and the rather tinny-looking Robot. This series, starring Molly Parker, Toby Stevens and Parker Posey (in a very different take on Dr. Smith), is a very modern, VFX-intensive adventure show with more deeply wrought characters and elaborate action sequences.

Siggy Ferstl

Colorist Siggy Ferstl of Company 3 devoted a significant amount of his time and creative energy to the 10-episode release over the five-and-a-half-month period the group of 10 episodes was in the facility. While Netflix’s approach to dropping all 10 episodes at once, rather than the traditional series schedule of an episode a week, fuels excitement and binge-watching among viewers, it also requires a different kind of workflow, with cross-boarded shoots across multiple episodes and different parts of episodes coming out of editorial for color grading throughout the story arc. “We started on episode one,” Ferstl explains, “but then we’d get three and portions of six and back to four, and so on.”

Additionally, the series was mastered both for Dolby Vision HDR and Rec.709, which added additional facets to the grading process over shows delivered exclusively for Rec.709.

Ferstl’s grading theater also served as a hub where the filmmakers, including co-producer Scott Schofield, executive producer Zack Estrin and VFX supervisor Jabbar Raisani could see iterations of the many effects sequences as they came in from vendors (Cinesite, Important Looking Pirates and Image Engine, among others).

Ferstl himself made use of some new tools within Resolve to create a number of effects that might once have been sent out of house or completed during the online conform. “The process was layered and very collaborative,” says Ferstl. “That is always a positive thing when it happens but it was particularly important because of this series’ complexity.”

The Look
Shot by Sam McCurdy, the show’s aesthetic was designed, “to have a richness and realness to the look,” Ferstl explains. “It’s a family show but it doesn’t have that vibrant and saturated style you might associate with that. It has a more sophisticated kind of look.”

One significant alteration to the look involves changes to the environment of the planet onto which the characters crash land. The filmmakers wanted the exteriors to look less Earthlike with foliage a bit reddish, less verdant than the actual locations. The visual effects companies handled some of the more pronounced changes, especially as the look becomes more extreme in later episodes, but for a significant amount of this work, Ferstl was able to affect the look in his grading sessions — something that until recently would likely not have been achievable.

Ferstl, who has always sought out and embraced new technology to help him do his job, made use of some features that were then brand new to Resolve 14. In the case of the planet’s foliage, he made use of the Color Compressor tool within the OpenFX tab on the color corrector. “This allowed me take a range of colors and collapse that into a single vector of color,” he explains. “This lets you take your selected range of colors, say yellows and greens in this case, and compress them in terms of hue, saturation and luminance.” Sometimes touted as a tool to give colorists more ability to even out flesh tones, Ferstl applied the tool to the foliage and compressed the many shades of green into a narrower range prior to shifting the resulting colors to the more orange look.

“With foliage you have light greens and darker greens and many different ranges within the color green,” Ferstl explains. “If we’d just isolated those ranges and turned them orange individually, it wouldn’t give us the same feel. But by limiting the range and latitude of those greens in the Color Compressor and then changing the hue we were able to get much more desirable results.” Of course, Ferstl also used multiple keys and windows to isolate the foliage that needed to change from the elements of the scenes that didn’t.

He also made use of the Camera Shake function, which was particularly useful in a scene in the second episode in which an extremely heavy storm of sharp hail-like objects hits the planet, endangering many characters. The storm itself was created at the VFX houses, but the additional effect of camera shake on top of that was introduced and fine-tuned in the grade. “I suggested that we could add the vibration, and it worked very well,” he recalls. By doing the work during color grading sessions, Ferstl and the filmmakers in the session could see that effect as it was being created, in context and on the big screen, and could fine-tune the “camera movement” right then and there.

Fortunately, the colorist notes, the production afforded the time to go back and revise color decisions as more episodes came into Company 3. “The environment of the planet changes throughout. But we weren’t coloring episodes one after the other. It was really like working on a 10-hour feature.

“If we start at episode one and jump to episode six,” Ferstl notes, “exactly how much should the environment have changed in-between? So it was a process of estimating where the look should land but knowing we could go back and refine those decisions if it proved necessary once we had the surrounding episodes for context.”

Dolby Vision Workflow
As most people reading this know, mastering in high dynamic range (Dolby Vision in this case) opens up the possibility of working within a significantly expanded contrast range and wider color gamut over Rec.709 standard for traditional HD. Lost in Space was mastered concurrently for both, which required Ferstl to use Dolby’s workflow. And this involves making all corrections for the HDR version and then allowing the Dolby hardware/software to analyze the images to bring them into the Rec.709 space for the colorist to do a standard-def pass.

Ferstl, who worked with two Sony X-300 monitors, one calibrated for Rec.709 and the other for HDR, explains, “Everyone is used to looking at Rec. 709. Most viewers today will see the show in Rec.709 and that’s really what the clients are most concerned with. At some point, if HDR becomes the dominant way people watch television, then that will probably change. But we had to make corrections in HDR and then wait for the analysis to show us what the revised image looked like for standard dynamic range.”

He elaborates that while the Dolby Vision spec allows the brightest whites to read at 4000 nits, he and the filmmakers preferred to limit that to 1000 nits. “If you let highlights go much further than we did,” he says, “some things can become hard to watch. They become so bright that visual fatigue sets in after too long. So we’d sometimes take the brightest portions of the frame and slightly clamp them,” he says of the technique of holding the brightest areas of the frame to levels below the maximum the spec allows.

“Sometimes HDR can be challenging to work with and sometimes it can be amazing,” he allows. Take the vast vistas and snowcapped mountains we first see when the family starts exploring the planet. “You have so much more detail in the snow and an amazing range in the highlights than you could ever display in Rec.709,” he says.

“In HDR, the show conveys the power and majesty of these vast spaces beyond what viewers are used to seeing. There are quite a few sections that lend themselves to HDR,” he continues. But as with all such tools, it’s not always appropriate to the story to use the extremes of that dynamic range. Some highlights in HDR can pull the viewer’s attention to a portion of the frame in a way that simply can’t be replicated in Rec. 709 and, likewise, a bright highlight from a practical or a reflection in HDR can completely overpower an image that tells the story perfectly in standard dynamic range. “The tools can re-map an image mathematically,” Ferstl notes, “but it still requires artists to interpret an image’s meaning and feel from one space to the other.”

That brings up another question: How close do you want the HDR and the Rec.709 to look to each other when they can look very different? Overall, the conclusion of all involved on the series was to constrain the levels in the HDR pass a bit in order to keep the two versions in the same ballpark aesthetically. “The more you let the highlights go in HDR,” he explains, “the harder it is to compress all that information for the 100-nit version. If you look at scenes with the characters in space suits, for example, they have these small lights that are part of their helmets and if you just let those go in HDR, those lights become so distracting that it becomes hard to look at the people’s faces.”

Such decisions were made in the grading theater on a case by case basis. “It’s not like we looked at a waveform monitor and just said, ‘let’s clamp everything above this level,’” he explains, “it was ultimately about the feeling we’d get from each shot.”


A colorist weighs in on ‘the new world’ of HDR

By Maxine Gervais

HDR is on many people’s minds these days. Some embrace it, some are hesitant and some simply do not like the idea.

But what is HDR really? I find that manufacturers often use the term too loosely. Anything that offers higher dynamic range can fall into the HDR category, but let’s focus on the luminance and greater contrast ratio brought by HDR.

We have come a long way in the last 12 years — from film prints to digital projection. This was a huge shift, and one could argue it happened relatively fast. Since then, technology has been on the fast forward.

Film allows incredible capture of detail information in large formats, and when digital was first introduced we couldn’t say the same. At the time, cameras were barely capable of capturing true 2K and wide dynamic range. Many would shoot film and scan it into digital files hoping to preserve more of the dynamic range offered by film. Eventually, cameras got better and film started to disappear, mostly for convenience and cost reasons.

Through all this, target devices (projectors and monitors) stayed pretty much the same. Monitors went from CRT to plasma to LCD, but kept the same characteristics. For monitors, everything was in a Rec.709 color space and a luminance of 100 nits. Projectors were in the P3 colors space, but with a lower luminance of about 48 nits.

Maxine at work on the FilmLight Baselight.

Philosophically, one could argue that all creative intent was in some ways limited by the display. The files might of held much more information than the display was able to show. So, the aesthetics we learned to love were a direct result of the displays’ limitations.

What About Now?
Now, we are at the break in the revolution of these displays. With the introduction of OLEDs for monitors and laser projection for theaters, the contrast ratios, color spaces and luminance are now larger than before. It is now possible to see the details captured by cameras and or film. This allows for greater artistic freedom: since there is less limitation one can push the aesthetic to a new level.

However, that doesn’t mean all of a sudden everything is brighter and more colorful. It is very easy to create the same aesthetic one used to love, but it is now possible to bring to the screens details in shadows and highlights that were never an option prior. This even means better color separation. What creatives can do with “HDR” is still very much in their control.

The more difficult part is that HDR has not yet taken over theaters and or homes. If someone has set their look in a P3 48-nits world and is now asked to take this look into a 4000-nits P3 PQ display, it might be difficult to decide how to approach it. How do we maintain the original intent yet embrace what HDR has to offer? There are many ways to go about it, and not one is better than the other. You can redefine your look for the new displays, and in some ways have a new look that becomes its own entity, or you can mimic your original look, taking advantage of only a few elements of HDR.

The more we start using brighter luminance, bigger contrast ratio and color cube as our starting point, the more we will be able to future-proof and protect the creative intent. The afterthought of HDR, in terms of never having planned for it, is still something difficult to do and controversial in some cases.

The key is to have those philosophical discussions with creatives ahead of time and come up with a workflow that will have the expected results.

Main Image: Maxine Gervais working director Albert Hughes on his upcoming film, Alpha.


Maxine Gervais is a senior supervising colorist at Technicolor Hollywood.  Her past credits include Black Panther; The 15:17 to Paris; Pitch Perfect 3 and American Sniper.


Sim and the ASC partner on educational events, more

During Cine Gear recently, Sim announced a 30-year sponsorship with the American Society of Cinematographers (ASC). Sim offers end-to-end solutions for creatives in film and television, and the ASC is a nonprofit focusing on the art of cinematography. As part of the relationship, the ASC Clubhouse courtyard will now be renamed Sim Plaza.

Sim and the ASC have worked together frequently on events that educate industry professionals on current technology and its application to their evolving craft. As part of this sponsorship, Sim will expand its involvement with the ASC Master Classes, SimLabs, and conferences and seminars in Hollywood and beyond.

During an official ceremony, a commemorative plaque was unveiled and embedded into the walkway of what is now Sim Plaza in Hollywood. Sim will also host a celebration of the ASC’s 100th anniversary in 2019 at Sim’s Hollywood location.

What else does this partnership entail?
• The two organizations will work together closely over the next 30 years on educational events for the cinematography community. Sim’s sponsorship will help fund society programs and events to educate industry professionals (both practicing and aspiring) on current technology and its application to the evolving craft.
• The ASC Master Class program, SimLabs and other conferences and seminars will continue on over these 30 years with Sim increasing its involvement. Sim is not telling the ASC what kind of initiatives they should be doing, but is rather lending a helping hand to drive visual storytelling forward. For example, they have already hosted ASC Master Class sessions in Toronto and Hollywood, sponsored the annual ASC BBQ for the last couple of years, and founder Rob Sim himself is an ASC associate member.

How will the partnership will increase programming and resources to support the film and television community for the long term?
• It has a large focus on three things: financial resources, programming assistance and facility support.
• It will provide access and training with world-class technology in film and television.
• It will offer training directly from industry leaders in Hollywood and beyond
• It will develop new programs for people who can’t attend ASC Master Class sessions, such as an online experience, which is something ASC and Sim are working on together.
• It will expand SimLabs beyond Hollywood —with the potential to bring it to Vancouver, Atlanta, New York and Toronto with the goal of creating new avenues for people who are associated with the ASC and who know they can call on Sim.
• It will bring volunteers. Sim has many volunteers on ASC committees, including the Motion Imaging Technology Council and its Lens committee.

Main Image: L-R: Sim President/CEO James Haggarty, Sim founder and ASC associate member Rob Sim,ASC events coordinator Patty Armacost and ASC president Kees van Oostrum.


Sony updates Venice to V2 firmware, will add HFR support

At CineGear, Sony introduced new updates and developments for its Venice CineAlta camera system including Version 2 firmware, which will now be available in early July.

Sony also showed the new Venice Extension System, which features expanded flexibility and enhanced ergonomics. Also announced was Sony’s plan for high frame rate support for the Venice system.

Version 2 adds new features and capabilities specifically requested by production pros to deliver more recording capability, customizable looks, exposure tools and greater lens freedom. Highlights include:

With 15+ stops of exposure latitude, Venice will support high base ISO of 2500 in addition to an existing ISO of 500, taking full advantage of Sony’s sensor for superb low-light performance with dynamic range from +6 stops to -9 stops as measured at 18% middle gray. This increases exposure indexes at higher ISOs for night exteriors, dark interiors, working with slower lenses or where content needs to be graded in high dynamic range while maintaining the maximum shadow details; Select FPS (off speed) in individual frame increments, from 1 to 60; V2.0 adds several Imager Modes, including 25p in 6K full-frame, 25p in 4K 4:3 anamorphic, 6K 17:9, 1.85:1 and 4K 6:5 anamorphic imager modes; user-uploadable 3D LUTs allows users to customize their own looks and save them directly into the camera; wired LAN remote control allows users to remotely control and change key functions, including camera settings, fps, shutter, EI, iris (Sony E-mount lens), record start/stop and built-in optical ND filters; and E-mount allows users to remove the PL mount and use a wide assortment of native E-mount lenses.

The Venice Extension System is a full-frame tethered extension system that allows the camera body to detach from the actual image sensor block with no degradation in image quality up to 20 feet apart. These are the result of Sony’s long-standing collaboration with James Cameron’s Lightstorm Entertainment.

“This new tethering system is a perfect example of listening to our customers, gathering strong and consistent feedback, and then building that input into our product development,” said Peter Crithary, marketing manager for motion picture cameras, Sony. “The Avatar sequels will be among the first feature films to use the new Venice Extension System, but it also has tremendous potential for wider use with handheld stabilizers, drones, gimbals and remote mounting in confined places.”

Also at CineGear, Sony shared the details of a planned optional upgrade to support high frame rate — targeting speeds up to 60fps in 6K, up to 90fps in 4K and up to 120fps in 2K. It will be released in North America in the spring of 2019.


NAB: Imagine Products and StorageDNA enhance LTO and LTFS

By Jonathan S. Abrams

That’s right. We are still taking NAB. There was a lot to cover!

So, the first appointment I booked for NAB Show 2018, both in terms of my show schedule (10am Monday) and the vendors I was in contact with, was with StorageDNA’s Jeff Krueger, VP of worldwide sales. Weeks later, I found out that StorageDNA was collaborating with Imagine Products on myLTOdna, so I extended my appointment. Doug Hynes, senior director of business development for StorageDNA, and Michelle Maddox, marketing director of Imagine Products, joined me to discuss what they had ready for the show.

The introduction of LTFS during NAB 2010 allowed LTO tape to be accessed as if it was a hard drive. Since LTO tape is linear, executing multiple operations at once and treating it like a hard drive results in performance falling off of a cliff. It also could cause the drive to engage in shoeshining, or shuttling of the tape back-and-forth over the same section.

Imagine Products’ main screen.

Eight years later, these performance and operation issues have been addressed by StorageDNA’s creation of HyperTape, which is their enhanced Linear File Transfer System that is part of Imagine Products’ myLTOdna application. My first question was “Is HyperTape yet another tape format?” Fortunately for myself and other users, the answer is “No.”

What is HyperTape? It is a workflow powered by dnaLTFS. The word “enhanced” in the description of HyperTape as an enhanced Linear File Transfer System refers to a middleware in their myLTOdna application for Mac OS. There are three commands that can be executed to put an LTO drive into either read-only, write-only or training mode. Putting the LTO drive into an “only mode” allows it to achieve up to 300MB/s of throughput. This is where the Hyper in HyperTape comes from. These modes can also be engaged from the command line.

Training mode allows for analyzing the files stored on an LTO tape and then storing that information in a Random Access Database (RAD). The creation of the RAD can be automated using Imagine Products’ PrimeTranscoder. Otherwise, each file on the tape must be opened in order to train myLTOdna and create a RAD.

As for shoeshining, or shuttling of the tape back-and-forth over the same section, this is avoided by intelligently writing files to LTO tape. This intelligence is proprietary and is built into the back-end of the software. The result is that you can load a clip in Avid’s Media Composer, Blackmagic’s DaVinci Resolve or Adobe’s Premiere Pro and then load a subclip from that content into your project. You still should not load a clip from tape and just press play. Remember, this is LTO tape you are reading from.

The target customer for myLTOdna is a DIT with camera masters who wants to reduce how much time it takes to backup their footage. Previously, DITs would transfer the camera card’s contents to a hard drive using an application such as Imagine Products’ ShotPut Pro. Once the footage had been transferred to a hard drive, it could then be transferred to LTO tape. Using myLTOdna in read-only mode allows a DIT to bypass the hard drive and go straight from the camera card to an LTO tape. Because the target customer is already using ShotPut Pro, the UI for myLTOdna was designed to be comfortable and not difficult to use or understand.

The licensing for dnaLTFS is tied to the serial number of an LTO drive. StorageDNA’s Krueger explained that, “dnaLTFS is the drive license that works with stand alone mac LTO drives today.” Purchasing a license for dnaLTFS allows the user to later upgrade to StorageDNA’s DNAevolution M Series product if they need automation and scheduling features without having to purchase another drive license if the same LTO drive is used.

Krueger went on to say, “We will have (dnaLTFS) integrated into our DNAevolution product in the future.” DNAevolution’s cost of entry is $5,000. A single LTO drive license starts at $1,250. Licensing is perpetual, and updates are available without a support contract. myLTOdna, like ShotPut Pro and PrimeTranscoder, is a one-time purchase (perpetual license). It will phone home on first launch. Remote support is available for $250 per year.

I also envision myLTOdna being useful outside of the DIT market. Indeed, this was the thinking when the collaboration between Imagine Products and StorageDNA began. If you do not mind doing manual work and want to keep your costs low, myLTOdna is for you. If you later need automation and can budget for the efficiencies that you get with it, then DNAevolution is what you can upgrade to.


Jonathan S. Abrams is the Chief Technical Engineer at Nutmeg, a creative marketing, production and post resource, located in New York City.


Display maker TVLogic buys portable backup storage company Nexto DI

TVLogic, a designer and manufacturer of LCD and OLED high-definition displays, has acquired Nexto DI, a provider of portable field backup storage for digital cameras. They are located in South Korea.

Nexto DI uses the company’s patented “X-copy” technology, while M-copy (copy to multiple drives simultaneously), according to the company, guarantees 100% data safety, even in worst-case circumstances.

We reached out to TVLogic’s Denny An to find out more…

Why did it make sense for TVLogic to acquire Nexto DI?
TVLogic develops and manufactures broadcast and pro monitors that work in concert with other equipment. Because we compete on a global scale with large organizations that supply other products, such as cameras, switchers and more in addition to monitors, we realized we had to extend our offerings to better serve our customers and stay competitive. After a thorough search for companies that provide complementary products we found the perfect technology partner with Nexto DI.

We will continue our efforts to become a comprehensive broadcast and professional equipment company by searching for products and companies that can create synergy with our monitor technology.

How do you feel this fits in with what you already provide for the industry?
TVLogic has over 90 distributors and service networks around the world that can now also promote, sell and provide the same great quality service for the Nexto DI product line. Although the data backup is the main feature of the Nexto DI products, they also support image and video preview features. We’re confident that the combined technologies of TVLogic and Nexto DI will result in new monitor products with built-in recording features in the near future.


HPA Tech Retreat: The production budget vs. project struggle

“Executive producers often don’t speak tech language,” said Aaron Semmel, CEO and head of BoomBoomBooya, in addressing the HPA Tech Retreat audience Palm Springs in late February. “When people come to us with requests and spout all sorts of tech mumbo jumbo, it’s very easy for us to say no,” he continued. “Trust me, you need to speak to us in our language.”

Semmel was part of a four-person HPA panel that included Cirina Catania, The Catania Group; Larry O’Connor, OWC Digital; and Jeff Stansfield, Advantage Video Systems. Moderated by Andy Marken of Marken Communications, the panel explored solutions that can bring the executive and line producers and the production/post teams closer together to implement the right solutions for every project and satisfy everyone, including accounting.

An executive and co-producer on more than a dozen film and TV series projects, Semmel said his job is to bring together the money and then work with the best creative people possible. He added that the team’s job was to make certain the below-the-line items — actual production and post production elements — stay on or below budget.

Semmel noted that most executive producers often work off of the top sheet of the budget, typically an overview of the budget. He explained that executive producers may go through all of the budget and play with numbers here and there but leave the actual handling of the budget to the line producer and supervising producer. In this way, they can “back into” a budget number set by the executive producer.

“I understand the technologies at a higher level and could probably take a highlighter and mark budget areas where we could reduce our costs, but I also know I have very experienced people on the team who know the technologies better than I do to make effective cuts.

L-R: Jeff Stansfield, Aaron Semmel, Cirina Catania

“For example, in talking with many of you in the audience here at the Retreat, I learned that there’s no such thing as an SSD hard drive,” he said. “I now know there are SSDs and there are hard drives and they’re totally different.”

Leaning into her mic, Catania got a laugh when she said, “One of the first things we all have to do is bring our production workflows into the 21st century. But seriously, the production and post teams are occasionally not consulted during the lengthy budgeting process. Our keys can make some valuable contributions if they have a seat at the table during the initial stages. In terms of technology, we have some exciting new tools we’d like to put to work on the project that could save you valuable time, help you organize your media and metadata, and have a direct and immediate positive impact on the budget. What if I told you that you could save endless hours in post if you had software that helped your team enter metadata and prep for post during the early phase — and hardware that worked much faster, more securely and more reliably.”

With wide agreement from the audience, Catania emphasized that it is imperative for all departments involved in prep/production/post and distribution to be involved in the budget process from the outset.

“We know the biggest part of your budget might be above-the-line costs,” she continued. “But production, post and distribution are where much of the critical work also gets done. And if we’re involved at the outset, and that includes with people like Jeff (Stansfield), who can help us come up with creative workflow and financing options, that will save you and the investors’ money, we will surely turn a profit.”

Semmel said the production/post team could probably be of assistance in the early budget stages to pinpoint where work could be done more efficiently to actually improve the overall quality and ensure EPs do what they need to do for their reputation… deliver the best and be under budget.

The Hatfields and the McCoys via History Channel

“But for some items, there seem to be real constraints,” he emphasized. “For example, we were shooting America’s Feud: Hatfields & McCoys, a historical documentary in Romania — yes, Romania,” he grinned; “and we were behind schedule. We shot the farmhouse attack on day one, shot the burning of the house on day two and on day three we received our dailies to review for day one’s work. We were certain we had everything we needed so we took a calculated risk and burned the building,” he recalled. “But no one exhaled until we had a chance to go through the dailies.”

“What if I told you there’s a solution that will transfer your data at 2800MB/s and enable you to turn around your dailies in a couple of hours instead of a couple of days?” O’Connor asked.

Semmel replied, “I don’t understand the 2800MB/s stuff, but you clearly got my attention by saying dailies in a couple of hours instead of days. If there had been anything wrong with the content we had shot, we would have been faced with the huge added expense of rebuilding and reshooting everything,” he added. “Even accounting can understand the savings in hours vs. days.”

Semmel pointed out that because films and TV shows start and end digital, there’s always a concern about frames and segments being lost when you’re on location and a long distance from the safety net of your production facilities.

“No one likes that risk, including production/post leaders, integrators or manufacturers,” said O’Connor. “In fact, a lot of crews go to extraordinary lengths to ensure nothing is lost; and frankly, I don’t blame them.”

He recalled a film crew going to Haiti to shoot a documentary that was told by the airline they were over their limit on baggage for the trip.

“They put their clothes in an airport locker and put their three RAID storage systems in their backpacks. They wanted to make certain they could store, backup and backup their work again to ensure they had all of the content they needed when they got back to their production/post facility.”

Stansfield and Catania said they had seen and heard of similar gut-level decisions made by executive and line producers. They encouraged the production/post audience not to simply accept the line item budgets they are given to work with but be more involved at the beginning of the project to explore and define all of the below-the-line budget to minimize risk and provide alternative plans just in case unexpected challenges arise.

“An EP and line producer’s mantra for TV and film projects is you only get two out of three things: time, money and quality,” Semmel said. “If you can deliver all three, then we’ll listen, but you have to approach it from our perspective.

“Our budgets aren’t open purses,” he continued. “You have to make recommendations and deliver products and solutions that enable us to stay under budget, because no matter how neat they are or how gee-whiz technical they are, they aren’t going to be accepted. We have two very fickle masters — finance and viewer — so you have to give us the tools and solutions that satisfy both of them. Don’t give us bits, bytes and specs, just focus on meeting our needs in words we can understand.

“When you do that, we all win; and we can all work on the next project together,” Semmel concluded. “We only surround ourselves with people who will help us through the project. People who deliver.”

AJA intros new 2TB Pak 2000 SSD recording media

AJA has expanded its line of Pak SSD media with the new 2TB Pak 2000 for Ki Pro Ultra and Ki Pro Ultra Plus recording and playback systems. The company also announced new ordering options for the entire Pak drive family, including HFS+ formatting for Mac OS users and exFAT for PC and universal use.

“With productions embracing high resolution, high frame rate and multi-cam workflows, media storage is a key concern. Pak 2000 introduces a high capacity recording option at a lower cost per GB,” says AJA president Nick Rashby. “Our new HFS+ and exFat options give customers greater flexibility with formatting upon ordering that fits their workflow demands.”

Pak 2000 offers longer recording capacity required for documentaries, news, sports programming and live events, making it suitable for multi-camera HD workflows with the Ki Pro Ultra Plus’s multi-channel HD recording capabilities. The high capacity drive can hold more than four hours of 4K/UltraHD ProRes (HQ), three hours of ProRes 4444 at 30p and up to two hours ProRes (HQ) or 90 minutes of ProRes 4444 at 60p.

Users can get double that length with two Pak drives and rollover support in Ki Pro Ultra and Ki Pro Ultra Plus.

The Pak 2000, and all Pak SSD media modules are now available for order in the following formats and prices:
– Pak 2000-R0 2TB HFS+: $1,795
– Pak 2000-X0 2TB exFAT: $1,795
– Pak 1000-R0 1TB HFS+: $1,495
– Pak 1000-X0 1TB exFAT: $1,495
– Pak 512-R1 512GB HFS+: $995
– Pak 512-X1 512GB exFAT: $995
– Pak 256-R1 256GB HFS+: $495
– Pak 256-X1 256GB exFAT: $495

(For HFS+ and X models for exFAT, order R models.)