Evercast 5.4. Ad 5, Evercast Ad 6 5.18

Category Archives: 4k

Quick Chat: SGO CEO Miguel Angel Doncel

By Randi Altman

When I first happened upon Spanish company SGO, they were giving demos of their Mistika system on a small stand in the back of the post production hall at IBC. That was about eight years ago. Since then, the company has grown its Mistika DI finishing system, added a new product called Mamba FX, and brought them both to the US and beyond.

With NAB fast approaching, I thought I would check in with SGO CEO Miguel Angel Doncel to find out how the company began, where they are now and where they are going. I also checked in about some industry trends.

Can you talk about the genesis of your company and the Mistika product?
SGO was born out of a technically oriented mentality to find the best ways to use open architectures and systems to improve media content creation processes. That is not a challenging concept today, but it was an innovative view in 1993 when most of the equipment used in the industry was proprietary hardware. The idea of using computers to replace proprietary solutions was the reason SGO was founded.

It seems you guys were ahead of the curve in terms of one product that could do many things. Was that your goal from the outset?
Ten years ago, most of the manufacturers approached the industry with a set of different solutions to address different parts of the workflow; this gave us an opportunity to capitalize on improving the workflow, as disjointed solutions imply inefficient workflows due to their linearity/sequentiality.

We always thought that by improving the workflow, our technology would be able to play in all those arenas without having to change the tools. Making the workflow parallel and saving time when a problem is detected avoids going backwards in the pipeline, and we can focus moving forward.

I think after so many years, the industry is saying we were right, and all are going in that direction.

How is SGO addressing HDR?
We are excited about HDR, as it really improves the visual experience, but at the same time it is a big challenge to define a workflow that can work in both HDR and SDR in a smooth way. Our solution to that challenge is the four-dimensional grading that is implemented with our 4th ball. This allows the colorist to work not only in the three traditional dimensions — R, G and B — but also to work in the highlights as a parallel dimension.

What about VR?
VR pieces together all the requirements of the most demanding 3D with the requirements of 360. Considering what SGO already offers in stereo 3D production, we feel we are well positioned to provide a 360/VR solution. For that reason, we want to introduce a specific workflow for VR that helps customers to work on VR projects, addressing the most difficult requirements, such as discontinuities in the poles, or dealing with shapes.

The new VR mode we are preparing for Mistika 8.7 will be much more than a VR visualization tool. It will allow users to work in VR environments the same way they would work in a normal production. Not having to worry about circles ending up being highly distorted ellipses and so forth.

What do you see as the most important trends happening in post and production currently?
The industry is evolving in many different directions at the moment — 8K realtime, 4K/UHD, HDR, HFR, dual-stream stereo/VR. These innovations improve and enhance the audience’s experience in many different ways. They are all interesting individually, but the most vital aspect for us is that all of them actually have something in common — they all require a very smart way of how to deal with increasing bandwidths. We believe that a variety of content will use different types of innovation relevant to the genre.

Where do you see things moving in the future?
I personally envision a lot more UHD, HDR and VR material in the near future. The technology is evolving in a direction that can really make the entertainment experience very special for audiences, leaving a lot of room to still evolve. An example is the Quantum Break game from Remedy Studios/Microsoft, where the actual users’ experience is part of the story. This is where things are headed.

I think the immersive aspect is the challenge and goal. The reason why we all exist in this industry is to make people enjoy what they see, and all these tools and formulas combined together form a great foundation on which to build realistic experiences.

Digging Deeper: NASA TV UHD executive producer Joel Marsden

It’s hard to deny the beauty of images of Earth captured from outer space. And NASA and partner Harmonic agree, boldly going where no one has gone before — creating NASA TV UHD, the first non-commercial consumer UHD channel in North America. Leveraging the resolution of ultra high definition, the channel gives viewers a front row seat to some gorgeous views captured from the International Space Station (ISS), other current NASA missions and remastered historical footage.

We recently reached out to Joel Marsden, executive producer of NASA TV UHD, to find out how this exciting new endeavor reached “liftoff.”

Joel Marsden

Joel Marsden

This was obviously a huge undertaking. How did you get started and how is the channel set up?
The new channel was launched with programming created from raw video footage and imagery supplied by NASA. Since that time, Harmonic has also shot and contributed 4K footage, including video of recent rocket launches. They provide the end-to-end UHD video delivery system and post production services while managing operations. It’s all hosted at a NASA facility managed by Encompass Digital Media in Atlanta, which is home to the agency’s satellite and NASA TV hubs.

Like the current NASA TV channels, and on the same transponder, NASA TV UHD is transmitted via the SES AMC-18C satellite, in the clear, with a North American footprint. The channel is delivered at 13.5Mbps, as compared with many of the UHD demo channels in the industry, which have required between 50 and 100 Mbps. NASA’s ability to minimize bandwidth use is based on a combination of encoding technology from Harmonic in conjunction with the next-generation H.265 HEVC compression algorithm.

Can you talk about how the footage was captured and how it got to you for post?
When the National Aeronautics and Space Act of 1958 was created, one of the legal requirements of NASA was to keep the public apprised of its work in the most efficient means possible and with the ultimate goal of bringing everyone on Earth as close as possible to being in space. Over the years, NASA has used imagery as the primary means of demonstration. The group in charge of these efforts, the NASA Imagery Experts Program, provides the public with a wide array of digital television, web video and still images based on the agency’s activities. Today, NASA’s broadcast offerings via NASA TV include an HD consumer channel, an HD media channel and an SD education channel.

In 2015, the agency introduced NASA TV UHD. Naturally, NASA archives provide remastered footage from historical missions and shots from NASA’s development and training processes, all of which are used for production of broadcast programming. In fact, before the agency launched NASA TV, it had already begun production of its own documentary series, based on footage collected during missions.

Just five or six years ago, NASA also began documenting major events in 4K resolution or higher. The agency has been using 6K Red Dragon digital cinema cameras for some time. NASA TV UHD video content is sourced from high-resolution images and video generated on the ISS, Hubble Space Telescope and other current NASA missions. The raw content files are then sent to Harmonic for post.

Can you walk us through the workflow?
Raw video files are mailed on physical discs or sent via FTP from a variety of NASA facilities to Harmonic’s post studio in San Jose and stored on the Harmonic MediaGrid system, which supports an edit-in-place workflow with Final Cut Pro and other third-party editing tools.

During the content processing phase, Harmonic uses Adobe After Effects to paint out dead pixels that result from the impact of cosmic radiation on camera sensors. They have built bad-pixel maps that they use in post production to remove the distracting white dots from the picture. The detail of UHD means that the footage also shows scratches on the windows of the ISS through which the camera is shooting, but these are left in for authenticity.

 

A Blackmagic DaVinci Resolve is used to color grade footage, and Maxon Cinema 4D Studio is used to create animations of images. Final Cut Pro X and Adobe Creative Suite are used to set the video to music and add text and graphics, along with the programming name, logo and branding.

Final programs are then transferred in HD back to the NASA teams for review, and in UHD to the Harmonic team in Atlanta to be loaded onto the Spectrum X for playout.

————

You can check out NASA TV’s offerings here.

Evercast 5.4. Ad 5, Evercast Ad 6 5.18

Atomos brings 4K HDR monitors/recorders on set

Atomos, makers on the Shogun and Ninja on-set systems, has introduced new products targeting 4K HDR and offering brightness and detail simultaneously — field monitor/recorders Shogun Flame and Ninja Flame.

The Atomos Flame Series features a calibrated 7-inch field monitor, which displays 10 stops of the luminance detail of LOG with 10-bit HDR post production color accuracy. While the AtomHDR engine resolves HDR brightness detail (dynamic range) with 10-bit color accuracy, it also resolves 64 times more color information than traditional 8-bit panels. For Rec709 standard dynamic range scenes, the 1500nits brightness aids with outdoor shooting as does the upgraded continuous power management system that allows users to shoot shooting longer in the field. The Flame series, which offers a cost effective solution to the growing demands for HDR image capture and on set viewing, also features pro 4K/HD Apple ProRes/DNxHR recording, playback and editing.

We threw a few questions at Atomos president Matt Cohen (who many of you know from Tekserve) regarding the new gear…

What’s the most important thing people should know about Atomos Flame Series?
Atomos Flame Series products empower realtime visualization of HDR on set and in post using LOG from the camera and our 10-bit AtomHDR processing. We explain it here

Are we really looking at 10 different stops? That’s a ton of bandwidth. How can you accomplish that?
It’s magic! Well, not really… it’s math. LOG is the trick; it basically takes the range and squishes it. That’s why when looking at a LOG signal it looks all washed out — it has transformed the brightest part into something representable using current technology same with the darkest areas. So, basically, it is making the changes much more subtle in the image that can then be interpreted into the true levels they represent. We do that in realtime with AtomHDR.

The Series works with all cameras, as long as they have a live video tap?
Our products have always been compatible with cameras that have a clean output, meaning there is no menu data overlaid or degradation to the output. We get the pristine image from the sensor before any of the compression or degradation that occurs recording internally to the camera.

In how many different ways can this be used on set? Focus, Composition, Color, etc?
All the Atomos Monitor tools are still available. We have very accurate scopes and focus tools. You can use this on set for all aspects you describe. There is calibrated monitoring focus and exposure tools, including waveform vectorscope and RGB Parade, We have Graticule support and even support to de-squeeze anamorphic. We also feel confident the Flame tools will be very valuable in post production, enabling most systems to work with HDR and UWG (Ultra Wide Gamut) content.

Here are some key features of the Flame Series:
– AtomHDR monitors, which offer a dynamic range to match that of a 10-bit camera LOG footage, provide the detail in highlights and shadows usually clipped on traditional monitors.

– Is an advanced field monitor even in non-HDR scenarios with 1500nits brightness for outdoor shooting, native full HD resolution and optional calibration to ensure natural LCD color drift can be corrected over time.

– Users can  record directly from the sensor in 4K UHD (up to 30p) or record high frame rate HD (up to 120p).

– Along with recording the high pixel density of 4K, the Ninja and Shogun Flame also record higher resolution 10-bit color information and more precise yet efficient 4:2:2 color encoding.

– Recording to Apple ProRes and Avid DNxHR visually lossless edit-ready codecs allow users to capture full individual frames like film, allowing for more flexibility and creativity in post.

– The Series features an armor protection, dual battery hot-swappable continuous power system and included accessories, such as a new fast charger and snap-fast sun hood.

– Atomos’ hot-swappable dual battery system for continuous power is backed up with the included power accessories (2 x 4-cell batteries, D-Tap adaptor and fast battery charger).

– There are focus and exposure tools, 3D custom Looks, waveforms (LUMA and RGB) and vectorscopes.

– XLR audio via breakout cables are available for Shogun Flame or 3.5mm line level input with audio delay, level adjustment and dedicated audio meters with channel selection for Ninja Flame.

– The Flame Series supports affordable, readily available SSDs.

Shogun Flame and Ninja Flame are available for sale on March 28.


A glimpse at what Sony has in store for NAB

By Fergus Burnett

I visited Sony HQ in Manhattan for their pre-NAB Show press conference recently. In a board room with tiny muffins, mini bagels and a great view of New York, we sat pleasantly for a few hours to learn about the direction the company is taking in 2016.

Sony announced details for a slew of 4K-, HDR-capable broadcast cameras and workflow systems, all backwards compatible with standard HD to ease the professional and consumer transition to Ultra-HD.

As well as broadcast and motion picture, Sony’s Pro division has a finger in the corporate, healthcare, education and faith markets. They have been steadily pushing their new products and systems into universities, private companies, hospitals and every other kind of institution. Last year, they helped to fit out the very first 4K church.

I work as a DIT/dailies technician in the motion picture industry rather than broadcast, so many of these product announcements were outside my sphere of professional interest, but it was fascinating to gain an understanding of the immense scale and variety of markets that Sony is working in.

There were only a handful of new additions the CineAlta (pictured) line, firmware updates for the F5 and F55, and a new 4K recording module. These two cameras have really endured in popularity since their introduction in 2012.

The new AXS-R7 recording module (right) offers a few improvements over its predecessor the AXS-R5. It’s capable of full 4K up to 120fps and has a nifty 30-second cache capability, which is going to be really useful for shooting water droplets in slow motion. The AXS-R7 uses a new kind of high-speed media card that looks like a slightly smaller SxS — it’s called AXSM-S48. Sony is really on fire with these names!

A common and unfortunate problem when I am dealing with on-set dailies is sketchy card readers. This is something that ALL motion picture camera companies are guilty of producing. USB 3.0 is just not fast enough when copying huge chunks of critical camera data to multiple drives, and I’ve found the power connector on the current AXS card reader to be touchy on separate occasions with different readers, causing the card to eject in the midst of offloading. Though there are no details yet, I was assured that the AXSM-S48 reader would use a faster connection than USB 3.0. I certainly hope so; it’s a weak point in what is otherwise a fairly trouble-free camera ecosystem.

Looming at the top of the CineAlta lineup, the F65 is still Sony’s flagship camera for cinema production. Its specs were outrageous four years ago and still are, but it never became a common sight on film sets. The 8K resolution was mostly unnecessary even for top-tier productions. I inquired where Sony saw the F65 sitting among its competition, from Arri and Red, as well as their own F55 which has become a staple of TV drama.

Sony sees the F65 as their true cinema camera, ideally suited for projection on large screens. They admitted that while uptake of the camera was slow after its introduction, rentals have been increasing as more DPs gain experience with the camera, enjoying its low-light capabilities, color gamut and sheer physical bulk.

Sony manufactures a gigantic fleet of sensible, soberly named cameras for every conceivable purpose. They are very capable production tools, but it’s only a small part of Sony’s overall strategy.

With 4K HDR delivery fast becoming standard and expected, we are headed for a future world where pictures are more appealing than reality. From production to consumption, Sony could well be set to dominate that world. We already watch Sony-produced movies shot on Sony cameras playing on Sony screens, and we listen to Sony musicians on Sony stereos as we make our way to worship the God of sound and vision in a 4K church.

Enjoy NAB everyone!


Sony gives Rita Hayworth, Gene Kelly a 4K make-over in ‘Cover Girl’

Sony Pictures Entertainment has completed an all-new 4K restoration of Cover Girl, director Charles Vidor’s 1944 Technicolor musical that starred Rita Hayworth and Gene Kelly. The restoration, completed under the supervision of Sony’s Grover Crisp, premiered at New York’s Museum of Modern Art in New York during Preserve and Project, its 13th international festival of film preservation.

Cover Girl was Columbia Pictures’ first big film shot in the Technicolor three-strip process. For the new 4K restoration, the team went back to the original 3-strip nitrate camera negatives.

“There was a preservation initiative with this film in the 1990s that involved making some positive intermediate elements for video transfer, but our current process dictates that we source the most original materials possible to come up with the best visual result for our 4K workflow,” recalls Crisp, who is EVP of asset management, film restoration and digital mastering at Sony Pictures. “The technical capabilities that we have now allow us to digitally recombine the three separate black and white negatives to create a color image that is virtually free of the fuzzy registration issues inherent in the traditional analog work, in addition to the usual removal of scratches and other physical flaws in the film.”

Crisp says they tried to stay as true to the Technicolor look as possible. “That specific kind of look is impossible to match exactly as it was in the original work from the 1940s and 1950s for a variety of reasons. With original sources for reference, however, it gives us a good target to aim for.”

The greater color range facilitated the recreation of a Technicolor look that is as authentic as possible, especially where original dye transfer prints were available as reference points.

In terms of challenges, Crisp says that aside from the usual number of torn frames, scratches and dirt imbedded in the emulsion of the film, there is always the issue of color breathing when working with the 3-strip Technicolor films.  “It is an inconsistent problem and can be very difficult to address,” he explains. “Kevin Manbeck at MTI Film has developed algorithms to compensate and correct for this problem and that is a big advancement.”

The film was scanned at Cineric in New York City on their proprietary 4K wetgate scanner.

“Working with our colorist, Sheri Eisenberg, we strived to get the colors, with deep blacks and vibrant reds, right.”

She called on the FilmLight Baselight 8 for the color at Deluxe (formerly ColorWorks) in Culver City. “It is a very robust color correction system, and one that we have used for years on our work,” says Crisp. “The lion’s share of the image restoration was done at L’Immagine Ritrovata, a film restoration and conservation facility in Bologna, Italy.  They use a variety of software for image cleanup, though much of this kind of work is manual. This means a lot of individuals sitting at digital workstations working on one frame at a time.  At MTI Film, here in Los Angeles, some of the final image restoration was completed, mostly for the removal of gate hairs in numerous shots, something that is very difficult to achieve without leaving digital artifacts.”