Tag Archives: UHD

The Path‘s post path to UHD

By Randi Altman

On a recent visit to the Universal Studios lot in Los Angeles, I had the pleasure of meeting the post team behind Hulu’s The Path, which stars Aaron Paul and Hugh Dancy. The show is about a cult — or as their members refer to it, a movement — that on the outside looks like do-gooders preaching peace and love, but on the inside there are some freaky goings-on.

The first time I watched The Path, I was taken with how gorgeous the picture looked, and when I heard the show was posted and delivered in UHD, I understood why.

“At the time we began to prep season one — including the pilot — Hulu had decided they would like all of their original content shows to deliver in UHD,” explains The Path producer Devin Rich. “They were in the process of upgrading their streaming service to that format so the viewers at home, who had the capability, could view this show in its highest possible quality.”

For Rich (Parenthood, American Odyssey, Deception, Ironside), the difference that UHD made to the picture was significant. “There is a noticeable difference,” he says. “For lack of better words, the look is more crisp and the colors pop. There, of course, is a larger amount of information in a UHD file, which gives us a wider range to make it look how we want it to, or at least closer to how we want it to look.”

L-R: Tauzhan Kaiser, Craig Burdick (both standing), Jacqueline LeFranc and Joe Ralston.

While he acknowledges that as a storyteller UHD “doesn’t make much of a difference” because scripts won’t change, his personal opinion is that “most viewers like to feel as if they are living within the scene rather than being a third-party to the scene.” UHD helps get them there, as does the team at NBCUniversal StudioPost, which consists of editor Jacqueline LeFranc, who focuses on the finishing, adding titles, dropping in the visual effects and making the final file; colorist Craig Budrick; lead digital technical operations specialist Joe Ralston, who focuses on workflow; and post production manager Tauzhan Kaiser.

They were all kind enough to talk to us about The Path’s path to UHD.

Have you done an UHD workflow on any other shows?
Ralston: We have a lot of shows that shoot UHD or high resolution, but The Path was our first television show that finished UHD all the way through.

What is it shot on?
Ralston: They shoot Red 3840×2160, and they also shoot 4800×2700, so almost 5K. UHD is technically twice the height and twice the width of HD, so while it’s still 16×9, resolution-wise it’s double.

From an infrastructure perspective, were you guys prepared to deal with all that data?
Ralston: Yes. At the facility here at NBCUniversal StudioPost, not only do we do TV work, but there’s remastering work — all the centennial titles, for example.

Kaiser: We we’ve done Spartacus. All Quiet on the Western Front, The Birds, Buck Privates, Dracula (1931), Frankenstein, Out of Africa, Pillow Talk, The Sting, To Kill a Mockingbird, Touch of Evil, Double Indemnity, Holiday Inn and King of Jazz.

Ralston: The infrastructure as far as storage and monitoring were already in place here. We knew that this was coming. So slowly the facility has been preparing and gearing up for it. We had been ready, but this was really the first that requested end-to-end UHD. Usually, we do a show that maybe it’s shot UHD or 5K, but they finish in HD, so when we leave the editorial room, we’re then in an HD world. In this case, we were not.

LeFranc: Joe’s group, which is digital tech ops, doesn’t really exist in other places that I know of. They develop, train and work with everybody else in the facility to develop these kind of workflows in order to get ahead of it. So we are prepared, adequately trained and aware of all the pitfalls and any other concerns there might be. That’s a great thing for us, because it’s knowledge.

Other shows have gone UHD, but some in season two, and they were playing catch up in terms of workflow.
Ralston: We’d been thinking about it for a long time. Like I said, the difference with this show, versus some of the other ones who do it is that everyone else, when it got to color, went to HD. This one, when we got to color, we stayed UHD all the way through from there on out.

So, that was really the big difference for a show like this. The big challenges for this one were — and Jacqueline can go into it a little bit more — when you get into things like titling or creating electronic titles, there’s not a lot of gear out there that does that.

Jacqueline, can you elaborate on that?
LeFranc: There were obstacles that I encountered when trying to establish the initial workflow. So, for example, the character generator that is used to create the titles has an option for outputting 4K, but after testing it I realized it wasn’t 4K. It looked like it was just up-rezed.

So I came up with a workflow where, in the character generator, we would make the title larger than we needed it to be and then size it down in Flame. Then we needed a new UHD monitor, the Sony BVMX300. The broadcast monitor didn’t work anymore, because if you want to see UHD in RGB, it has to have a quad-link output.

Craig, did your color process change at all?
Budrick: No, there wasn’t really any change for me in color. The creative process is still the creative process. The color corrector supports a higher resolution file, so it wasn’t an issue of needing new equipment or anything like that.

What systems do you use?
Budrick: We are primarily an Autodesk facility, so we use Flame, Flame Premium and Lustre for color. We also have Avids.

Can you walk us through the workflow?
Ralston: We don’t do the dailies on this project here. It’s all done in New York at Bling. We receive all the camera master files. While they do use drones and a couple of other cameras, a large percent of the show is shot on Epic Red Dragon at 3840×2160.

We get all those camera master files and load them onto our server. Then we receive an Avid bin or sequence from the client and bring that into our Avid in here and we link to those camera master files on the SAN. Once they’re linked, we then have a high-res timeline we can play through. We take the low-res offline version that they gave us and we split it — our editor goes through it and makes sure that everything’s there and matched.

Once that part is complete, we transcode that out to the Avid codec DNX-HR444, which is basically 440Mb and a UHD file that the Avid is outputting. Once we get that UHD file out of the Avid, we flip that UHD DNX-MXF file into a DPX sequence that is a UHD 3840×2160 DPX sequence. That’s where Craig would pick up on color. He would take that DPX sequence and color from there.

Craig, in terms of the look of the show, what direction were you given?
Budrick: They shoot in New York, so the DP Yaron Orbach is in New York. Because of that distance, I had a phone conversation with them to start the look of the show. Then I do a first-day pass, and then he receives the file. Then, he just gives me notes via email on each scene. Then he gets the second file, and hopefully I’m there.

Can you give me an example of a note that he has given?
Budrick: It just might be, you know, let’s add some saturation, or let’s bring this scene down. Maybe make it more moody. Bring down the walls.

Overall, as the show has gone along and the stories have developed it’s gotten a little darker and more twisted, it’s leaned more toward a moody look and not a whole lot of happy.

Ralston: Because of the distance between us and the DP, we shipped a color-calibrated Sony HD monitor to New York. We wanted to make sure that what he was looking at was an exact representation of what Craig was doing.

Jacqueline, any challenges from your perspective other than the titles and stuff?
LeFranc: Just the differences that I noticed — the render time takes a little longer, obviously, because the files are a little bigger. We have to use certain SAN volumes, because some have larger bandwidths.

Ralston: We have 13 production volumes here, and for this particular show — like the feature mastering that we do — the volume is 156TB Quantum that is tuned for 4K. So, in other words, it performs better with these larger files on it.

Did you experiment at all at the beginning?
Ralston: For the first three episodes we had a parallel workflow. Everything we did in UHD, we did in HD as well — we didn’t want the producer showing up to a screening and running into a bandwidth issue. In doing this, we realized we weren’t experiencing bandwidth issues. We kind of underestimated what our SAN could do. So, we abandoned the HD.

Do you think finishing in UHD will be the norm soon?
Ralston: We were unofficially told that this time next year we should plan on doing network shows this way.

Review: BenQ’s 4K/UHD monitor

By Brady Betzel

If you have been dabbling in higher-than 1920×1080 resolution multimedia production, you have likely been investigating a color-accurate and affordable 4K/UHD monitoring solution.

If you’ve Googled the Dolby PRM-4220 professional reference monitor you probably had a heart attack when you saw the near $40K price tag. This monitor is obviously not for the prosumer, or even the work-at-home professional. You may have found yourself in the forum-reading rabbit-hole where Flanders Scientific, Inc. (FSI) comes up a lot — unfortunately, if you aren’t able to shell out between $2K and $8K then you have been left in the dark.

PV3200pt_regular_front2While Dolby, FSI and others, like Sony, have amazing reference monitor solutions they come with that price tag that is hot and fast stop for anyone on a work-at-home budget. This is where the BenQ PV3200PT 32-inch LED backlit LCD IPS monitor comes in.
BenQ has been around for a while. You may remember them being in stores like Best Buy or Circuit City (if you are that old). When I worked at Best Buy, BenQ was the option next to Sony, but I remember thinking, “I’ve never heard if BenQ!” Now, after playing around with the PV3200PT monitor, I know the name BenQ, and I won’t forget it.

Digging In
The 32-inch PV3200PT monitor is a professional multimedia monitor. Not only is it a gigantic and gorgeous 32-inch 10-bit display, it has some great technology running it — including 100% Rec. 709 color accuracy. If you don’t deal with the tech spec-nerdom behind color science, Rec. 709 is the international technical standard for high-definition color given by the Radiocommunication Sector (you’ve probably heard it referred to as CCIR if you’ve heard it at all).

Simply, it’s the standard that color is broadcast across television in a high-definition environment, and if you produce video or other multimedia across televisions you want as close to 100% accuracy in your Rec. 709 color as possible. That way you can be confident that what you are creating on your monitor is technically what most people will see when it is broadcast… whew, that is long and boring but essential to me saying that the PV3200PT monitor is 100% Rec. 709 accurate.

But it is not Rec. 2020 accurate, which is the newer standard applied to ultra-high definition television — think 4K UHD (2160p) and the imminent 8K UHD (4320p). So while you are accurate color-wise in HD space, you can’t necessarily rely on it for the wider range of color values that is offered in UHD, Rec. 2020. This is not necessarily a bad thing, but something to be aware of. And, once you see the price you probably won’t care anyway. As I write this review, it is being sold on BenQ’s website for $1,299. This is a really, really great price for the punch this monitor packs.

As a video editor, I love large, color-accurate monitors. Who doesn’t? I want my whites properly exposed (if possible) and my blacks detailed and dark; it’s a lot to ask for but it’s what I want and what I need when color correcting footage. While using the BenQ PV3200PT, I was not disappointed with its output.

Rotation
I am also testing an HP z1G3 all-in-one workstation at the moment, so I opened the BenQ box and plugged the PV3200PT right into the HP z1G3 mini-displayport and was off and running. I noticed immediately how many ways I could physically move the display around to match the environment I was in, including 90 degrees for some sweet Adobe Photoshop vertical work, visit www.postperspective.com to read all the articles at once, or even use it to display your Media Pool when using Blackmagic’ DaVinci Resolve 12.5 (and, yes, it does work with the vertical display!!) Using the PV3200PT vertically in Resolve was really mind opening and could become a really great way to use such big screen real estate.

To get the PV3200PT to rotate the image vertically I tried using the BenQ provided software, Display Pilot, but eventually realized that I had to use Nvidia’s Control Panel. That did get me into using the Display Pilot to break up the BenQ’s (and the other monitor for that matter) into quadrants to display multiple windows at once easily and efficiently.

I put Adobe Premiere on my left screen and set up the BenQ PV3200PT to have it split three ways: a large left column with Adobe Media Encoder and two right rows with Internet browsers. I really liked that feature, especially because I love to watch tutorials, and this type of set-up allows me to watch and create at the same time. It’s an awesome feature.

When using the PV3200PT I didn’t notice any lag time or smearing, which you can sometimes see on lower-priced monitors. I also noticed that they shipped the monitor with Brightness set to 100% and Color Mode set to Standard, so if you want your eyes to not bug out of your head after 10 hours of work and you want that Rec. 709 color, you need to enable that yourself. Luckily, the menu on the monitor is easy to navigate, which isn’t always the case with monitors so I wanted to make sure to point that out. It isn’t a touch screen monitor, so don’t be a dummy like me and poke at your monitor wondering why the menus aren’t working.

I hand picked a few tech specs below for the BenQ PV3200PT monitor that I felt are important but you can see the entire list here under Specs:
– Resolution: 3840×2160 (UHD – NOT true 4K)
– Native Contrast: 1000:1
– Panel Type: IPS
– Response Time : 5ms
– Display Colors: 1.07 B
– Color Gamut: 100% Rec. 709
– Color Bit 10bits
– Input connectors: HDMI 1.4, Display Port 1.2, mini Display Port 1.2
– Weight: 27 to 33 pounds depending on the mounting option installed

BenQ Vertical ResolveSumming Up
In the end I really loved this monitor, not only for its price but for the technology inside of it. From the beautiful 32-inch IPS real estate to the SD card reader and two USB 3.0 ports built in. I learned that I love the vertical feature and may have to incorporate that into my daily color correction and editing style.

One thing I didn’t mention earlier is the external OSD Controller included that allows you to quickly select between Rec. 709, EBU and SMPTE-C color spaces. Also included is the BenQ proprietary Palette Master Element Calibration Software that allows for custom calibration with devices like the Spyder by @Datacolor.

I would recommend taking a look at this beautiful display if you are in the market for a UHD, 100% Rec. 709 color accurate, adjustable display for around $1,299, if you are lucky enough to get in on that price.

Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter @allbetzroff. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

Dave Cole joins FotoKem as senior colorist

FotoKem has hired Dave Cole as senior colorist, strengthening its DI talent offerings. One of Cole’s first projects for Burbank-based FotoKem, will be Legendary’s upcoming Kong: Skull Island.

Cole’s career began in his native Australia, where he was a telecine operator and technical director, quickly segueing to colorist. His early work includes collaborating with director Peter Jackson and cinematographer Andrew Lesnie, ASC, on color for The Lord of the Rings: The Fellowship of the Ring in 2001 at The PostHouse AG and King Kong in 2004 at Weta Digital.

In 2006, he moved to Los Angeles and joined LaserPacific Media where he was colorist on the Oscar-nominated Ides of March, The Savages, Tron: Legacy, the Alvin and the Chipmunks series, and the Best Cinematography Academy Award-winning Life of Pi.

Most recently, at Modern VideoFilm, Cole was supervising colorist on titles such as The Book of Life, Eye in the Sky, Alvin and the Chipmunks: The Road Chip, An Ordinary Man and created looks for TV series such as Sleepy Hollow, Reign and Scorpion.

In addition to his colorist duties, Cole has been helping in the development of emerging HDR technologies for manufacturers and studios, as well as providing HDR grading for several major home theater releases.

Cole joins a colorist FotoKem team that includes Alastor Arnold, John Daro, Mark Griffith, George Koran, Kostas Theodosiou and Walter Volpatto. These colorists have worked on such titles as San Andreas, The Boxtrolls, Palo Alto, The D Train, Interstellar, The Conjuring 2, Independence Day: Resurgence and Central Intelligence. The team calls on Blackmagic DaVinci Resolve and SGO’s Mistika.

New England SMPTE holding free session on UHD/HDR/HFR, more

The New England Section of SMPTE is holding a free day-long “New Technologies Boot Camp” that focuses on working with high resolution (UHD, 4K and beyond), high-dynamic-range (HDR) imaging and higher frame rates (HFR). In addition, they will discuss how to maintain resolution independence on screens of every size, as well as how to leverage IP and ATSC 3.0 for more efficient movement of this media content.

The boot camp will run from 9am to 9pm on May 19 at the Holiday Inn in Dedham, Massachusetts.

“These are exciting times for those of us working on the technical side of broadcasting, and the array of new formats and standards we’re facing can be a bit overwhelming,” says Martin P. Feldman, chair of SMPTE New England Section. “No one wants — or can afford — to be left behind. That’s why we’re gathering some of the industry’s foremost experts for a free boot camp designed to bring engineers up to speed on new technologies that enable more efficient creation and delivery of a better broadcast product.”

Boot camp presentations will include:

• “High-Dynamic-Range and Wide Color Gamut in Production and Distribution” by Hugo Gaggioni, chief technical officer at Sony Electronics.
• “4K/UHD/HFR/HDR — HEVC H.265 — ATSC 3.0” by Karl Kuhn of Tektronix.
• “Where Is 4K (UHD) Product Used Today — 4K Versus HFR — 4K and HFR Challenges” by Bruce Lane of Grass Valley.
• “Using MESH Networks” by Al Kornak of JVC Kenwood Corporation.
• “IP in Infrastructure-Building (Replacing HD-SDI Systems and Accommodating UHD)” by Paul Briscoe of Evertz Microsystems;
• “Scripted Versus Live Production Requirements” by Michael Bergeron of Panasonic.
• “The Transition from SDI to IP, Including IP Infrastructure and Monitoring” by John Shike of SAM (formerly Snell/Quantel).
• “8K, High-Dynamic-Range, OLED, Flexible Displays” by consultant Peter Putman.
• “HDR: The Great, the Okay, and the WTF” by Mark Schubin, engineer-in-charge at the Metropolitan Opera, Sesame Street and Great Performances (PBS).

The program will conclude with a panel discussion by the program’s presenters.

 No RSVP is required, and both SMPTE members and non-members are welcome.

Quick Chat: SGO CEO Miguel Angel Doncel

By Randi Altman

When I first happened upon Spanish company SGO, they were giving demos of their Mistika system on a small stand in the back of the post production hall at IBC. That was about eight years ago. Since then, the company has grown its Mistika DI finishing system, added a new product called Mamba FX, and brought them both to the US and beyond.

With NAB fast approaching, I thought I would check in with SGO CEO Miguel Angel Doncel to find out how the company began, where they are now and where they are going. I also checked in about some industry trends.

Can you talk about the genesis of your company and the Mistika product?
SGO was born out of a technically oriented mentality to find the best ways to use open architectures and systems to improve media content creation processes. That is not a challenging concept today, but it was an innovative view in 1993 when most of the equipment used in the industry was proprietary hardware. The idea of using computers to replace proprietary solutions was the reason SGO was founded.

It seems you guys were ahead of the curve in terms of one product that could do many things. Was that your goal from the outset?
Ten years ago, most of the manufacturers approached the industry with a set of different solutions to address different parts of the workflow; this gave us an opportunity to capitalize on improving the workflow, as disjointed solutions imply inefficient workflows due to their linearity/sequentiality.

We always thought that by improving the workflow, our technology would be able to play in all those arenas without having to change the tools. Making the workflow parallel and saving time when a problem is detected avoids going backwards in the pipeline, and we can focus moving forward.

I think after so many years, the industry is saying we were right, and all are going in that direction.

How is SGO addressing HDR?
We are excited about HDR, as it really improves the visual experience, but at the same time it is a big challenge to define a workflow that can work in both HDR and SDR in a smooth way. Our solution to that challenge is the four-dimensional grading that is implemented with our 4th ball. This allows the colorist to work not only in the three traditional dimensions — R, G and B — but also to work in the highlights as a parallel dimension.

What about VR?
VR pieces together all the requirements of the most demanding 3D with the requirements of 360. Considering what SGO already offers in stereo 3D production, we feel we are well positioned to provide a 360/VR solution. For that reason, we want to introduce a specific workflow for VR that helps customers to work on VR projects, addressing the most difficult requirements, such as discontinuities in the poles, or dealing with shapes.

The new VR mode we are preparing for Mistika 8.7 will be much more than a VR visualization tool. It will allow users to work in VR environments the same way they would work in a normal production. Not having to worry about circles ending up being highly distorted ellipses and so forth.

What do you see as the most important trends happening in post and production currently?
The industry is evolving in many different directions at the moment — 8K realtime, 4K/UHD, HDR, HFR, dual-stream stereo/VR. These innovations improve and enhance the audience’s experience in many different ways. They are all interesting individually, but the most vital aspect for us is that all of them actually have something in common — they all require a very smart way of how to deal with increasing bandwidths. We believe that a variety of content will use different types of innovation relevant to the genre.

Where do you see things moving in the future?
I personally envision a lot more UHD, HDR and VR material in the near future. The technology is evolving in a direction that can really make the entertainment experience very special for audiences, leaving a lot of room to still evolve. An example is the Quantum Break game from Remedy Studios/Microsoft, where the actual users’ experience is part of the story. This is where things are headed.

I think the immersive aspect is the challenge and goal. The reason why we all exist in this industry is to make people enjoy what they see, and all these tools and formulas combined together form a great foundation on which to build realistic experiences.

Digging Deeper: NASA TV UHD executive producer Joel Marsden

It’s hard to deny the beauty of images of Earth captured from outer space. And NASA and partner Harmonic agree, boldly going where no one has gone before — creating NASA TV UHD, the first non-commercial consumer UHD channel in North America. Leveraging the resolution of ultra high definition, the channel gives viewers a front row seat to some gorgeous views captured from the International Space Station (ISS), other current NASA missions and remastered historical footage.

We recently reached out to Joel Marsden, executive producer of NASA TV UHD, to find out how this exciting new endeavor reached “liftoff.”

Joel Marsden

Joel Marsden

This was obviously a huge undertaking. How did you get started and how is the channel set up?
The new channel was launched with programming created from raw video footage and imagery supplied by NASA. Since that time, Harmonic has also shot and contributed 4K footage, including video of recent rocket launches. They provide the end-to-end UHD video delivery system and post production services while managing operations. It’s all hosted at a NASA facility managed by Encompass Digital Media in Atlanta, which is home to the agency’s satellite and NASA TV hubs.

Like the current NASA TV channels, and on the same transponder, NASA TV UHD is transmitted via the SES AMC-18C satellite, in the clear, with a North American footprint. The channel is delivered at 13.5Mbps, as compared with many of the UHD demo channels in the industry, which have required between 50 and 100 Mbps. NASA’s ability to minimize bandwidth use is based on a combination of encoding technology from Harmonic in conjunction with the next-generation H.265 HEVC compression algorithm.

Can you talk about how the footage was captured and how it got to you for post?
When the National Aeronautics and Space Act of 1958 was created, one of the legal requirements of NASA was to keep the public apprised of its work in the most efficient means possible and with the ultimate goal of bringing everyone on Earth as close as possible to being in space. Over the years, NASA has used imagery as the primary means of demonstration. The group in charge of these efforts, the NASA Imagery Experts Program, provides the public with a wide array of digital television, web video and still images based on the agency’s activities. Today, NASA’s broadcast offerings via NASA TV include an HD consumer channel, an HD media channel and an SD education channel.

In 2015, the agency introduced NASA TV UHD. Naturally, NASA archives provide remastered footage from historical missions and shots from NASA’s development and training processes, all of which are used for production of broadcast programming. In fact, before the agency launched NASA TV, it had already begun production of its own documentary series, based on footage collected during missions.

Just five or six years ago, NASA also began documenting major events in 4K resolution or higher. The agency has been using 6K Red Dragon digital cinema cameras for some time. NASA TV UHD video content is sourced from high-resolution images and video generated on the ISS, Hubble Space Telescope and other current NASA missions. The raw content files are then sent to Harmonic for post.

Can you walk us through the workflow?
Raw video files are mailed on physical discs or sent via FTP from a variety of NASA facilities to Harmonic’s post studio in San Jose and stored on the Harmonic MediaGrid system, which supports an edit-in-place workflow with Final Cut Pro and other third-party editing tools.

During the content processing phase, Harmonic uses Adobe After Effects to paint out dead pixels that result from the impact of cosmic radiation on camera sensors. They have built bad-pixel maps that they use in post production to remove the distracting white dots from the picture. The detail of UHD means that the footage also shows scratches on the windows of the ISS through which the camera is shooting, but these are left in for authenticity.

 

A Blackmagic DaVinci Resolve is used to color grade footage, and Maxon Cinema 4D Studio is used to create animations of images. Final Cut Pro X and Adobe Creative Suite are used to set the video to music and add text and graphics, along with the programming name, logo and branding.

Final programs are then transferred in HD back to the NASA teams for review, and in UHD to the Harmonic team in Atlanta to be loaded onto the Spectrum X for playout.

————

You can check out NASA TV’s offerings here.

Why fast file transfers are critical to video production, post


By Katie Staveley

Accelerated file transfer software is not new. It’s been around for many years and has been used by the world’s largest media brands. For those teams of content producers, it has been a critical piece of their workflow architecture, but it wasn’t until recently that this kind of software has become more accessible to every size company, not just the largest. And just in time.

It goes without saying that the process of producing and delivering content is ever-evolving. New problems and, as a result, new solutions arise all the time. However, a few challenges in particular seem to define the modern media landscape, including support for a globally distributed team, continuous demand for high-resolution content and managing the cost of production.

KatieStaveley_i[1]

These challenges can be thought of from many different angles, and likewise resolved in different ways. One aspect that is often overlooked is how organizations are moving their precious video content around as part of the pre-production, post and distribution phases of the workflow. The impact of distributed teams, higher resolution content and increasing costs are driving organizations of all sizes to rethink how they are moving content. Solutions that were once “good enough” to get the job done — like FTP or shipping physical media — are rapidly being replaced with purpose-built file transfer tools.

Here are some of the reasons why:

1. Distributed teams require a new approach
Bringing final content to market very rarely happens under one roof or in one location anymore. More and more teams of media professionals are working around the globe. Obviously, production teams work remotely when they are filming on location. And now, with the help of technology, media organizations can build distributed teams and get access to the talent they need regardless of where they’re located, giving them a competitive advantage. In order to make this work well, organizations need to consider implementing a fast file transfer solution that is not only accessible globally, but moves large files fast, especially when bandwidth speeds are less than optimal.

2. File sizes are growing
The demand for higher resolution content is driving innovation of production technology like cameras, audio equipment and software. While HD and even Ultra HD (UHD) content is becoming more mainstream, media professionals have to think about how their entire toolset is helping them meet those demands. High-resolution content means exponentially larger files sizes. Moving large files around within the prepro and post workflows, or distributing final content to clients, can be especially difficult when you don’t have the right tools in place. If your team is delivering HD or UHD content today, or plans to in the future, implementing a fast file transfer solution that will help you send content of any size without disrupting your business is no longer a nice-to-have. It’s business critical.

3. You can’t afford delays
When it comes to getting your files where they need to be, hope is not a strategy. The reality is that production will often finish up later than you hoped. Deadlines are hard and you still need to get your content out the door. Any number of factors can cause you to miss deadlines, but transferring content files shouldn’t be your biggest delay. You can’t afford slow transfer times, or even worse, interruptions that force you start the transfer all over again. Implementing a solution that gives you reliable, fast file transfer and predictability around when your files will arrive is a strategy. Not only will it enable your employees and partners to focus on producing the content, it will help you to create a positive experience for your customers whether they are reviewing pre-release content, or receiving the final cut.

4. Customer experience matters
Any time your customers are interacting with your brand they are forming an opinion of you. In today’s highly-competitive world, it’s imperative that you delight your customers with the content you’re producing and their experience working with you. Your file transfer solution is part of building that positive experience. The solution needs to be reliable and fast and not leave your customers disappointed because the file didn’t arrive when they expected; or make them feel frustrated because it was too painful to use. They should be able to focus on your content, not on how you’re delivering it to them — your solution should just work. It’s a necessary part of today’s media business to have a cost-efficient, low-maintenance way to send and share content that ensures a delightful customer experience.

5. Your business is growing
Moving digital video content has been part of the media business for over a decade, and there have been solutions that have worked well enough for many organizations. But when considering the rapid growth in file sizes, increased distribution of teams and the importance of customer experience, you’ll find that those solutions are not built to scale as your business grows. Planning for the future means finding a solution that has flexibility of deployment, is easy to manage and maintain, and the cost of expansion is proportional to your size. Growth is hard, but managing your file transfer tools doesn’t have to be.

Managing cost and keeping profit margins healthy is as imperative as always. Fortunately the days where every technology purchase requires significant capital investment are waning. The good news is that the availability of cloud-hosted solutions and other advancements have given rise to powerful solutions that are accessible to every size company. As a result, media professionals have affordable access to the technology they need to stay competitive without breaking the bank, which includes fast file transfer software. Investing today in the right solution will make a big impact on your business now and into the future.

Katie Staveley is VP of marketing at Signiant.

Colorfront demos UHD HDR workflows at SMPTE 2015

Colorfront used the SMPTE 2015 Conference in Hollywood to show off the capabilities of its upcoming 2016 products supporting UHD/HDR workflows. New products include the Transkoder 2016 and On-Set Dailies 2016. Upgrades allow for faster, more flexible processing of the latest UHD HDR camera, color, editorial and deliverables formats for digital cinema, high-end episodic TV and OTT Internet entertainment channels.

Colorfront’s Bruno Munger filled us in on some of the highlights:

More details:
·   Transkoder and On-Set Dailies feature Colorfront Engine, an ACES-compliant, HDR-managed color pipeline, enabling on-set look creation and ensuring color fidelity of UHD/HDR materials and metadata though the camera-to-post chain. Colorfront Engine supports the full dynamic range and color gamut of the latest digital camera formats and mapping into industry-standard deliverables such as the latest IMF specs, AS-11 DPP and HEVC, at a variety of brightness, contrast and color ranges in current display devices.
·   The mastering toolset for Transkoder 2016 is enhanced with new statistical analysis tools for immediate HDR data graphing. Highlights include MaxCLL and MaxFALL calculations, as well as HDR mastering tools with tone and gamut mapping for a variety of target color spaces, including Rec. 2020 and P3D65, as well as XYZ, PQ curve and BBC-NHK Hybrid Log Gamma.
·    New for Transkoder 2016 are tools to concurrently color grade HDR and SDR UHD versions, cutting down the complexity, time and cost of delivering multiple masters at once.
·    Transkoder 2016 will output simultaneous, realtime grades on 4K 60p material to dual Sony OLED BVM-X300 broadcast monitors — concurrently processing HDR 2084 PQ Rec. 2020 at 1000nits and SDR Rec. 709 at 100nits — while visually graphing MaxFALL/MaxCLL light values per frame.

Advanced dailies toolsets enhancements include:
·    Support for the latest camera formats, including full Panasonic Varicam35 VRAW, AVC Intra 444, 422 and LT support, Canon EOS C300 Mark II with new Canon Log2 Gamma, ARRI Alexa 65 and Alexa SXT, Red Weapon, Sony XAVC and the associated image metadata from all of these.
·    The new Multi-view Dailies capability for On-Set Dailies 2016, which allows concurrent, realtime playback and color grading of all cameras and camera views.
·    Transwrapping, which allows video essence data (the RAW, compressed audio/video and metadata inside a container such as MXF or MOV) to be passed through the transcoding process without re-encoding, enabling frame-accurate insert editing on closed digital deliverables. This workflow can be a great time saver in day-to-day production, allowing Transkoder users to quickly generate new masters based on changes and versioning of content in the major mastering formats, like IMF, DCI and ProRes, and efficient trimming of camera original media for VFX pulls and final conform from Arri, Red and Sony cameras.

IBC 2015 Blog: Rainy days but impressive displays, solutions

By Robert Keske

While I noted in my first post that we were treated to beautiful weather in Amsterdam during the first days of IBC 2015, the weather on day four was not quite as nice… it was full of rain and thunderstorms, the latter of which was heard eerily through the RAI Exhibition Centre.

CLIPSTER

The next-gen Clipster

I spent day three exploring content delivery and automation platforms.

Rohde & Schwarz’s next-gen Clipster is finally here and is a standout — built on an entirely new hardware platform. It’s seamless, simplified, faster and looks to have a hardware and software future that will not require a forklift upgrade. 

Colorfront, also a leader in on-set dailies solutions, has hit the mark with its Transkoder product. The new HDR mathematical node is nothing less than impressive, which is nothing less than expected from Colorfront engineering.

Colorfront Transkoder

Colorfront Transkoder

UHD and HDR were also forefront at the show as the need for higher quality content continues to grow, and I spent day four examining these emerging display and delivery technologies. Both governments and corporate entities are leading the global community towards delivery of UHD to households starting in 2015, so I was especially interested in seeing how display and content providers would be raising the standards in display tech.

Sony, Samsung and Panasonic (our main image) all showcased impressive results to support UHD and HDR, and I’m looking forward to seeing what further developments and improvements the industry has to offer for both professional and consumer adoption.

Overall, while its seemed like a smaller show this year, I’ve been impressed by the quality of technology on display. IBC never fails to deliver a showcase of imagination and innovation and this year was no different.  

New York-based Robert Keske is CIO/CTO at Nice Shoes (@NiceShoesOnline).

New firmware and Atomos Shogun support for AJA’s CION camera

AJA has released version 1.2 firmware for its CION 4K/UltraHD and 2K/HD professional production cameras.

CION v.1.2 updates offer improved white balance performance for overexposed image portions; improved video levels with higher IRE values available for various EI, gamma and color correction combinations; additional gamma and color correction options for higher EI 800 and EI 1000 values; and new internal video gamma LUT for external monitoring of the “expanded 1” or “disabled” gamma selections when an external LUT device is not in use.

In addition, an automatic white-balance alarm notifies users if an image does not contain a sufficient value of white or grey to perform an appropriate white balance, which in turn triggers the software to revert to the unity setting. AJA has also added an interval record (timelapse) indicator for the superimposed monitoring overlay, and SMPTE or full RGB range values can now be selected for the main SDI outputs.

CION v.1.2 firmware is field-upgradeable and can be uploaded to the camera via a built-in Web interface with a standard Web browser. The free update is available for download from https://www.aja.com/en/products/cion#support.

In other CION camera news, the Atomos Shogun portable recorder now supports external recording of AJA Raw files captured by the CION camera. Atomos Shogun users can record CION’s AJA Raw files at 4K and UltraHD resolutions at up to 60fps.