Category Archives: UHD

Review: Sony’s a6300 E-mount camera

By Brady Betzel

It’s fair to say that the still and motion camera market isn’t boring. Canon and Nikon have been the huge players in the market, more so a few years ago when Canon introduced the landscape-changing 7D and full-frame 5D cameras. The 5D was the magic camera for the filmmaking community. Over the years, other companies have been breaking the 5D mold with cameras like Blackmagic with its Pocket Cinema Camera, but once the filmmaking community started lusting for higher frame rates that filmed at higher than 1920×1080 resolution, along with a Log or Log-like color space, the field began to really open up.

It seems that’s when Sony started taking the prosumer camera market seriously and doubled down on the Sony a6000 mirrorless E-mount camera, which eventually led to the 4K (technically UHD) recording-capable a6300 and a6500.

Once people started seeing the images and video that the APS-C-based a6300 produced, mixed with the awesome low-light capabilities and wide dynamic range using picture profiles like SLog2/3 — Sony had a bonafide hit on their hands. And if you still wanted more sensor size than the a6300 can provide with its crop sensor, you have the full-frame A7SII and A7RII (and, hopefully, soon the A7R/S III).

In this review, I am going to cover the Sony a6300 and explain why it’s a good value for anyone looking to make some great content, or even just have top-notch 4K home videos. The image fidelity that comes from the Sony a6300 is truly incredible. It’s a little hard to quantify for me, but I think that the Sony a6300 has a look from the sensor that is superbly unique to a handheld camera. The Sony a6300 delivers a top-notch product for around $949.99 (not including lenses) or $1049.99, which includes a 16-50mm lens… but more on pricing later.

Technically, the Sony a6300 is a handheld camera camera with an interchangeable E-mount lens system. Since this is an APS-C crop sensor camera, it is not full frame. The sensor will record images at 24.2 megapixels and up to UHD (3840×2160) resolution when recording video, and since I work in video I am focusing on that aspect of the a6300. It records in the Sony created xvYCC color space — essentially an extended gamut color space that allows for more saturation but is compatible with existing YCC color space. Short answer: more saturation. It accepts Sony Memory Stick Duo or SD memory cards to record on, but do some research on your memory card as not all will allow for UHD recording at the full 100Mb/s. In movie mode you have an ISO range of 100-25600, which really shines in the high ISO range when filming in low light.

In terms of video recording formats, the Sony a6300 stays in the family with its XAVC S, AVCHD and MP4 all of which are 8-bit out of the camera. Keep in mind that if you edit a lot of footage in XAVC- or AVCHD-based codecs, your computer will need to be on the higher end and/or you will want to create proxy media to edit before finishing and color correcting. The XAVC and AVCHD codecs allow for pretty good quality video to be recorded, but this really stresses editing systems because of the way interframe codecs work. If you notice your system can’t play down your clips, it might be time to think about transcoding them to a more edit-friendly codec like ProRes, Cineform or DNxHR.

When recording in XAVC S 4K/UHD (3840×2160) you can shoot in multiple framerates and bitrates — 24p @ 100/60Mbps; 30p @ 100/60Mbps; XAVC S HD (1920×1080) 60p/30p/24p @ 50Mbps and 120p @ 100/50Mbps as well as many other options — but for this review those are the ones that really matter. The real beauty in the Sony a6300 is the ability to shoot in Log color space, which in very basic terms is a video with a grayish-flat color that allows for advanced color correction in post production because there is more information to pull out of the shadows and highlights aka dynamic range.

S-Log 2 split screen.

To enable the Log color spaces, find the Picture Profile menu under menu five and select PP7, PP8 or PP9. This is where you will find the Gamma menu and S-Log 2- or 3-enabled by default. There are more options but the next one that concerns a lot of people is the Color Mode, which can be changed to S-Gamut, S-Gamut3.Cine, S-Gamut3 and more. These are a little tricky, and my best piece of advice is to try each combination in different lighting environments like sunset, a bright blue sky with gradations and low light to see which works best for each situation. I noticed I got a good amount of noise in S-Log3 S-Gamut3.Cine, but I really tried to push the low light in that mode. I fixed excessive shadow noise when I was color correcting by using Red Giant’s Magic Bullet Suite Denoiser — read my review.

I noticed S-Log2 left me a little more detail in the highlights, while S-Log 3 gave a little more detail in the shadows; that may have just been my experience, but that is what I noticed. In addition, when shooting in S-Log 2/3 I noticed some macro-blocking/banding in shots that had color gradations, like a blue sky turning into white or even very bright lights — this will look like square digital artifacts or bands arcing across the gradient. I even saw a dead pixel flash when shooting some really low-light footage. The real test is to watch this footage on a huge TV or output monitor above 32-inches because you will really start to see the noise and banding that is present. I did some testing with noise removal, and with a little bit of noise removal elbow grease you can get a great picture. Overall, I am very impressed with the Log type images I was able to pull out of the a6300 and how well they held up in color correction. Typically, a camera that can pull this type of image would be at least over $5,000-6,000 or more plus lenses, so the a6300 is a steal.

After all that S-Log talk you might be asking, “What if I just want to shoot great video and not worry about Logs and Gamuts?” Well, you can set the Picture Profile to 1-6 and get a great image with little to no color correction needed. Specifically, Picture Profile 1 is really the automatic setting to use; it is described by Sony as being the “Movie Gamma,” which basically means your video will look good.

For more descriptions on the Picture Profiles of the Sony a6300, check out their help guide. You will need to test out all of the Picture Profiles though as they all have different characteristics, such as more detail in the shadows but less accurate color in the highlights. Just something to take a few hours and test out.

More Cool Stuff
The internal microphone on the a6300 is ok, but probably shouldn’t be used to use as your primary audio recording. I would suggest something like the Røde VideoMic Pro. Unfortunately without being able to monitor your audio by headphones you will definitely need to test your external microphone to check whether you need a pre-amp, or if something like the VideoMic Pro +20dB boost will be enough or too loud.

One thing that really stuck out to me was how fast the automatic focus was on the Sony a6300. I am used to using a Canon EOS Rebel t2i camera, and the Sony a6300 is lightning fast, almost instantaneous. It really impressed me. I was visiting Disneyland when I had the a6300 and was taking some stills and video around the park, I took a picture of my son, but the Sony a6300 had accidentally caught a bubble in the autofocus and very clearly took a picture of that bubble. It was accidentally incredible.

In addition to the camera, Sony let me borrow a few lenses when I tested out the a6300, including the 50mm f1.8 ($249.99), E 35mm f1.8 ($449.99) and E PZ 16-50mm f3.5-5.6 ($349.99). While the 35mm and 50mm are great , I felt that the 16-50mm zoom lens did the job for me overall. In low-light situations it definitely helped to have the f1.8 prime lenses in my bag, but during daylight, and even dusk, the zoom lens was great. However, when taking portraits or footage where I wanted a nice bokeh background, the prime lenses were what I had to use.

If I was going to buy this camera for myself I would weigh the idea of spending a little more money and grabbing a really nice lens, whether it be a prime or zoom. The only problem with that is most of the upgraded lenses are for full-frame cameras, which brings me to my next point: Would I just go all the way to a full-frame Sony A7rII or A7sII camera? In my mind, if I have enough money to get a full-frame camera I do it. The quality, in my eyes, is far superior. However, you are going to be paying an extra $1,000 to $2,500, depending on the lenses, whether you buy new or used. So a middle ground might be to buy the full-frame lenses like the G Master series for the a6300. This way when you find the right Sony body you don’t have to upgrade lenses as the full-frame lenses will work on the a6300. Keep in mind you will have a crop factor of 1.5, which means a full-frame 50mm lens will actually be a 75mm lens. That might be more confusing than helpful, but it is a constant fight for Sony a6300 owners after they see what the Sony cameras can do. Another option is to take a look at Craigslist or Ebay and see if anyone is selling a used a6300 or A7srII. I did a cursory search when writing this article and found a Sony a6300 with four lenses and extra accessories for $1,300, and another a6300 for sale with one lens for $700, so there are options for used cameras at a great price.

So what didn’t I like about the Sony a6300? There is no headphone jack to monitor your audio. That’s a big one, but one possible solution is using the micro-HDMI port. If you use an external monitoring solution, like an Atomos or a SmallHD monitor, you will be able to use their audio monitoring. Also, I just can’t get used to Sony’s menu and button setup. Maybe because I’ve been used to Canon’s menu, button and wheel setup for a while, but Sony’s setup for some reason throws me off. I feel like I have to go in to one or two extra menus before I get to the settings I want.

Summing Up
The bottom line is that the Sony a6300 is an incredible UHD-capable camera that can be purchased with a lens for around $1,000. It lacks things like proper audio monitoring but gives you great control over your color correction when filming in SLog 2 or 3, and with a little noise reduction you will have clean low-light footage in the palm of your hand.

There is a newer version of this camera in the a6500, which has the following upgrades over the a6300 — 5-axis image stabilization, touchscreen LCD (can swipe to change focus on an object or touch to set focus) and improved menu system. The a6500 costs $1,399.99 for the body only. The image stabilization is what really sells the a6500 since you can now use any lens (with adapters) that you want while still benefitting from image stabilization. Either way, the a6300 is the best bet to get a great UHD-capable camera at a great price, especially if you can find someone selling a used one with a bunch of lenses and batteries. The video that comes from the Sony line of cameras is unmistakable, and will add a level of professionalism to anyone’s videography arsenal.

You can see my Sony a6300 Slow Motion SLog 2/SLog 3 test as well as my UHD tests on YouTube.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Designed for large file sizes, Facilis TerraBlock 7 ships

Facilis, makers of shared storage solutions for collaborative media production networks, is now shipping TerraBlock Version 7. The new Facilis Hub Server, a performance aggregator that can be added to new and existing TerraBlock systems, is also available now. Version 7 includes a new browser-based, mobile-compatible Web Console that delivers enhanced workflow and administration from any connected location.

With ever-increasing media file sizes and 4K, HDR and VR workflows continually putting pressure on facility infrastructure, the Facilis Hub Server is aimed at future-proofing customers’ current storage while offering new systems that can handle these types of files. The Facilis Hub Server uses a new architecture to optimize drive sets and increase the bandwidth available from standard TerraBlock storage systems. New customers will get customized Hub Server Stacks with enhanced system redundancy and data resiliency, plus near-linear scalability of bandwidth when expanding the network.

According to James McKenna, VP of marketing/pre-sales at Facilis, “The Facilis Hub Server gives current and new customers a way to take advantage of advanced bandwidth aggregation capabilities, without rendering their existing hardware obsolete.”

The company describes the Web Console as a modernized browser-based and mobile-compatible interface designed to increase the efficiency of administrative tasks and improve the end-user experience.

Easy client setup, upgraded remote volume management and a more integrated user database are among the additional improvements. The Web Console also supports Remote Volume Push to remotely mount volumes onto any client workstations.

Asset Tracking
As the number of files and storage continue to increase, organizations are realizing they need some type of asset tracking system to aid them in moving and finding files in their workflow. Many hesitate to invest in traditional MAM systems due to complexity, cost, and potential workflow impact.

McKenna describes the FastTracker asset tracking software as the “right balance for many customers. Many administrators tell us they are hesitant to invest in traditional asset management systems because they worry it will change the way their editors work. Our FastTracker is included with every TerraBlock system. It’s simple but comprehensive, and doesn’t require users to overhaul their workflow.”

V7 is available immediately for eligible TerraBlock servers.

Check out our interview with McKenna during NAB:

Dell 6.15

Pixelogic acquires Sony DADC NMS’ creative services unit

Pixelogic, a provider of localization and distribution services, has completed the acquisition of the creative services business unit of Sony DADC New Media Solutions, which specializes in 4K, UHD, HDR and IMF workflows for features and episodics. The move brings an expansion of Pixelogic’s significant services to the media and entertainment industry and provides additional capabilities, including experienced staff, proprietary technology and an extended footprint.

According to John Suh, co-president of Pixelogic, the acquisition “expands our team of expert media engineers and creative talent, extends our geographic reach by providing a fully established London operation and further adds to our capacity and capability within an expansive list of tools, technologies, formats and distribution solutions.”

Seth Hallen

Founded less than a year ago, Pixelogic currently employs over 240 worldwide and is led by industry veterans Suh and Rob Seidel. While the company is headquartered in Burbank, California, it has additional operations in Culver City, California, London and Cairo.

Sony DADC NMS Creative Services was under the direction of Seth Hallen, who joins Pixelogic as senior VP of business development and strategy. All Sony DADC NMS Creative Services staff, technology and operations are now part of Pixelogic. “Our business model is focused on the deep integration of localization and distribution services for movies and television products,” says Hallen. “This supply chain will require significant change in order to deliver global day and date releases with collapsed distribution windows, and by partnering closely with our customers we are setting out to innovate and help lead this change.”


The Path‘s post path to UHD

By Randi Altman

On a recent visit to the Universal Studios lot in Los Angeles, I had the pleasure of meeting the post team behind Hulu’s The Path, which stars Aaron Paul and Hugh Dancy. The show is about a cult — or as their members refer to it, a movement — that on the outside looks like do-gooders preaching peace and love, but on the inside there are some freaky goings-on.

The first time I watched The Path, I was taken with how gorgeous the picture looked, and when I heard the show was posted and delivered in UHD, I understood why.

“At the time we began to prep season one — including the pilot — Hulu had decided they would like all of their original content shows to deliver in UHD,” explains The Path producer Devin Rich. “They were in the process of upgrading their streaming service to that format so the viewers at home, who had the capability, could view this show in its highest possible quality.”

For Rich (Parenthood, American Odyssey, Deception, Ironside), the difference that UHD made to the picture was significant. “There is a noticeable difference,” he says. “For lack of better words, the look is more crisp and the colors pop. There, of course, is a larger amount of information in a UHD file, which gives us a wider range to make it look how we want it to, or at least closer to how we want it to look.”

L-R: Tauzhan Kaiser, Craig Burdick (both standing), Jacqueline LeFranc and Joe Ralston.

While he acknowledges that as a storyteller UHD “doesn’t make much of a difference” because scripts won’t change, his personal opinion is that “most viewers like to feel as if they are living within the scene rather than being a third-party to the scene.” UHD helps get them there, as does the team at NBCUniversal StudioPost, which consists of editor Jacqueline LeFranc, who focuses on the finishing, adding titles, dropping in the visual effects and making the final file; colorist Craig Budrick; lead digital technical operations specialist Joe Ralston, who focuses on workflow; and post production manager Tauzhan Kaiser.

They were all kind enough to talk to us about The Path’s path to UHD.

Have you done an UHD workflow on any other shows?
Ralston: We have a lot of shows that shoot UHD or high resolution, but The Path was our first television show that finished UHD all the way through.

What is it shot on?
Ralston: They shoot Red 3840×2160, and they also shoot 4800×2700, so almost 5K. UHD is technically twice the height and twice the width of HD, so while it’s still 16×9, resolution-wise it’s double.

From an infrastructure perspective, were you guys prepared to deal with all that data?
Ralston: Yes. At the facility here at NBCUniversal StudioPost, not only do we do TV work, but there’s remastering work — all the centennial titles, for example.

Kaiser: We we’ve done Spartacus. All Quiet on the Western Front, The Birds, Buck Privates, Dracula (1931), Frankenstein, Out of Africa, Pillow Talk, The Sting, To Kill a Mockingbird, Touch of Evil, Double Indemnity, Holiday Inn and King of Jazz.

Ralston: The infrastructure as far as storage and monitoring were already in place here. We knew that this was coming. So slowly the facility has been preparing and gearing up for it. We had been ready, but this was really the first that requested end-to-end UHD. Usually, we do a show that maybe it’s shot UHD or 5K, but they finish in HD, so when we leave the editorial room, we’re then in an HD world. In this case, we were not.

LeFranc: Joe’s group, which is digital tech ops, doesn’t really exist in other places that I know of. They develop, train and work with everybody else in the facility to develop these kind of workflows in order to get ahead of it. So we are prepared, adequately trained and aware of all the pitfalls and any other concerns there might be. That’s a great thing for us, because it’s knowledge.

Other shows have gone UHD, but some in season two, and they were playing catch up in terms of workflow.
Ralston: We’d been thinking about it for a long time. Like I said, the difference with this show, versus some of the other ones who do it is that everyone else, when it got to color, went to HD. This one, when we got to color, we stayed UHD all the way through from there on out.

So, that was really the big difference for a show like this. The big challenges for this one were — and Jacqueline can go into it a little bit more — when you get into things like titling or creating electronic titles, there’s not a lot of gear out there that does that.

Jacqueline, can you elaborate on that?
LeFranc: There were obstacles that I encountered when trying to establish the initial workflow. So, for example, the character generator that is used to create the titles has an option for outputting 4K, but after testing it I realized it wasn’t 4K. It looked like it was just up-rezed.

So I came up with a workflow where, in the character generator, we would make the title larger than we needed it to be and then size it down in Flame. Then we needed a new UHD monitor, the Sony BVMX300. The broadcast monitor didn’t work anymore, because if you want to see UHD in RGB, it has to have a quad-link output.

Craig, did your color process change at all?
Budrick: No, there wasn’t really any change for me in color. The creative process is still the creative process. The color corrector supports a higher resolution file, so it wasn’t an issue of needing new equipment or anything like that.

What systems do you use?
Budrick: We are primarily an Autodesk facility, so we use Flame, Flame Premium and Lustre for color. We also have Avids.

Can you walk us through the workflow?
Ralston: We don’t do the dailies on this project here. It’s all done in New York at Bling. We receive all the camera master files. While they do use drones and a couple of other cameras, a large percent of the show is shot on Epic Red Dragon at 3840×2160.

We get all those camera master files and load them onto our server. Then we receive an Avid bin or sequence from the client and bring that into our Avid in here and we link to those camera master files on the SAN. Once they’re linked, we then have a high-res timeline we can play through. We take the low-res offline version that they gave us and we split it — our editor goes through it and makes sure that everything’s there and matched.

Once that part is complete, we transcode that out to the Avid codec DNX-HR444, which is basically 440Mb and a UHD file that the Avid is outputting. Once we get that UHD file out of the Avid, we flip that UHD DNX-MXF file into a DPX sequence that is a UHD 3840×2160 DPX sequence. That’s where Craig would pick up on color. He would take that DPX sequence and color from there.

Craig, in terms of the look of the show, what direction were you given?
Budrick: They shoot in New York, so the DP Yaron Orbach is in New York. Because of that distance, I had a phone conversation with them to start the look of the show. Then I do a first-day pass, and then he receives the file. Then, he just gives me notes via email on each scene. Then he gets the second file, and hopefully I’m there.

Can you give me an example of a note that he has given?
Budrick: It just might be, you know, let’s add some saturation, or let’s bring this scene down. Maybe make it more moody. Bring down the walls.

Overall, as the show has gone along and the stories have developed it’s gotten a little darker and more twisted, it’s leaned more toward a moody look and not a whole lot of happy.

Ralston: Because of the distance between us and the DP, we shipped a color-calibrated Sony HD monitor to New York. We wanted to make sure that what he was looking at was an exact representation of what Craig was doing.

Jacqueline, any challenges from your perspective other than the titles and stuff?
LeFranc: Just the differences that I noticed — the render time takes a little longer, obviously, because the files are a little bigger. We have to use certain SAN volumes, because some have larger bandwidths.

Ralston: We have 13 production volumes here, and for this particular show — like the feature mastering that we do — the volume is 156TB Quantum that is tuned for 4K. So, in other words, it performs better with these larger files on it.

Did you experiment at all at the beginning?
Ralston: For the first three episodes we had a parallel workflow. Everything we did in UHD, we did in HD as well — we didn’t want the producer showing up to a screening and running into a bandwidth issue. In doing this, we realized we weren’t experiencing bandwidth issues. We kind of underestimated what our SAN could do. So, we abandoned the HD.

Do you think finishing in UHD will be the norm soon?
Ralston: We were unofficially told that this time next year we should plan on doing network shows this way.


Netflix’s ‘Unbreakable Kimmy Schmidt’ gets crisper look via UHD

NYC’s Technicolor Postworks created a dedicated post workflow for the upgrade.

Having compiled seven Emmy Award nominations in its debut season, Netflix’s Unbreakable Kimmy Schmidt returned in mid-April with 13 new episodes in a form that is, quite literally, bigger and better.

The sitcom, from co-creators Tina Fey and Robert Carlock, features the ever-cheerful and ever-hopeful Kimmy Schmidt, whose spirit refuses to be broken, even after being held captive during her formative years. This season the series has boosted its delivery format from standard HD to the crisper, clearer, more detailed look of Ultra High Definition (UHD).

L-R: Pat Kelleher and Roger Doran

As with the show’s first season, post finishing was done at Technicolor PostWorks New York. Online editor Pat Kelleher and colorist Roger Doran once again served as the finishing team, working under the direction of series producer Dara Schnapper, post supervisor Valerie Landesberg and director of photography John Inwood. Almost everything else, however, was different.

The first season had been shot by Inwood with Arri Alexa, capturing in 1080p, and finished in ProRes 4444. The new episodes were shot with Red Dragon, capturing in 5K, and needed to be finished in UHD. That meant that the hardware and workflow used by Kelleher and Doran had to be retooled to efficiently manage UHD files four times larger than ProRes.

“It was an eye opener,” recalls Kelleher of the change. “Obviously, the amount of drive space needed for storage is huge. Everyone from our data manager through to the people who did the digital deliveries had to contend with the higher volume of data. The actual hands-on work is not that different from an HD show, but you need the horses to do it.”

Before post work began, engineers from Technicolor PostWorks’ in-house research unit, The Test Lab, analyzed the workflow requirements of UHD and began making changes. They built an entirely new hardware Unbreakable Kimmy Schmidtsystem for Kelleher to use, running Autodesk’s Flame Premium. It consisted of an HP Z820 workstation with Nvidia Quadro K6000 graphics, 64GB of RAM and dual Intel Xeon Processor E5-2687Ws (20M Cache, 3.10 GHz, 8.00 GT/s Intel QPI). Kelleher described its performance in handling UHD media as “flawless.”

Doran’s color grading suite got a similar overhaul. For him, engineers built a Linux-based workstation to run Blackmagic’s DaVinci Resolve, V11, and set up a dual monitoring system. That included a Panasonic 300 series display to view media in 1080p and a Samsung 9500 series curved LED to view UHD. Doran could then review color decisions in both formats (while maintaining a UHD signal throughout) and spot details or noise issues in UHD that might not be apparent at lower resolution.

While the extra firepower enabled Kelleher and Doran to work with UHD as efficiently as HD, they faced new challenges. “We do a lot of visual effects for this show,” notes Kelleher. “And now that we’re working in UHD, everything has to be much more precise. My mattes have to be tight because you can see so much more.”

Doran’s work in the color suite similarly required greater finesse. “You have to be very, very aware,” he says. “Cosmetically, it’s different. The lighting is different. You have to pay close attention to how the stars look.”

Doran is quick to add that, while grading UHD might require closer scrutiny, it’s justified by the results. “I like the increased range and greater detail,” he says. “I enjoy the extra control. Once you move up, you never want to go back.”

Both Doran and Kelleher credited the Technicolor PostWorks engineering team of Eric Horwitz, Corey Stewart and Randy Main for their ability to “move up” with a minimum of strain. “The engineers were amazing,” Kelleher insists. “They got the workflow to where all I had to think about was editing and compositing. The transition was so smooth, you almost forgot you were working in UHD, except for the image quality. That was amazing.”


Quick Chat: New president/GM Deluxe TV post services Dom Rom

Domenic Rom, a fixture in the New York post community for 30 years, has been promoted to president and GM of Deluxe TV Post Production Services. Rom was most recently managing director of Deluxe’s New York studio, which incorporates Encore/Company 3/Method. He will now be leading Deluxe’s global services for television, specifically, the Encore and Level 3 branded companies. He will be making the move to Los Angeles.

Rom’s resume is long. He joined DuArt Film Labs in 1984 as a colorist, working his way up to EVP of the company, running both its digital and film lab divisions. In 2000, he joined stock footage/production company Sekani (acquired by Corbis), helping to build the first fully digital content distribution network. In 2002, he founded The Lab at Moving Images, the first motion picture lab to open in in New York in 25 years. It was acquired by PostWorks, which named Rom COO overseeing its Avid rentals, remote set-ups, audio mixing, color correction and editorial businesses. In 2010, Rom joined Technicolor NY as SVP post production. When PostWorks NY acquired Technicolor NY, Rom again became COO of the now-larger company. He joined Deluxe in 2013 as GM of its New York operations.

“I love what I’m seeing today in the industry,” he says. “It has been said many times, but we’re truly in a golden age of television. The best entertainment in the world is coming from the networks and a whole new generation of original content creators. It’s exciting to be in a position to service that work. There are few, if any, companies that have invested in the research, technology and talent to the degree Deluxe has, to help clients take advantage of the latest advancements — whether it’s HDR, 4K, or whatever comes next, to create amazing new experiences for viewers.”

postPerspective reached out to Rom, as he was making his transition to the West Coast, to find out more about his new role and his move.

What does this position mean to you?
This position is the biggest honor and challenge of my career. I have always considered Encore and Level 3 to be the premier television facilities in the world, and to be responsible for them is amazing and daunting all at the same time. I am so looking forward to working with the facilities in Vancouver, Toronto, New York and London.

What do you expect/hope to accomplish in this new role?
To bring our worldwide teams even closer and grow the client relationships even stronger than they already are, because at the end of the day this is a total relationship business and probably my favorite part of the job.

How have you seen post and television change over the years?
I was talking about this with the staff out here the other day. I have seen the business go from film to 2-inch tape to 1-inch to D2 to D5 to HDCAM (more formats than I can remember) to nonlinear editing and digital acquisition — I could go on and on. Right now the quality and sheer amount of content coming from the studios, networks, cablenets and many, many new creators is both exciting and challenging. The fact that this business is constantly changing helps to keep me young.

How is today’s production and post technology helping make TV an even better experience for audiences?
In New York we just completed the first Dolby Vision project for an entire episodic television season (which I can’t name yet), and it looks beautiful. HDR opens up a whole new visual world to the artists and the audience.

Are you looking forward to living in Los Angeles?
I have always danced with the idea of living in LA throughout my career, and to do so this far in is interesting timing. My family and, most importantly, my brand new grandson are all on the east coast so I will maintain my roots there while spreading them out west as well.


BenQ offering 4K UHD monitor for editing pros

For video editors looking for a new monitor, BenQ America has made available the PV3200PT IPS, which is purpose-built for post workflows. The 32-inch 4K Ultra HD display offers color precision via 10-bit, 100 percent sRGB color, which follows the Rec. 709 standard. Available now, the unit sells for $1,499.

The PV3200PT reproduces color tones with a Delta-E value of less than or equal to two and features a 14-bit 3D LUT to display an accurate color mixture for improved RGB color blending. By balancing brightness to a deviation and chromaticity less than 10 percent, the monitor offers a more consistent viewing experience. The monitor also features simple hardware and software calibration by allowing users to adjust the unit’s image processing chip without altering graphics card data.

An OSD controller provides preset custom modes so users can easily switch between Rec. 709, EBU and SMPTE-C modes. The PV3200PT is part of BenQ’s Eye-Care models, which are designed to increase visual comfort while performing common computer tasks. While conventional screens flicker at a rate of 200 times per second, BenQ’s ZeroFlicker technology eliminates flickering at all brightness levels, which reduces eye fatigue and provides a more comfortable viewing experience during prolonged sessions of computer use. Further capabilities include ergonomic customization such as height, tilt, pivot and swivel adjustments.

Watch this space in coming weeks for a review of the product via video editor Brady Betzel.

FMPX8.14

UHD Alliance’s Victor Matsuda: updates from NAB 2016

Victor Matsuda from the UHD Alliance was at NAB 2016. The Alliance was formed about 15 months ago as 4K UHD products began exploding into the market. The goal of the Alliance was to establish certifications for these new products and for content. All of this is to ensure a quality experience for consumers, who will ultimately drive 4K/UHD adoption throughout the market.

Watch our video with Matsuda to find out more.


NAB 2016 from an EP’s perspective

By Tara Holmes

Almost two weeks ago, I found myself at NAB for the first time. I am the executive producer of color and finishing at Nice Shoes, a post production studio in New York City. I am not an engineer and I am not an artist, so why would an EP go to NAB? I went because one of my main goals for 2016 is to make sure the studio remains at the forefront of technology. While I feel that our engineering team and artists represent us well in that respect, I wanted to make sure that I, along with our producers, were fully educated on these emerging technologies.

One of our first priorities for NAB was to meet with top monitor manufacturers to hopefully land on what UHD HDR monitors we would find to meet our standards for professional client viewing. We came to the conclusion that the industry is not there yet and we have more research to do before we upgrade our studio viewing environments.

Everyone with me was in agreement. They aren’t where they need to be. Most are only outputting around 400-800 nits and are experiencing luminance and contrast issues. None of this should stop the process of coloring for HDR. For the master monitor for the colorist, the Sony BVM-X300 OLED master monitor, which we are currently using, seems to be the ideal choice as you can still work in traditional Rec 709 as well as Rec 2020 for HDR.

After checking out some monitors, we headed to the FilmLight booth to go over the 5.0 upgrades to Baselight. Our colorist Ron Sudul, along with Nice Shoes Creative Studio VFX supervisor Adrian Winter, sat with myself and the FilmLight reps to discuss the upgrades, which included incredible new isolation tracking capabilities.  These upgrades will reinvent what can be achieved in the color suite: from realtime comps to retouch being done in color. The possibilities are exciting.

I also spent time learning about the upgrades to Filmlight’s Flip, which is their on-set color hardware. The Flip can allow you to develop your color look on set, apply it during your edit process (with the Baselight plug-in for Avid) and refine it in final color, all without affecting your RAW files. In addition to the Flip, they developed a software that supports on-set look development and grading called Prelight. I asked if these new technologies could enable us to even do high-end things like sky replacements on set and was told that the hardware within the Flip very well could.

We also visited our friends at DFT, the manufacturers of the Scanity film scanner, to catch up and discuss the business of archiving. With Scanity, Nice Shoes can scan 4K when other scanners only scan up to 2K resolution. This is a vital tool in not only preserving past materials, but in future proofing for emerging formats when archiving scans from film.

VR
On Sunday evening before the exhibits opened, we attended a panel on VR that was hosted by the Foundry. At this event we got to experience a few of the most talked about VR projects including Defrost, one of the first narrative VR films, from the director of Grease, Randal Kleiser, who was on the panel along with moderator Morris May (CEO/founder, Specular Theory), Bryn Mooser (co-founder, RYOT), Tim Dillon (executive producer, MPC) and Jake Black (head of VR, Create Advertising).

The Foundry’s VR panel.

The panel inspired me to delve deeper into the VR world, and on Wednesday I spent most of my last day exploring the Virtual & Augmented Reality Pavilion. In addition to seeing the newest VR camera rig offerings and experiencing a live VR feed, as well as demo-ing the Samsung Gear, I explored viewing options for the color workflow. Some people I spoke to mentioned that multiple Oculus set-ups all attached to a single feed was the way to go for color workflow, but another option that we did a very preliminary exploration of was the “dome” possibility, which offers a focused 180-degree view for everyone involved to comment on the same section of a VR scene. This would enable all involved to be sure they are experiencing and viewing the same thing at the same time.

HDR Workflow
Another panel we attended was about HDR workflows. Nice Shoes has already had the opportunity to work on HDR material and have begun to develop workflows for this emerging medium. Most HDR deliverables are for episodic and long form for such companies as Netflix, Hulu and the like. It may be some time before commercial clients are requesting an HDR deliverable, but the workflows will be much the same so the development being performed now is extremely valuable.

My biggest take away was that there are still no set standards. There’s Dolby Vision vs. HDR 10 vs. PQ vs. others. But it appears that everyone agrees that standards are not needed right now. We need to get tools into the hands of the artists and figure out what works best. Standards will come out of that. The good news is that we appear to be future-proofed for the standard to change. Meaning for the most part, every camera we are shooting on is shooting for HDR and should standards change — say from 1000 nits to 10,000 nits — the footage and process is still there to go back in and color for the new request.

Summing Up
I truly believe my time spent at NAB has prepared me for the myriad of questions that will be put forth throughout the year and will help us develop our workflows to evolve the creative process of post. I’ll be sure to be there again next year in order to prepare myself for the questions of 2017 and beyond.

Our Main Image: The view walking into the South Hall Lower at the LVCC.