Category Archives: Cameras

JVC GY-LS300CH camera offering 4K 4:2:2 recording, 60p output

JVC has announced version 4.0 of the firmware for its GY-LS300CH 4KCAM Super 35 handheld camcorder. The new firmware increases color resolution to 4:2:2 (8-bit) for 4K recording at 24/25/30p onboard to SDXC media cards. In addition, the IP remote function now allows remote control and image viewing in 4K. When using 4K 4:2:2 recording mode, the video output from the HDMI/SDI terminals is HD.

The GY-LS300CH also now has the ability to output Ultra HD (3840 x 2160) video at 60/50p via its HDMI 2.0b port. Through JVC’s partnership with Atomos, the GY-LS300CH integrates with the new Ninja Inferno and Shogun Inferno monitor recorders, triggering recording from the camera’s start/stop operation. Plus, when the camera is set to J-Log1 gamma recording mode, the Atomos units will record the HDR footage and display it on their integrated, 7-inch monitors.

“The upgrades included in our Version 4.0 firmware provide performance enhancements for high raster recording and IP remote capability in 4K, adding even more content creation flexibility to the GY-LS300CH,” says Craig Yanagi, product marketing manager at JVC. “Seamless integration with the new Ninja Inferno will help deliver 60p to our customers and allow them to produce outstanding footage for a variety of 4K and UHD productions.”

Designed for cinematographers, documentarians and broadcast production departments, the GY-LS300CH features JVC’s 4K Super 35 CMOS sensor and a Micro Four Thirds (MFT) lens mount. With its “Variable Scan Mapping” technology, the GY-LS300CH adjusts the sensor to provide native support for MFT, PL, EF and other lenses, which connect to the camera via third-party adapters. Other features include Prime Zoom, which allows shooters using fixed-focal (prime) lenses to zoom in and out without loss of resolution or depth, and a built-in HD streaming engine with Wi-Fi and 4G LTE connectivity for live HD transmission directly to hardware decoders as well as JVCVideocloud, Facebook Live and other CDNs.

The Version 4.0 firmware upgrade is free of charge for all current GY-LS300CH owners and will be available in late May.

Sony intros extended-life SSDs for 4K or higher-bitrate recording 

Sony is expanding its media lineup with the introduction of two new G Series professional solid-state drives in 960GB (SV-GS96) and 480GB (SV-GS48) capacities. Sony says that these SSDs were designed to meet the growing need for external video recording devices docked to camcorders or high-performance DSLRs.

The new SSDs are an option for respective video recorders, offering videographers stable high-speed capabilities, a sense of security and lower cost of ownership due to their longer life. Using Sony’s Error Correction Code technology, the 960GB G Series SSD achieves up to 2400TBW (Terabytes Written), while the 460GB drive can reach 1200TBW, resulting in less frequent replacement and increased ROI. 2400TBW translates to about 10 years of use for the SV-GS96, if data is fully written to the drive an average of five times per week.

According to Sony, the drives are also designed for ultra-fast, stable data writing. Sony G Series SSDs feature built-in technology preventing sudden speed decreases, while ensuring stable recording of high-bitrate 4K video without frame dropping. For example, used with an Atomos Shogun Inferno, G Series SSDs are able to record video at 4K 60p (ProRes 422 HQ) mode stably.

When paired with the necessary connection cables, the new G Series drives can be effortlessly removed from a recorder and connected to a computer for file downloading, making editing easier and faster with read speeds up to 550MB/s.

G Series SSDs also offer data protection technology that keeps content secure and intact, even if a sudden power failure occurs. To add to the drive’s stability, it features a durable connector which withstands extreme repeated insertion and removal up to 3,000 times — or six times more tolerance than standard SATA connectors — even in harsh conditions.

Sony’s SSD G Series is expected to be available May 2017 at the suggested retail prices of $539 for SV-GS96 and $287 for SV-GS48.

MTI 4.28

DP John Kelleran shoots Hotel Impossible

Director of photography John Kelleran shot season eight of the Travel Channel’s Hotel Impossible, a reality show in which struggling hotels receive an extensive makeover by veteran hotel operator and hospitality expert Anthony Melchiorri and team.

Kelleran, who has more than two decades experience shooting reality/documentary projects, called on Panasonic VariCam LT 4K cinema camcorders for this series.

eWorking for New York production company Atlas Media, Kelleran shot a dozen Hotel Impossible hour-long episodes in locations that include Palm Springs, Fire Island, Capes May, Cape Hatteras, Sandusky, Ohio, and Albany, New York. The production, which began last April and wrapped in December 2016, spent five days in each location.

Kelleran liked the VariCam LT’s dual native ISOs of 800/5000. “I tested ISO5000 by shooting in my own basement at night, and had my son illuminated only by a lighter and whatever light was coming through the small basement window, one foot candle at best. The footage showed spectacular light on the boy.”

Kelleran regularly deployed ISO5000 on each episode. “The crux of the show is chasing out problems in dark corners and corridors, which we were able to do like never before. The LT’s extreme low light handling allowed us to work in dark rooms with only motivated light sources like lamps and windows, and absolutely keep the honesty of the narrative.”

Atlas Media is handling the edit, using Avid Media Composer. “We gave post such a solid image that they had to spend very little time or money on color correction, but could rather devote resources to graphics, sound design and more,” concludes Kelleran.


Review: GoPro’s Karma Grip and Quik Key

By Brady Betzel

There has been a flood of GoPro-compatible accessories introduced over the last several years, with few having as much impact as handheld stabilizers. Stabilizers have revolutionized videography (more specifically GoPro videography) and they are becoming extremely compact and very reasonably priced.

A while ago, I reviewed a GoPro Hero 3- and 4-compatible handheld stabilizer from Polaroid, which was good but had a few kinks to work out, like a somewhat clumsy way of mounting your camera.

Over the last year, GoPro has ventured into the drone market with the Karma Drone where it unfortunately fell out of grace — it was recalled because of a battery latch issue — but has recently returned to the market.

When I first got my hands on the Karma Drone (the initial release), I immediately saw the benefit of buying GoPro’s drone. Along with the GoPro Karma Drone came the Karma Grip, a handheld stabilizer for the newly released Hero 5 action camera. It is really mind blowing to be flying a drone one minute and seconds later remove the Karma Grip from the Karma Drone and then be creating beautifully smooth shots. Handheld stabilizers like the GoPro Karma Grip have really helped shooters to create more cinematically styled footage at a relatively low cost.

When GoPro sent me the Karma Grip to borrow for a few weeks, I was really excited. I received the Karma Grip between the time they recalled the Karma Drone and when they subsequently re-released it. In addition to the Karma Grip they sent me the Quik Key, a mobile microSD card reader.

In this review I’m going to share my experience with the Karma Grip as well as touch on the Quik Key and why it’s a phenomenal accessory if you want to quickly upload photos from your GoPro action cam.

Jumping In
When testing the Karma Grip I used my GoPro Hero 5 Black Edition, which is important to note because the Hero 5 has a different case build than previous GoPro models. You’ll need to purchase a different harness if you have a Hero 4. Nonetheless, I love the Hero 5. While the Hero 4 and Hero 5 have similar camera sensors, they have some major differences. First, the Hero 5 has some really sweet voice control. I’m not a huge Siri user, so I was initially skeptical when GoPro tried to sell me on the voice control. To my surprise I love it, especially when paired with the Remo waterproof voice-activated remote. To not be a total GoPro fanboy, I will avoid reviewing the Remo for now but it’s something that I really love.

The Hero 5 has a built-in waterproof housing (unlike previous versions that needed a separate waterproof housing), voice activation, easy-to-use touch screen menu system and many other features. What I’m getting at is that the Karma Grip comes out of the box to fit the Hero 5, but you can purchase the Hero 4 Harness for an additional $29.99. The Session mount will be released later in the summer.

What makes the GoPro Karma Grip different from other handheld stabilizers, in my opinion, is its build quality, ease of use and GoPro-focused mounting options. Immediately when opening the Karma Grip box you get four key components: the removable grip handle ($99.99), mounting ring ($29.99) and stabilizer ($249.99) with the Hero 5 harness attached ($29.99). In addition, it all comes in a form-fitted case. The case is sturdy but kind of reminds me of a trombone case; it does the job but is a little unwieldy. When you buy the Karma Grip as a set it retails for $299.99, which is a little pricey, but in my opinion completely worth it — especially if you plan to buy the Karma drone because you can purchase the drone separately.

If you know you are going to buy the Karma Drone, you should probably just go ahead and buy the whole drone package now ($1099.99 Karma Drone with the Grip and Hero 5, $799.99 Karma Drone with the Grip). If you decide you want the Karma Drone you can purchase the Flight Kit for the Karma Grip for $599. For those counting at home that comes to $899 if you purchase the Grip and the Karma Drone separately, so it’s definitely a better deal to buy it all at once if you can.

Once I opened the form-fitted Karma Grip case, I plugged the USB-C charging cable into the base of the Karma Grip handle. I kind of wish the cable plugged in somewhere other than the base, since I like to rest stabilizers on their base, but not really a big deal if you have your case around. I set the Karma Grip to charge overnight, but the manual writes it will take six hours on a standard 1A charger, and one hour and 50 minutes if you use the “Supercharger” — immediately I was like what the hell is this Supercharger and why don’t I have one? They are $49.99 and can be found here.

So the next day I tried using the Karma Grip in conjunction with a suction cup mount inside of my car on my ride home from work. I wanted to see how the Karma Grip would work when mounted to a windshield (inside my car) to film a driving timelapse. To attach the Karma Grip you have to put a separate mounting ring between the handle and the stabilizer. Like a typical bonehead, I didn’t read the manual, so I tried mounting the ring with the GoPro mount. It took me a few tries to get it on right, but once it is on it actually feels very sturdy.

From there you have to do a typical GoPro mount connecting dance to get everything situated. You can check out my results here.

Admittedly, I probably should have locked the view of the Karma Grip to keep it focused straight forward, but I didn’t. It worked okay, but I definitely would need way more time to perfect this. However, if you can lock in your Karma Grip to something like the side of a train or airplane, your shooting options will become way smoother.

On the Move
Next I wanted to test running around with the Karma Grip. Once you lock your Hero 5 into the harness on the Karma Grip it’s as simple as powering on your Grip and hitting record. You can flip over the Grip to record a ground level view very easily. Flipping the Karma Grip over to a ground level view was the easiest transition on a handheld stabilizer I have ever experienced. Usually you have to either tell the stabilizer that you want to film ground level or you have to do a certain motion to not make the stabilizer flip out. The Karma Grip is incredibly easy to use; it lets you film smoothly with minimal effort.

To go a little further into testing I made a makeshift mount using a pipe and a 2×4 I had lying around. I screwed some sticky GoPro mounts to the 2×4 for mounting. In the end, I wanted to put my Hero 4 mounted alongside my Hero 5 mounted on the Karma Grip to demonstrate just how stable the Karma Grip makes your footage. You can check it out here. After a few hours of using the Karma Grip, I really felt like I had many more options when filming. I saw a staircase and knew I could run up it without my steps being reflected in my video recording; it really opens your creative brain.

One thing I wish was more easily accessible was a mount for an external microphone. In my video, I separated the audio on the left from the GoPro Hero 4 not mounted in the Karma Grip and the Hero 5 mounted in the Karma Grip. I did this so I could hear the difference. Once in the Karma Grip, the Hero 5’s audio becomes pretty muted. I know that GoPros aren’t necessarily supposed to be used with external mics, but with the GoPro’s audio not being high level all the time I sometimes use an external mic mounted on something like the iOgrapher Go or even the Karma Grip. If the Karma Grip could somehow mount a microphone along with possibly integrating a ¼-inch jack instead of having to buy a $49.99 converter I would be very happy.

Quik Key
The Quik Key is a great addition to the GoPro accessory line and is available for Lightning Port for the iOS ($29.99), Micro-USB ($19.99) and even USB-C ($19.99). It works directly with the GoPro Capture app on your mobile device to transfer photos and videos without having to hook up your GoPro or microSD card to your computer. Based on support documents, it seems like Android phones are more compatible with formats and resolutions, but since I have an iPhone the iOS version is what I am dealing with. You can get the specific iPhone resolution compatibility chart here. It’s interesting to note that ProTune footage is specifically not compatible with iOS.

The Quik Key is great for my dad adventures (or dad-ventures!) to Disneyland, Knott’s Berry Farm, hikes, baseball games, etc. If for some odd reason one of my sons takes a nap, I can transfer some videos or images to my phone and upload them to the web while on the run. The Quik Key comes with a carabiner-style clip to hang on, but it’s definitely small enough to keep in your pocket with the Remo remote. I love the Remo for the same dad-ventures with the kids; you can use the button as a shutter release and also change shooting modes from video to photos by just saying “GoPro Photo Mode.”

Summing Up
In the end, while GoPro is digging their way out of the Karma Drone battery latch caper, I continue to love their gear. The GoPro Hero 5 is my favorite camera they’ve made to date and it’s easy to take along since you no longer need an external housing to keep it waterproof. All of the GoPro accessories like the Karma Grip, Hero 5, Hero 5 Session, mounts, three-way mount and practically anything else fit perfectly in my favorite GoPro bag, The Seeker. It’s an incredible bag that even comes with room enough for your CamelBak water bladder.

The Karma Grip is smooth and super easy to use, it works flawlessly with the Hero 5 and coming soon in spring of 2017 is the Karma Grip extension cable. The extension cable allows you to put your Grip handle out of sight and mount the stabilizer inconspicuously, something I bet a lot of television shows will like to use, opening the GoPro creativity door a little more.

I really love GoPro products. Even if there are other options out there, I always know that for the most part the GoPro product line is made of high-quality accessories and cameras that everyone from moms and dads to professional broadcasters rely on. I can even give my GoPro to my sons to run around with and get muddy without a care in the world allowing them to capture the world from their own point of view. The GoPro product line including the Karma Grip is full of awesome gear that I can’t recommend enough.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.


Building a workflow for The Great Wall

Bling Digital, which is part of the SIM Group, was called on to help establish the workflow on Legendary/Universal’s The Great Wall, starring Matt Damon as a European mercenary imprisoned within the wall. While being held he sees exactly why the Chinese built this massive barrier in the first place — and it’s otherworldly. This VFX-heavy mystery/fantasy was directed by Yimou Zhang.

We reached out to Bling’s director of workflow services, Jesse Korosi, to talk us through the process on the film, including working with data from the Arri 65, which at that point hadn’t yet been used on a full-length feature film. Bling Digital is a post technology and services provider that specializes in on-set data management, digital dailies, editorial system rentals and data archiving

Jesse Korosi

When did you first get involved on The Great Wall and in what capacity?
Bling received our first call from the unit production manager Kwame Parker about providing on-set data management, dailies, VFX and stereo pulls, Avid rentals and a customized process for the digital workflow for The Great Wall in December of 2014.

At this time the information was pretty vague, but outlined some of the bigger challenges, like the film being shot in multiple locations within China, and that the Arri 65 camera may be used, which had not yet been used on a full-length feature. From this point on I worked with our internal team to figure out exactly how we would tackle such a challenge. This also required a lot of communication with the software developers to ensure that they would be ready to provide updated builds that could support this new camera.

After talks with the DP Stuart Dryburgh, the studio and a few other members of production, a big part of my job and anyone on my workflow team is to get involved as early as possible. Therefore our role doesn’t necessarily start on day one of principal photography. We want to get in and start testing and communicating with the rest of the crew well ahead of time so that by the first day, the process runs like a well-oiled machine and the client never has to be concerned with “week-one kinks.”

Why did they opt for the Arri 65 camera and what were some of the challenges you encountered?
Many people who we work with love Arri. The cameras are known for recording beautiful images. For anyone who may not be a huge Arri fan, they might dislike the lower resolution in some of the cameras, but it is very uncommon that someone doesn’t like the final look of the recorded files. Enter the Arri 65, a new camera that can record 6.5K files (6560×3100) and every hour recorded is a whopping 2.8TB per hour.

When dealing with this kind of data consumption, you really need to re-evaluate your pipeline. The cards are not able to be downloaded by traditional card readers — you need to use vaults. Let’s say someone records three hours of footage in a day — that equals 8.7TB of data. If you’re sending that info to another facility even using a 500Mb/s Internet line, that would take 38 hours to send! LTO-ing this kind of media is also dreadfully slow. For The Great Wall we ended up setting up a dedicated LTO area that had eight decks running at any given time.

Aside from data consumption, we faced the challenge of having no dailies software that could even read the files. We worked with Colorfront to get a new build-out that could work, and luckily, after having been through this same ordeal recording Arri Open Gate on Warcraft, we knew how to make this happen and set the client at ease.

Were you on set? Near set? Remote?
Our lab was located in the production office, which also housed editorial. Considering all of the traveling this job entailed, from Beijing and Qingdao to Gansu, we were mostly working remotely. We wanted to be as close to production as possible, but still within a controlled environment.

The dailies set-up was right beside editor Craig Wood’s suite, making for a close-knit workflow with editorial, which was great. Craig would often pull our dailies team into his suite to view how the edit was coming along, which really helped when assessing how the dailies color was working and referencing scenes in the cut when timing pickup shots.

How did you work with the director and DP?
At the start of the show we established some looks with the DP Stuart Dryburgh, ASC. The idea was that we would handle all of the dailies color in the lab. The DIT/DMT would note as much valuable information on set about the conditions that day and we would use our best judgment to fulfill the intended look. During pre-production we used a theatre at the China Film Group studio to screen and review all the test materials and dial in this look.

With our team involved from the very beginning of these color talks, we were able to ensure that decisions made on color and data flow were going to track through each department, all the way to the end of the job. It’s very common for decisions to be made color wise at the start of a job that get lost in the shuffle once production has wrapped. Plus, sometimes there isn’t anyone available who recognizes why certain decisions were made up front when you‘re in the post stage.

Can you talk us through the workflow? 
In terms of workflow, the Arri 65 was recording media onto Codex cards, which were backed up onset with a VaultS. After this media was backed up, the Codex card would be forwarded onto the lab. Within the lab we had a VaultXL that would then be used to back this card up to the internal drive. Unfortunately, you can’t go directly from the card to your working drive, you need to do two separate passes on the card, a “Process” and a “Transfer.”

The Transfer moves the media off the card and onto an internal drive on the Vault. The Process then converts all the native camera files into .ARI files. Once this media is processed and on the internal drive, we were able to move it onto our SAN. From there we were able to run this footage through OSD and make LTO back-ups. We also made additional back-ups to G-Tech GSpeed Studio drives that would be sent back to LA. However, for security purposes as well as efficiency, we encrypted and shipped the bare drives, rather than the entire chassis. This meant that when the drives were received in LA, we were able to mount them into our dock and work directly off of them, i.e no need to wait on any copies.

Another thing that required a lot of back and forth with the DI facility was ensuring that our color pipeline was following the same path they would take once they hit final color. We ended up having input LUTs for any camera that recorded a non-LogC color space. In regards to my involvement, during production in China I had a few members of my team on the ground and I was overseeing things remotely. Once things came back to LA and we were working out of Legendary, I became much more hands-on.

What kind of challenges did providing offline editorial services in China bring, and how did that transition back to LA?
We sent a tech to China to handle the set-up of the offline editorial suites and also had local contacts to assist during the run of the project. Our dailies technicians also helped with certain questions or concerns that came up.

Shipping gear for the Avids is one thing, however shipping consoles (desks) for the editors would have been far too heavy. Therefore this was probably one of the bigger challenges — ensuring the editors were working with the same caliber of workspace they were used to in Los Angeles.

The transition of editorial from China to LA required Dave French, director of post engineering, and his team to mirror the China set-up in LA and have both up and running at the same time to streamline the process. Essentially, the editors needed to stop cutting in China and have the ability to jump on a plane and resume cutting in LA immediately.

Once back in LA, you continued to support VFX, stereo and editorial, correct?
Within the Legendary office we played a major role in building out the technology and workflow behind what was referred to as the Post Hub. This Post Hub was made up of a few different systems all KVM’d into one desk that acted as the control center for VFX and stereo reviews, VFX and stereo pulls and final stereo tweaks. All of this work was controlled by Rachel McIntire, our dailies, VFX and stereo management tech. She was a jack-of-all-trades who played a huge role in making the post workflow so successful.

For the VFX reviews, Rachel and I worked closely with ILM to develop a workflow to ensure that all of the original on set/dailies color metadata would carry into the offline edit from the VFX vendors. It was imperative that during this editing session we could add or remove the color, make adjustments and match exactly what they saw on set, in dailies and in the offline edit. Automating this process through values from the VFX Editors EDL was key.

Looking back on the work provided, what would you have done differently knowing what you know now?
I think the area I would focus on next time around would be upgrading the jobs database. With any job we manage at Bling, we always ensure we keep a log of every file recorded and any metadata that we track. At the time, this was a little weak. Since then, I have been working on overhauling this database and allowing creative to access all camera metadata, script metadata, location data, lens data, etc. in one centralized location. We have just used this on our first job in a client-facing capacity and I think it would have done wonders for our VFX and stereo crews on The Great Wall. It is all too often that people are digging around for information already captured by someone else. I want to make sure there is a central repository for that data.


An image scientist weighs in about this year’s SciTech winners

While this year’s Oscar broadcast was unforgettable due to the mix up in naming the Best Picture, many in the industry also remember actors Leslie Mann and John Cho joking about how no one understands what the SciTech Awards are about. Well, Shed’s SVP of imaging science, Matthew Tomlinson, was kind enough to answer some questions about the newest round of winners and what the technology means to the industry.

As an image scientist, what was the most exciting thing about this year’s Oscars’ Scientific and Technical Awards?
As an imaging scientist, I was excited about the five digital cameras — Viper, Genesis, Sony 65, Red Epic and Arri — that received accolades. I’ve been working with each of these cameras for years, and each of them has had a major impact in the industry. They’ve pioneered the digital revolution and have set a very high standard for future cameras that appear on the market.

The winners of the 2017 SciTech Awards. Credit: Todd Wawrychuk/A.M.P.A.S.

Another exciting aspect is that you actually have access to your “negative” with digital cameras and, if need be, you can make adjustments to that negative after you’ve exposed it. It’s an incredibly powerful option that we haven’t even realized the full potential of yet.

From an audience perspective, even though they’ll never know it, the facial performance capture solving system developed by ILM, as well as the facial performance-based software from Digital Domain and Sony Pictures Imageworks, is incredibly exciting. The industry is continuously pushing the boundaries of the scope of the visual image. As stories become more expansive, this technology helps the audience to engage with aliens or creatures that are created by a computer but based on the actions, movements and emotions of an actor. This is helping blur the lines between reality and fantasy. The best part is that these tools help tell stories without calling attention to themselves.

Which category or discipline saw the biggest advances from last year to this year? 
The advancements in each technology that received an award this year are based on years of work behind the scenes that led up to this moment. I will say that from an audience perspective, the facial animation advancements were significant this past year. We’re reaching a point where audiences are unaware major characters are synthetic or modified. It’s really mind blowing when you think about it.

Sony’s Toshihiko Ohnishi.

Which of the advancements will have the biggest impact on the work that you do, specifically?
The integration of digital cameras and intermixing various cameras into one project. It’s pretty common nowadays to see the Sony, Alexa and Red camera all used on the same project. Each one of these cameras comes with its own inherent colorspace and particular attributes, but part of my job is to make sure they can all work together — that we can interweave the various files they create — without the colorist having to do a lot of technical heavy lifting. Part of my job as an Imaging Scientist is handling the technicalities so that when creatives, such as the director, cinematographer and colorist, come together they can concentrate on the art and don’t have to worry about the technical aspects much at all.

Are you planning to use, or have you already begun using, any of these innovations in your work?

The digital cameras are very much part of my everyday life. Also, in working with a VFX house, I like to provide the knowledge and tools to help them view the imagery as it will be seen in the DI. The VFX artist spends an incredible amount of time and effort on every pixel they work on and it’s a real goal of mine to make sure that the work that they create is the best it can be throughout the DI.


Blackmagic intros lower-cost color panels for Resolve, new camera

By Brady Betzel

Yesterday, Blackmagic held a press conference on YouTube introducing a new pro camera — the Ursa Mini Pro 4.6K, which combines high-end digital film quality with the ergonomics and features of a traditional broadcast camera — and two new portable hardware control panels for the DaVinci Resolve (yes, only the Resolve) designed to allow color correction workflows to be mixed in with editing workflows.

For this article, I’m going to focus on the panels.

The color correction hardware market is a small one, usually headed by the same companies who produce color correction software. Tangent is one of the few that produces its own color correction panels. There is also the Avid/Euphonix Artist color correction panel and a few others, but the price jumps incredibly when you step up to panels like the Blackmagic DaVinci Resolve Advanced panels (just under $30,000).

I’ve previously reviewed the Tangent Ripple and Element color correction panels, and I love them. However, besides Tangent there really hasn’t been any mid- to prosumer-level products… until now. Blackmagic is offering the new Micro and Mini color correction panels.

The Blackmagic’s Micro color correction panel (our main image) is well priced at $995, which can be somewhat compared to the Tangent Wave (over $1,500 on B&H‘s site), Tangent Element Tk (over $1,135), or more closely compared to the Avid Artist Color Control Surface ($1,299). You’ll notice all of those are priced way higher than the new Micro panel. You could also throw the Tangent Ripple up for comparison, but that has a much more limited functionality and is much lower in price at around $350. The Micro panel is essentially three trackballs, 12 knobs and 18 keys. It is a collection of the most highly used parts of a color correction panel without any GUI screens. It connects via USB-C, although a USB 3 to USB-C converter will be included.

The Blackmagic Mini color correction panel (pictured right) is priced higher at $2,995 and can be compared to a combo of the Tangent Element Tk with one or two more in the Element set, which retail for $3,320 on www.bhphotovideo.com. The Mini adds two 5-inch displays, eight soft buttons, and eight soft knobs, in addition to everything the Micro panel has. It also has pass-through Ethernet to power and connect the panel, USB-C, and 4-pin XLR 12V DC power connection.

I am really excited to try these color correction panels out for my own — and I will, as the panels are on their way to me as I type. I need to emphasize that these panels only work with Resolve, no other software apps, so these were built with one workflow in mind.

I do wonder if in the future Blackmagic will sell additional panels that add more buttons and knobs or something crazy like a Smartscope through the Ethernet ports so I don’t have to buy additional SDI output hardware. Will everyone be ok with transport controls being placed on the right?

“We are always looking to design new products and features to help with the creative process,” says Blackmagic’s Bob Caniglia. “These new panels were designed to enable our growing number of Resolve users to be able access the power of DaVinci Resolve and Resolve Studio beyond a mouse and keyboard. The Micro and Mini control panels provide the perfect complement to our existing Advanced control panels.”

Blackmagic is really coming for everyone in the production and post world with recent moves like the acquisition of audio company Fairlight and realtime bluescreen and greenscreen removal hardware Ultimatte, providing Avid with their Media Composer DNx IOs, and even releasing an updated version of the Ursa camera, the Ursa Mini Pro. Oh, yeah, and don’t forget they provide one of the top color correction and editing apps on the market in DaVinci Resolve, and the latest color correction hardware like the Micro and Mini panels are primed to bring the next set of colorists into the Resolve world.

Oh, and as not to forget about the camera, the Ursa Mini Pro 4.6K is now available for $5,995. Here are some specs:

•  Digital film camera with 15 stops of dynamic range.
• Super 35mm 4.6K sensor with third-generation Blackmagic color science processing of raw sensor data.
• Interchangeable lens mount with EF mount included as standard. Optional PL and B4 lens mount available separately.
• High-quality 2, 4 and 6 stop ND filters with IR compensation designed to specifically match the colorimetry and color science of Ursa Mini Pro.
• Fully redundant controls including ergonomically designed tactile controls that allow direct access to the most important camera settings such as external power switch, ND filter wheel, ISO, shutter, white balance, record button, audio gain controls, lens and transport control, high frame rate button and more.
• Built-in dual C-Fast 2.0 recorders and dual SD/UHS-II card recorders allow unlimited duration recording in high quality.
• LCD status display for quickly checking timecode, shutter and lens settings, battery, recording status and audio levels.
• Support for CinemaDNG 4.6K RAW files and ProRes 4444 XQ, ProRes 4444, ProRes 422 HQ, ProRes 422, ProRes 422 LT and ProRes 422 Proxy recording at Ultra HD and HD resolutions.
• Supports up to 60 fps 4.6K resolution capture in RAW.
• Features all standard connections, including dual XLR mic/line audio inputs with phantom power, 12G-SDI output for monitoring with camera status graphic overlay and separate XLR 4-pin power output for viewfinder power, headphone jack, LANC remote control and standard 4-pin 12V DC power connection.
• Built-in stereo microphones for recording sound.
• Four-inch foldout touchscreen for on-set monitoring and menu settings.


The A-List: La La Land’s Oscar-winning director Damien Chazelle

By Iain Blair

Writer/director Damien Chazelle may only have three feature films on his short resume, but the 32-year-old is already viewed by Hollywood as an acclaimed auteur and major talent. His latest film, the retro-glamorous musical La La Land, is a follow-up to his 2014 release Whiplash. That film received five Oscar nominations — including Best Picture and Best Adapted Screenplay for Chazelle — and three wins, including Best Supporting Actor for J.K. Simmons.

Now officially crowned as this year’s Oscar frontrunner, Lionsgate’s La La Land just scored a stunning total of 14 nominations (including Best Director), matching the record held by All About Eve and Titanic. It also recently scooped up seven Golden Globes, a record for a single movie, as well as a ton of other awards and nominations.

Damien Chazelle

Set in the present, but paying homage to the great Hollywood musicals of the ’40s and ’50s, La La Land tells the story of jazz pianist Sebastian (Ryan Gosling), who meets aspiring actress, playwright and fan of old movies Mia (Emma Stone). They initially ignore each other, they talk, they fight — but mainly they break out of the conventions of everyday life as they break into song and dance at the drop of a hat and take us on an exuberant journey through their love affair in a movie that’s also an ode to the glamour and emotion of cinema classics. It’s also a love letter to the Los Angeles of Technicolor dreams.

To bring La La Land to life, Chazelle collaborated with a creative team that included director of photography Linus Sandgren (known for his work with David O. Russell on American Hustle and Joy), choreographer Mandy Moore, composer Justin Hurwitz, lyricists Benj Pasek and Justin Paul, and editor Tom Cross who cut Whiplash for him.

I recently talked to Chazelle about making the film and his workflow.

To paraphrase Mark Twain, reports that the musical is dead have been greatly exaggerated. You obviously love them.
I do, and I also don’t think they’re just escapist fantasies. They usually tell you something about their era, and the idea was to match the tropes of those great old movies — the Fred and Ginger musicals — with modern life and all its demands. I’m a huge fan of all those old musicals, and I drew my inspiration from a wide mix of all the MGM musicals, the Technicolor and CinemaScope ones especially, and then all the films of Jacques Demy. He’s the French New Wave director who made The Umbrellas of Cherbourg, The Young Girls of Rochefort and A Room in Town. But I was also inspired by ‘90s films about LA that really captured the grandeur of the city, like Robert Altman’s Short Cuts or Pulp Fiction.

It’s interesting that all your films are so music-driven.
I used to be a jazz drummer — or a wannabe — so a lot of it comes from that. Probably frustrated ambition (laughs).

Is it true that you never used a hand double for Ryan Gosling when he was playing piano?
Completely true. He could play a little bit of basic piano stuff, and he’s definitely musical, but he was adamant right from the start that he would learn all the pieces and play them himself — and he did. He practiced intensely for four months before the shoot, and by the time we shot he could play. There’s no cheating. They’re his hands, even on the close-ups. That’s how committed he was.

The dancing must have been equally demanding for both Ryan and Emma?
It was. They both had a little dance experience — him more than her, I think, but fairly minimal and in different styles than this. So they had to do a lot of rehearsal and training, and Mandy Moore is a great dance instructor as well as a choreographer, so she did both at the same time — training them and building the choreography out of that and what suited each actor and each character. It was all very organic and tailored specifically for them.

The big opening dance sequence with all the cars is such a tour-de-force. Just how tough was that to pull off?
It was very tough. I had an amazing crew, and once we’d found this overpass ramp we had to figure out exactly how to shoot it for real with all these cars of different colors and eras, so there was a ton of insane logistics to deal with. That was going on while Mandy was working on all the choreography, either in the studio or in parking lots, since we couldn’t rehearse that much on location. The last thing to add was the crane. I’d storyboarded the whole sequence and shot a lot of the rehearsals on my iPhone so we could study them and see how we wanted to move the camera with the crane.

There’s been a lot of talk about it being one long uncut sequence. Is it?
No. We designed it to look like one shot but it’s actually three, stitched together invisibly, and we shot it over a weekend.

Talk about working with Linus Sandgren, who used anamorphic lenses and 35mm film to get that glamour look.
We had a great relationship, as every time I had an idea he’d one-up it, and vice-versa. So he really embraced all the challenges and set the tone with his enthusiasm. There was a lot of back and forth before and during the shoot. We wanted the camera to feel like a dancer, to become part of the choreography, to be very energetic, and we had this great Steadicam guy, Ari Robbins. He did amazing work executing these very difficult, fluid shots. I wanted the film to be very anamorphic, and today, scope films are usually shot in 2.40 to 1, but Linus thought it would be interesting to shoot it in 2.52 to 1 to give it the extra scope of those classic films. We talked to Panavision about it, and they actually custom-fit some lenses for us.

Do you like post?
I love it, especially the editing. It’s my favorite part of the whole process.

Tell us about working with editor Tom Cross. Was he on the set?
He visited a couple of times, but I think it’s better when editors are not there so they are more objective when they first see the coverage. He starts cutting while I shoot, and then we start. I like to be in the editing room every day, and the big challenge on this was finding the right tone.

While Whiplash was all about punctuated editing so it reflected the tempos and rhythms of the drumming, La La Land is the polar opposite. It’s all about lush curves, and Whiplash is a movie about hard right angles. So on this, it was all about calibrating a lot of details. We had a mass of footage — a lot ended up on the cutting room floor — and while some is heightened fantasy, some is like a realist drama. So we had to find a way for both to coexist, and that involved everything from minute tweaks to total overhauls. We actually cut the whole opening number at one point, then later put it back and dropped other scenes around it. There’s probably no number we didn’t cut at some point, so we tried all possibilities, and it took a while to get the tone and pacing right.

Where did you do the post?
At EPS-Cineworks in Burbank; then on the Fox lot. Justin, the composer, was also there working on score cues next door, and we had our sound team with us for a bit, way before the mix, doing sound design, so it was very collaborative. It was like a mini-factory. Crafty Apes did all the VFX, such as the planetarium sequence and flying through space sequence, as well as the more invisible stuff throughout the film.

Obviously, all the music and sound was crucial?
Yes, and it helped that we had a lot of the score done before we shot. Justin was with us for the edit, and we’d do temp stuff for screenings and then tweak things. I had a great sound team led by Andy Nelson, who were phenomenal. Just like with the VFX, it had to somehow be small and intimate while also being huge and epic. It couldn’t be too glossy, so all the music was recorded acoustically and the vocals are all dry with very little reverb or compression, and we mixed in Atmos at Fox.

Where did you do the DI?
On the Fox lot with colorist Natasha Leonnet from EFilm. She did Whiplash for me and she’s very experienced. The DP and her set the template for the look and color palette even before the shoot, and then Linus and I’d go in for the DI and alternate on sessions. Our final session was literally 48 hours long non-stop — no sleep, no trips outdoors — as we were so under the wire to finish. But it all turned out great, and I’m very pleased with the look and the final film. It’s the film I wanted to make.


Review: Polaroid camera accessories

By Brady Betzel

When you hear the name Polaroid, your mind likely serves up an image of an instant camera, but that image might vary depending on your age. Not long ago, the once ubiquitous company sold off its last instant film factory — to a very interesting company named Impossible — and began to produce new cameras for the digital age as well as boatloads of camera accessories. (Check out their refurbished Polaroid format cameras and film here.)

You can pretty much find anything you want among Polaroid’s offerings, from sliders to their own HD action camera. For this article, I chose to focus on three camera accessories that I think every prosumer videographer could make use of. Because, let’s be honest, many editors and video pros shoot their own projects on the side, so why not be outfitted for the job? And let’s not forget those filmmakers who aren’t precious about getting just the right shot, whether it’s with a Canon DSLR or a GoPro.

Panorama Pan Head
Up first is the Polaroid remote controlled 360-degree Panorama Pan Head. It’s a small, mountable, remote-controllable, motorized pan mount for your GoPro, smart phone or any camera weighing around 1.2 pounds. Physically, it’s a little smaller than a tennis ball. It has fold-out legs that can act as a surprisingly sturdy stand, or it can be mounted to a standard tripod via its ¼-inch screw mount.

Since I love to shoot using GoPro cameras, I wanted to test this out with my Hero 5 Black Edition. Along with the pan head, you get a detachable GoPro Mount, a smart phone holder to get those panoramic pictures and a remote.

The remote has a less-than-awesome design, but it works! Once you mount your camera of choice, you can go left and right, faster and slower, press one button to rotate the camera 75 degrees and return home, or set up a rotational timelapse by telling it to rotate five degrees every 10 seconds.

If you are looking at this for panoramic shots on your iPhone or Samsung Android phone, you can connect the Bluetooth and use the remote as a wireless shutter trigger. I have to say that the Bluetooth connected easily and quickly, which doesn’t always happen in life, so I was particularly happy about that. The pan head is rechargeable via an included micro USB cable. It only took me a few hours to recharge, but the manual says up to eight hours for a full charge.

In my experience, the pan head isn’t a necessary piece of equipment. It’s more a fun piece of niche hardware that you will use in few shoots. If you shoot a ton of panoramic shots and want them to be consistent then the pan head is for you. The timelapse feature is interesting but I’ve messed around with some “egg timers” that people have retrofitted with 1/4-inch screw mounts that have been just as good and cost a lot less. You can find this Polaroid 360 Pan Head on Amazon.com for $49.99.

Track Slider
Up next is the Polaroid 24-inch Rail Track Slider. There are a lot of sliders on the market and some for almost impossibly cheap prices. The Polaroid slider comes in two varieties: 24-inch and 48-inch. I do a lot of product shooting for these reviews, so the 24-inch was just the right size for me.

I tested it out with a ball head tripod mount holding a Blackmagic Pocket Cinema Camera and zoom lens. Sliders really add another level of professionalism to any video, so if you are looking to take your shots to the next level definitely buy a camera slider. A few pushes and slides really make a difference and add some great dimensionality to any shot. Polaroid’s slider has detachable and adjustable legs, tension adjustment on the carriage itself to adjust slide speeds and a ¼-inch screw mount on the bottom to mount onto a tripod.

In addition, it has ¼-inch mounts on either side of the slider to mount it vertically to a tripod. I used the slider quite a few times and noticed that it worked well. I also noticed that the sliding of the carriage wasn’t always consistent; even when I was super-cautious and careful I would get some bumps. The carriage screw wasn’t as consistent as I would have liked, but for $99 you need to remember you aren’t getting a $1,700 Syrp motion-controlled slider. The slider itself is relatively light, comes in a very sturdy bag with handles and transports easily.

65-Inch Varipod
Last on the list is the Polaroid 65-inch Varipod — a telescoping monopod with removable tripod balance stand. Polaroid has really stuck to keeping prices low with their accessories, and they continue with the Varipod, which is priced at $49.99. While cheap, the monopod is very sturdy. I didn’t have a problem mounting my Canon DSLR with lens.

I was feeling particularly brave and decided to let the monopod hold my DSLR by itself — the Varipod held up just fine. The Varipod can get up to 65-inches tall, but I never needed it to go that high. One trick hidden inside the screw mount is that you can flip over the ⅜-inch screw and on the bottom is a ¼-inch screw mount. The best part about the Varipod is the detachable legs. If you want to take the legs off and use them as a sort of table tripod you can.

Summing Up
Polaroid has thrown every camera accessory against the wall to see what sticks. While the monopod and the pan head are interesting, and if you need them you will know, the real gem is the Polaroid 24-inch slider. If you buy one accessory other than a sweet lens, lighting or a tripod, you need to buy a slider. For $99 you really get a great starting slider and may never need to buy one again. I warn you though, once you start using the slider you might get sucked down the motorized slider rabbit hole.

You can find these Polaroid accessories on Amazon.com for purchase.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

A one-man production band… on wheels

Capturing an event with pro know-how and flexible tools

By David Hurd

I recently had an opportunity to shoot a gala event at a mall for the Tampa Innovation Alliance. The CEOs of all the big local companies, as well as the mayor were there, along with 600 guests. The event was held in the large space that used to be an Old Navy store, and there were booths out in the mall that needed coverage as well.

The plan was for Tracy, the interviewer, to get short interviews with the VIPs before the sit down part of the event and then I would record the speakers. The footage would then be edited down into a five-minute 720p YouTube video.

Because of the many set-ups, and the size of the venue, I needed a rig that was quick and portable. I started with a pair of American Grip Dana Dolly Baby Combo Stands on wheels. These things are awesome and built like tanks. I then attached a 48-inch SmartSystem slider to the top of the stands and a Manfrotto head and pan bar. The 48-inch SmartSystem slider can take a lot of weight and allows me to use any camera rig.

I assembled the rig in the parking lot, and just rolled it into the mall. During the shoot, I used the slider to re-position shots quickly when the crowd got in my way, and it came in handy for creating moving shots as well. Let’s talk about the camera.

My Gear
I have grown to love my Blackmagic 4K production camera for jobs like these. I use a 35mm Rokinon lens, which due to the crop factor ends up at around 50mm. Indoors, I set it to Film mode (iso 800) and a color balance of 4000, which always seems to work best. I also turn on the 2:35 mask so that I have an idea of what the image will look like later.

The Rokinon lens is f1.3, so it does well in low light. Since I was going to be constantly on the move, I just used available light. Did I mention that the lighting inside the event looked like a dark bar? That’s where Film mode (iso 800) and the lens saved my butt.

For important jobs, I record a 220Mb/sec stream in ProRes 422HQ, otherwise ProRes 422 100Mb/sec works fine for the web. You will only see the difference when you zoom in a lot in post. For power, I used a V-mount Blueshape battery. Blueshape batteries are what professionals are changing to. The one I used that night lasted the whole shoot.

For audio, I use the amazing little JuicedLink BMC366 mixer for Blackmagic cameras. It’s small, lightweight, and has everything I need. I used a Shure VP64 mic, plugged into a Sennheiser RF transmitter in one channel of the mixer for the interviews. I also needed the house audio for the sit-down speeches. For this I used a Sennheiser lav transmitter plugged into a sub out on the house mixer via a 1/4-inch jack. Since the jack was mono and the mixer was stereo, I only pushed in the jack to the first click to avoid shorting it out. After adjusting the in and out levels, the Sennheiser transmitted the house audio to wherever I was in the room.

Interviews
The interview part of the shoot went something like this: Tracy walked around with his mic in hand, finding interview victims. I followed him, happily pushing my rig along. When one was found, I directed them into position to make use of available light, framed for a wide shot, focused and hit record. It was painless, and the process took about one to three minutes per interview.

When everyone went inside for the sit-down part of the evening, I found a place off to one side of the stage, about 30 feet from the podium. Using the same lens, I could get most of the stage in the shot. After a quick battery change in the house audio transmitter, I was ready to rock.

About an hour later, after the event, we stood by the exit and snagged people for interviews as they were leaving. Then I rolled the rig to the parking lot, took it apart, loaded it up, and headed home for the edit.

The Edit
The edit is where the magic happens. Thunderbolt is wonderful, and I have built up a system that is fairly state of the art, so that I don’t have to wait much while editing.

I called on a Mac Pro “Trash Can” with 64GB of memory and 12GB of GPU processing on the two video cards. The computer is connected to four G-Tech G-Speed esPro drive boxes via two HighPoint RocketStor 6328 RAID controllers. Each controller is connected to its own TB channel. Each set of two boxes (eight drives) is a RAID-5, and all 16 drives are striped RAID-0 in OS X. The system reads data at 2000MB/sec and writes at over 1700MB/sec. — perfect for 4K editing.

For viewing, there are two 32-inch monitors, one of which is a Boland broadcast monitor run through a Blackmagic UltraStudio 4K interface box via SDI.

The workflow is easy. I simply drop the SSDs from the Blackmagic camera into my RocketStor 5212, which transfers the data via Thunderbolt to my RAID really fast. I record on OWC 480GB Mercury Extreme Pro 6G SSD cards, so the transfer rate is over 550MB/sec.

In Apple FCPX I create a 720p timeline and when I import the 4K footage, I select “Leave Files in Place.” Basically, I am dropping roughly 2000×4000 pixel footage onto a 720×1280 pixel timeline.

For more of a “film” look, I place a 2:35 aspect ratio mask that I made in Photoshop over the footage. Now, I simply open up the scopes and color correct the footage, which is much easier to do before it’s all cut up.

My intention was to have the original wide shot, and zoomed-in medium and close-up shots, so first I had to see where I wanted to cut them. To do this I had to go through the footage and make cuts with the Blade tool. For example, I may start close-up on Tracy and go to a two-shot when he introduces his guest. Then I go to the guest when he says something interesting and then back to a two-shot for the close.

With the cuts made, I clicked on the clips, re-sized them and moved them around into the medium and close-up shots. Because I had about 2000×4000 pixels to work with, I was able to zoom in up to 300 percent and still have pixel-to-pixel coverage. If the shot was in focus, but looked a little soft, I would call on a sharpen filter to fix it.

Since I shoot with a Prime lens, there is no zoom. If the client wants a slow zoom, I just use keyframes. This is actually better than trying to zoom in and out at the event, where there are no re-takes.

This rig and workflow turned what would have been a lot of lifting and moving about in a crowded space into an efficient one-man shoot. I didn’t have to worry about zooming, or getting the exact framing, which removed a lot of stress. I got 90 minutes of footage, and I only needed five.

This story has a happy ending. The client was pleased with the video, and I got paid.


David Hurd is the owner of David Hurd Productions in Tampa, Florida. He has been in the business for over 40 years.