OWC 12.4

Category Archives: Uncategorized

Building a workflow for The Great Wall

Bling Digital, which is part of the SIM Group, was called on to help establish the workflow on Legendary/Universal’s The Great Wall, starring Matt Damon as a European mercenary imprisoned within the wall. While being held he sees exactly why the Chinese built this massive barrier in the first place — and it’s otherworldly. This VFX-heavy mystery/fantasy was directed by Yimou Zhang.

We reached out to Bling’s director of workflow services, Jesse Korosi, to talk us through the process on the film, including working with data from the Arri 65, which at that point hadn’t yet been used on a full-length feature film. Bling Digital is a post technology and services provider that specializes in on-set data management, digital dailies, editorial system rentals and data archiving

Jesse Korosi

When did you first get involved on The Great Wall and in what capacity?
Bling received our first call from the unit production manager Kwame Parker about providing on-set data management, dailies, VFX and stereo pulls, Avid rentals and a customized process for the digital workflow for The Great Wall in December of 2014.

At this time the information was pretty vague, but outlined some of the bigger challenges, like the film being shot in multiple locations within China, and that the Arri 65 camera may be used, which had not yet been used on a full-length feature. From this point on I worked with our internal team to figure out exactly how we would tackle such a challenge. This also required a lot of communication with the software developers to ensure that they would be ready to provide updated builds that could support this new camera.

After talks with the DP Stuart Dryburgh, the studio and a few other members of production, a big part of my job and anyone on my workflow team is to get involved as early as possible. Therefore our role doesn’t necessarily start on day one of principal photography. We want to get in and start testing and communicating with the rest of the crew well ahead of time so that by the first day, the process runs like a well-oiled machine and the client never has to be concerned with “week-one kinks.”

Why did they opt for the Arri 65 camera and what were some of the challenges you encountered?
Many people who we work with love Arri. The cameras are known for recording beautiful images. For anyone who may not be a huge Arri fan, they might dislike the lower resolution in some of the cameras, but it is very uncommon that someone doesn’t like the final look of the recorded files. Enter the Arri 65, a new camera that can record 6.5K files (6560×3100) and every hour recorded is a whopping 2.8TB per hour.

When dealing with this kind of data consumption, you really need to re-evaluate your pipeline. The cards are not able to be downloaded by traditional card readers — you need to use vaults. Let’s say someone records three hours of footage in a day — that equals 8.7TB of data. If you’re sending that info to another facility even using a 500Mb/s Internet line, that would take 38 hours to send! LTO-ing this kind of media is also dreadfully slow. For The Great Wall we ended up setting up a dedicated LTO area that had eight decks running at any given time.

Aside from data consumption, we faced the challenge of having no dailies software that could even read the files. We worked with Colorfront to get a new build-out that could work, and luckily, after having been through this same ordeal recording Arri Open Gate on Warcraft, we knew how to make this happen and set the client at ease.

Were you on set? Near set? Remote?
Our lab was located in the production office, which also housed editorial. Considering all of the traveling this job entailed, from Beijing and Qingdao to Gansu, we were mostly working remotely. We wanted to be as close to production as possible, but still within a controlled environment.

The dailies set-up was right beside editor Craig Wood’s suite, making for a close-knit workflow with editorial, which was great. Craig would often pull our dailies team into his suite to view how the edit was coming along, which really helped when assessing how the dailies color was working and referencing scenes in the cut when timing pickup shots.

How did you work with the director and DP?
At the start of the show we established some looks with the DP Stuart Dryburgh, ASC. The idea was that we would handle all of the dailies color in the lab. The DIT/DMT would note as much valuable information on set about the conditions that day and we would use our best judgment to fulfill the intended look. During pre-production we used a theatre at the China Film Group studio to screen and review all the test materials and dial in this look.

With our team involved from the very beginning of these color talks, we were able to ensure that decisions made on color and data flow were going to track through each department, all the way to the end of the job. It’s very common for decisions to be made color wise at the start of a job that get lost in the shuffle once production has wrapped. Plus, sometimes there isn’t anyone available who recognizes why certain decisions were made up front when you‘re in the post stage.

Can you talk us through the workflow? 
In terms of workflow, the Arri 65 was recording media onto Codex cards, which were backed up onset with a VaultS. After this media was backed up, the Codex card would be forwarded onto the lab. Within the lab we had a VaultXL that would then be used to back this card up to the internal drive. Unfortunately, you can’t go directly from the card to your working drive, you need to do two separate passes on the card, a “Process” and a “Transfer.”

The Transfer moves the media off the card and onto an internal drive on the Vault. The Process then converts all the native camera files into .ARI files. Once this media is processed and on the internal drive, we were able to move it onto our SAN. From there we were able to run this footage through OSD and make LTO back-ups. We also made additional back-ups to G-Tech GSpeed Studio drives that would be sent back to LA. However, for security purposes as well as efficiency, we encrypted and shipped the bare drives, rather than the entire chassis. This meant that when the drives were received in LA, we were able to mount them into our dock and work directly off of them, i.e no need to wait on any copies.

Another thing that required a lot of back and forth with the DI facility was ensuring that our color pipeline was following the same path they would take once they hit final color. We ended up having input LUTs for any camera that recorded a non-LogC color space. In regards to my involvement, during production in China I had a few members of my team on the ground and I was overseeing things remotely. Once things came back to LA and we were working out of Legendary, I became much more hands-on.

What kind of challenges did providing offline editorial services in China bring, and how did that transition back to LA?
We sent a tech to China to handle the set-up of the offline editorial suites and also had local contacts to assist during the run of the project. Our dailies technicians also helped with certain questions or concerns that came up.

Shipping gear for the Avids is one thing, however shipping consoles (desks) for the editors would have been far too heavy. Therefore this was probably one of the bigger challenges — ensuring the editors were working with the same caliber of workspace they were used to in Los Angeles.

The transition of editorial from China to LA required Dave French, director of post engineering, and his team to mirror the China set-up in LA and have both up and running at the same time to streamline the process. Essentially, the editors needed to stop cutting in China and have the ability to jump on a plane and resume cutting in LA immediately.

Once back in LA, you continued to support VFX, stereo and editorial, correct?
Within the Legendary office we played a major role in building out the technology and workflow behind what was referred to as the Post Hub. This Post Hub was made up of a few different systems all KVM’d into one desk that acted as the control center for VFX and stereo reviews, VFX and stereo pulls and final stereo tweaks. All of this work was controlled by Rachel McIntire, our dailies, VFX and stereo management tech. She was a jack-of-all-trades who played a huge role in making the post workflow so successful.

For the VFX reviews, Rachel and I worked closely with ILM to develop a workflow to ensure that all of the original on set/dailies color metadata would carry into the offline edit from the VFX vendors. It was imperative that during this editing session we could add or remove the color, make adjustments and match exactly what they saw on set, in dailies and in the offline edit. Automating this process through values from the VFX Editors EDL was key.

Looking back on the work provided, what would you have done differently knowing what you know now?
I think the area I would focus on next time around would be upgrading the jobs database. With any job we manage at Bling, we always ensure we keep a log of every file recorded and any metadata that we track. At the time, this was a little weak. Since then, I have been working on overhauling this database and allowing creative to access all camera metadata, script metadata, location data, lens data, etc. in one centralized location. We have just used this on our first job in a client-facing capacity and I think it would have done wonders for our VFX and stereo crews on The Great Wall. It is all too often that people are digging around for information already captured by someone else. I want to make sure there is a central repository for that data.

Origins: The Creative Storage Conference

By Tom Coughlin

I was recently asked how the Creative Storage Conference came to be. So here I am to give you some background.

In 2006, the Storage Visions Conference that my colleagues and I had been organizing just before the CES show in January was in its fifth year. I had been doing more work on digital storage for professional media and entertainment, including a report on this important topic. In order to increase my connections and interaction with both media and entertainment professionals, and the digital storage and service companies that support them, it seems that a conference focusing on digital storage for media and entertainment would be in order.

That same year, my partner Ron Dennison and I participated in the MediaTech Conference in the LA area, working with Bryan Eckus, the director of the group at the time. In 2007, we held the first Creative Storage Conference in conjunction with the MediaTech Conference in Long Beach, California. It featured a dynamite line-up of storage companies and end users.

The conference has grown in size over the years, and we have had a stream of great companies showing their stuff, media and entertainment professional attendees and speakers, informative sessions and insightful keynote talks on numerous topics related to M&E digital storage.

The 2017 Creative Storage Conference
This year, the Creative Storage Conference is taking place on May 24 in Culver City. Attendees can learn more about the use of Flash memory in M&E as well as the growth in VR content in professional video, and how this will drive new digital storage demand and technologies to support the high data rates needed for captured content and cloud-based VR services. This is the 11th year of the conference and we look forward to having you join us.

We are planning for six sessions and four keynotes during the day and a possible reception in the evening on May 24.

Here is a list of the planned sessions:
• Impact of 4K/HDR/VR on Storage Requirements From Capture to Studio
• Collaboration in the Clouds: Storing and Delivering Content Where it is Needed
• Content on the Move: Delivering Storage Content When and Where it is Needed
• Preserving Digital Content — the Challenges, Needs and Options
• Accelerating Workflows: Solid State Storage in Media and Entertainment
• Independent Panel — Protecting the Life of Content

Don’t miss this opportunity to meet giants in the field of VR content capture and post production and meet the storage and service companies to help you make sure your next professional projects are a big success.

• Hear how major media equipment suppliers and entertainment industry customers use digital storage technology in all aspects of content creation and distribution.
• Find out the role that digital storage plays in new content distribution and marketing opportunities for a rapidly evolving market.
• See presentations on digital storage in digital acquisition and capture, nonlinear editing and special effects.
• Find out how to convert and preserve content digitally and protect it in long-term dependable archives.
• Learn about new ways to create and use content metadata, making it easier to find and use.
• Discover how to combine and leverage hard disk drives, flash memory, magnetic tape and optical storage technology with new opportunities in the digital media market.
• Be at the juncture of digital storage and the next generation of storage for the professional media market.

Online registration is open until May 23, 2017. As a media and entertainment professional you can register now with a $100 discount using this link:

—–
Thomas Coughlin, president of Coughlin Associates is a storage analyst and consultant with over 30 years in the data storage industry. He is active with SNIA, SMPTE, IEEE, and other professional organizations.

OWC 12.4

Creating the color of Hacksaw Ridge

Australian colorist Trish Cahill first got involved in the DI on Mel Gibson’s Hacksaw Ridge when cinematographer Simon Duggan enquired about her interest and availability for the film. She didn’t have to consider the idea long before saying yes.

Hacksaw Ridge, which earned Oscar nominations for Best Picture, Director, Lead Actor, Film Editing (won), Sound Editing and Sound Mixing (won), is about a real-life World War II conscientious observer, Desmond Doss, who refused to pick up a gun but instead used his bravery to save lives on the battlefield.

Trish Cahill

Let’s find out more about Cahill’s work and workflow on Hacksaw Ridge

What was the collaboration like between you and director Mel Gibson and cinematographer Simon Duggan?
I first met Mel and the editor John Gilbert when I visited them in the cutting room halfway through the edit. We looked through the various scenes and — in particular, the different battle sequences — and discussed the different tone that was needed for each.

Simon had already talked through the Kodachrome idea with a gradual and subtle desaturation as the film progressed and it was very helpful to be spinning through the actual images and listening to Mel and John talk through their thoughts. We then chose a collection of shots that were representative of the different looks and turning points in the film to use in a look development session.

Simon was overseas at the time, but we had a few phone conversations and he sent though some reference stills prior to the session. The look development session not only gave us our look template for the film but it also gave us a better idea of how smoke continuity was shaping up and what could be done in the grade to help.

During the DI, Mel, John and producer Bill Mechanic came in see my work every couple of days for a few hours to review spools down. Once the film was in good shape, Simon flew in with a nice fresh eye to help tighten it further.

What was the workflow for this project?
Being a war film, there are quite a few bullet hits, blood splatter, smoke elements and various other VFX to be completed across a large number of shots. One of the main concerns was the consistency of smoke levels, so it was important that the VFX team had a balanced set of shots put into sequence reflecting how they would appear in the film.

While the edit was still evolving, the film was conformed and assistant colorist Justin Tran started a balance grade of the war sequences on FilmLight Baselight at Definition Films. This provided VFX supervisor Chris Godfrey and the rest of the team with a better idea of how each shot should be treated in relation to the shots around them and if additional treatment was required for shots not ear-marked for VFX. The balance grading work was carried across to the DI grade in the form of BLGs and were applied to the final edit with the use of Baselight’s multi-paste, so I had full control and nothing was baked in.

Was there a particular inspiration or reference that you used for the look of this film?
Simon sent through a collection of vintage photograph references from the era to get me started. There were shots of old ox blood red barns, mechanics and machinery, train yards and soldiers in uniform — a visual board of everyday pictures of real scenes from the 1930s and 1940s, which was an excellent starting point to spring from. Key words were desaturated, Kodachrome and, the phrase “twist the primaries a touch” was used a bit!

The film starts when our hero, Desmond Doss, is a boy in the 1930s. These scenes have a slight chocolaty sepia tone, which lessens when Doss becomes a young man and enters the military training camp. Colors become more desaturated again when he arrives in Okinawa and then climbs the ridge. We wanted the ridge to be a world unto itself — the desolate battlefield. Each battle from there occurs at different times of day in different environmental conditions, so each has been given its own color variation.

What were the main challenges in grading such a film?
Hacksaw Ridge is a war film. A big percentage of screen time is action-packed and fast-paced with a high-cut ratio. So there are many more shots to grade, there are varied cameras to balance between and fluctuating smoke levels to figure out. It’s more challenging to keep consistency in this type of film than the average drama.

The initial attack on top of the ridge happens just after an aerial bombing raid, and it was important to the story for the grade to help the smoke enhance a sense of vulnerability and danger. We needed to keep visibility as low as possible, but at the same time we wanted it still to be interesting and foreboding. It needed analysis at an individual shot level: what can be done on this particular image to keep it interesting and tonal but still have the audience feel a sense of “I can’t see anything.”

Then on a global level — after making each shot as tonal and interesting as possible — do we still have the murkiness we need to sell the vulnerability and danger? If not, where is the balance to still provide enough visual interest and definition to keep the audience in the moment?

What part of the grading process do you spend most of your time on?
I would say I spend more time on the balancing and initial grade. I like to keep my look in a layer at the end of the stack that stays constant for every shot in the scene. If you have done a good job matching up, you have the opportunity of being able to continue to craft the look as well as add secondaries and global improvements with confidence that you’re not upsetting the apple cart. It gives you better flexibility to change your mind or keep improving as the film evolves and as your instincts sharpen on where the color mood needs to sit. I believe tightening the match and improving each shot on the primary level is time very well spent.

What was the film shot on, and did this bring any challenges or opportunities to you during the grade?
The majority of Hacksaw Ridge was shot with an Arri Alexa. Red Dragon and Blackmagic pocket cameras were also used in the battle sequences. Whenever possible I worked with the original camera raw. I worked in LogC and used Baselight’s generalized color space to normalize the Red and Blackmagic cameras to match this.

Matching the flames between Blackmagic and Alexa footage was a little tricky. The color hues and dynamic range captured by each camera are quite different, so I used the hue shift controls often to twist the reds and yellows of each closer together. Also, on some shots I had several highlight keys in place to create as much dynamic range as possible.

Could you say more about how you dealt with delivering for multiple formats?
The main deliverables required for Hacksaw Ridge were an XYZ and a Rec709 version. Baselight’s generalized color space was used to do the conversions from P3 to XYZ and Rec709. I then made minimal tweaks for the Rec709 version.

Was there a specific scene or sequence you found particularly enjoyable or challenging?
I enjoyed working with the opening scene of the film, enhancing the golden warmth as the boys are walking through the forest in Virginia. The scenes within the Doss house were also a favorite. The art direction and lighting had a beautiful warmth to it and I really enjoyed bringing out the chocolaty, 1930’s and 1940’s tones.

On the flip side of that I also loved working with the cooler crisper dawn tones that we achieved in the second battle sequence. I find when you minimize the color palette and let the contrast and light do the tonal work it can take you to a unique and emotionally amplified place.

One of the greater challenges of grading the film was eliminating any hint of green plant life throughout the Okinawa scenes. With lush, green plants happily existing in the background, we were in danger of losing the audience’s belief that this was a bleak place. Unfortunately, the WW II US military uniforms were the same shade of green found in many parts of the surrounding landscape of the location, making it impossible to get a clean key. There is one scene in particular where a convoy of military trucks rolls through a column of soldiers adding clouds of dust to an already challenging situation.


Lime opens sound design division led by Michael Anastasi, Rohan Young

Santa Monica’s Lime Studios has launched a sound design division. LSD (Lime Sound Design), featuring newly signed sound designer Michael Anastasi and Lime sound designer/mixer Rohan Young has already created sound design for national commercial campaigns.

“Having worked with Michael since his early days at Stimmung and then at Barking Owl, he was always putting out some of the best sound design work, a lot of which we were fortunate to be final mixing here at Lime,” says executive producer Susie Boyajan, who collaborates closely with Lime and LSD owner Bruce Horwitz and the other company partners — mixers Mark Meyuhas and Loren Silber. “Having Michael here provides us with an opportunity to be involved earlier in the creative process, and provides our clients with a more streamlined experience for their audio needs. Rohan and Michael were often competing for some of the same work, and share a huge client base between them, so it made sense for Lime to expand and create a new division centered around them.”

Boyajan points out that “all of the mixers at Lime have enjoyed the sound design aspect of their jobs, and are really talented at it, but having a new division with LSD that operates differently than our current, hourly sound design structure makes sense for the way the industry is continuing to change. We see it as a real advantage that we can offer clients both models.”

“I have always considered myself a sound designer that mixes,” notes Young. “It’s a different experience to be involved early on and try various things that bring the spot to life. I’ve worked closely with Michael for a long time. It became more and more apparent to both of us that we should be working together. Starting LSD became a no-brainer. Our now-shared resources, with the addition of a Foley stage and location audio recordists only make things better for both of us and even more so for our clients.”

Young explains that setting up LSD as its own sound design division, as opposed to bringing in Michael to sound design at Lime, allows clients to separate the mix from the sound design on their production if they choose.

Anastasi joins LSD from Barking Owl, where he spent the last seven years creating sound design for high-profile projects and building long-term creative collaborations with clients. Michael recalls his fortunate experiences recording sounds with John Fasal, and Foley sessions with John Roesch and Alyson Dee Moore as having taught him a great deal of his craft. “Foley is actually what got me to become a sound designer,” he explains.

Projects that Anastasi has worked on include the PSA on human trafficking called Hide and Seek, which won an AICP Award for Sound Design. He also provided sound design to the feature film Casa De Mi Padre, starring Will Ferrell, and was sound supervisor as well. For Nike’s Together project, featuring Lebron James, a two-minute black-and-white piece, Anastasi traveled back to Lebron’s hometown of Cleveland to record 500+ extras.

Lime is currently building new studios for LSD, featuring a team of sound recordists and a stand-alone Foley room. The LSD team is currently in the midst of a series of projects launching this spring, including commercial campaigns for Nike, Samsung, StubHub and Adobe.

Main Image: Michael Anastasi and Rohan Young.


Review: LogicKeyboard’s Astra PC keyboard for Resolve 12/12.5

By Brady Betzel

I love a good keyboard. In fact, my favorite keyboards have always been mechanical, or pseudo-mechanical, like those old Windows keyboards you can find at thrift stores for under 10 bucks — in fact, I went back and bought one just the other day at a Goodwill. I love them because of the tactile response and click you get when depressing the keys.

Knowing this, you can understand my frustration (and maybe old-man bitterness) when all I see in the modern workplace are those slimline Apple keyboards, even on Windows PCs! I mean I can get by on those, but at home I love using this old Avid keyboard that is as close to mechanical as I can get.

LogicKeyboard’s Astra latest Resolve-focused backlit keyboard answers many problems in one slick keyboard. Logic’s scissor switch designed keys give me the tactile feedback that I love while the backlit keyboard itself is sleek and modern.

After being a primarily Avid Media Composer-focused editor with keyboards emblazoned with Avid shortcuts for many years, I started using other apps like Adobe After Effects and Blackmagic’s DaVinci Resolve and realized I really like to see shortcuts displayed on my keyboard. Yeah, I know, I should pretend to be able to blaze through an edit without looking at the keyboard but guess what, I look down. So when learning new apps like Resolve it is really helpful to have a keyboard with shortcuts, moreover with keys that have backlighting. I don’t usually run into many Resolve-focused keyboards so when I heard about Logic’s backlit version, I immediately wanted to try it out.

While this particular keyboard has Resolve-specific shortcuts labeled on the keys it will work as a standard keyboard and will run backlit regardless of what app you are in. If you are looking for a keyboard with shortcuts for a specific app check out LogicKeyboard’s site where you can find Windows and Ma OS keyboards for Adobe Premiere, Adobe After Effects, Avid Media Composer, Autodesk Smoke and even non-video-based apps like Pro Tools or Photoshop.

Taking it for a Drive
The Astra keyboard for Resolve 12/12.5 is awesome. First off, there are two USB 2.0 cables you need to plug into your PC to use this keyboard: one for the keyboard itself and one for the two USB 2.0 ports on the back. I love that LogicKeyboard has created a self-powered USB hub on the back of the keyboard. I do wish it was USB 3.0, but to have the ability to power external hard drives from the keyboard and not have to fumble around the back of the machine really helps my day-to-day productivity, a real key addition. While the keyboard I am reviewing is technically for a Windows-based machine it will work on a Mac OS-based system, but you will have to keep in mind the key differences such as the Windows key, but really you should just buy the Mac OS version.

The Astra keyboard is sleek and very well manufactured. The first thing I noticed after I plugged in the keyboard was that it didn’t walk along the desk as I was using it. Maybe I’m a little hard on my equipment, but a lot of keyboards I use start to move across my desk when typing; the Logic keyboard stays still and allows me to pound on that keyboard all day long.

As a testament to the LogicKeyboard’s durability, one day I came home after work and one of the shift keys on the keyboard had come off (it may or may not have been my two year old — I have no concrete evidence). My first thought was “great, there goes that keyboard,” but then I quickly tried to snap the key back on and it went on the first try. Pretty amazing.

What sets the LogicKeyboard backlit keyboard apart from other application-specific keyboards, or any for that matter, is not only the solid construction but also the six levels of brightness for the backlit keys that can be controlled directly from the keyboard. The brightness can be controlled in increments of 100%, 80%, 60%, 40%, 20% and 0% brightness. As a professional editor or colorist, you might think that having backlit keys in a dark room is both distracting and/or embarrassing, but LogicKeyboard has made a beautiful keyboard that glows softly. Even at 100% brightness it feels like the Astra keyboard has a nice fall off, leaving the keyboard almost unnoticeable until you need to see it and use it. Furthermore, it kicks into what Logic calls “smoothing light” after three minutes of non-use — basically it dims to a dull level.

In terms of shortcuts on the Resolve 12/12.5-specific Astra keyboard, you get four levels of shortcuts: normal, shift + key, control + key, and alt + key. Normal is labeled in black, shift + key are labeled in red just like the shift key, control + key are labeled in blue just like the control key, and alt + key are labeled green just like the alt key. While I love all of these shortcuts I do think that it can sometimes get a little overwhelming with so many visible at the same time. It’s kind of a catch-22; I want every shortcut labeled for easy and fast searches, but too many options lead me, at times, to search too long.

On the flip side, after about a week I noticed my Resolve keyboard shortcuts getting more committed to memory than before, so I was less worried about searching each individual key for the shortcut I needed. I am a big proponent for memorizing keyboard shortcuts and the Astra keyboard for Resolve helped cement those into my memory way faster than any normal non-backlit keyboard. Usually, my eyes have a hard time going back and forth between a bright screen and a super dark keyboard; it’s pretty much impossible to do efficiently. The backlit Astra solved my problem of hunting for keys in a dark room with a bright monitor.

The Windows version is compatible with pretty much any version of Windows from the last 10 years, and the Mac version is compatible with Mac OS 10.6 and higher. I tested mine on a workstation with Windows 10 installed.

Summing Up
In the end, I love Logic’s Astra backlit keyboard for DaVinci Resolve 12/12.5. The tactile feedback from each key is essential for speed when editing and color correcting, and it’s the best I’ve felt since having to give up my trusty mechanical-style keyboards. I’ve been through Apple-like low-profile keyboards for Media Composer, going back to the old-school ps/2-style mechanical-ish keyboards, and now to the Astra backlit keyboard and loving it.

The backlit version of LogicKeyboards don’t necessarily come cheap, however, this version retails for $139.90-plus $11.95 for shipping. The Mac version costs the same.

While you may think that is high for a keyboard, the Astra is of the highest manufacturing quality, has two fully powered USB 2.0 ports (that come in handy for things like the Tangent Ripple or Element color correction panels), and don’t forget the best part: is also backlit! My two-year-old son even ripped a key off of the keyboard (he wants me to add, allegedly!) and I fixed it easily without having to send it in for repairs. I doubt the warranty will cover kids pulling off keys, but you do get a free one-year warranty with the product.

I used this keyboard over a few months and really began to fall in love with the eight-degree angle that it is set at. I use keyboards all day, every day and not all keyboards are the same. Some have super flat angles and some have super high angles. In my opinion, the LogicKeyboard Astra has a great and hurt-free angle.

I also can’t overstate how awesome the backlit element of this keyboard is, it’s not just the letters that are backlit, each key is smoothly backlit in its entirety. Even at 100% brightness the keys look soft with a nice fall off on the edges, they aren’t an eyesore and in fact are a nice talking point for many clients. If you are barely thinking about buying a keyboard or are in desperate need of a new keyboard and you use Resolve 12 or 12.5 you should immediately buy the Astra. I love it, and I know you will not regret it.

Check out my footage of the LogicKeyboard Astra backlit keyboard for Resolve on my YouTube page:

.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.


Review: Dell Precision 7910 tower workstation

By Mike McCarthy

While I started my career on Dell Precision workstations, I have spent the last 10 years with HP workstations under my desk. They have served me well, which is why I used them for five generations. At the beginning of 2016, I was given the opportunity to do a complete hardware refresh for director Scott Waugh’s post house, Vasquez Saloon, to gear up our capabilities to edit the first film shot for Barco Escape and edited fully in 6K. This time we ended up with Dell Precision 7910 workstations under our desks. After having a chance to use them for a year, I decided it was time to share some of my experiences with the top-end Precision workstation.

My 7910 has two Xeon E5-2687W V3 processors, each with 10 cores running at 3.1Ghz. Regardless of which CPU speed you select, always fill both sockets of a high-end workstation, as that doubles your memory bandwidth and enables the last two PCIe slots. Therefore, choose dual 4-core CPUs instead of a single 8-core CPU, if that is the performance level you are after. It has 128GB of DDR4 memory, divided across eight sticks that are 16GB each. Regardless of size, maximum performance is achieved with at least as many sticks of RAM since there are memory channels. This system has four memory channels per CPU, for a total of eight channels. I would recommend at least 64GB of RAM for most editing systems, with more for larger projects. Since we were cutting an entire feature with 6K source files, 128GB was a reasonable choice that served us well.

Both our systems are usually pretty quiet, which is impressive considering how powerful they are. They do generate heat, and I don’t recommend running them in a room without AC, but that was outside of our control. Air-cooled systems are only as effective as the environment they are in, and our situation wasn’t always optimal.

PCIe SSDs are a huge leap forward for storage throughput. This workstation came with a PCIe x16 Gen3 card that supports up to four M.2 NVMe https://en.wikipedia.org/wiki/NVM_Express SSDs at full speed. This allows up to 2500MB/s from each of the four ports, which is enough bandwidth to play back 6K DPXs at 24p in Premiere without dropping frames.

Now capacity is limited with this new expensive technology, topping out at 1TB per $700 card. My 512GB card can only store seven minutes of data at maximum throughput, but for smaller data sets, like VFX shots, this allows a system to cache meaningful quantities of data at very high speed without needing a large array of disks to sustain the required I/Os.

Once we open the tool-less case, one of the obvious visual differences between the Dell and HP solutions is that the Precision 7910 splits the PCIe slots, with two above the CPUs and five below. I assume the benefits to this are shorter circuit paths to the CPUs, and better cooling for hot cards. It hasn’t made a big difference to me, but it is worth noting. Like other dual-socket systems, two of the slots are disabled if the second CPU is not installed.

In my case, I have the SSD card in the top slot, and a Red Rocket-X in the next one down. The Thunderbolt 2 card has to be installed in the slot directly below the CPUs. Then I installed my SAS RAID card and the Intel X540 10GbE NIC, leaving space at the bottom for my Quadro GPU.

Another unique feature of the case layout is that the power supply is located behind the motherboard, instead of at the top or bottom of the system. This places the motherboard at the center of the chassis, with components and cards on one side, and power and storage bays on the other. There are a variety of integrated ports, with dual-Gigabit NICs, PS/2, audio, serial, and six USB ports. The only aspect I found limiting was the total of four USB 3.0 ports, one in front and three in back. I have on occasion been using all of them at once for my external drive transfers, but having a USB 3.0 hub in most of Dell’s monitors can help with this issue. Hopefully, we will see USB-C ports with double that bandwidth in the next generation, as well as integrated Thunderbolt 3 support to free up another PCIe slot.

Besides the slim DVD drive, there are four 3.5-inch hard drive bays with tool-less cages, and a 5.25-inch bay, which can be optionally reconfigured to hold four more 2.5-inch drives. The next model down, the Precision 7810, is similar, but without the top two PCIe slots and only two 3.5-inch drive bays. My drive bays are all empty because the PCIe SSD is my only internal storage, but that means that I could easily add four 8TB SAS drives for 32TB of internal storage with no other accessories required. And I may use the 5.25-inch bay for an LTO drive someday, if I don’t end up getting an external one.

If I do get an external SAS drive, it could be connected to one of the two SFF 8643 connectors on the motherboard. These new connectors each support four channels of 12Gb SAS, with one of them hooked to the 3.5-inch drive back plane by default. The integrated SAS controller supports up to eight channels of SAS or SATA data, capable of RAID-0 or -1. Using RAID-5 or -6 requires a separate dedicated card, in my case the Areca 1883x. At least one integrated M.2 slot would be great to see in the next refresh, as those SSDs become more affordable.

Dell also includes their system management software Dell Precision Optimizer to help you get the maximum performance from the system. It allows users to monitor and chart CPU and GPU use as well as memory and disk usage. It can configure system settings like Hyperthreading, Power Usage and V-Sync, using pre-built profiles for various industry applications. It won’t tune your system for video editing as well as an expert who knows what they are doing, but it is better than doing nothing right out of the box.

Real-World Use
Over the last year, we have run two of these workstations on a 6K feature film, taking them right to the limit on a regular basis. It was not uncommon to be encoding R3D dailies to H264 in AME, while rendering a VFX shot in AE, and playing back in Premiere, on both systems simultaneously, pulling data from each other’s local storage arrays over the network. And while I won’t say that they never crashed, stability was not an issue that seriously impacted our workflow or schedule. I have been quite impressed by what we were able to accomplish with them, with very little other infrastructure. The unique split chassis design makes room for a lot of internal storage, and they run reliably and quietly, even when chock full of powerful cards. I am looking forward to getting a couple more solid years of use out of them.


Mike McCarthy is an online editor and workflow consultant with 10 years of experience on feature films and commercials. He has been on the forefront of pioneering new solutions for tapeless workflows, DSLR filmmaking and now multi-screen and surround video experiences. If you want to see more specific technical details about these topics, check out techwithmikefirst.com.


Dog in the Night director/DP Fletcher Wolfe

By Cory Choy

Silver Sound Showdown Music + Video Festival is unique in two ways. First, it is both a music video festival and battle of the bands at the same time. Second, every year we pair up the Grand Prize-winners, director and band, and produce a music video with them. The budget is determined by the festival’s ticket sales.

I conceived of the festival, which is held each year at Brooklyn Bowl, as a way to both celebrate and promote artistic collaboration between the film and music communities — two crowds that just don’t seem to intersect often enough. One of the most exciting things for me is then working with extremely talented filmmakers and musicians who have more often than not met for the first time at our festival.

Dog in the Night (song written by winning band Side Saddle) was one of our most ambitious videos to date — using a combination of practical and post effects. It was meticulously planned and executed by director/cinematographer Fletcher Wolfe, who was not only a pleasure to work with, but was gracious enough to sit down with me for a discussion about her process and the experience of collaborating.

What was your favorite part of making Dog in the Night?
As a music video director I consider it my first responsibility to get to know the song and its meaning very intimately. This was a great opportunity to stretch that muscle, as it was the first time I was collaborating with musicians who weren’t already close friends. In fact, I hadn’t even met them before the Showdown. I found it to be a very rewarding experience.

What is Dog in the Night about?
The song Dog in the Night is, quite simply, about a time when the singer Ian (a.k.a. Angler Boy) is enamored with a good friend, but that friend doesn’t share his romantic feelings. Of course, anyone who has been in that position (all of us?) knows that it’s never that simple. You can hear him holding out hope, choosing to float between friendship and possibly dating, and torturing himself in the process.

I decided to use dusk in the city to convey that liminal space between relationship labels. I also wanted to play on the nervous and lonely tenor of the track with images of Angler Boy surrounded by darkness, isolated in the pool of light coming from the lure on his head. I had the notion of an anglerfish roaming aimlessly in an abyss, hoping that another angler would find his light and end his loneliness. The ghastly head also shows that he doesn’t feel like he has anything in common with anybody around him except the girl he’s pining after, who he envisions having the same unusual head.

What did you shoot on?
I am a DP by trade, and always shoot the music videos I direct. It’s all one visual storytelling job to me. I shot on my Alexa Mini with a set of Zeiss Standard Speed lenses. We used the 16mm lens on the Snorricam in order to see the darkness around him and to distort him to accentuate his frantic wanderings. Every lens in the set weighed in at just 1.25lbs, which is amazing.

The camera and lenses were an ideal pairing, as I love the look of both, and their light weight allowed me to get the rig down to 11lbs in order to get the Snorricam shots. We didn’t have time to build our own custom Snorricam vest, so I found one that was ready to rent at Du-All Camera. The only caveats were that it could only handle up to 11lbs, and the vest was quite large, meaning we needed to find a way to hide the shoulders of the vest under Ian’s wardrobe. So, I took a cue from Requiem for a Dream and used winter clothing to hide the bulky vest. We chose a green and brown puffy vest that held its own shape over the rig-vest, and also suited the character.

I chose a non-standard 1.5:1 aspect ratio, because I felt it suited framing for the anglerfish head. To maximize resolution and minimize data, I shot 3.2K at a 1.78:1 aspect ratio and cropped the sides. It’s easy to build custom framelines in the Alexa Mini for accurate framing on set. On the Mini, you can also dial in any frame rate between 0.75-60fps (at 3.2K). Thanks to digital cinema cameras, it’s standard these days to over-crank and have the ability to ramp to slow motion in post. We did do some of that; each time Angler Boy sees Angler Girl, his world turns into slow motion.

In contrast, I wanted his walking around alone to be more frantic, so I did something much less common and undercranked to get a jittery effect. The opening shot was shot at 6fps with a 45-degree shutter, and Ian walked in slow motion to a recording of the track slowed down to quarter-time, so his steps are on the beat. There are some Snorricam shots that were shot at 6fps with a standard 180-degree shutter. I then had Ian spin around to get long motion blur trails of lights around him. I knew exactly what frame rate I wanted for each shot, and we wound up shooting at 6fps, 12fps, 24fps, 48fps and 60fps, each for a different emotion that Angler Boy is having.

Why practical vs. CG for the head?
Even though the fish head is a metaphor for Angler Boy’s emotional state, and is not supposed to be real, I wanted it to absolutely feel real to both the actor and the audience. A practical, and slightly unwieldy, helmet/mask helped Ian find his character. His isolation needed to be tangible, and how much he is drawn to Angler Girl as a kindred spirit needed to be moving. It’s a very endearing and relatable song, and there’s something about homemade, practical effects that checks both those boxes. The lonely pool of light coming from the lure was also an important part of the visuals, and it needed to play naturally on their faces and the fish mask. I wired Lite Gear LEDs into the head, which was the easy part. Our incredibly talented fabricator, Lauren Genutis, had the tough job — fabricating the mask from scratch!

The remaining VFX hurdle then was duplicating the head. We only had the time and money to make one and fit it to both actors with foam inserts. I planned the shots so that you almost never see both actors in the same shot at the same time, which kept the number of composited shots to a minimum. It also served to maintain the emotional disconnect between his reality and hers. When you do see them in the same shot, it’s to punctuate when he almost tells her how he feels. To achieve this I did simple split screens, using the Pen Tool in Premiere to cut the mask around their actions, including when she touches his knee. To be safe, I shot takes where she doesn’t touch his knee, but none of them conveyed what she was trying to tell him. So, I did a little smooshing around of the two shots and some patching of the background to make it so the characters could connect.

Where did you do post?
We were on a very tight budget, so I edited at home, and I always use Adobe Premiere. I went to my usual colorist, Vladimir Kucherov, for the grade. He used Blackmagic Resolve, and I love working with him. He can always see how a frame could be strengthened by a little shaping with vignettes. I’ll finally figure out what nuance is missing, and when I tell him, he’s already started working on that exact thing. That kind of shaping was especially helpful on the day exteriors, since I had hoped for a strong sunset, but instead got two flat, overcast days.

The only place we didn’t see eye to eye on this project was saturation — I asked him to push saturation farther than he normally would advise. I wanted a cartoon-like heightening of Angler Boy’s world and emotions. He’s going through a period in which he’s feeling very deeply, but by the time of writing the song he is able to look back on it and see the humor in how dramatic he was being. I think we’ve all been there.

What did you use VFX for?
Besides having to composite shots of the two actors together, there were just a few other VFX shots, including dolly moves that I stabilized with the Warp Stabilizer plug-in within Premiere. We couldn’t afford a real dolly, so we put a two-foot riser on a Dana Dolly to achieve wide push-ins on Ian singing. We were rushing to catch dusk between rainstorms, and it was tough to level the track on grass.

The final shot is a cartoon night sky composited with a live shot. My very good friend, Julie Gratz of Kaleida Vision, made the sky and animated it. She worked in Adobe After Effects, which communicates seamlessly with Premiere. Julie and I share similar tastes for how unrealistic elements can coexist with a realistic world. She also helped me in prep, giving feedback on storyboards.

Do you like the post process?
I never used to like post. I’ve always loved being on set, in a new place every day, moving physical objects with my hands. But, with each video I direct and edit I get faster and improve my post working style. Now I can say that I really do enjoy spending time alone with my footage, finding all the ways it can convey my ideas. I have fun combining real people and practical effects with the powerful post tools we can access even at home these days. It’s wonderful when people connect with the story, and then ask where I got two anglerfish heads. That makes me feel like a wizard, and who doesn’t like that?! A love of movie magic is why we choose this medium to tell our tales.


Cory Choy, Silver Sound Showdown festival director and co-founder of Silver Sound Studios, produced the video.


Bringing the documentary Long Live Benjamin to life

By Dayna McCallum

The New York Times Op-Docs recently debuted Long Live Benjamin, a six-part episodic documentary directed by Jimm Lasser (Wieden & Kennedy) and Biff Butler (Rock Paper Scissors), and produced by Rock Paper Scissors Entertainment.

The film focuses on acclaimed portrait artist Allen Hirsch, who, while visiting his wife’s homeland of Venezuela, unexpectedly falls in love. The object of his affection — a deathly ill, orphaned newborn Capuchin monkey named Benjamin. After nursing Benjamin back to health and sneaking him into New York City, Hirsch finds his life, and his sense of self, forever changed by his adopted simian son.

We reached out to Lasser and Butler to learn more about this compelling project, the challenges they faced, and the unique story of how Long Live Benjamin came to life.

Long Live Benjamin

Benjamin sculpture, Long Live Benjamin

How did this project get started?
Lasser: I was living in Portland at the time. While in New York I went to visit Allen, who is my first cousin. I knew Benjamin when he was alive, and came by to pay my respects. When I entered Allen’s studio space, I saw his sculpture of Benjamin and the frozen corpse that was serving as his muse. Seeing this scene, I felt incredibly compelled to document what my cousin was going through. I had never made a film or thought of doing so, but I found myself renting a camera and staying the weekend to begin filming and asking Allen to share his story.

Butler: Jimm had shown up for a commercial edit bearing a bag of Mini DV tapes. We offered to transfer his material to a hard drive, and I guess the initial copy was never deleted from my own drive. Upon initial preview of the material, I have to say it all felt quirky and odd enough to be humorous; but when I took the liberty of watching the material at length, I witnessed an artist wrestling with his grief. I found this profound switch in takeaway so compelling that I wanted to see where a project like this might lead.

Can you describe your collaboration on the film?
Lasser: It began as a director/editor relationship, but it evolved. Because of my access to the Hirsch family, I shot the footage and lead the questioning with Allen. Biff began organizing and editing the footage. But as we began to develop the tone and feel of the storytelling, it became clear that he was as much a “director” of the story as I was.

Butler: In terms of advertising, Jimm is one of the smartest and discerning creatives I’ve had the pleasure of working with. I found myself having rather differing opinions to him, but I always learned something new and felt we came to stronger creative decisions because of such conflict. When the story of Allen and his monkey began unfolding in front of me, I was just as keen to foster this creative relationship as I was to build a movie.

Did the film change your working relationship?
Butler: As a commercial editor, it’s my job to carry a creative team’s hard work to the end of their laborious process — they conceive the idea, sell it through, get it made and trust me to glue the pieces together. I am of service to this, and it’s a privilege. When the footage I’d found on my hard drive started to take shape, and Jimm’s cousin began unloading his archive of paintings, photographs and home video on to us, it became a more involved endeavor. Years passed, as we’d get busy and leave things to gather dust for months here and there, and after a while it felt like this film was something that reflected both of our creative fingerprints.

Long Live Benjamin

Jimm Lasser, Long Live Benjamin

How did your professional experiences help or influence the project?
Lasser: Collaboration is central to the process of creating advertising. Being open to others is central to making great advertising. This process was a lot like film school. We both hadn’t ever done it, but we figured it out and found a way to work together.

Butler: Jimm and I enjoyed individual professional success during the years we spent on the project, and in hindsight I think this helped to reinforce the trust that was necessary in such a partnership.

What was the biggest technical challenge you faced?
Butler: The biggest challenge was just trying to get our schedules to line up. For a number of years we lived on opposite sides of the country, although there were three years where we both happened to live in New York at the same time. We found that the luxury of sitting was when the biggest creative strides happened. Most of the time, though, I would work on an edit, send to Jimm, and wait for him to give feedback. Then I’d be busy on something else when he’d send long detailed notes (and often new interviews to supplement the notes), and I would need to wait a while until I had the time to dig back in.

Technically speaking, the biggest issue might just be my use of Final Cut Pro 7. The film is made as a scrapbook from multiple sources, and quite simply Final Cut Pro doesn’t care much for this! Because we never really “set out” to “make a movie,” I had let the project grow somewhat unwieldy before realizing it needed to be organized as such.

Long Live Benjamin

Biff Butler, Long Live Benjamin

Can you detail your editorial workflow? What challenges did the varying media sources pose?
Butler: As I noted before, we didn’t set out to make a movie. I had about 10 tapes from Jimm and cut a short video just because I figured it’s not every day you get to edit someone’s monkey funeral. Cat videos this ain’t. Once Allen saw this, he would sporadically mail us photographs, newspaper clippings, VHS home videos, iPhone clips, anything and everything. Jimm and I were really just patching on to our initial short piece, until one day we realized we should start from scratch and make a movie.

As my preferred editing software is Final Cut Pro 7 (I’m old school, I guess), we stuck with it and just had to make sure the media was managed in a way that had all sources compressed to a common setting. It wasn’t really an issue, but needed some unraveling once we went to online conform. Due to our schedules, the process occurred in spurts. We’d make strides for a couple weeks, then leave it be for a month or so at a time. There was never a time where the project wasn’t in my backpack, however, and it proved to be my companion for over five years. If there was a day off, I would keep my blades sharp by cracking open the monkey movie and chipping away.

You shot the project as a continuous feature, and it is being shown now in episodic form. How does it feel to watch it as an episodic series?
Lasser: It works both ways, which I am very proud of. The longer form piece really lets you sink into Allen’s world. By the end of it, you feel Allen’s POV more deeply. I think not interrupting Alison Ables’ music allows the narrative to have a greater emotional connective tissue. I would bet there are more tears at the end of the longer format.

The episode form sharpened the narrative and made Allen’s story more digestible. I think that form makes it more open to a greater audience. Coming from advertising, I am used to respecting people’s attention spans, and telling stories in accessible forms.

How would you compare the documentary process to your commercial work? What surprised you?
Lasser: The executions of both are “storytelling,” but advertising has another layer of “marketing problem solving” that effects creative decisions. I was surprised how much Allen became a “client” in the process, since he was opening himself up so much. I had to keep his trust and assure him I was giving his story the dignity it deserved. It would have been easy to make his story into a joke.

Artist Allen Hirsch

Butler: It was my intention to never meet Allen until the movie was done, because I cherished that distance I had from him. In comparison to making a commercial, the key word here would be “truth.” The film is not selling anything. It’s not an advertisement for Allen, or monkeys, or art or New York. We certainly allowed our style to be influenced by Allen’s way of speaking, to sink deep into his mindset and point of view. Admittedly, I am very often bored by documentary features; there tends to be a good 20 minutes that is only there so it can be called “feature length” but totally disregards the attention span of the audience. On the flip side, there is an enjoyable challenge in commercial making where you are tasked to take the audience on a journey in only 60 seconds, and sometimes 30 or 15. I was surprised by how much I enjoyed being in control of what our audience felt and how they felt it.

What do you hope people will take away from the film?
Lasser: To me this is a portrait of an artist. His relationship with Benjamin is really an ingredient to his own artistic process. Too often we focus on the end product of an artist, but I was fascinated in the headspace that leads a creative person to create.

Butler: What I found most relatable in Allen’s journey was how much life seemed to happen “to” him. He did not set out to be the eccentric man with a monkey on his shoulders; it was through a deep connection with an animal that he found comfort and purpose. I hope people sympathize with Allen in this way.


To watch Long Live Benjamin, click here.


CAS and MPSE bestow craft honors to audio pros, filmmakers

By Mel Lambert

While the Academy Awards spotlight films released during the past year, members of the Cinema Audio Society (CAS) and Motion Picture Sound Editors (MPSE) focus on both film and TV productions.

The 53rd CAS Awards — held at the Omni Los Angeles Hotel on February 18, and hosted once again by comedian Elayne Boosler — celebrated the lifetime contributions of production mixer John Pritchett with the CAS Career Achievement Award for his multiple film credits. The award was presented by re-recording mixer Scott Millan, CAS, and actor/producer Jack Black, with a special video tribute from actor/director/producer Tom Hanks. Quoting seasoned sound designer Walter Murch, Millan shared, “Dialog is the backbone of a film.”

“Sound mixing is like plastic surgery,” Black advised. “You only notice it when it’s done badly.”

Actor/director Jon Favreau received the CAS Filmmaker Award from actor/writer Seth McFarlane, film composer John Debney and CAS president Mark Ulano. Clips from the directors’ key offerings, including The Jungle Book, Chef, Cowboys & Aliens, Iron Man and Iron Man 2, were followed by pre-recorded congratulations from Stan Lee and Ed Asner. “Production and post production are invisible arts,” said Favreau. “Because if you do it right, it’s invisible. If you want to look good on the set you need to understand sound.”

Presenters Robert Forster and Melissa Hoffman flanking winners of the CAS Award for Outstanding Sound Mixing Motion Picture for La La Land.

The CAS Award for Outstanding Sound Mixing Motion Picture — Live Action went to the team behind La La Land: production mixer Steven Morrow, CAS; re-recording mixers Andy Nelson, CAS, and Ai-Ling Lee, scoring mixer Nicholai Baxter, ADR mixer David Betancourt and Foley mixer James Ashwill. “It was a blast to work with Andy Nelson and the Fox Sound Department,” said Lee. The film’s director, Damien Chazelle, also was on hand to support his award-winning crew. Other nominees included Doctor Strange, Hacksaw Ridge, Rogue One: A Star Wars Story and Sully.

The CAS Award for Outstanding Sound Mixing Motion Picture — Animated went to Finding Dory and original dialogue mixer Doc Kane, CAS, re-recording mixers Nathan Nance and Michael Semanick, CAS, scoring mixer Thomas Vicari, CAS, and Foley mixer Scott Curtis. “I’ve got the best job in the world,” Kane offered, “recording all these talented people.”

 

Kevin O’Connell and Angela Sarafyan flanking Dennis Hamlin and Peter Horner, winners of the CAS Award for Outstanding Sound Mixing Motion Picture — Documentary.

During a humorous exchange with his co-presenter Angela Sarafyan, an actress who starred in HBO’s Westworld series, re-recording mixer Kevin O’Connell, CAS, was asked why the 21-time Oscar-nominee had not — as yet — received an Academy Award. Pausing briefly to collect his thoughts, O’Connell replied that he thought the reasons were three-fold. “First, because I do not work at Skywalker Sound,” he said, referring to Disney Studios’ post facility in Northern California, which has hosted a number of nominated sound projects. “Secondly, I do not work on musicals,” he continued, referring to the high number of Oscar and similar nominations this year for La La Land. “And third, because I do not sit next to Andy Nelson,” an affectionate reference to the popular re-recording engineer’s multiple Oscar wins and current nomination for La La Land. (For O’Connell it seems the 21st time is the charm. He walked away from this year’s Oscar with a statuette for his work on Hacksaw Ridge.)

O’Connell and Sarafyan then presented the first-ever CAS Award for Outstanding Sound Mixing Motion Picture — Documentary to the team that worked on The Music of Strangers: Yo-Yo Ma and The Silk Road Ensemble: production mixers Dimitri Tisseyre and Dennis Hamlin, plus re-recording mixer Peter Horner.

The CAS Award for Outstanding Sound Mixing Television Movie or Miniseries went to The People v. O.J. Simpson: American Crime Story and production mixer John Bauman, re-recording mixers Joe Earle, CAS, and Doug Andham, CAS, ADR mixer Judah Getz and Foley mixer John Guentner. The award for Television Series — 1-Hour went to Game of Thrones: Battle of the Bastards and production mixers Ronan Hill, CAS, and Richard Dyer, CAS, re-recording mixers Onnalee Blank, CAS, and Mathew Waters, CAS, and Foley mixer Brett Voss, CAS. “Game of Thrones was a great piece of art to work on,” said Blank.

L-R:Game of Thrones: Battle of the Bastards team — Onnalee Blank, Brett Voss, and Matthew Waters, with Karol Urban and Clyde Kusatsu.

The award for Television Series — 1/2-Hour went to Modern Family: The Storm and production mixer Stephen A. Tibbo, CAS, and re-recording mixers Dean Okrand, CAS, and Brian R. Harman, CAS. The award for Television Non-Fiction, Variety or Music Series or Specials went to Grease Live! and production mixer J. Mark King, music mixer Biff Dawes, playback and SFX mixer Eric Johnston and Pro Tools playback music mixer Pablo Munguía.

The CAS Student Recognition Award went to Wenrui “Sam” Fan from Chapman University. Outstanding Product Awards went to Cedar Audio for its DNS2 Dynamic Noise Suppression Unit and McDSP for its SA-2 dialog processor.

Other presenters included Nancy Cartwright (The Simpsons), Robert Forster (Jackie Brown), Janina Gavankar (Sleepy Hollow), Clyde Kusatsu (SAG/AFTRA VP and Madame Secretary), Rhea Seehorn (Better Call Saul) and Nondumiso Tembe (Six).

MPSE
Held on February 19 at the Westin Bonaventure Hotel in downtown Los Angeles, opening remarks for the 64th MPSE Golden Reel Awards came from MPSE president Tom McCarthy. “Digital technology is creating new workflows for our sound artists. We need to take the initiative and drive technology, and not let technology drive us,” he said, citing recent and upcoming MPSE Sound Advice confabs. “The horizons for sound are expanding, particularly virtual reality. Immersive formats from Dolby, Auro, DTS and IMAX are enriching the cinematic experience.”

Scott Gershin, MPSE Filmmaker Award recipient Guillermo Del Toro and Tom McCarthy.

The annual MPSE Filmmaker Award was presented to writer/director Guillermo del Toro by supervising sound editor/sound designer Scott Gershin, who has worked with him for the past 15 years on such films as Hellboy II: The Golden Army (2008) and Pacific Rim (2013). “Sound editing is an opportunity in storytelling,” the director offered. “There is always a balance we need to strike between sound effects and music. It’s a delicate tango. Sound design and editing is a curatorial position. I always take that partnership seriously in my films.”

Referring to recent presidential decisions to erect border walls and tighten immigration controls, del Torro was candid in his position. “I’m a Mexican,” he stated. “Giving me this award [means] that the barriers people are trying to erect between us are false,” he stressed, to substantial audience applause.

Supervising sound editor/sound designer Wiley Stateman and producer Shannon McIntosh presented the MPSE Career Achievement Award to supervising sound editor/sound designer Harry Cohen, who has worked on more than 150 films, including many directed by Quentin Tarantino, who made a surprise appearance to introduce the award recipient. “I aspired to be a performing musician,” Cohen acknowledged, “and was 31 when I became an editor. Sound design is a craft. You refine the director’s creativity through your own lens.” He also emphasized the mentoring process within the sound community, “which leads to a free flow of information.”

The remaining Golden Reel Awards comprised several dozen categories encompassing feature films, long- and short-form TV, animation, documentaries and other media.

The Best Sound Editing In Feature Film — Music Score award went to Warcraft: The Beginning and music editors Michael Bauer and Peter Myles. The Best Sound Editing In Feature Film — Music, Musical Feature award went to La La Land music editor Jason Ruder.

The Hacksaw Ridge team included (L-R) Michelle Perrone, Kimberly Harris, Justine Angus, Jed Dodge, Robert Mackenzie Liam Price and Tara Webb.

The Best Sound Editing In Feature Film — Dialog/ADR award went to director Mel Gibson’s Hacksaw Ridge and supervising sound editor Andy Wright, supervising ADR editors Justine Angus and Kimberly Harris, dialog editor Jed Dodge and ADR editor Michele Perrone. The Best Sound Editing In Feature Film — FX/Foley Award also went to Hacksaw Ridge and supervising sound editors Robert Mackenzie, Foley editor Steve Burgess and Alex Francis, plus sound effects editors Liam Price, Tara Webb and Steve Burgess.

The MPSE Best Sound & Music Editing: Television Animation Award went to Albert  supervising sound editor Jeff Shiffman, MPSE, dialogue editors Michael Petak and Anna Adams, Foley editor Tess Fournier, music editor Brad Breeck plus SFX editors Jessey Drake, MPSE, Tess Fournier and Jeff Shiffman, MPSE. The Best Sound & Music Editing: Television Documentary Short-Form award to Sonic Sea and supervising sound editor Trevor Gates, dialog editor Ryan Briley and SFX editors Ron Aston and Christopher Bonis. The Best Sound & Music Editing: Television Documentary Long-Form award went to My Beautiful Broken Brain supervising sound editor Nick Ryan, dialog editor Claire Ellis and SFX editor Tom Foster. The Best Sound & Music Editing: Animation — Feature Film award went to Moana supervising sound editor Tim Nielsen, supervising dialog editor Jacob Riehle, Foley editors Thom Brennan and Matthew Harrison, music editors Earl Ghaffari and Dan Pinder, plus SFX editors Jonathan Borland, Pascal Garneau and Lee Gilmore. The Best Sound & Music Editing: Documentaries — Feature Film award to The Music of Strangers: Yo-Yo Ma and The Silk Road Ensemble and supervising sound editor Pete Horner, sound designer Al Nelson and SFX editor Andre Zweers.

The Verna Fields Award in Sound Editing in Student Films was a tie, with $1,500 checks being awarded to Fishwitch, directed by Adrienne Dowling from the National Film and Television School, and Icarus by supervising sound editor/sound designer Zoltan Juhasz from Dodge College of Film and Media Arts, Chapman University.

The MPSE Best Sound & Music Editing: Special Venue award went to supervising sound editor/sound designer Jamey Scott for his work on director Patrick Osborne’s Pearl, a panoramic virtual reality presentation — and which has also been nominated in the Oscars Best Animated Short Category. The Best Sound Editing In Television: Short Form — Music Score award went to music editor David Klotz for his work on Stranger Things, Chapter Three: Holly Jolly. “The show’s composers — Kyle Dixon and Michael Stein — were an inspiration to work with,” said Klotz, “as was the sound team at Technicolor.” The Best Sound Editing In Television: Short Form — Music, Musical award was another tie between music editor Jason Tregoe Newman and Bryant J. Fuhrmann for Mozart in the Jungle — Now I Will Sing and music editor Jamieson Shaw for The Get Down — Raise Your Words, Not Your Voice.

The winning Westworld team included Thomas E. de Gorter (center), Matthew Sawelson, Geordy Sincavage, Michael Head, Mark R. Allen and Marc Glassman.

The Best Sound Editing In Television: Short Form — Dialog/ADR award went to the team from Penny Dreadful III, including supervising sound editor Jane Tattersall, supervising dialogue editor David McCallum, dialog editor Elma Bello, and ADR editors Dale Sheldrake and Paul Conway. The Best Sound Editing In Television: Short Form — FX/Foley award went to Westworld — Trompe L’Oeil supervising sound editors Thomas E. de Gorter, MPSE, and Matthew Sawelson, MPSE, Foley editors Geordy Sincavage and Michael Head, and sound designers Mark R. Allen, MPSE, and Marc Glassman, MPSE. The same post team won The Best Sound Editing In Television: Long Form — FX/Foley award for Westworld — The Bicameral Mind. The Best Sound Editing In Television: Long Form — Dialog/ADR award went to The Night Of — Part 1 The Beach and supervising sound editor Nicholas Renbeck, and dialog editors Sara Stern, Luciano Vignola and Odin Benitez.

Presenters included actor Erich Riegelmann, actress Julie Parker, Avid director strategic solutions Rich Nevens, SFX editor Liam Price, producer/journalist Geoff Keighley, Formosa Interactive VP of creative services Paul Lipson, CAS president Mark Ulano, actress Andrene Ward-Hammond, supervising sound editors Mark Lanza and Bernard Weiser, picture editor Sabrina Plisco, and Technicolor VP/head of theatrical Sound Jeff Eisner.

MPSE president McCarthy offered that the future for entertainment sound has no boundaries. “It is impossible to predict what new challenges will be presented to practitioners of our craft in the years to come,” he said. “It is up to all of us to meet those challenges with creativity, professionalism and skill. MPSE membership now extends around the world. We are building a global network of sound professionals in order to help artists collaborate and share ideas with their peers.”

A complete list of MPSE Golden Reel Awards can be found on its website.

Main Image (L-R): John Debney, CAS Filmmaker Award recipient Jon Favreau, Seth MacFarlane and Mark Ulano. 

CAS images – Alex J. Berliner/ABImages
MPSE Images – Chris Schmitt Photography


Mel Lambert is principal of Content Creators, an LA-based copywriting and editorial service, and can be reached at mel.lambert@content-creators.com. Follow him on Twitter @MelLambertLA.

Michael Vinyard joins Xytech exec team

Xytech, which makes facility management software for the broadcast and media industries, has added industry vet Michael Vinyard in the new role of SVP Professional Services. Vinyard will be responsible for consulting, configuration and installation services for system implementations across the company.

Vinyard’s previous senior management roles include stints with Mattel, Warner Bros. and CBS. At Xytech, he will based out of the company’s Chatsworth headquarters.

“Having Michael allows us to expand our goals while maintaining the focus required to properly serve our clients,” said Greg Dolan, Xytech COO. “The addition of Michael shows our dedication to working with the best professionals in the business.”

Second HPA Tech Retreat UK announced

The Hollywood Professional Association (HPA) has announced the dates and opened the call for proposals for the second annual HPA Tech Retreat UK. Presented in association with SMPTE, the event returns to Heythrop Park Resort in Oxfordshire July 11-13.

Programming for the HPA Tech Retreat UK is built from proposals, along with the participation of notable speakers. The proposal deadline for the main program is May 30.

“We’ll feature seminars, a Supersession and the curated Innovation Zone, where attendees can explore the latest developments in workflow, tools and technologies,” said Richard Welsh, co-chair of the HPA Tech Retreat UK. “Proposals for the main program, as well as the breakfast roundtables, may address topics related to moving images and associated sound, great projects or worthy technological development.”

The 2016 program included a special session on the production and post of Game of Thrones, Hybrid Log-Gamma (HLG) vs. Perceptual Quantiser (PQ), the IMF and the European view of the media landscape from the European Broadcasting Union (EBU), and cloud workflows.

For more information on the HPA Tech Retreat UK, click here.

Review: Soundly — an essential tool for sound designers

By Ron DiCesare

The people behind the sound effects database Soundly and I think alike. We both imagine a world where all audio files are accessible from any computer at anytime. Soundly is helping accomplish that with their cloud-based audio sound effect searchable database and online sound effects library. Having access to thousands of sound effects online via the cloud from any computer anywhere with Internet access is long overdue. I am so pleased to see Soundly paving the way to what I see as the inevitable workflow of the future.

When I started out in audio post production years ago, sound effect libraries were all on CDs. Back then I had to look through a huge directory listing the tens of thousands of sounds available on all of the audio CDs, which I called “the big phone book of sounds.” I remember thinking to myself that there must be a better way. After years of struggling with these phone books, technology finally made a viable step forward with iTunes. That led to my “innovative” idea to rip all of my sound effect CDs to iTunes to use it as a makeshift searchable database. It was crude, but worked a hell of a lot better than the phone books and audio CDs!

Once digital audio files became the norm, technology got on board and finally offered us searchable database programs exclusively for sound effects. Now Soundly has made another leap forward with its cloud access.

Over the years, I have acquired well over 100,000 sound effects — 112,495 to be exact. In my library, there are a fair amount of custom sounds (particularly vocal reactions) that I have recorded myself. All of these sounds are stored on a 1TB external hard drive (with an ilok/dongle) that I take with me to every studio I work at, including my home studio.

The problem for me is that I am a freelance audio mixer and sound designer working at many different studios in New York City, in addition to my home studio on Long Island. That means I am forced to take my external sound effects drive and ilok to every studio I work at for every session. I am always at risk of losing the drive and/or ilok or simply forgetting them behind when going to and from studios. I have often asked myself, wouldn’t it be great to have all my sounds accessible from any computer with Internet access at all times? Enter Soundly.

Soundly can be broken down into two main parts. First, they offer 300-plus or 7,500-plus sounds included in their database for immediate use. This depends on which price option you choose, which is either free or a monthly subscription. Second, they offer the ability to upload all of users’ existing sound effects to a local drive or, better yet, the cloud. Uploading to the cloud makes your sounds available from a computer with Internet access, in addition to the over 7,500 sound effects included with Soundly.

A Wide Appeal
Soundly is available for Mac and PC, and is very easy to install — it took me just a few minutes. Once installed, the program immediately gives access to over 7,500 high-quality sound effects, many as 96kHz, 24-bit Wav files. This is ideal for anyone not able to spend the thousands of dollars needed to build up a large library by purchasing sound effects from a variety of companies. That could include video editors who are often asked to do sound design without a proper or significant database of sounds to choose from. All too often these video editors are forced to look to the Internet for any kind of free sound effect, but the quality can be dubious at times. Audio mixers and sound designers, who are just starting out and getting their libraries underway could benefit as well.

In addition to accessing 7,500-plus high-quality sounds, Soundly allows for the purchase of additional sound effect libraries in the store section of the program, such as “Cinematic Hits and Transitions” from SoundBits and “Summer Nature Ambiences” by Soundholder. The store also gives the user access to all free sound effects across the Internet via Freesound.org. This will no doubt help fill in any gaps in the large variety of sounds needed for any video editor or sound designer. But just as the Soundly disclaimer notes for the free sound effects, there is no way to enforce any kind of quality control or audio standard for the wide range of free sounds available throughout the Internet. Even so, Soundly manages to be a one-stop shop for all Internet sound searches rather than just randomly searching the Internet blindly.

Targeted Appeal
Any seasoned audio mixer or sound designer will tell you that it is best to stay away from free sounds found on the Internet in general. Audio mixers like me who have been working for over 30 years (though I do not look like I am over 50!) are more likely to have built up their own sound effect libraries over the years that they prefer to use. For example, my sound effect library contains both purchased sounds from many of the various commercial libraries and a fair amount of custom sounds I have recorded on the job. That is why uploading a user’s own entire sound effect library to the cloud for use with Soundly (which in my case is almost 1TB) is an absolute necessity.

Now I admit, I am the exception and not the rule. I need access to all of my audio files at all times because I am never in one place for long. That is why Soundly is ideal for me. I can dial up Soundly and access the cloud instantly from any computer that has Internet access. Now I can leave my sound effects drive at home, which is a huge relief.

I know that the vast majority of audio professionals on my level have a staff position. Most of them typically work at multi-room facilities and rarely, if ever, need to leave their facility for an audio mix or sound design. Soundly offers multi-room licenses for just that reason. But more importantly, it means that most of the major audio facilities have their sound effect libraries accessible to all their staff on some kind of network server such as a RAID or NAS. So why switch to Soundly’s cloud storage service when an audio or video facility has access to many TBs worth of network storage of their own? The answer in a nutshell is price.

To fully understand if Soundly could replace a network server in a large audio or video facility, let’s breakdown Soundly’s pricing options starting with the free option. Soundly offers access to the free cloud library of over 300 sound effects, a maximum of 2,500 pre-existing local files and no upload space allotment. Next is Soundly’s Pro subscription for $14.99 a month, allowing for all the features of Soundly, access to the 7,500-plus cloud-based sound effects and unlimited access to pre-existing local files.

But for the real heavy lifting, Soundly offers storage space options needed to upload large amounts of sounds to the cloud at a very competitive rate. For example, to get access to my pre-existing sound effect library totaling nearly 1TB worth of sound effects, Soundly offers an annual fee of $500 for cloud storage that size. Compare that to the cost of installing and maintaining RAID or NAS storage systems that a large facility might use and it could very well be a better and more cost-effective option, not to mention it’s accessible everywhere. So freelancers like me, or staff audio engineers, can count on reliable, safe, large-scale storage of their data by switching to Soundly.

Operation
Installing Soundly is fast and easy. I was instantly able to access all of the included sounds. Once my entire sound effect library was uploaded, it was well worth the time and effort needed for such a large amount of files. Searching for sound effects worked exactly as I expected it to. All possible sounds came up with the search criteria I specified, all based on file names and metadata. Simply click on any sound file to play it and see if it’s right for your project.

Now here is where Soundly really impressed me. There are two ways of exporting your sound files: drag and drop and what Soundly calls “spot-to.” Drag and drop works with Pro Tools, Nuendo, Avid Media Composer, Adobe Premiere Pro CC and FCP X and 7, to name a few. The “spot-to” function works with Pro Tools, specifically Pro Tools HD 12.7. The “spot-to” function is where the real power and speed comes into play. The “spot-to” icon appears automatically whenever Pro Tools is active (it disappears when the Pro Tools is not active, so just be aware of that). Click on the icon and your sound file is sent to Pro Tools in an instant.

There are two great options when using the “spot-to” icon, spot to bin or spot to timeline. Each one has its advantages depending on how you like to work. Sending to your bin makes it accessible via the clip list in Pro Tools. Sending to the timeline adds it to wherever your curser is located on any track. That is a real time saver. To illustrate this, let’s look at how few steps are needed to get your sound file in your time line or bin. I counted three steps. Step one: select the sound in Soundly. Step two: send to Pro Tools using the “spot-to” icon. Step three: immediately working with the sound file in my session, which really is not a step. So, we can say it is actually just two steps. Yes, it’s that fast and easy.

For me, the most important aspect of Soundly’s “spot-to” function is that it copies the sound file to Pro Tools rather than referencing it. This is significant. Some people may have learned the hard way, like I have, that referencing a sound effect does not include that sound effect in your audio folder within your session. This is key because coping it into your session’s audio folder allows you to move your session from drive to drive, room to room or studio to studio without the dreaded missing sound file error message in Pro Tools when the drive or network housing the sound effects cannot be located. As far as I know, only Sound Miner’s higher priced options do this crucial copy to audio folder step. In contrast, all of Soundly’s pricing options do this essential step.

Let’s not ignore the fact that Soundly works as a stand-alone program without any DAW or video editing software needed. Simply drag and drop the sound file to a folder located anywhere, say your desktop, should you happen to want to work outside of your DAW or video software for whatever reason.

Organization
With Soundly, there are a variety of ways you can organize your library, all customizable and up to the user. For me, I kept it very simple. I chose a three-folder hierarchy as follows: Soundly’s built-in cloud library, my entire personal sound effects library and my “greatest hits” for my most useful sounds. All three folders are located under the master cloud folder, which means that all my sounds and folders can be searched at once, or in any combination. You can choose one or more of your folders whenever you do a search. That means you can really hone in your search if you would like to set up multiple sub folders – or not. For me, when I do a search I will typically want to search all my sounds all at once since I cannot take the time to think of sub categories that may or may not yield better results. My organization and set up is purely my own preference and it is sure to vary from user to user. Each person can set up their folders however they feel best to organize their library.

Hard to Pick a Favorite Feature
I think my absolute favorite feature of Soundly is the pitch shift function. That’s because whenever I am finding and auditioning sounds with the pitch shift engaged (up or down), the sound file will be sent to my DAW with the exact amount of pitch shift applied to the sound effect! That means I do not have to recreate or guess the amount of pitch shifting I used when auditioning the sound after it is imported into Pro Tools. The same goes for the reverse function. There is no doubt that pitch shift and reverse are the two most common alterations for sound effects done by sound designers. Soundly has these two crucial functions built-in to the search and export functions.

Another feature worth noting is marking favorite or popular sounds with a star, like flagging an important email. Marking your favorite sounds with the star icon means you do not have to make a separate folder for your favorites as I have done in the past. Playlists are another noteworthy feature. Making playlists can be a great way of storing all your sounds as you are searching for a project that can be downloaded or sent to your DAW in a more organized fashion after your search. This is much faster than downloading each sound effect one by one as you find the sound effects needed for larger sound design projects. Making multiple playlists is another way to speed up the searching process over all. Playlists can be shared with other Soundly users.

More to Come
In the future, we can expect to see more options for the output format. Currently you can choose bit rate and sample rate, but you will only be able to export .wav files. Future releases are slated to include AIFF, MP3 and even Ogg Vorbis for the gaming world.

As Soundly grows, there will be more sound effects added to the cloud for use. Not surprisingly, the folks behind Soundly are sound designers and the program clearly reflects that. Soundly’s developer Peder Jørgensen and sound designer Christian Schaanning really understand how today’s sound designers work. More importantly, they understand how tomorrow’s sound designers will work.


Ron DiCesare is an audio mixer and sound designer located in the New York City area. His work can be heard on promos and shows, including “Noisey” featuring Kendrick Lamar, “B. Deep,” “F**k That’s Delicious” and “Moltissomo” with Chef Mario Batali on Vice’s Munchies channel. He also works on spots and promos. He can be reached at rononizer@gmail.com.

Steve Porter

Behind the Title: MTI colorist Steve Porter

NAME: Steve Porter

COMPANY: MTI Film

CAN YOU DESCRIBE YOUR COMPANY?
We are a Hollywood-based post facility that specializes in TV finishing, film restoration and software development.

AS A COLORIST, WHAT WOULD SURPRISE PEOPLE ABOUT WHAT FALLS UNDER THAT TITLE?
That I am also a skin care specialist, VFX artist and therapist.

WHAT SYSTEM DO YOU WORK ON?
Digital Vision Nucoda.

Bates Motel

ARE YOU ASKED TO DO MORE THAN JUST COLOR ON PROJECTS?
This career has proven to be as much about color as it is about understanding cameras and technical workflows.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Being able to put my stamp as an artist on a project and working with great DPs and clients that like collaboration.

WHAT’S YOUR LEAST FAVORITE?
Working with someone that doesn’t respect the art of it.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Trying to make the professional golf tour.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I’ve always loved photography and movies… it was a path that I never even knew existed when I was younger. It was something that bridged those two worlds, and I was lucky enough to have a knack for it.

CAN YOU NAME SOME RECENT PROJECTS?
Outlander, Outcast, Bates Motel, Good Behavior, The Magicians and Hell on Wheels.

Outlander

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
The projects that I just mentioned above… and I love Outlander — the clients and the show, they allow great freedom to create many different worlds. I enjoy that and take great pride in being a part of it.

WHERE DO YOU FIND INSPIRATION?
Watching movies. Getting a wonderfully shot show from a great DP and seeing where something takes me — that’s inspiration.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
A cell phone, my color corrector and a set of golf clubs.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
I really only check Facebook.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Did I mention I like to golf?

Post vet Russ Robertson returns to Deluxe, joins Encore New York

After a year away, Russ Robertson has returned to Deluxe as SVP of sales at the company’s Encore New York. With scripted original series reaching 455 — a record — in 2016 and more shows delivering in HDR formats, Robertson’s 20 years of post experience will support content creators as they navigate this global, multi-format market. He re-joins Deluxe after a year at Panavision, where he was VP of marketing of camera systems and production services.

Robertson first joined Deluxe in 2002 in Toronto. He spent 14 years as VP of sales in Toronto, Vancouver and New York. He helped establish the New York outpost of Deluxe’s Encore in the process. He began his 20-year post career in sales and services roles at a number of facilities in Toronto.

“I had an amazingly educational year learning about cameras and lenses, but there’s so much happening in post right now — new models, a sea change in workflows with HDR, and so much opportunity to help clients create content for worldwide audiences, I couldn’t stay away.”

Cory Melious

Behind the Title: Heard City senior sound designer/mixer Cory Melious

NAME: Cory Melious

COMPANY: Heard City (@heardcity)

CAN YOU DESCRIBE YOUR COMPANY?
We are an audio post production company.

WHAT’S YOUR JOB TITLE?
Senior Sound Designer/Mixer

WHAT DOES THAT ENTAIL?
I provide final mastering of the audio soundtrack for commercials, TV shows and movies. I combine the production audio recorded on set (typically dialog), narration, music (whether it’s an original composition or artist) and sound effects (often created by me) into one 5.1 surround soundtrack that plays on both TV and Internet.

Heard City

WHAT WOULD SURPRISE PEOPLE ABOUT WHAT FALLS UNDER THAT TITLE?
I think most people without a production background think the sound of a spot just “is.” They don’t really think about how or why it happens. Once I start explaining the sonic layers we combine to make up the final mix they are really surprised.

WHAT’S YOUR FAVORITE PART OF THE JOB?
The part that really excites me is the fact that each spot offers its own unique challenge. I take raw audio elements and tweak and mold them into a mix. Working with the agency creatives, we’re able to develop a mix that helps tell the story being presented in the spot. In that respect I feel like my job changes day in and day out and feels fresh every day.

WHAT’S YOUR LEAST FAVORITE?
Working late! There are a lot of late hours in creative jobs.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
I really like finishing a job. It’s that feeling of accomplishment when, after a few hours, I’m able to take some pretty rough-sounding dialog and manipulate that into a smooth-sounding final mix. It’s also when the clients we work with are happy during the final stages of their project.

WHAT TOOLS DO YOU USE ON A DAY-TO-DAY BASIS?
Avid Pro Tools, Izotope RX, Waves Mercury, Altiverb and Revibe.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
One of my many hobbies is making furniture. My dad is a carpenter and taught me how to build at a very young age. If I never had the opportunity to come to New York and make a career here, I’d probably be building and making furniture near my hometown of Seneca Castle, New York.

WHY DID YOU CHOOSE THIS PROFESSION? HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I think this profession chose me. When I was a kid I was really into electronics and sound. I was both the drummer and the front of house sound mixer for my high school band. Mixing from behind the speakers definitely presents some challenges! I went on to college to pursue a career in music recording, but when I got an internship in New York at a premier post studio, I truly fell in love with creating sound for picture.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
Recently, I’ve worked on Chobani, Google, Microsoft, and Budweiser. I also did a film called The Discovery for Netflix.

The Discovery for Netflix.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
I’d probably have to say Chobani. That was a challenging campaign because the athletes featured in it were very busy. In order to capture the voiceover properly I was sent to Orlando and Los Angeles to supervise the narration recording and make sure it was suitable for broadcast. The spots ran during the Olympics, so they had to be top notch.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
iPhone, iPad and depth finder. I love boating and can’t imagine navigating these waters without knowing the depth!

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
I’m on the basics — Facebook, LinkedIn and Instagram. I dabble with SnapChat occasionally and will even open up Twitter once in a while to see what’s trending. I’m a fan of photography and nature, so I follow a bunch of outdoor Instagramers.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I joke with my friends that all of my hobbies are those of retired folks — sailing, golfing, fly fishing, masterful dog training, skiing, biking, etc. I joke that I’m practicing for retirement. I think hobbies that force me to relax and get out of NYC are really good for me.

ACE Eddie nominees include Arrival, Manchester by the Sea, Better Call Saul

The American Cinema Editors (ACE) have named the nominees for the 67th ACE Eddie Award, which recognize editing in 10 categories of film, television and documentaries.

Winners will be announced during ACE’s annual awards ceremony on January 27 at the Beverly Hilton Hotel. In addition to the regular editing awards, J.J. Abrams will receive the ACE Golden Eddie Filmmaker of the Year award.

Check out the nominees:

BEST EDITED FEATURE FILM (DRAMATIC)
Arrival
Joe Walker, ACE

Hacksaw Ridge
John Gilbert, ACE

Hell or High Water
Jake Roberts

Manchester by the Sea
Jennifer Lame
 
Moonlight
Nat Sanders, Joi McMillon

BEST EDITED FEATURE FILM (COMEDY)
Deadpool
Julian Clarke, ACE

Hail, Caesar!
Roderick Jaynes

The Jungle Book
Mark Livolsi, ACE

La La Land
Tom Cross, ACE

The Lobster
Yorgos Mavropsaridis

BEST EDITED ANIMATED FEATURE FILM
Kubo and the Two Strings
Christopher Murrie, ACE

Moana
Jeff Draheim, ACE

Zootopia
Fabienne Rawley and Jeremy Milton

BEST EDITED DOCUMENTARY (FEATURE)

13th
Spencer Averick

Amanda Knox
Matthew Hamachek

The Beatles: Eight Days a Week — The Touring Years
Paul Crowder

OJ: Made in America
Bret Granato, Maya Mumma and Ben Sozanski

Weiner
Eli B. Despres

BEST EDITED DOCUMENTARY (TELEVISION)
The Choice 2016
Steve Audette, ACE

Everything Is Copy
Bob Eisenhardt, ACE

We Will Rise: Michelle Obama’s Mission to Educate Girls Around the World
Oliver Lief

BEST EDITED HALF-HOUR SERIES
Silicon Valley: “The Uptick”
Brian Merken, ACE

Veep: “Morning After”
Steven Rasch, ACE

Veep: “Mother”
Shawn Paper

BEST EDITED ONE-HOUR SERIES — COMMERCIAL
Better Call Saul: “Fifi”
Skip Macdonald, ACE

Better Call Saul: “Klick”
Skip Macdonald, ACE & Curtis Thurber

Better Call Saul: “Nailed”
Kelley Dixon, ACE and Chris McCaleb

Mr. Robot: “eps2.4m4ster-s1ave.aes”
Philip Harrison

This is Us: “Pilot”
David L. Bertman, ACE

BEST EDITED ONE-HOUR SERIES – NON-COMMERCIAL
The Crown: “Assassins”
Yan Miles, ACE

Game of Thrones: “Battle of the Bastards”
Tim Porter, ACE

Stranger Things: “Chapter One: The Vanishing of Will Byers”
Dean Zimmerman

Stranger Things: “Chapter Seven: The Bathtub”
Kevin D. Ross

Westworld: “The Original”
Stephen Semel, ACE and Marc Jozefowicz

BEST EDITED MINISERIES OR MOTION PICTURE (NON-THEATRICAL)
All the Way
Carol Littleton, ACE

The Night Of: “The Beach”
Jay Cassidy, ACE

The People V. OJ Simpson: American Crime Story: “Marcia, Marcia, Marcia”
Adam Penn, Stewart Schill, ACE and C. Chi-yoon Chung

BEST EDITED NON-SCRIPTED SERIES:
Anthony Bourdain: Parts Unknown: “Manila” 
Hunter Gross, ACE

Anthony Bourdain: Parts Unknown: Senegal
Mustafa Bhagat

Deadliest Catch: “Fire at Sea: Part 2”
Josh Earl, ACE and Alexander Rubinow, ACE

Final ballots will be mailed on January 6, and voting ends on January 17. The Blue Ribbon screenings, where judging for all television categories and the documentary categories take place, will be on January 15. Projects in the aforementioned categories are viewed and judged by committees comprised of professional editors (all ACE members). All 850-plus ACE members vote during the final balloting of the ACE Eddies, including active members, life members, affiliate members and honorary members.

Main Image: Tilt Photo

Sight, Sound & Story takes on cinematography

By Daniel Rodriguez

Manhattan Edit Workshop’s recent Sight, Sound & Story: Art of Cinematography in New York City featured two one-hour panels: “Thinking In Pictures — Perspectives, Compositions, Lighting and Mood” and “Life Behind the Lens: DPs Talk Careers and Creativity in Film and Television.” The first focused on documentary work and the second on narrative-based storytelling. Both sparked questions and ideas in the head of this DP, including what roles and responsibilities cinematographers play in the storytelling process.

Docs
“Thinking In Pictures — Perspectives, Compositions, Lighting and Mood,” moderated by DP David Leitner, featured fellow cinematographers Wolfgang Held and Kirsten Johnson. Johnson’s documentary Cameraperson has made the Academy Awards Documentary shortlist.

The role of a cameraperson is essential to any film, narrative or documentary, but especially in the documentary world where much of the action is unplanned or out of one’s control. Johnson remarked how “we all live in a new way of filming and being filmed.” So, while much of their talk reflected on their own careers, they also looked toward the future. Her statement made me think about the current state of filming and seeing how stories are becoming much easier to tell thanks to technology that ranges from high-end digital cinema cameras to the ever-improving video quality of cellphones.

It brought to mind the saying, “the best camera is the one you have with you,” as some of the most stunning documentation of the human condition in the past decade have been on phones and lower-end cameras. Today’s ability to capture images is a far cry from a time when Super 8 and 16mm were the few feasible formats for documentary work — even then, the technology limited the possibilities due to technical skill or the unfortunate reality of a film magazine running out and the precious few minutes one might lose while reloading.

Working off older terms like “reloading,” all three on the stage expressed their distaste with the term “shooter.” They emphasized how they weren’t shooting any firearms and, if anything, the real shooters were the ones pointing guns at them — this had them reflecting on the death of Leonardo Henrichsen, a cameraperson who filmed his death while staring down a rifle’s barrel as a soldier fired at him during Salvador Allende’s rule in Chile.

Oftentimes camerapersons have to live in the moment, whether in narrative or documentary to judge the conditions they’re in and make decisions that’ll maximize their coverage and approach. To paraphrase Johnson, she made the brilliant observation that “directors work by anticipating what happens next, while a cameraperson nourishes in the present.” Regardless of filming background, whether documentary or narrative, this statement rings true because time is usually the most pressing factor in the field or on set.

While I do believe that a cameraperson must be somewhat aware of what they are striving to tell or cover, this feeling of nourishing in the present permits one to be flexible with how the given moment affects mood and emotion. I’m going to paraphrase once more — Cartel Land director Matthew Heineman has said, “If the documentary you were looking to shoot is the same one you get at the end then you weren’t paying attention.” The statement that Johnson made only enforces this idea because you must be able to fully immerse yourself in that moment in order to truly understand how to capture it.

Possibly the most simple and effective statement hat really summarized the role of a cameraperson was from moderator Leitner. He said, “Every shot matters.” While that is a very general statement, it does raise many questions regarding the cameraperson’s role in today’s world. Since we are now living in a predominantly digital age where truly cinematic images can be captured easily and on cheaper prosumer cameras, our artistic roles as cinematographers and camerapersons come down to the intuition we have as artists to make every shot matter.

With the advent of digital cinematography, excessive coverage and the ability to shoot longer has now become part of the norm; oftentimes this is a sacrifice of quality for the sake of having more to work with. Coming from analog film backgrounds, each person on the panel, specifically Leitner, emphasized how this finite length of film made the utmost care and attention go into every shot.

Wolfgang Held most effortlessly showed this approach as he screened bits from the latest film he worked on as cinematographer, Sophie and the Rising Sun was largely shot handheld, but unlike this feeling of over-coverage, each shot feels thought out and effective in adding to the story. The role of a cameraperson is an ever-changing one, especially in our current age, and as technology becomes more accessible to many the emphasis will always be on the artist and their approach.

Narrative
“Life Behind The Lens: DPs Talk Careers and Creativity in Films and Television” was moderated by cinematographer Marcin Kapron and featured Eric Lin, Eric Alan Edwards and Vanja Černjul, ASC. All four cinematographers come from a narrative-based background and they reflected on the moments that inspired their career choices and projects they’ve worked on.

I loved hearing how each panelist began in the industry. They all came from different walks of life and have built their careers in different fields, ranging from television to indie films to major blockbusters. As a young DP, it was very exciting to hear that they each shared a persistent and infinitely curious approach to creating images from early on, mostly originating through stills photography and related techniques.

Each pro screened clips from projects and discussed their approach on set and the technical challenges they each faced. The talk eventually looked toward the future and newer storytelling formats, such as high frame rate, HDR, and 4K projection. All agreed that there has yet to be a common standard set for newer methods of displaying these new formats. Despite this, each panelist agreed that there is definitely potential in these formats, especially in HDR which Vanja has direct experience with, shooting episodes of Marco Polo for Netflix, which requesedt an HDR version for delivery.

Speaking with Vanja directly after the event and having spoken with the colorist who collaborated with him on the SDR and HDR versions, Dado Valentic, the biggest challenge with HDR is having ways of displaying and monitoring on set in a cost-effective way. Ultimately, each panelist agreed that these are simply tools to aid and provide new methods of storytelling and, as cinematographers, they’re excited for the future.

Summing Up
We currently live in an industry where the tools that were once exclusive to camerapersons and cinematographers are now affordable, compact and available to anyone. Listening to these panelists talk about their experiences and opinions on the future was exhilarating and encouraging. Regardless of whether you work on narrative or documentary fare, ultimately comes down to the role of the artist to bring their unique approach and creative work ethic to make every shot matter.


Daniel Rodriguez is cinematographer and photographer living in New York City. Check out his work here. Dan took many of the pictures featured in this article. He is credited with the photos in this piece.

Warner/Chappell intros Color TV, Elbroar music catalogs from Germany

For those of you working in film and television with a need for production music, Warner/Chappell Production Music has added to its offerings with the Color TV and Elbroar catalogs. Color TV is German composer Curt Cress’ nearly 14,000-track collection from Curt Cress Publishing and its sister company F.A.M.E. Recordings Publishing. Color TV and the Elbroar catalog, which is also from Germany, are available for licensing now.

Color TV brings to life a wide range of TV production styles with an initial release that includes nine albums: Panoramic Landscapes; Simply Happy, Quirky & Eccentric; Piano Moods; Chase & Surveillance; Secret Service; Actionism; Drama Cuts; and Crime Scene.

Following the initial release, Warner/Chappell Production Music plans to offer two new compilations from the catalog every two weeks. Color TV is available for licensing worldwide, excluding Italy and France.

“Composers have that unique talent and ability to translate what they’re feeling,” explains Warner/Chappell Production Music president Randy Wachtler. “You can hear emotion in different compositions, and it’s always interesting to hear how creators from countries around the world capture it.  Adding to our mix only adds more perspective and more choice for our clients.”

Cress began his musical career in the 1960s, performing in acts such as Klaus Doldinger’s Passport and his own band Snowball, as well as in Falco and Udo Lindenberg’s band. His solo projects involved work with local and international artists including Freddie Mercury, Tina Turner, Rick Springfield, SAGA, Meat Loaf and Scorpions, as well as releasing his own solo material. He made a name for himself as a composer for popular German films and TV series such as SK Kölsch, HeliCops and The Red Mile.

Elbroar, out of Hamburg, Germany, is a collection ranging from epic to minimal, jazz to techno and drama to fun. The catalog serves creatives in the fields of television, film and advertising, with a strong focus on trailers and daytime TV.

The catalog’s first release, “Epic Fairy Tales,” is an album of orchestral arrangements that set the scene for fantastic stories and epic emotions. Elbroar is available for licensing immediately, worldwide.

Team Player: Rules Don’t Apply editor Brian Scofield

By Randi Altman

In the scheme of things, we work in a very small industry where relationships, work ethic and talent matter. Brian Scofield is living proof of that. He is one of a team of editors who worked on Warren Beatty’s recent Rules Don’t Apply.

That team included lead editor Billy Weber, Leslie Jones and Robin Gonsalves. It was the veteran editor Weber (Beatty’s Bulworth 1998) who brought Scofield on board as a second editor.

Weber was Scofield’s mentor while he was in the MFA program at USC. “Not long after I completed graduate school, Billy helped me reconnect with the Malick camp, who I met while working in the camera crew on Tree of Life,” he explains. “I then became an apprentice on To the Wonder, and then an editor on Knight of Cups. When Billy came in as an advisor at the end of Knight of Cups, we reconnected in LA. He had just begun working on Rules Don’t Apply with Warren, and when I finished my work on Knight of Cups, he brought me aboard.”

Scofield recognizes that relationships open doors, but says you have to walk through them and prove you belong in the room all by yourself. “I think people often make the mistake of thinking that networking trumps talent and work ethic, or the other way around, and that just isn’t true.  All three are required to have a career as a film editor — the ability to form lasting relationships, the diligence to work really hard, and having natural instincts that you’re always striving to improve upon.”

Scofield says he will always be grateful to Weber and the example he’s set. “I’m only one of over a dozen people whose careers Billy has helped launch over the years. It’s in large part his generosity and mentorship that inspires me to pay it forward any chance I get.”

Let’s find out more from Scofield about his editing process, what he’s learned over the years, and the importance of collaboration.

You have worked with two Hollywood icons in Terrence Malick and Warren Beatty. I’m assuming you’re not easily intimidated.
It’s been a transformative experience in every way. These two guys, who have been making films for over 40 years, are constantly challenging themselves to try new things… to experiment, to learn. They’re always re-evaluating pretty much everything from the story to the style, and yet these are two guys with such distinct voices that really shine through their work. You know a Malick or Beatty film when you see it. The Inexhaustibility of the cinematic art form, I guess, is what I really took away from both of them.

Photo Credit: Francois Duhamel.They are both very different kinds of filmmakers.
You would never think that working on a Terrence Malick film would prepare you to work on a Warren Beatty film. Knight of Cups is a stream-of-consciousness, meditative tome about the meaning of life. Warren’s film is a romantic comedy with a historical drama slant. Aesthetically, they’re very different films, but the process of constantly finding ways to break open the movie all over again, and the mindset that requires, is very similar.

Both Terry and Warren are uncompromising and passionate about making movies the way they want and not bending to conventions, yet at the same time looking for ways to reach people on a very deep level. In this case, both films were also deeply personal for the director. When you work on something like that, it adds another layer of pressure because you want to honor how much of themselves they’re willing to put into their work. But that’s also where I believe the most exciting films come from. That pressure just becomes inspiration.

How early did you get involved on Rules Don’t Apply?
Right after production wrapped. I was finishing up with Terry on the mix stage for Knight of Cups when Billy called. They had an assembly of the film when I joined — everything was in there — and that version was probably about four hours long. Interestingly, some things have changed dramatically since that version and some are remarkably similar.

I was on Rules Don’t Apply for just over a year, but I’ve been back several times since officially finishing. I took a good amount of time off and went back, and since then I’ve popped in and out whenever Warren has needed me. Robin became a true caretaker of the film, staying with Warren through that additional time leading up to the release.

Is that typically how you’ve worked? Coming in after there’s an assembly?
I’ve come in as an additional set eyes on some, and I’ve been on films during production, sending cuts to the director while they’re in the middle of shooting. This includes giving feedback on pick-ups they need to grab or things to be wary of performance-wise, those types of things.

Both are thrilling experiences. It’s fun to come in when there has been one specific approach and they’re open to new ideas. You kind of get to shake people out of the one way they’ve been going about the film. When I’m the editor that’s been working on the film since the beginning, that initial discovery period when you see the film take shape for the first time is always thrilling. The relationship you form with both the film and the director is hard to beat. But then, I’m always excited for someone to come in and shake things up, to help me think differently. That’s why you do feedback screenings. That’s why you bring other editors into the room to take a look and to make you think about things from a different angle.

How was it on Rules Don’t Apply?
When I came on, so much of it was working really well from the first assembly, but I did want to strengthen the love story between Frank and Marla and make their attraction more evident early in the film so that it paid off later. I started by going through all of the scenes and looking for little moments where we could build up glances between them or find little raindrops before the storm of that budding relationship.

Photo Credit: Francois Duhamel.There were a few storylines going on at the same time as well?
The story takes place over a long period of time — you’ve got Warren Beatty playing Howard Hughes, you’re dealing with a young love story, you’re dealing with an incredible supporting cast, all of whom could be bigger characters or smaller characters. When you come in a little bit later, it’s often your job to help figure out which storylines or themes are going to become the main thrust of the movie.

So there are different definitions of co-editor?
Well, it varies every day. Some days Warren would want to work on a couple of different scenes, so one editor would take one and I would take the other. Sometimes you would have worked on a scene for a long time and somebody else would say, “Let me have a stab at that. I’ve got a different idea.” Sometimes we were all together in one room with one of us driving the Avid and the others offering a different set of eyes — eyes that aren’t staring at the timeline — and they’re looking at it side-by-side with the director, almost as a viewer instead of within the nitty-gritty of making the cut. We would take turns doing that.

You’ve got to check your ego at the door, I suppose? Everybody’s on the same team these days.
There’s no pecking order, and I think Billy Weber is really the one who sets that tone because he’s such a generous and experienced editor and man. There are people out in the industry that might be protective of their work versus letting anybody else touch it, but there’s none of that in any of the editing rooms that I’ve been fortunate enough to work in. Everybody’s respectful of each other.

On this film we had Billy, myself, Leslie Jones and Robin all working at the same time. You’ve got almost three generations of editors in that room, and to be treated as an equal really opens up your mind and your creativity. You feel the freedom to really present big ideas.

How is it collaborating with Warren?
He is such a unique guy. His favorite thing to do is to have a fight — he doesn’t want people who are just going to accept what he says. He wants a fiery debate, which can make people uncomfortable, but I’m okay with it. I actually really enjoyed that, especially when you realize he’s not taking it personally and neither should I. This is about making a movie the best that it can be. He wants people that are going to challenge him and push back.

Photo Credit: Francois Duhamel.So it’s part of his creative process?
Yes, it’s all about the discourse. If he has a strong point of view, he wants to argue it to make sure that he really believes it. And if you have a strong point of view, he wants you to be able to tell him why. I would say the fiercest fights led to him being most happy afterwards. At the end of the screaming, he would always say, “That was such a productive conversation. I’m so glad we did that!” He surrounds himself with people he knows he trusts. He knows that’s what he needs to make him as productive and as creative as he can be.

It’s been a long time since Warren directed a film, how did he react to the new technology?
He was thrilled with all of the new abilities of technology. This movie was shot on the Alexa, for the most part, and we did do a good amount of combining it will archival footage. This is a very modern movie in many ways, but it also has a distinctive throwback vibe. We had to try to marry those things without going overboard.

We resized frames, added a few push-ins, speed ramps, and so on. Ultimately, all of these tools just allowed him to explore the footage even more than he’s used to doing. He really loved taking advantage of new editorial opportunities that couldn’t have been done even 15 years ago, at least not as easily.

How do you organize things within the Avid Media Composer?
Any time I start a new job, I send a Google Doc to the assistant that specifies exactly how I want the project set up. It’s an evolution of things I’ve learned in different editing rooms over time.

For every scene, I have a bin with a frame view. If the bin is the size of my monitor, I should be able to see all clips in that one view without scrolling. Each set-up is separated from each other, so I can see very quickly, “Oh there are four takes of that shot, there are four takes of that shot, there are three takes of that one.” I have the assistant prepare three sequences: one that’s just a pure string-out of all of the clips, so I can, in one sequence, scrub through everything that’s there. I do a string-out “clean,” which is when you take out all the slates and you take out all the director’s talking, so I can be impartial and just look at the footage. Then I usually have one more sequence that’s just circle takes that the director chose on set. Then I go through and I make a select reel based off of everything that I watch. That’s the basic bin set-up.

For films that have multiple editors, organization is really important because somebody else has to be able to understand how your work is organized. You have to be able to find things that you did a year ago.

Any special tricks, like speed ramps, sound effects, transitions? I’m imagining that changes per project?
Yeah, it’s pretty unique to the project. There are a lot of editors who have specific effects that they go back to over and over again in their own bin. I’ve got a few of those, but I almost always end up tailoring them and sometimes just starting from scratch.  I go on the hunt for the right effect when I need it.

Photo Credit: Francois Duhamel.I’ve gotten pretty adept at tailoring the built-in effects to my needs as they come up, but people who use those effects all the time are working on more crazy action or stylized films because they’ve got a lot more demand for those than when you’re working on character-driven content.

Do you typically work with a template from a colorist, or do you do any temp color corrections yourself?
Most of the films have a look that the DP has already applied, and I do tweaking as needed. If we come up with a creative reason for color correction, I’ll do a sketch. I do a lot of work with sound, but with color, it just depends. If it needs to be changed in order to understand what the idea is or if we’re screening it for somebody that we don’t trust to be able to see what it is without color correction, then of course we’re going to go in and we’re going to tweak it. I’ve worked on a film where all the exteriors were really magenta, so we came up with our kind of default fix to be applied to all of those shots.

Can you elaborate on the sound part?
I cut as much for sound as I do for picture. I think people grossly underestimate the influence that sound has on how you watch a movie. I’m not a sound designer, but I try my best to provide a sketch for when we go into that next phase so the sound designer has a pretty clear idea of what we’re going for. Then, of course, they use their creativity to expand and do their own thing.

How do you work with your assistant editors? Do you encourage them to edit, or are they strictly technical?
It depends on the project and on the timeframe of the project. In the beginning, the priority is on getting everything set up. Then the priority is on helping me build a first sound pass after we’ve gotten an assembly. They help bring in effects and to smooth over things I’ve sketched out. Sometimes they’re just gathering effects for me and sometimes they’re cutting them themselves. Sometimes we’re kind of tossing them back and forth. I do a rough pass and I ask them to mix it, clean up the levels, add in a couple accents here and there. Once we’re through with that we kind of have at least a ground floor for sound to cut with.

When given the opportunity, I love to let my assistants get creative. I let them take a stab at scenes, or at least have them be present in the room to give feedback. When the director isn’t present, I rely a lot on my assistant just to check in and say, “Hey, is this crazy?” or try to engage them as much as I can in that creative process. It all just depends on the demands of the project and the experience level of the assistant.

Is there anything you would like to add?
Film is a collaborative art form, and in order to help a director do their best work, you need to be their friend, their antagonist, their therapist, their partner. Whatever it takes is what your job is. I was so fortunate to learn an enormous amount from Warren, but also from my fellow editors. I hope everybody has as much fun watching this crazy little movie as we did making it.

Finally, I’d just love to say that working with Warren will undoubtedly be one of the most cherished experiences of my life. Reputations be damned, he’s a kind, brilliant and uncompromising artist who it was endlessly inspiring to spend so much time with.  I’ll forever be grateful I had the opportunity to both work for him and to call him a friend.

Main Image: Robin Gonsalves, Warren Beatty and Brian Scofield.

Magix Movie Edit Pro Plus for when you want simple edits

By Brady Betzel

In the middle of 2016, Sony sold its nonlinear editing software Vegas to a 20-year-old German company called Magix. When they contacted me about reviewing their NLE Movie Edit Pro Plus, since I am always interested in the latest gear and software, I said, “sure.”

I was sent a copy of Movie Edit Pro Plus, and as I am typing this intro I am downloading around 5GB of additional content and templates to use inside the software. Immediately upon opening Movie Edit Pro Plus I get the feeling that this isn’t a professional NLE and to be honest, that isn’t a problem. Sometimes I just want to use something that is plug-and-play, so maybe that is how Movie Edit Pro Plus can work for you. Think of a Window’s version of iMovie when you don’t want to jump into FCPX to make a slideshow or family video. It even comes with prebuilt templates.

There are three versions of Movie Edit Pro: Standard, Plus and Pro. I am going over Plus, which retails for $99. The Pro version costs $129.99 while the Standard is around $49, but I was kind of confused by their site, which is running a deal where you get all the Pro benefits when you buy the $49 Standard edition. The Plus version adds features that include up to 99 tracks for video and audio, proxy video editing, shot matching, color correcting and some more templates and effects.

Immediately, I wanted to check out their keyboard shortcuts since I like to edit with as little clicking as possible. I found them in the help manual, but wouldn’t mind an onscreen keyboard layout/reassignment (me being a picky editor, I know). Then I loaded up their demo project that I was able to download and zoomed into the sequence using the scroll wheel on my mouse and the bar on the bottom of the timeline. It was a little strange as it zoomed into an arbitrary part of the sequence as opposed to where my edit indicator was on the timeline. To accurately zoom in you will need to use the keyboard shortcuts CTRL + Up Arrow or Down Arrow. Then I noticed you can place anything anywhere. Pretty crazy —you can put audio on video layers and put effects like timecode display anywhere you want… kind of like FCPX. It’s liberating I guess, but my OCD can’t really handle that yet. I’m really trying to get used to FCPX, I promise.

Sometimes I can’t help myself when I review products and I forget to start at the basics, like where to import your footage, audio or stills. So to back up a little bit, in the default view of Movie Edit Pro you get a few tabs on the top right: Import, Fades, Title and Effects. Each window has sub menus under the Fades menu you can see different types of transitions. It’s a little clunky but they get the job done. They have some cool preset transitions like card turns or slides. They are easy to apply by dragging them over the edit point between two clips. Once you apply it you can change the duration, for which you can only enter seconds or fractions of seconds, not frames like normal NLEs would use. By now you probably are getting the point that this is more of a beginner NLE or one for someone who wants to simply edit something without much technical thought.

On the plus side you do get a good amount of presets including title animations, subtitles, and even 3D title animations. One of my favorite plug-ins that comes with Magix Movie Edit Pro Plus is the Travel Route tool located in the Magix Tools drop down located under the Import tab. The Travel Route tool lets you set different stops along a trip, assign a way of travel like an airplane or car, and then animate your route. Animating a vacation along a map isn’t the easiest thing to accomplish, even for an After Effects wiz, so this being included is an awesome feature. You can preview your animation and, if you like it, create an export of it to use in your edit.

There are a lot of tools that come with Movie Edit Pro Plus that I haven’t mentioned yet, like Beat Based editing which allows you to automate an edit along a beat in a song, recording your actual edit so that you can create tutorials or timelapses of your edit easily, chroma key effects to key out that greenscreen you shot on, 360 video creation and much more. In the end you can share your content using the Export Wizard to sites like YouTube or even your phone.

In the end, Magix Movie Edit Pro Plus is not a professional nonlinear editor. If you are looking for a Premiere Pro or FCPX replacement this isn’t it. This isn’t bad though; sometimes I need an app that does home movie type editing fast and easy with templates — this is where Movie Edit Pro Plus might fit in for you.

Check out their website for more features, including image stabilization powered by proDAD’s Mercalli V2 technology.

Creating and tracking roaches for Hulu’s 11.22.63

By Randi Altman

Looking for something fun and compelling to watch while your broadcast shows are on winter break? You might want to try Hulu’s original eight-part miniseries 11.22.63, which the streaming channel released last February.

It comes with a pretty impressive pedigree — it’s based on a Stephen King novel, it’s executive produced by J.J. Abrams, it stars Oscar-nominee James Franco (127 Hours) and it’s about JFK’s assassination and includes time travel. C’mon!

The plot involves Franco’s character traveling back to 1960 in an effort to stop JFK’s assassination, but just as he makes headway, he feels the past pushing back in some dangerous, and sometimes gross, ways.

Bruce Branit

In the series pilot, Franco’s character, Jack Epping, is being chased by Kennedy’s security after he tries to sneak into a campaign rally. He ducks in a storage room to hide, but he’s already ticked off the past, which slowly serves him up a room filled with cockroaches that swarm him. The sequence is a slow build, with roaches crawling out, covering the floor and then crawling up him.

I’m not sure if Franco has a no-roach clause in his contract (I would), but in order to have control over these pests, it was best to create them digitally. This is where Bruce Branit, owner of BranitFX in Kansas City, Missouri came in. Yes, you read that right, Kansas City, and his resume is impressive. He is a frequent collaborator with Jay Worth, Bad Robot’s VFX supervisor.

So for this particular scene, BranitFX had one or two reference shots, which they used to create a roach brush via Photoshop. Once the exact look was determined regarding the amount of attacking roaches, they animated it in 3D and and composited. They then used 2D and 3D tracking tools to track Franco while the cockroaches swarmed all over him.

Let’s find out more from Bruce Branit.

How early did you get involved in that episode? How much input did you have in how it would play out?
For this show, there wasn’t a lot of lead time. I came on after shooting was done and there was a rough edit. I don’t think the edit changed a lot after we started.

What did the client want from the scene, and how did you go about accomplishing that?
VFX supervisor Jay Worth and I have worked together on a lot of shows. We’d done some roaches for an episode of Almost Human, and also I think for Fringe, so we had some similar assets and background with talking “roach.” The general description was tons of roaches crawling on James Franco.

Did you do previs?
Not really. I rendered about 10 angles of the roach we had previously worked with and made Adobe Photoshop brushes out of each frame. I used that to paint up a still of each shot to establish a baseline for size, population and general direction of the roaches in each of the 25 or so shots in the sequence.

Did you have to play with the movements a lot, or did it all just come together?
We developed a couple base roach walks and behaviors and then populated each scene with instances of that. This changed depending on whether we needed them crossing the floor, hanging on a light fixture or climbing on Franco’s suit. The roach we had used in the past was similar to what the producers on 11.22.63 had in mind. We made a few minor modifications with texture and modeling. Some of this affected the rig we’d built so a lot of the animations had to be rebuilt.

Can you talk about your process/workflow?
This sequence was shot in anamorphic and featured a constantly flashing light on the set going from dark emergency red lighting to brighter florescent lights. So I generated unsqueezed lens distortion, removed and light mitigated interim plates to pull all of our 2D and 3D tracking off of. The tracking was broken into 2D, 3D and 3D tracking by hand involving roaches on Franco’s body as he turns and swats at them in a panic. The production had taped large “Xs” on his jacket to help with this roto-tracking, but those two had to be painted out for many shots prior to the roaches reaching Franco.

The shots were tracked in Fusion Studio for 2D and SynthEyes for 3D. A few shots were also tracked in PFTrack.

The 3D roach assets were animated and rendered in NewTek LightWave. Passes for the red light and white light conditions were rendered as well as ambient show and specular passes. Although we were now using tracking plates with the 2:1 anamorphic stretch removed, a special camera was created in LightWave that was actually double the anamorphic squeeze to duplicate the vertical booked and DOF from an anamorphic lens. The final composite was completed in Blackmagic Fusion Studio using the original anamorphic plates.

What was the biggest challenge you faced working on this scene?
Understanding the anamorphic workflow was a new challenge. Luckily, I had just completed a short project of my own called Bully Mech that was shot with Lomo anamorphic lenses. So I had just recently developed some familiarity and techniques to deal with the unusual lens attributes of those lenses. Let’s just say they have a lot of character. I talked with a lot of cinematographer friends to try to understand how the lenses behaved and why they stretched the out-of-focus element vertically while the image was actually stretched the other way.

What are you working on now?
I‘ve wrapped up a small amount of work on Westworld and a handful of shots on Legends of Tomorrow. I’ve been directing some television commercials the last few months and just signed a development deal on the Bully Mech project I mentioned earlier.

We are making a sizzle reel of the short that expands the scope of the larger world and working with concept designers and a writer to flush out a feature film pitch. We should be going out with the project early next year.

TrumpLand

TrumpLand gets quick turnaround via Technicolor Postworks

Michael Moore in TrumpLand is a 73-minute film that documents a one-man show performed by Moore over two nights in October to a mostly Republican crowd at a theater in Ohio. It made its premiere just 11 days later at New York’s IFC Center.

The very short timeframe between live show and theatrical debut included a brisk five days at Technicolor PostWorks New York, where sound and picture were finalized. [Editor’s note: The following isn’t any sort of political statement. It’s just a story about a very quick post turnaround and the workflow involved. Enjoy!]

TrumplandMichael Kurihara was supervising sound editor and re-recording mixer on the project. He was provided with the live feeds from more than a dozen microphones used to record the event. “Michael had a hand-held mic and a podium mic, and there were boom mics throughout the crowd,” Kurihara recalls. “They set it up like they were recording an orchestra with mics everywhere. I was able to use those boom mics and some on stage to push sound into the surrounds to really give you the feeling that you are sitting in the theater.”

Kurihara’s main objectives, naturally, were to ensure that the dialogue was clear and that the soundtrack, which included elements from both nights, was consistent, but he also worked to capture the flavor of the event. He notes, for example, that Moore wanted to preserve the way that he used his microphone to produce comic effects. “He did a funny bit about the Clinton Foundation, and used the mic the way stand-up comics do, holding it closer or further a way to underscore the joke,” Kurihara says. “By holding the mic at different angles, he makes the sound warmer or punchier.”

Kurihara adds that the mix sessions did not follow a conventional, linear path as creative editorial was still ongoing. “That made it a particularly exciting project,” he notes. “We were never just mixing. Editorial changes continued to arrive right up to the point of print.”

Focusing on Picture
Colorist Allie Ames handled the film’s picture finishing. Similar to Kurihara, her task was to cement visual consistency while maintaining the immediacy of the live event. She worked from a conformed version of the film, supplied by the editing team.

According to Ames, “It already had a beautiful look from the way it was staged and shot, therefore, my goal was to embrace and enhance the intimacy of the location and create a consistent look that would draw the film audience into the world of the theatrical audience without distracting from Michael’s stage performance.”

Moore and his producers attended most of the sound mixing and picture grading sessions. “It was an unusual and exciting process,” says Ames. “Usually, you have weeks to finish a film, but in this case we had to get it out quickly. It was an honor to contribute to this project.”

Technicolor PostWorks has provided post services for several of Moore’s documentaries, including Where to Invade Next, which debuted earlier this year. For TrumpLand the facility created deliverables for the premiere at IFC, and subsequent theatrical and Netflix releases.

Says Moore, “Simply put, there would have been no TrumpLand movie without Technicolor PostWorks. They have a dedicated team of artists who are passionate about filmmaking, and especially about documentaries. In this instance, they went above and beyond what was asked of them to ensure we were ready in record time for our premiere — and they did so without compromising quality or creativity. I did my previous film with them a year ago and in just 14 months they were already using technology so new it made our 2015 experience feel so… 2015.”

Qwire’s tool for managing scoring, music licensing upped to v.2.0

Qwire, a maker of cloud-based tools for managing scoring and licensing music to picture, has launched QwireMusic 2.0, which expands the collaboration, licensing and cue sheet capabilities of QwireMusic. The tool also features a new and intuitive user interface as well as support for the Windows OS. User feedback played a role in many of the new updates, including marker import of scenes from Avid for post, Excel export functions for all forms and reports and expanded file sharing options.

QwireMusic is a suite of integrated modules that consolidates and streamlines a wide range of tasks and interactions for pros involved with music and picture across all stages of post, as well as music clearance and administration. QwireMusic was created to help facilitate collaboration among picture editors and post producers, music supervisors and clearance, composers, music editors and production studios.

Here are some highlights of the new version:
Presentations — Presentations allow music cues and songs to be shared between music providers (supervisors and composers) and their clients (picture editors, studio music departments, directors and producers. With Presentations, selected music is synced to video, where viewers can independently adjust the balance between music and dialogue, adding comments on each track. The time-saving efficiency of this tool centralizes the music sharing and review process, eliminating the need for the confusing array of QuickTimes, Web links, emails and unsecured FTP sites that sometimes accompany post production.

Real-time licensing status — QwireMusic 2.0 allows music supervisors to easily audition music, generate request letters, and share potential songs with anyone who needs to review them. When the music supervisor receives a quote approval, the picture editor and music editor are notified, and the studio music budget is updated instantly and seamlessly. In addition, problem songs can be instantly flagged. As with the original version of QwireMusic, request letters can be generated and emailed in one step with project-specific letterhead and signatures.

Electronic Cue Sheets — QwireMusic’s “visual cue sheet,” allows users to review all of the information in a cue sheet displayed alongside the final picture lock.  The cue sheet is automatically populated from data already entered in qwireMusic by the composer, music supervisor and music editor. Any errors or missing information are flagged. When the review is complete, a single button submits the cue sheet electronically to ASCAP and BMI.

QwireMusic has been used by music supervisors, composers, picture editors and music editors on over 40 productions in 2016, including Animals (HBO); Casual (Hulu); Fargo (FX); Guilt (Freeform); Harley and the Davidsons (Discovery); How to Get Away With Murder (ABC); Pitch (Fox); Shameless (Showtime); Teen Wolf (MTV); This Is Us (NBC); and Z: The Beginning of Everything (Amazon).

“Having everyone in the know on every cue ever put in a show saves a huge amount of time,” says Patrick Ward, a post producer for the shows Parenthood, The West Wing and Pure Genius. “With QwireMusic I spend about a tenth of the time that I used to disseminating cue information to different places and entities.”

Behind the Title: Arnold Worldwide’s Jon Drawbaugh

NAME: Jon Drawbaugh

COMPANY: Arnold Worldwide

CAN YOU DESCRIBE YOUR COMPANY?
Arnold is a global creative agency that sits within Havas Creative Group and has offices in Boston (HQ), London, Madrid, Milan, New York, Prague, São Paulo, Shanghai, Sydney and Toronto.

WHAT’S YOUR JOB TITLE?
EVP, Director of Integrated Production

WHAT DOES THAT ENTAIL?
I like to think of the job as sort of production curator. I am the steward of all the wonderful things that we make as an agency — from sites to apps to video content to still imagery to live brand experiences. I produce by supporting creative solutions and executions. We’re in a period of disruption in the agency world, and I find the opportunities exciting. There’s always something new to learn and a “never been done before” to figure out.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I’m lucky that’s it’s a very roll up your sleeves and dig into the work kind of role. Unlike other leadership roles that are administrative or directorial in nature, I’m very hands-on while still being strategic and holistic. I’ll go from managing staffing allocations into content strategy meetings and then be in an edit bay reviewing creative decks and making ballpark estimates. I also spend a fair amount “producing” for the agency.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Collaborating with my team, creative teams, clients and partners.

WHAT’S YOUR LEAST FAVORITE?
Number crunching.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
Late afternoon. If all my meetings are done for the day, it’s a great time to grab a coffee and reflect on the solutions of the day.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I wish I could be an amazing chef with popular, hip restaurants. In reality, I’d likely be working for a production company producing or directing content.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
To be honest, I stumbled into advertising. I didn’t know anything about it until I moved to New York City. I landed a temp job at Messner Vetere Berger McNamee Schmetter as a receptionist. Advertising seemed so glamorous, what with the producers jetting off to foreign countries and working with famous feature directors. It sounded much more fun than what I had been doing, which was making copies in the basement of a law firm.

From there I worked in the creative department and dabbled in copy writing. I wanted to get to making TV spots quickly, so I figured taking the producer track would get me there faster. Plus, I was producing theater projects on the side and discovered I could actually get paid for producing if I worked at an ad agency.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
I’m new to Arnold, so I don’t have my fingerprints on any projects just yet, but I’m a big fan of the recent work like Jack Daniel’s Global Barrel Hunt and their Our Town film (pictured). I also love the Hershey’s My Dad spot and Reese’s #AllTreesAreBeautiful social campaign.

Prior to Arnold, I’m really proud of the Qualcomm Invisible Museum app and Fabric Content projects I worked on out of DDB SF.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
That’s a tough one. I’m so proud of a lot of the work I’ve made over the years. For example, the massive Acura TLX integrated launch we did at Mullen LA, the documentary film I made with Lucy Walker Make Haste Slowly: The Kikkoman Creed, or the viral hit Nanerpus before there were viral hits.

But I’d say the animated short Smutley for AIDES (the French association tackling HIV and AIDS) I produced at Goodby, Silverstein and Partners is one of my proudest. A chance to use our ad skills for good, and how many times in a career can you say you made a cartoon about a cat having sex with all different kinds of animals to Joan Jett’s “Bad Reputation.”

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
My iPhone, my vintage HiFi, and my camera. Running water and heat are pretty cool, too.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
Instagram, Facebook, Twitter, SnapChat, Vine, Houseparty, Tumblr, Periscope, LinkedIn, Pinterest and others.

DO YOU LISTEN TO MUSIC WHILE YOU WORK? 
I love music. All kinds. But generally I don’t have a lot of time at the office to plug in my headphones. When I do, I generally use Spotify or Apple Music to listen to the Indie genre.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I like to listen to LPs on my vintage HiFi with my family. It’s our important family together time. We like to go hunting for vinyl together on weekends. Record Store Day is like a second Christmas for us.

Splenda Naturals gets the stop-motion treatment in new spot

Production company 1stAveMachine worked with J. Walter Thompson Toronto and Splenda to create an integrated marketing campaign that promotes the Stevia-based, calorie-free sweetener Splenda Naturals.

Coffee, a fully stop-motion animated spot, features an office scene where a gruff boss (a coffee mug) is laying off a packet of sugar, telling him that “sweet ain’t enough anymore.” The sugar packet figures out it’s that new “Splenda Naturals gal” who is replacing him. The boss explains that not only is she sweet, she’s healthier than sugar. The piece ends with a box of the product and the tag “Hello New Splenda Naturals Sweetener.”

Production companies Tronco and 1stAveMachine worked together to provide production and post production on the piece. The directors of the spot were Becho Lo Bianco and Mariano Bergara, and the production director was Anuk Torre Obeid. 1stAveMachine has represented Tronco and Mab and Becho exclusively worldwide for the last seven years, each year doing more work in the North American and international market.

“Mab, Becho and Anuk are directors and storytellers first and foremost. They own a stop-motion studio in Buenos Aires and have been experimenting in and perfecting this craft for over a decade, but perfect craftsmanship is only a tool to tell the story. We worked in a very collaborative way with the agency and were lucky enough to be able to be a part of the films from the ground up. We designed and built every character as well as the set,” reports 1stAveMachine executive producer/partner Sam Penfield. “As we work with agencies from around the world, many times remotely, we have built many tools in order to collaborate from a distance. Many in our business have not worked in stop-motion, and it is has some peculiarities in regards to process. The first step when we begin a job is to educate anyone on the team who is not familiar with stop motion — how to exploit its natural charm and what limitations that one should be aware of. Once there is an overall understanding of process, we build and previs in CG in order to work on pacing, camera and, most importantly, acting. In stop-motion, even inanimate objects ‘act.’

“For each frame, we build a visual hierarchy so that the viewer follows the story easily and then we fill each frame with interesting details that make for a richer experience on each viewing thereafter,” he continues. “In the case of Splenda, the pre-production was done remotely and the agency/client attended the shoot. Having a great pre-production process meant the shoot went smoothly and we had plenty of time to enjoy being in Buenos Aires.”

The editor was Nicolas Rivas and Alejandro Armaleo provided the color grade. The sound mix was via Pirate Toronto.

Review: DJI’s Phantom 4 drone

By Brady Betzel

I’ve been trying to get my hands on a professional drone to review for a few years now. My wife even got me a drone from a local store that was a ton of fun to play with but was hard to master.  For years, I’ve been working on television shows that use drone footage and capture incredible imagery, but it always seemed out of reach for me as an editor. Finally, after much persistence (or pestering, depending on who you ask), DJI agreed to send me the Phantom 4 to test out, and boy is it awesome!

By now you’ve probably made your way through the ubiquitous reviews, including the endless supply of YouTube reviews, but in that small chance you are reading this without much prior drone knowledge and work in production or post production, I have some ideas for you.

When reading this review, think about how you could take a drone, run outside and maybe grab some b-roll for something you are working on. If you create opening titles or sizzle reels, you could grab some great aerial shots or fast-paced shots to use as transitions. The possibilities are really endless, as long as you get your video picture settings dialed in.

Before I started as an online editor (which, for those who don’t know, focuses on the technical side of editing — color correction, grading, transcoding, outputting, exporting, anything that ends in “-porting” or “-linking” basically), I worked my way through being a post coordinator, post production supervisor and all the way to offline editor. One thing I noticed in many of the non-union live-to tape shows (like late night comedy or talk shows), is that the editor has a lot of freedom to be creative and can push the envelope a little.

Maybe the editor needs some b-roll for an edit that isn’t in the system, so as the post supervisor you might run out and shoot it yourself. Why not with a drone? If you need a quick aerial of a house from directly above, you might be able to get away with footage from your own drone, saving the project money while showing some talent that may get you more jobs in the future!

I really love the idea of people acquiring as much knowledge in different job positions as possible, whether you are in the craft service or executive producer, if you can do things like operate a camera, hold a boom mic or fly a drone, you will probably make a lasting impression and be known as someone who is hungry to work and to create a great end product, regardless of your position.

Fly Legally!
Not to be a total wet blanket and put a huge wrench in your drone flying, but there are some laws that recently have been passed (more like clarified) to standardize drone use between hobbyists and commercial fliers (basically someone who wants to make money from their footage). You should definitely check out the Federal Aviation Administration’s Getting Started page for more info.

closeup-cameraIf you are flying your drone for fun and as long as it is has the weight and footprint size of the DJI Phantom 4 — the weight is about 3lbs and it measures about 14 inches diagonally without propellers, which can add a couple of inches — there is minimal work that you need to do. However, if you are planning on making money from your drone footage, there are many steps you must take, including taking an official test. There is a a lot you need to know that is beyond the scope of this review, so definitely check out the FAA link above for more.

Easy to Use
Since I hadn’t flown a professional drone before I had nothing to compare it to, but I can tell you that I picked up the Phantom 4 and was flying it within five minutes. It really is that easy to get up and running.

Step 1, charge your remote and battery; Step 2, plug in your phone or tablet via USB to the remote; Step 3, attach propellers; Step 4, fly! You should probably boot up your Phantom before you go outside to check to make sure it is functional, and to update your firmware. As a side note, I’m not sure if I was up and running so quickly because the Phantom 4 I was loaned for review had been charged and used before, or if it was really that easy.

For this review, I really wanted to see how easy it was to get shots like wide sweeping pans and tilts or tracking shots, and it was relatively easy. Obviously, you will need to practice your camera work with the Phantom 4 to get nice shots that aren’t boring and have substance, but it’s pretty simple. I brought the Phantom 4 to an open field where I had tons and tons of space. I immediately turned on the Phantom 4 by pressing the power button once and then holding it down until it powered on, I forgot to download the DJI Go app to my iPhone 6, so after I downloaded it, I connected the USB to lightning cable from my iPhone 6 to the controller. While the iPhone 6 worked great, you do have minimal screen real estate with so many controls available, so I would suggest you use an iPad if you can or an iPhone 6 or 7 Plus. I tried using an iPad mini, but had trouble getting the Phantom 4 and the iPad to connect, so I stuck to the iPhone.

Once my propellers were spinning, I flew it straight up into the sky, I felt like a little kid with my first remote control car, except that the handling and precision that the Phantom 4 offers is exceptional. You can even take your hands off the joysticks and the Phantom 4 will hover. I noticed that once I got the Phantom 4 high in the air, I could hear it battle the winds. It really stuck to its position in the air even with some decent-strength gusts.

Collision Avoidance
When I took the Phantom 4 out for a second time, I wanted to test out its upgraded collision avoidance system. I also wanted to test out my camera moves. The collision avoidance was awesome! Not only does it sense the ground beneath it, but objects in front of it. I started flying toward a basketball hoop and it caught it in its sights and maneuvered to the right. Then, with just one prior flight, I noticed I was really getting the hang of long shots while tilting and panning the camera — a real testament to how easy it is to control.

Keep in mind that the DJI Go app has a built-in flight simulator to help you get your moves and techniques down before you go outside. Unfortunately, you have to be connected to your drone while using the flight simulator, but still it’s pretty handy for practicing — something you should definitely use before you fly, even if your pride is telling you not to.

Tech Specs
Beyond my pure joy at flying the Phantom 4 there are some fancy tech specs that you should know about. For my money, the DJI Phantom 4 really shows its worth in its camera, a 4K capable 1/2.3-inch CMOS image sensor, ISO range between 100-3,200 for video (100-1,600 for photos) and a shutter speed between eight seconds and 1/8000 of a second.

There are many different recording modes, including 4096×2160 (true 4K resolution) at 24/25 progressive frames per second, 3840×2160 (UHD) at 24/25/30p, 2704×1520 (2.7K) at 24/25/30p, 1920×1080 (HD) at 24/25/30/48/50/60/120p and 1280×720, for some reason, at 24/25/30/48/50/60p. All these resolutions are recorded at a max bit rate of 60Mbps, which is decent, but really should be higher in my opinion (probably more in the 100Mbps range).

In terms of image quality, the Phantom 4 is amazing for being a flying ship that captures video. However, it isn’t going to match cameras like the Sony a7S II, , Panasonic GH4 or Blackmagic Cinema Cameras, exactly. The Phantom 4 definitely rivals the GoPro Hero 5 Black in video quality, or at least gives them a good run for their money. The only problem is that the camera isn’t removable from the gimbal on the Phantom 4. I would really like a removable camera from the Phantom 4, much like the new GoPro Karma drone with its connection to the Karma Gimbal.

So after flying the Phantom 4 a few times I began to realize how volatile and important the picture and video profile settings are. The first time I recorded video I simply hit record. I was in Vivid mode, presumably at the baseline of Saturation, Sharpening and Contrast: 0,0,0. It looked great at first glance and for anyone who just wants to pick up the Phantom 4 and shoot you should probably just leave it at this or maybe knock the sharpness down to -1. If you plan on color correcting later or adding a creative LUT on top of your footage, then you are going to want a more flat-in-color image.

I thought the D-Log setting would be the way to go, as that should give you the flattest image in terms of saturation and exposure to pull the most life out of your image. Unfortunately, I found out that is not the case. I tried many variations of Saturation, Sharpness and Contrast from 0,0,0 to -3,-3,-3 and wasn’t really happy with any of them. After running through the usable color profiles (I’m omitting black and white and any other filters like that because you should really just go ahead and apply those looks while color correcting or editing since all NLEs have an easy way to add them), I found that D-Cinelike and None were the profiles I should really stay in, and I started to like Sharpness: -1, Contrast -2, and Saturation -2.

Before I go on about D-Cinelike and None, I think anyone buying a drone should consider ND filters (short for neutral density filters). When shooting outdoors you will get a lot of contrasting light values, such as dark shadows and blown out highlights. To get around having to pick your favorite, you can knock the exposure down on your camera externally with an ND filter while allowing you to keep your shutter speed and ISO values at more appropriate levels.

Without ND filters, you are going to have to ramp up the shutter speed on your Phantom 4 when filming using an ISO, such as 100, to properly expose your image, lending your footage to look a little choppier and less cinematic (I hate using the word cinematic to describe this, but essentially cinematic = motion blur in this instance).

If this sounds interesting to you, you should Google shutter speed techniques and rules, but be careful. It is a deep rabbit hole. From my simple research, I found ND filters ranging anywhere from $20 to $99 or more depending on quality and where you buy them. Polar Pro looks to make some sweet ones, including the Vivid Collection in their Cinema Series of polarized ND filters at $99 for a three-pack — another rabbit hole, be careful not to get G.A.S., Gear Acquisition Syndrome.

Moving on… D-Cinelike and None are flat color profile shooting modes that allow for decent color grading in post production but with less midtone muddiness like the D-Log seemed to produce for me. D-Cinelike seemed to warm up the shot a little with more orange and yellow tints and possibly less shadow detail. In None, I felt like I got the flattest color profile possible, which allowed for the best color correction and grading scenario with the Phantom 4 footage. Don’t forget to dial in your custom picture profile settings. Personally, I liked the picture best when I knocked Sharpness down to -1 or -2. Contrast and Saturation could also be knocked down a little, but this is something you should test when you buy a Phantom 4, since it is definitely a personal taste.

If you go on YouTube and search Phantom 4 color settings you will find a lot of videos. You should probably sort by upload date and watch the more recent videos that might take into account firmware updates. I really liked watching Bill Nichol’s YouTube Channel BillNicholsTV. He has a bunch of great and practical reviews.

You should still try out the Phantom 4’s D-Log mode. Hopefully, it works for you better than it did for me. If you use Blackmagic Resolve, you can check out DJI’s D-Log to sRGB LUT instructions and find the LUT under the software downloads here.

While I didn’t want to get too deep into the technical side of the Phantom 4, I did fall down the picture profile settings abyss and still want to highlight some automated flight modes that the Phantom 4 excels at. Some of the new features that separate the Phantom 4 from previous Phantom models include Active Track, TapFly, Obstacle Sensing System, Sport Mode, easier-to-use push and release propellers, up to 28-minute battery life (although I only got between 20-22 minutes with the Phantom 4 automatically returning to home when the battery was running low), improved camera with less chromatic aberration, and much more.

New Features That Editors Will Like
I now want to touch on the upgraded features that would get me, as an editor, interested in the Phantom 4. Active Track is an amazing feature that can track objects specified through the DJI Go app. You simply click the object or person you want to track and bam! The Phantom 4 will follow them from what DJI calls a “safe distance,” and it really is.

TapFly is another great feature that will help pilots who aren’t as comfortable flying in tight spaces to fly in a straight line. Simply tap the remote icon on your phone or tablet, tap TapFly, click on a visual point you want the Phantom 4 to fly to, and it will basically move into autopilot. You still have control over camera and even the Phantom 4, but it’s basically a coached flying system.

Again, there are a lot of technical specs I didn’t go into too much detail on, but if you want more info you can find it on DJI’s Phantom 4 page. For some simple and short videos check out: http://www.dji.com/edu/edu_videos or download the DJI Go app.

Summing Up
In the end, I really, really, really loved flying the Phantom 4! One of the easiest parts was installing the propellers — easy turn and lock. If you find yourself getting frustrated when filming or flying the Phantom 4, remember that it takes people many hours to get good at shooting with a camera, let alone a drone, with a camera and gimbal to control all at once. I spent many nights watching YouTuber’s reviews wondering why I couldn’t get a great picture out of the D-Log setting until I found Casey Faris’ video on the Mavic Pro, which described the same problem I was having with the Phantom 4. With some more tests, I was able to fail and succeed in the different picture profiles.

When reviewing products, I try to break them, and I did that with the Phantom 4. Really. I accidentally crashed it while in Sport mode and only one of the propellers caps flew off in that yard sale — a real testament to the sturdy construction of the Phantom 4.

Once back online, I tried to fly it into a tree but the Obstacle Sensing System and the Forward Vision System prevented the Phantom 4 from crashing. It’s like an extra layer of insurance.

I really like how the Phantom 4 has very advanced controls and features, but is also “dummy” proof. If I you’re editing a project and it begs for a tracking shot of a car that just isn’t in the dailies, you can grab a Phantom 4 and run out and film something. Even if it doesn’t make it into the final edit, it will give the producers and director a greater sense of what you are trying to convey. You could really help sell your vision, and your future job prospects.

I haven’t been able to get my hands on the recently announced MavicPro foldable drone from DJI, but I was able to get the recently announced GoPro Karma (you can see some of my in-flight footage on my  YouTube page.

In my opinion, I really don’t think these drones compare to one another, so I won’t really be going into a “tit for tat” comparison, but with so much drone competition it is an exciting time in the UAV world.

One thing I did notice when I went out to test out the Phantom 4 was how many people were ready to become FAA/police authorities and tell you that you can’t fly. It was almost laughable. In fact, every time I think about it I laugh. Moral of the story is to keep that in mind that before making a purchase like this, if you live in a city you probably live within five miles of an airport, helipad, etc., and technically you can’t fly your drone. It is a conversation starter whether or not you want it to be.

Definitely check out the FAA’s website to get the rules on where and when you can fly drones, otherwise you might have an awesome grey box in your room with nowhere to fly. On the flip side, I’ve been reading people’s comments on forums, and if you are a hobbyist flyer, have registered your drone and want to fly, you can contact your local airport and let them know you want to fly at a certain altitude or below, and they usually will say it’s fine. Those aren’t my words but a summation of what I have been reading — of course do your own research please!

The only criticism is that the Phantom 4’s 60Mbps data rate isn’t high enough to get the best quality footage from your drone. If you’ve been paying attention to the news lately or my Twitter: @allbetzroff, you may have seen DJI’s latest reveal of the Phantom 4 Pro, Pro + and Inspire 2, which can film at a much better data rate of 100Mbps. Maybe this is a simple firmware update to the Phantom 4 (but it’s probably not). Nonetheless, 60Mbps is acceptable for 1920×1080 or maybe 2.7K video (2704 x 1524,16×9 aspect ratio) or below, but once you get up into the higher frame sizes, you can really see the video footage breakdown. If you zoom into the footage, the compression becomes noticeable and the color fidelity begins to fade.

While writing this review, the DJI Phantom 4 retailed for $1,199 on the DJI online store without any accessories. More like $1399 with two extra batteries and an external battery charger. I even just found a refurbished Phantom 4 on DJI’s site for $899. The Phantom 4 Pro starts at $1,499 and Phantom 4 Pro + $1,799. Oh yeah, don’t forget a few 64GB MicroSD cards at $20-$35 a piece. A pretty expensive investment if you ask me, but If you find yourself being a major gear nerd like me or editing and needing to shoot your own footage, the DJI Phantom 4 is a must-have. Once you fly the Phantom 4 you will be hooked.

Watch some of the video I shot with the Phantom 4 on my YouTube Channel:

Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration. Follow him on Twitter @allbetzroff.

Utopic editor talks post for David Lynch tribute Psychogenic Fugue

Director Sandro Miller called on Utopic partner and editorCraig Lewandowski to collaborate on Psychogenic Fugue, a 20-minute film starring John Malkovich in which the actor plays seven characters in scenes recreated from some of filmmaker David Lynch’s films and TV shows. These characters include The Log Lady, Special Agent Dale Cooper, and even Lynch himself as narrator of the film.

It is part of a charity project called Playing Lynch that will benefit the David Lynch Foundation, which seeks to introduce at-risk populations affected by trauma to transcendental meditation.

craigChicago-based Utopic handled all the post, including editing, graphics, VFX and sound design. The film is part of a multimedia fundraiser hosted by Squarespace and executed by Austin-based agency, Preacher. The seven vignettes were released one at a time on Playinglynch,com.

To find out more about Utopic’s work on the film, we reached out to Lewandowski with some questions.

How early were you brought in on the film?
We were brought in before the project was even finalized. There were a couple other ideas that were kicked around before this one rose to the top.

We cut together a timing board using all the pieces we would later be recreating. We also pulled some hallway scenes from an old Playstation commercial that he directed, and we then scratched in all the “Lynch” lines for timing.

You were on set. Can you talk about why and what the benefits were for the director and you as an editor?
My job on the set was to have our reference movie at the ready and make sure we were matching timing, framing, lighting, etc. Sandro would often check the reference to make sure we were on track.

For scenes like the particles in Eraserhead, I had the DP shoot it at various frame rates and at the highest possible resolution, so we could shoot it vertical and use the particles falling. I also worked with the Steadicam operator to get a variety of shots in the hallway since I knew we’d need to create some jarring cutaways.

How big of a challenge was it dealing with all those different iconic characters, especially in a 20-minute film?
Sandro was adamant that we not try to “improve” on anything that David Lynch originally shot. Having had a lot of experience with homages, Sandro knew that we couldn’t take liberties. So the sets and action were designed to be as close as possible to the original characters.

In shots where it was only one character originally (The Lady in the Radiator, Special Agent Dale Cooper, Elephant Man) it was easier, but in scenes where there were originally more characters and now it was just Malkovich, we had to be a little more creative (Frank Booth, Mystery Man). Ultimately, with the recreations, my job was to line up as closely as possible with what was originally done, and then with the audio do my best to stay true to the original.

Can you talk about your process and how you went about matching the original scenes? Did you feel much pressure?
Sandro and I have worked together before, so I didn’t feel a lot of pressure from him, but I think I probably put a fair amount on myself because I knew how important this project was for so many people. And, as is the case with anything I edit, I don’t take it lightly that all of that effort that went into preproduction and production now sits on my shoulders.

Again, with the recreations it was actually fairly straightforward. It was the corridor shots where Malkovich plays Lynch and recites lines taken from various interviews that offered the biggest opportunity, and challenge. Because there was no visual reference for this, I could have some more fun with it. Most of the recreations are fairly slow and ominous, so I really wanted these corridor shots to offset the vignettes, kind of jar you out of the trance you were just put in, make you uneasy and perhaps squirm a bit, before being thrust into the next recreation.

What about the VFX? Can you talk about how they fit in and how you worked with them?
Many of the VFX were either in-camera or achieved through editorial, but there were spots — like where he’s in the corridor and snaps from the front to the back — that I needed something more than I could accomplish on my own, so I used our team at Utopic. However, when cutting the trailer, I relied heavily on our motion graphics team for support.

Psychogenic Fugue is such an odd title, so the writer/creative director, Stephen Sayadin, came up with the idea of using the dictionary definition. We took it a step further, beginning the piece with the phonetic spelling and then seamlessly transitioning the whole thing. They then tried different options for titling the characters. I knew I wanted to use the hallway shot, close-ups of the characters and ending on Lynch/Malkovich in the chair. They gave me several great options.

What was the film shot on, and what editing system did you use?
The film was shot on Red at 6K. I worked in Adobe Premiere, using the native Red files. All of our edit machines at Utopic are custom-built, high-performance PCs assembled by the editors themselves.

What about tools for the visual effects?
Our compositor/creative finisher used an Autodesk Flame, and our motion graphics team used Adobe After Effects.

Can you talk about the sound design?
I absolutely love working on sound design and music, so this was a dream come true for me. With both the film and the trailer, our composer Eric Alexandrakis provided me with long, odd, disturbing tracks, complete with stems. So I spent a lot of time just taking his music and sound effects and manipulating them. I then had our sound designer at Brian Lietner jump in and go crazy.

Is there a scene that you are most proud of, or that was most challenging, or both?
I really like the snap into the flame/cigarette at the very beginning. I spent a long time just playing with that shot, compositing a bunch of shots together, manipulating them, adjusting timing, coming back in the next morning and changing it all up again. I guess that and Eraserhead. We had so many passes of particles and layered so many throughout the piece. That shot was originally done with him speaking to camera, but we had this pass of him just looking around, and realized it was way more powerful to have the lines delivered as though they were internal monologue. It also allowed us to play with the timings in a way that we wouldn’t be able to with a one-take shot.

As far as what I’m most proud of, it’s the trailer. We worked really hard to get the recreations and full film done. Then I was able to take some time away from it all and come back fresh. I knew that there was a ton of great footage to work with and we had to do something that wasn’t just a cutdown. It was important to me that the trailer feel every bit as demented as the film itself, if not more. I think we accomplished that.

Check out the trailer here:

2016 HPA Award winners

The Hollywood Professional Association (HPA) held its annual awards this week at the Skirball Cultural Center in Los Angeles. The HPA Awards recognize individuals and companies for outstanding contributions made in the creation of feature films, television, commercials, and entertainment content enjoyed around the world.

Awards were bestowed in creative craft categories honoring behind-the-scenes artistry, and a host of special awards were also presented.

The winners of the 2016 HPA Awards are:

Outstanding Color Grading – Feature Film

Carol
John Dowdell // Goldcrest Post Productions Ltd

WINNER – The Revenant
Steven J. Scott // Technicolor Production Services

Brooklyn
Asa Shoul // Molinare

The Martian
Stephen Nakamura // Company 3

The Jungle Book
Steven J. Scott // Technicolor Production Services

Outstanding Color Grading – Television

Vinyl – E.A.B
Steven Bodner // Deluxe/Encore NY

Fargo – The Myth of Sysiphus
Mark Kueper // Technicolor

Outlander – Faith
Steven Porter // MTI Film

WINNER – Gotham – By Fire
Paul Westerbeck // Encore Hollywood

Show Me A Hero – Part 1
Sam Daley // Technicolor PostWorks NY

Outstanding Color Grading – Commercial
Fallout 4The Wanderer
Siggy Ferstl / Company 3

Toyota Prius – Poncho
Sofie Borup // Company 3

NASCAR – Team
Lez Rudge // Nice Shoes

Audi R8 – Commander
Stefan Sonnenfeld // Company 3

Apple Music – History of Sound
Gregory Reese // The Mill

Pennzoil – Joyride Circuit
Dave Hussey // Company 3

WINNER – Hennessy – Odyssey
Tom Poole // Company 3

Outstanding Editing – Feature Film

The Martian
Pietro Scalia, ACE

The Big Short

The Revenant
Stephen Mirrione, ACE

WINNER – The Big Short
Hank Corwin, ACE

Sicario
Joe Walker, ACE

Spotlight
Tom McArdle, ACE

Outstanding Editing – Television (TIE)

Body Team 12
David Darg // RYOT Films

Underground – The Macon 7
Zack Arnold, Ian Tan // Sony Pictures Television

Vinyl – Pilot
David Tedeschi

martin-nicholson-ace-greg-babor-editing-for-tv-winners-at-2016-hpa-awards

Roots winners for editing, Martin Nicholson, ACE, Greg Babor

WINNER – Roots – Night One
Martin Nicholson, ACE, Greg Babor

WINNER – Game of Thrones – Battle of the Bastards
Tim Porter, ACE

Outstanding Editing – Commercial

WINNER – Wilson – Nothing Without It
Doobie White // Therapy Studios

Nespresso – Training Day
Chris Franklin // Big Sky Edit

Saucony – Be A Seeker
Lenny Mesina // Therapy Studios

Samsung – Teresa
Kristin McCasey // Therapy Studios

Outstanding Sound – Feature Film

Room
Steve Fanagan, Niall Brady, Ken Galvin // Ardmore Sound

Eye In The Sky
Craig Mann, Adam Jenkins, Bill R. Dean, Chase Keehn // Technicolor Creative Services

Batman Vs Superman: Dawn of Justice
Scott Hecker // Formosa Group
Chris Jenkins, Michael Keller // Warner Bros. Post Production Services

Zootopia
David Fluhr, CAS, Gabriel Guy, CAS, Addison Teague // Walt Disney Company

WINNER – Sicario
Alan Murray, Tom Ozanich, John Reitz // Warner Bros. Post Production Services

Outstanding Sound – Television

WINNER – Outlander – Prestonpans
Nello Torri, Alan Decker, Brian Milliken, Vince Balunas  // NBCUniversal Post Sound

Game of Thrones – Battle of the Bastards
Tim Kimmel, MPSE, Paula Fairfield, Mathew Waters, CAS, Onnalee Blank, CAS, Bradley C. Katona, Paul Bercovitch // Formosa Group

Preacher – See
Richard Yawn, Mark Linden, Tara Paul // Sony Sound

Marco Polo – One Hundred Eyes
David Paterson, Roberto Fernandez, Alexa Zimmerman, Glenfield Payne, Rachel Chancey // Harbor Picture Company

House of Cards – Chapter 45
Jeremy Molod, Ren Klyce, Nathan Nance, Scott R. Lewis, Jonathan Stevens // Skywalker Sound

Outstanding Sound – Commercial

WINNER – Sainsbury’s – ­Mog’s Christmas Calamity
Anthony Moore, Neil Johnson // Factory

Save the Children UK – Still The Most Shocking Second A Day
Jon Clarke // Factory

Wilson – Nothing Without It
Doobie White // Therapy Studios

Honda – Paper
Phil Bolland // Factory

Honda – Ignition
Anthony Moore // Factory

Outstanding Visual Effects – Feature Film

Star Wars: The Force Awakens
Jay Cooper, Yanick Dusseault, Rick Hankins, Carlos Munoz, Polly Ing // Industrial Light & Magic

WINNER – The Jungle Book
Robert Legato, Andrew R. Jones
Adam Valdez, Charley Henley // MPC
Keith Miller // Weta Digital

Captain America: Civil War
Russell Earl, Steve Rawlins, Francois Lambert, Pat Conran, Rhys Claringbull // Industrial Light & Magic

The Martian
Chris Lawrence, Neil Weatherley, Bronwyn Edwards, Dale Newton // Framestore

Teenage Mutant Ninja Turtles: Out of the Shadows
Pablo Helman, Robert Weaver, Kevin Martel, Shawn Kelly, Nelson Sepulveda // Industrial Light & Magic

Outstanding Visual Effects – Television

Supergirl – Pilot
Armen V. Kevorkian, Andranik Taranyan, Gevork Babityan, Elaina Scott, Art Sayan // Encore VFX

Ripper Street – The Strangers’ Home
Ed Bruce, Nicholas Murphy, Denny Cahill, John O’Connell // Screen Scene

Black Sails – XXI
Erik Henry // Starz
Matt Dougan // Digital Domain
Martin Ogren, Jens Tenland, Nicklas Andersson // ILP

The Flash – Guerilla Warfare
Armen V. Kevorkian, Thomas J. Conners, Andranik Taranyan, Gevork Babityan, Jason Shulman // Encore VFX

Holly Shiffman and Mike Chapman with VFX winner for Game of Thrones, Matthew Rouleau.

WINNER – Game of Thrones – Battle of the Bastards
Joe Bauer, Eric Carney // Fire & Blood Productions
Derek Spears // Rhythm & Hues 
Glenn Melenhorst // Iloura
Matthew Rouleau // Rodeo FX

Outstanding Visual Effects – Commercial

Sainsbury’s – Mog’s Christmas Calamity
Ben Cronin, Grant Walker, Rafael Camacho // Framestore

WINNER – Microsoft Xbox – Halo 5: The Hunt Begins
Ben Walsh, Ian Holland, Brian Delmonico, Brian Burke // Method

AT&T – Power of &amp
James Dick, Corrina Wilson, Euna Kho, Callum McKeveny // Framestore

Kohler – Never Too Next
Andy Boyd, Jake Montgomery, Zachary DiMaria, David Hernandez // JAMM

Gatorade – Sports Fuel
JD Yepes, Richard Shallcross // Framestore

Emerging Leader Award

2016 Winners- Jesse Korosi, Jennifer Zeidan

The following special awards, which were previously announced, were also presented this evening:

HPA Engineering Excellence Award

Sponsored by NAB Show

The HPA Engineering Excellence Award is recognized as one of the most important technology honors in the industry, spotlighting companies and individuals who draw upon technical and creative ingenuity to develop breakthrough technologies.  Submissions for this peer-judged award may include products or processes, and must represent a step forward for its industry beneficiaries.

2016 Winners 

Aspera: FASPStream

Grass Valley: GV Node Real Time IP Processing and Edge Routing Platform

RealD: Ultimate Screen

SGO: Mistika

Honorable mentions:
Grass Valley: LDX 86N Native 4K Series Camera

Canon USA, Inc.: 4K / UHD / 2K / HD display

HPA Judges Award for Creativity and Innovation

The HPA Judges Award for Creativity and Innovation recognizes companies and individuals who have demonstrated excellence, whether in the development of workflow and process to support creative storytelling or in technical innovation. The Judges Award for Creativity and Innovation is conferred by a jury of industry experts.

2016 Winner- The Mill: Blackbird

HPA Lifetime Achievement Award

The HPA Lifetime Achievement Award is given to an individual who is recognized for his or her service and commitment to the professional media content industry. The mission of the award is to give recognition to individuals who have, with great service, dedicated their careers to the betterment of the industry. The Lifetime Achievement Award is given at the discretion of the HPA Board of Directors and the HPA Awards Committee. It is not bestowed every year.

herb-dow

Herb Dow

2016 Honoree- Herb Dow, ACE

The Charles S. Swartz Award

The Charles S. Swartz Award is conferred on a person, group, or company that has made significant artistic, technological, business or educational impact across diverse aspects of the media industry. The award was named in honor of the late Charles S. Swartz, who led the Entertainment Technology Center at the University of Southern California from 2002 until 2006, building it into the industry’s premiere testing bed for new digital cinema technologies.

2016 Honoree – Michelle Munson, Founder and CEO of Aspera

Margarita Mix’s Pat Stoltz gives us the low-down on VR audio

By Randi Altman

Margarita Mix, one of Los Angeles’ long-standing audio and video post facilities, has taken on virtual reality with the addition of 360-degree sound rooms at their facilities in Santa Monica and Hollywood. This Fotokem company now offers sound design, mix and final print masters for VR video and remixing current spots for a full-surround environment.

Workflows for VR are new and developing every day — there is no real standard. So creatives are figuring it out as they go, but they can also learn from those who were early to the party, like Margarita Mix. They recently worked on a full-length VR concert film with the band Eagles of Death Metal and director/producer Art Haynie of Big Monkey Films. The band’s 2015 tour came to an abrupt end after playing the Bataclan concert hall during last year’s terrorist attacks in Paris. The film is expected to be available online and via apps shortly.

Eagles of Death Metal film.

We reached out to Margarita Mix’s senior technical engineer, Pat Stoltz, to talk about his experience and see how the studio is tackling this growing segment of the industry.

Why was now the right time to open VR-dedicated suites?
VR/AR is an exciting emerging market and online streaming is a perfect delivery format, but VR pre-production, production and post is in its infancy. We are bringing sound design, editorial and mixing expertise to the next level based on our long history of industry-recognized work, and elevating audio for VR from a gaming platform to one suitable for the cinematic and advertising realms where VR content production is exploding.

What is the biggest difference between traditional audio post and audio post for VR?
Traditional cinematic audio has always played a very important part in support of the visuals. Sound effects, Foley, background ambiance, dialog and music clarity to set the mood have aided in pulling the viewer into the story. With VR and AR you are not just pulled into the story, you are in the story! Having the ability to accurately recreate the audio of the filmed environment through higher order ambisonics, or object-based mixing, is crucial. Audio does not only play an important part in support of the visuals, but is now a director’s tool to help draw the viewer’s gaze to what he or she wants the audience to experience. Audio for VR is a critical component of storytelling that needs to be considered early in the production process.

What is the question you asked the most from clients in terms of sound for VR?
Surprisingly none! VR/AR is so new that directors and producers are just figuring things out as they go. On a traditional production set, you have audio mixers and boom operators capturing audio on set. On a VR/AR set, there is no hiding. No boom operators or audio mixers can be visible capturing high-quality audio of the performance.

Some productions have relied on the onboard camera microphones. Unfortunately, in most cases, this turns out to be completely unusable. When the client gets all the way to the audio post, there is a realization that hidden wireless mics on all the actors would have yielded a better result. In VR especially, we recommend starting the sound consultation in pre-production, so that we can offer advice and guide decisions for the best quality product.

What question should clients ask before embarking on VR?
They should ask what they want the viewer to get out of the experience. In VR, no two people are going to walk away with the same viewing experience. We recommend staying focused on the major points that they would like the viewer to walk away with. They should then expand that to answer: What do I have to do in VR to drive that point home, not only mentally, but drawing their gaze for visual support? Based on the genre of the project, considerations should be made to “physically” pull the audience in the direction to tell the story best. It could be through visual stepping stones, narration or audio pre-cues, etc.

What tools are you using on VR projects?
Because this is a nascent field, new tools are becoming available by the day, and we assess and use the best option for achieving the highest quality. To properly address this question, we ask: Where is your project going to be viewed? If the content is going to be distributed via a general Web streaming site, then it will need to be delivered in that audio file format.

There are numerous companies writing plug-ins that are quite good to deliver these formats. If you will be delivering to a Dolby VR (object-based preparatory format) supported site, such as Jaunt, then you will need to generate the proper audio file for that platform. Facebook (higher order ambisonics) requires even a different format. We are currently working in all these formats, as well as working closely with leaders in VR sound to create and test new workflows and guide developments in this new frontier.

What’s the one thing you think everyone should know about working and viewing VR?
As we go through life, we each have our own experiences or what we choose to experience. Our frame of reference directs our focus on things that are most interesting to us. Putting on VR goggles, the individual becomes the director. The wonderful thing about VR is now you can take that individual anywhere they want to go… both in this world and out of it. Directors and producers should think about how much can be packed into a story to draw people into the endless ways they perceive their world.

Digging deeper with Jackie editor Sebastián Sepúlveda

By Mel Lambert

Cutting Jackie together was a major challenge, according to picture editor Sebastián Sepúlveda. “Cinematographer Stéphane Fontaine’s intricate handheld camera work — often secured in a single take — the use of the extreme close-ups and the unconventional narrative framework meant that my creative sensibilities were stretched to the maximum. I was won over by the personality of Jackie Kennedy, and saw the film and its component parts as a creative opportunity on several levels. I approached the edit as several small emotional moments that, as a whole, offered a peek into her inner life.”

sebastian_sepulveda

Sebastián Sepúlveda

Director Pablo Larraín’s new offering, which opens in the US on December 2, chronicles the tragic events following the assassination of President John F. Kennedy, as the late Jacqueline Bouvier Kennedy fights through grief and trauma to regain her faith, console her children and maintain her husband’s historic legacy, as well as the world of Camelot that they created. Jackie stars Natalie Portman, Peter Sarsgaard, Billy Crudup and Greta Gerwig.

The script, by Noah Oppenheim, is nonlinear. It opens with an interview between Jackie and an unnamed journalist from Life magazine just a few days after the assassination and transitions to earlier events as the narrative unfolds. The 100-minute film was shot in France and Washington, DC, on Kodak Vision3 Super 16mm film with an Arriflex 416 Plus camera. It had a 2K DI in an aspect ratio of is 1.66:1, which more convincingly matches the 4×3 archive footage than a wide-screen format.

The film is already getting award attention. Portman (Jackie) was nominated for a Gotham Independent Film Award for best actress, Larraín won the Platform Prize at the 2016 Toronto International Film Festival, and Oppenheim’s screenplay received the Golden Osella at the 2016 Venice Film Festival. The director was also nominated for the Golden Lion for best film at the latter festival.

The Edit
Sepúlveda (who has been nominated for a Spirit Award for his work on Jackie) previously edited Larraín’s Spanish-language film The Club and has collaborated with his friend on previous films. “I shaped Jackie’s unconventional narrative into a seamless story and dove into Larraín’s exploration of her internal reality — the emotional, enigmatic core of the most unknown woman in the world,” he explains. “I found emotional bridges to stitch the piece together in a format that’s bold, innovative and not taught in film school — it is organic to the movie and very much in sync with Larraín’s creative process.”

Sepúlveda identified four key layers to the narrative: ongoing interview sequences at Hyannis Port that provide an insight into the lead character’s frail emotional state; a reconstruction of the landmark White House tour that the First Lady hosted for CBS Television in 1961; sequences with an Irish catholic priest (John Hurt) that explore the lead character’s inevitable crisis of faith; and the assassination and harrowing high-speed exit from Dealey Plaza in Dallas on November 22, 1963.

“I navigated the edit by staying true to Jacqueline Kennedy’s emotional core, which was the primary through-line of the director’s approach to the movie. We had to bring to life a structure that went back and forth across many layers of time,” he says. “The film starts with a more classical interview of Mrs. Kennedy by a magazine journalist just after the tragedy. Then we have the White House tour in flashback, and then the day in Dallas where JFK was murdered. So that was tricky. We also had extreme close-ups of Natalie [Portman] in almost every scene, which we used not only to see the story from her point of view, but also to observe every detail of her expression following the nightmare the former First Lady had to go through.”

Sepúlveda, who works most often in Apple Final Cut Pro, was given an Avid Media Composer for this film. He says his biggest challenge in the editing suite was honoring the four identified layers throughout the complex cut. “It was very hard to balance all the facts, but also to give life to the film. I did a very quick edit in a week by keeping the structure very simple. I then went back and refined the edit while still honoring the basic shape. The use of extreme close-ups and medium shots let me keep [the First Lady] at the center of attention, and to make sure that the editing was not obtrusive to that vision of a sad, melancholic feel.

Photo by William Gray“And we had the gorgeous, incredible Natalie Portman, who plays with her eyes in a way that you cannot read so easily. It puts the character into a more mysterious perspective. You think in one scene that you understand the character, then comes the next scene and… boom! Natalie shows you another part of this complex character. Finally, you cannot pick which one is Jacqueline Kennedy, since all those different aspects of the character are the First Lady. We had to build the structure — the bridges between the scenes — only guided by this emotional path.”

Eye Contact
Both Larraín and Sepúlveda subscribe to the Shakespearean adage that our eyes are the windows to our soul, and arranged their cut around that conviction. “When we started the edit, after studying the rushes, Pablo and I had a conversation — maybe the most important/interesting part of the process for me — about the eyes,” says Sepúlveda. “For us, they built the entire emotional path of the storytelling process, because the viewer is always trying to read what’s behind the eyes. You can try to bluff with a facial expression, but our eyes are there to show things that you don’t want to say.

“As an audience member you are trying to go deeper into the character,” the editor continues, “but always find the unexpected. You become emotionally involved with this figure while wanting to know more about her. Your imagination is engaged, playing with the film. For me, that’s pure cinema.”

Sepúlveda considers the process as harkening back to the New Wave or La Nouvelle Vague film period of the late 1950s and 1960s. Although never a formally organized movement, New Wave filmmakers rejected literary-period pieces being made in France and written by novelists, and instead elected to shoot current social issues on location. They intended to experiment with a film form that chronicled social and political upheavals with their radical experiments in editing, visual style and narrative elements in more of a documentary style, with fragmented, discontinuous editing and long takes.

Photo by Pablo LarraínAnd not all scenes in Jackie involve complex cross editing. An example is the scene in the White House when Jacqueline Kennedy strips off her blood-stained clothing, to the ironical accompaniment of the title song from the Broadway musical Camelot sung by Richard Burton. “It was the first time she had been alone, and we had a number of long shots to emphasize that isolation; she was walking like a ghost, dropping clothes as she went from room to room — almost as if she was changing her skin — with several two- to three-minute takes,” describes Sepúlveda. “The music also recedes as if it was coming from an adjacent room, to add to the sense of separation, and the haunting loss of the sense of [JFK’s] Camelot — the dream was broken. This was not the same Jacqueline Kennedy known by the public.”

Because he has young girls, editing a film about this powerful, vulnerable, creative First Lady was important to this Chilean-born editor. “Given our current political situation here in the States – and which obviously has ripple effects beyond our borders — I think we need a little Jackie love and magic right now,” he says.

“As the father of two little girls I know that they don’t have the same opportunities as the boys, and that scares me. To participate in a film in which the main character is a woman who had to make important decisions for her country in a moment of political and personal crisis, is ethically important to me. Because, obviously, it was an extremely traumatic time for Jacqueline Kennedy, the idea was to create a seamless edit that could evoke how human memory works under trauma. In this case, we approached it like small glimpses of that period of the First Lady’s life. For me, it was very important to keep the audience emotionally involved with the main character, to almost participate in her experience and, ultimately, to empathize with her. It’s a portrait of grief but we also appreciate, ultimately, how she persevered and overcame it.”

An Editor’s Background
An experienced cinematographer, writer and director, Sepúlveda has enjoyed an eclectic career, whose vocations inform each other and also reflect a sometimes-stressful home life. “My family was exiled from Chile because of Pinochet’s coup d’état in 1973. My mother was a university professor and supported the Allende government. We lived in five countries — France, Venezuela, Argentina, Switzerland and Spain — but it was a very beautiful childhood. To live in Venezuela, discovering the Amazon rainforest, living in Argentina when the democracy returned in the eighties, attending public school in France and getting my dose of republican values. I studied history in a Chilean university, editing in Cuba and scriptwriting in Paris. I really like to work on different aspects of a film.”

In 2007, he worked in France as a film editor, and returned to Chile for vacations. “Pablo was editing Tony Manero, and invited me to give them feedback. It was a first cut, but astonishing. I was shocked in a positive way. We had a pleasant conversation about possible ways to build the film. Then I moved back to Chile and Pablo’s brother Juan invited me to work with them. I started as a script doctor for films and TV series they produced, edited some feature films, and also wrote some script treatments for Pablo. His company, Fabula, produced my first feature film as a director, Las Niñas Quispe (2013), which premiered at Venice Critics Week,” he concludes. “It’s been an amazing journey.”

—————
Mel Lambert is principal of Content Creators, an LA-based copywriting and editorial service, and can be reached at mel.lambert@content-creators.com Follow him on Twitter @MelLambertLA

The A-List: James L. Brooks on his latest film The Edge of Seventeen

By Iain Blair

James L. Brooks, the legendary writer/director/producer, probably has a reinforced mantelpiece in his home. If not, he could probably use one. After all, he’s Hollywood royalty — a three-time Academy Award winner and 20-time Emmy Award-winner whose films include Broadcast News, Terms of Endearment, As Good as It Gets and Jerry Maguire.

Brooks, who began his career as a writer, produced television hits such as Taxi, The Mary Tyler Moore Show, Rhoda, Lou Grant, The Tracy Ullman Show and The Simpsons. He produced his newest film, The Edge of Seventeen, for writer and first-time director Kelly Fremon Craig.

Writer Iain Blair (left) with James Brooks.

A coming-of-age comedy, it stars Hailee Steinfeld and Haley Lu Richardson as inseparable best friends attempting to navigate high school. Along with acting vets Kyra Sedgwick and Woody Harrelson, the behind-the-scenes team on The Edge of Seventeen includes DP Doug Emmett (The One I Love, HBO’s Togetherness) and editor Tracy Wadmore-Smith, ACE (About Last Night, How Do You Know).

I talked to Brooks about making the film and why post is everything.

You’ve made such a diverse slate of films. What do you look for in a project?
A writer with a specific voice. That’s always the main thing.

I heard that you worked on this script with Kelly for four years. Was that unusually long?
Unfortunately not (laughs)! This is up there, but I’ve never done less than four years on any of my own films when I direct, so that’s how I work. On this, it became more about what Kelly was about to do than what she did. I urged research on her, and she turned out to be gifted at it.

She got groups of young women of this age together and she was very empathetic and she asked great questions, and we’d look at the video, and it started to give us a sense of mission and responsibility. Then about two years in, she turned in this draft that was just extraordinary. Here was a writer popping and a new voice emerging, and I was dazzled. Then it took two more years to cast it and get financing.

She’d never directed before. How nervous were you?
I wasn’t. You’re always nervous about the movie, but I was the one who said to her, ‘You should direct this one day,’ and she told me she’d been trying to figure out how to sell herself for the job. I believe in writer/directors, as once you’ve done the script, you’ve seen a version of it.

You’ve mentored so many first-time directors over the years, including Cameron Crowe for Say Anything and Wes Anderson on Bottle Rocket. What have you learned from all that?
That it’s good to back writers of real ability. In Cameron’s case, he was a noteworthy screenwriter when he directed for the first time. From the start, we knew Wes was going to direct, and he felt he’d have died if he didn’t. It’s always the writing first, then that need to direct.

EDGE OF SEVENTEENDo you like the post process?
I not only love it — I think that post is what filmmaking really is. Editing is where you make the film. Everything else —all the prep and the shoot — is just the raw material you then shape into the actual film.

Where did you do the post?
We did it all in LA. We rented space for all the editorial, and used Wildfire for finishing.

You’ve worked with editor Tracy Wadmore-Smith before on the rom-com How Do You Know (Reese Witherspoon, Owen Wilson, Jack Nicholson, Paul Rudd), which you directed. Tell us about the relationship and how it worked.
She was absolutely brilliant, as we were a long time editing, and it wasn’t always easy with two of us in the room. But you try to find “it.” You’re not trying to just get your way. You’re trying to find the movie. That’s what it is. You start off with a firm idea of the movie you want to make, and then in post, you’re forced to come to grips with the movie you’ve actually made. And they’re not supposed to be the same thing.

That’s the thing about actors and what they bring to the script. You can’t have that many people involved in the shoot and not have the whole movie redefined in some way. We shot in Vancouver, and Technicolor did the dailies. Then it was back to LA. I was there with Tracey pretty much every day, and I love editing. It’s exciting. It’s everything. It’s a roller coaster. Editing is hitting your head against a brick wall until it gives.

THE EDGE OF SEVENTEENEditing’s changed so much technically since you began.
Totally! I did my first films with people wearing white gloves and carefully handling the film and all the bins, and when you made a cut, you had to wait a couple of minutes until it was made. Then digital and instant gratification arrived, and that meant you can see every version of every scene, given the time — but you don’t have the time to do that.

I’m a huge digital fan. It’s like electric lights. Who wants to go back? It’s such a different process that the result has to be different. Look at the whole religion of lighting a set — it’s been changed forever as you can now do so much in post. There’s almost nothing you can’t do in post now. So I’ve lived through the revolution, and we always schedule more time for editing than we think we might need. This took a good six months to cut.

Don’t you like to preview?
I do. I’m a big believer, and they always result in more tweaking and refinement to the film. And that went great. We were very lucky as we were previewing very well, but Kelly and I both felt we needed a couple of extra scenes in order to really get the ending right, and STX, the financing company, gave us three extra days to shoot them and solve the problem. Kelly came up with this last shot that means everything to me. It’s the absolute honest true ending we needed.

Can you talk about the importance of music and sound in the film?
We did all the mixing at Wildfire, that has an Atmos stage with an Avid S6. Kelly was brilliant at finding and using the songs — there are over 30 — which form the great backdrop to the story. But the score was tricky. My friend Hans Zimmer agreed to produce it, and he brought in this wonderful composer from Iceland, Atli Orvarsson, who came up with the perfect theme, and that was the last piece of the puzzle. Then we spent a final week fine-tuning the mix with re-recording mixers Kevin O’Connell, Deb Adair and Chris Carpenter. It’s hard to over state the importance of sound. It’s always huge, especially when you’re trying to be real.

Director Kelly Fremon Craig and James Brooks on set.

This is obviously not a VFX-driven piece, but there are a few.
They were all done by Stargate Studios, and we couldn’t get the damn phone right! That killed us for a while, as there was an emoji we just couldn’t get right. Sometimes it’s the simplest stuff that’s the hardest.

How important was the DI on this and where did you do it?
We did it at Wildfire with colorist Andrew Balis, and Kelly and the DP were more involved in that than I was. The DI is hugely important.

What are the biggest changes you’ve seen in the industry since you began?
Obviously, the digital revolution, but also things like women crew members and getting over the tendency to say, ‘Can I help with that?’ when the grip’s a woman (Laughs)! What hasn’t changed is that script is everything, passion counts, and post is the most creative part of filmmaking.

Why haven’t you directed more films recently, and what’s next?
I’ve just been so busy with these other projects, but I’ve been working on a script for several years — which is normal for me — and hope to do that. But the price you pay to direct is to go legally insane – meaning, you lose touch with the world and people you love. And that’s a high price to pay.

Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Behind the Title: Iloura lead animator Dean Elliott

NAME: Dean Elliott

COMPANY: Iloura (@iloura_vfx)

CAN YOU DESCRIBE YOUR ILOURA?
Based in Melbourne and Sydney, Iloura houses a collective of animation and VFX artists.

WHAT’S YOUR JOB TITLE?
Lead Animator

THE SPONGEBOB MOVIE: SPONGE OUT OF WATER

The SpongeBob Movie: Sponge Out of Water

WHAT DOES THAT ENTAIL?
My role can change depending on the project that I’m working on at the time. On a production with only a small scope for character animation like Mad Max: Fury Road, I will work purely as an animator producing shots for the film, whereas on a larger character-based film like SpongeBob SquarePants I would work as a more traditional lead — helping other animators to hit required notes, communicating direction and working as a sounding board for any performance ideas they may have.

Then on a production like Game of Thrones: Battle of the Bastards, l spent most of my time supervising the complex crowd system we developed to extend the scope of our hero keyframe animation.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
Somehow l seem to have ended up spending a lot of time in the mocap suit over the past 12 months. This isn’t something l had intended, but it does make it a lot easier when l can plan and generate complex performances that would be otherwise very difficult to achieve directing other actors, or purely by keyframing.

HOW LONG HAVE YOU BEEN WORKING IN VFX?
I’ve been working as an animator for over 15 years now at various studios.

HOW HAS YOUR PART OF THE INDUSTRY CHANGED IN THE TIME YOU’VE BEEN WORKING? 
As an animator, I haven’t seen any great advances in the technology we use to do our job. At the end of the day, animators only really have to deal with timing and poses. The biggest change has been the career becoming more accessible as a profession, and it’s been a good one. The tools have leveled the playing field, and now when we look for animators we don’t need to look for traditional art skills like drawing. As long as they understand performance and movement they can produce amazing work.

DID A PARTICULAR FILM INSPIRE YOU ALONG THIS PATH IN ENTERTAINMENT?
Like most people in the industry I had a lot of influences that led me in this direction, but the main film that finally tipped me over was A Bug’s Life. I could see a very strong future for 3D animation watching that film; that was when l thought l could make a career out of a hobby.

DID YOU GO TO SCHOOL FOR ANIMATION?
Not for animation. There were no courses available for animation when l left school. So instead l studied illustration to build my creative skills, and in my spare time researched animation on the Internet and taught myself at home.

WHAT’S YOUR FAVORITE PART OF THE JOB?
I really enjoy the start of each production. Doing motion tests to establish how a character will move and looking at the storyboards or previs for the first time.

WHAT’S YOUR LEAST FAVORITE?
When you’re getting close to the deadline and the schedule becomes more important than reworking the shot because you came up with a better idea for the character.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I’d love to say l would be a pilot. But then again, l spent so much time drawing in school that my grades weren’t very good, so l doubt anyone would have let me fly 50 tons of metal across the sky. (Which is probably best, now that l think of it.)

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
We recently finished production on Underworld 5, and before that we completed the Battle of the Bastards sequence in Season 6, Episode 9 of Game of Thrones.

The Game of Thrones: Battle of the Bastards

WHAT IS THE PROJECT YOU ARE MOST PROUD OF?
I think Game of Thrones: Battle of the Bastards has been the most rewarding. We set out to greatly improve our crowd animation for the sequence, and it’s probably the only project l’ve worked on where the final result looked as good what I had imagined it would be when I started.

WHAT TOOLS DO YOU USE DAY TO DAY?
Along with a number of in-house tools, we rely on Maya day to day for all of our keyframe animation. We have also recently started using Massive for crowds and iPi Motion Capture in a small in-house mocap space.

WHERE DO YOU FIND INSPIRATION?
Many places. It’s very easy to find your way to a lot of very impressive work on the Internet these days. I’m probably most inspired by work in other films, and I follow a lot of illustrators and artists as well.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Leave work and go home.

Review: The HP Z1G3 All-in-One workstation

By Brady Betzel

I’ll admit it. I’ve always been impressed with HP’s All-in-One workstations — from their z840 to their zBook mobile workstation and now their HP Z1G3. Yes, I know, the HP line of workstations are not cheap. In fact, you can save quite a bit of money building your own system, but you will probably have tons of headaches unless you are very confident in your computer-building skills. And if you don’t mind standing in the return line at the Fry’s Electronics.

HP spends tons of time and money on ISV certifications for their workstations. ISV certification stands for Independent Software Vendor certification. In plain English it means that HP spends a lot of time and money making sure the hardware inside of your workstation works with the software you use. For an industry pro that means apps like Adobe’s Premiere Pro and After Effects, Avid Media Composer, Autodesk products like 3DS Max and many others.

For this review,  I tested apps like Avid Media Composer, FilmLight’s Baselight for Media Composer color correction plug-in, Adobe Premiere Pro, Adobe Media Encoder and Adobe After Effects, as well as Blackmagic’s Resolve 12.5.2, which chewed through basic color correction. In terms of testing time, I typically keep a review computer system for a couple of months, but with this workstation I really wanted to test it as thoroughly as possible — I’ve had the workstation for three months and counting, and I’ve been running the system through all the appropriate paces.

I always love to review workstations like the HP Z1G3 because of the raw power they possess. While HP sent me one of the top-of-the-line Z1G3 configurations, which retails for a list price of $3,486, they have a pretty reasonable starting price at $1,349. From Intel i3, i5 and i7 configurations all the way up to the all mighty Intel Xeon — the HP Z1G3 can be customized to fit into your workflow whether you just need to check your email or color correct video from your GoPro.

Here are the specs that make up the HP Z1G3 All-in-One workstation I received:

● 23.6-inch UHD/4K non-glare and non-touch display (3840×2160)
● Intel Xeon E3-1270 v5 CPU, 3.6GHz (4 Cores / 8 Threads)
● 64GB DDR4 SODIMM 2133 GHz (4 x 16GB)
● Nvidia Quadro M2000M graphics (4GB)
● Two Z Turbo drives (512GB, PCIe M.2)
● Wireless keyboard and mouse
● Two Thunderbolt 3/USB 3.1 ports
● USB charging port
● Media card reader
● DisplayPort out

As I mentioned earlier, I tested the Z1G3 with many different apps, but recently I’ve been diving deeper into color correction, and luckily for my testing this fits right in. A few of the most strenuous real-world tests for computer systems is running 3D modeling apps like Maxon Cinema 4D and color correction suites like Resolve. Of course, apps like After Effects are great tests as well, but adding nodes on nodes on nodes in Resolve will really tax your CPU, as well as your GPU.

One thing that can really set apart high-end systems like the Z1G3 is the delay when using a precision color correction panel like Tangent’s Elements or Ripple. Sometimes you will move one of the color wheel balls and a half a second later the color wheel moves on screen. I tried adding a few clips and nodes on the timeline and when using the panels, I noticed no discernible delay (at least more than what I would expect). While this isn’t a scientific test, it is crucial for folks looking to plug in external devices.

For more scientific tests I stuck to apps like Cinebench from Maxon, AJA’s System Test and Blackmagic’s Disk Speed Test. In Cinebench, the Z1G3 ranked at the top of the list when compared to similar systems. In AJA’s System Test I tested the read/write speed of the hp-z1g3-aja-system-test-copynon-OS drive (basically the editing or cache drive). It sustained around 1520MB/s read and 1490MB/s write. I say around because I couldn’t get the AJA app to display the entire read/write numbers because of the high-resolution/zoom in Windows, I tried scaling it down to 1920×1080 but no luck. In Blackmagic’s Disk Speed Test, I was running at 1560MB/s read and 1497.3MB/s write. The drive that I ran this test on is HP’s version of the M.2 PCIe SSD powered by Samsung, more affectionately known by HP as a Z-Turbo drive. The only thing better at the moment would be a bunch of these drives arranged in a RAID-0 configuration. Luckily, you can do that through the Thunderbolt 3 port with some spare SSDs you have lying around.

Almost daily I ran Premiere Pro CC, Media Encoder and Resolve Studio 12.5.2. I was really happy with the performance in Premiere. When working with QuickTimes in inter-frame codecs like H.264 and AVC-HD (non-edit friendly codecs), I was able to work without too much stuttering in the timeline. When I used intra-frame codecs like ProRes HQ from a Blackmagic’s Pocket Cinema Camera, Premiere worked great. I even jumped into Adobe’s Lumetri color tools while using Tangent’s Ripple external color correction panel and it worked with little discernable delay. I did notice that Premiere had a little more delay when using the external color correction panel than Media Composer and Resolve, but that seemed to be more of a software problem rather than a workstation problem.

One of my favorite parts about using a system with an Nvidia graphics card, especially a Quadro card like the M2000M, is the ability to encode multiple versions of a file at once. Once I was done editing some timelapses in Premiere, I exported using Media Encoder. I would apply three presets I made: one square 600×600 H.264 for Instagram, one 3840×2160 H.264 for YouTube and an Animated GIF at 480×360 for Twitter. Once I told Media Encoder to encode, it ran all three exports concurrently — a really awesome feature. With the Nvidia Quadro card installed, it really sped along the export.

Media Composer
Another app I wanted to test was Media Composer 8.6.3. Overall Media Composer ran great except for the high-resolution display. As I’ve said in previous reviews, this isn’t really the fault of HP, but more of the software manufacturers who haven’t updated their interfaces to adapt to the latest UHD displays. I had filmed a little hike I took with my five-year-old. I gave him a GoPro while I had my own. Once we got the footage back home, I imported it into Media Composer, grouped the footage and edited it using the multi-cam edit workflow.

Simply put, the multi-camera split was on the left and the clip I had in the sequence was playing simultaneously on the right. Before I grouped the footage into a multi-group, I transcoded the H.264s into DNxHD 175 an intra-frame, edit-friendly codec. The transcode was nearly realtime, so it took 60 minutes to transcode a 60-minute H.264 — which is not bad. In the end, I was able to edit the two-camera multi-group at 1920×1080 resolution with only minor hiccups. Occasionally, I would get caught in fast forward for a few extra seconds when J-K-L editing, but nothing that made me want to throw my keyboard or mouse against the wall.

Once done editing, I installed the FilmLight color correction plug-in for Media Composer. I had a really awesome experience coloring using Baselight in Media Composer on the Z1G3. I didn’t have any slowdowns, and the relationship between using the color correction panel and Baselight was smooth.

Resolve
The last app I tested with HP’s Z1G3 All-in-One Workstation was Blackmagic’s Resolve 12.5.2. Much like my other tests, I concentrated on color correction with the Tangent Ripple and Element-Vs iOS app. I had four or five nodes going in the color correction page before I started to see a slow down. I was using the native H.264 and ProRes HQ files from the cameras, so I didn’t make it easy for Resolve, but it still worked. Once I added a little sharpening to my clips, the HP Z1G3 really started to kick into gear. I heard the faint hum of fans, which up until this point hadn’t kicked in. This is also where the system started to slow down and become sluggish.

Summing Up
The Z1G3 is one of my favorite workstations, period. A while ago, I reviewed the previous All-in-One workstation from HP, the Z1G2, and at the time it was my favorite. One of my few complaints was that, while it was easy to fix, it was very heavy and bulky. When I opened the Z1G3 box, I immediately noticed how much lighter and streamlined the design was. It almost felt like they took away 50 percent of the bulk, which is something I really appreciate. I can tell that one of the main focuses with the Z1G3 was minimizing its footprint and weight, while increasing the power. HP really knocked it out of the park.

One of the only things that I wish was different on the Z1G3 I tested was the graphics card. While the Nvidia Quadro M2000M is a great graphics card, it is a “mobile” version of a Quadro, which has 128 fewer CUDA cores and 26GB/s less bandwidth than its desktop equivalent the M2000. I would love the option of a full-sized Quadro and instead of the mobile version but I also understand the power consumption will go up as well as the form factor, so maybe I give HP a pass here.

In the end, I know everyone reading this review is saying to themselves, “I love my iMac so why would I want the HP Z1G3?” If you are a die-hard Apple user, or you just saw the new Microsoft Surface Studio announcement, then it might be a hard sell, but I love both Windows- and Mac OS-based systems, and the Z1G3 is awesome. What’s even more awesome is that it is easily upgradeable. I took off the back cover, and with simple switch I could have added a 2.5-inch hard drive or two in under a minute. If you are looking for a new powerful workstation and want one that not only stands up to Resolve and Premiere Pro CC, the HP Z1G3 is for you.


Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

Justin Martin joins Hush as technology director

Design agency Hush has added Justin Martin as technology director. A technologist who has worked in a variety of jobs — from programming and research to engineering and visual effects — Martin joins Hushafter spending the last two years at The Barbarian Group, where he last served as senior developer and worked with clients such as Google, Samsung, IBM and Intel.

“A lot of the work at Hush requires working with non-traditional mediums and integrating with technologies that require significant research and development,” the Brooklyn-based Martin says. “Having studied architecture and worked in diverse technological fields, I have a unique vantage point that allows me to talk concept, form and experience — but through the lens of practical and realistic execution.”

Along with his Barbarian Group background, which includes work on a 7K interactive retail experience for Samsung, Martin used his technical experience in research and engineering while at Look Effects for feature films such as Darren Aronofsky’s Noah and Wes Anderson’s Moonrise Kingdom, as well as his favorite project, decoding a satellite’s telemetry data using only ‘70s-era mission manuals when it passed by Earth in 2014.

Founding partner/creative leader David Schwarz adds that Martin’s broad technical skill set provides something wholly unique to the experience design agency. “Justin has that impossible-to-find balance of qualities: deep, detailed knowledge mixed with the broader skill of being able to articulate that knowledge to our clients, peers, and partners,” he notes. “Having him involved allows us to begin solving complex challenges quickly and iteratively, and with an elasticity I’ve never seen at our company to date.”

Review: Red Giant’s Universe 2

By Brady Betzel

Throughout 2016, we have seen some interesting acquisitions in the world of post production software and hardware — Razer bought THX, Blackmagic bought Ultimatte and Fairlight and Boris FX bought GenArts, to name a few. We’ve also seen a tremendous consolidation of jobs. Editors are now being tasked as final audio mixers, final motion graphics creators, final colorists and much more.

Personally, I love doing more than just editing, so knowing tools like Adobe After Effects and DaVinci Resolve, in addition to Avid Media Composer, has really helped me become not only an editor but someone who can jump into After Effects or Resolve and do good work.

hudUnfortunately, for some people it is the nature of the post beast to know everything. Plug-ins play a gigantic part in balancing my workload, available time and the quality of the final product. If I didn’t have plug-ins like Imagineer’s Mocha Pro, Boris’s Continuum Complete, GenArt’s Sapphire and Red Giant’s Universe 2, I would be forced to turn down work because the time it would take to create a finished piece would outweigh the fee I would be able to charge a client.

A while back, I reviewed Red Giant’s Universe when it was in version 1, (check it out here). In the beginning Universe allowed for lifetime, annual and free memberships. It seems the belt has tightened a little for Red Giant as Universe 2 is now $99 a year, $20 a month or a 14-day free trial. No permanent free version or lifetime memberships are offered (if you downloaded the free Universe before June 28, you will still be able to access those free plug-ins in the Legacy group). Moreover, they have doubled the monthly fee from $10 to $20 — definitely trying to get everyone on to the annual subscription train.

Personally, I think this resulted from too much focus on the broad Universe, trying to jam in as many plug-ins/transitions/effects as possible and not working on specific plug-ins within Universe. I actually like the renewed focus of Red Giant toward a richer toolset as opposed to a full toolset.

Digging In
Okay, enough of my anecdotal narrative and on to some technical awesomeness. Red Giant’s Universe 2 is a vast plug-in collection that is compatible with Adobe’s Premiere Pro and After Effects CS6-CC 2015.3; Apple Final Cut Pro X 10.0.9 and later; Apple Motion 5.0.7 and later; Vegas 12 and 13; DaVinci Resolve 11.1 and later; and HitFilm 3 and 4 Pro. You must have a compatible GPU installed as Universe does not have a CPU fallback plan for unsupported machines. Basically you must have 2GB or higher GPU, and don’t forget about Intel as their graphic support has improved a lot lately. For more info on OS compatibility and specific GPU requirements, check out Red Giant’s compatibility page.

Universe 2 is loaded with great plug-ins that, once you dig in, you will want to use all the time. For instance, I really like the ease of use of Universe’s RGB Separation and Chromatic Glow. If you want a full rundown of each and every effect you should download the Universe 2 trial and check this out. In this review I am only going to go over some of the newly added plug-ins — HUD Components,  Line, Logo Motion and Color Stripe — but remember there are a ton more.

I will be bouncing around different apps like Premiere Pro and After Effects. Initially I wanted to see how well Universe 2 worked inside of Blackmagic’s DaVinci Resolve 12.5.2. Resolve gave me a little trouble at first; it began by crashing once I clicked on OpenFX in the Color page. I rebooted completely and got the error message that the OpenFX had been disabled. I did a little research (and by research I mean I typed ”Disabled OpenFX Resolve” into Google), and  stumbled on a post on Blackmagic’s Forum that suggested deleting “C:\ProgramData\Blackmagic Design\Davinci Resolve\Support\OFXPluginCache.xml” might fix it. Once I deleted that and rebooted Resolve, I clicked on the OpenFX tab in the Color Page, waited 10 minutes, and it started working. From that point on it loaded fast. So, barring the Resolve installation hiccup, there were no problems installing in Premiere and After Effects.

Once installed, you will notice that Universe has a few folders inside of your plug-in’s drop down: Universe Blur, Universe Distort, Universe Generators, Universe Glow, Universe Legacy, Universe Motion Graphics, Universe Stylize and Universe Utilities. You may recognize some of these if you have used an earlier version of Universe, but something you will not recognize is that each Universe plug-in now has a “uni.” prefix.

I am still not sure whether I like this or hate this. On one hand it’s easy to search for if you know exactly what you want in apps like Premiere. On the other hand it runs counterintuitive to what I am used to as a grouchy old editor. In the end, I decided to run my tests in After Effects and Premiere. Resolve is great, but for tracking a HUD in 3D space I was more comfortable in After Effects.

HUD Components
First up is HUD Components, located under the Universe Motion Graphics folder and labeled: “uni.HUD Components.” What used to take many Video CoPilot tutorials and many inspirational views of HUD/UI master Jayse Hansen’s (@jayse_) work, now takes me minutes thanks to the new HUD components. Obviously, to make anything on the level of a master like Jayse Hansen will take hundreds of hours and thousands of attempts, but still — with Red Giant HUD Components you can make those sci-fi in-helmet elements quickly.

When you apply HUD Components to a solid layer in After Effects you can immediately see the start of your HUD. To see what the composite over my footage would look like, I went to change the blend mode to Add, which is listed under “Composite Settings.” From there you can see some awesome pre-built looks under the Choose a Preset button. The pre-built elements are all good starting points, but I would definitely dive further into customizing, maybe layer multiple HUDs over each other with different Blend Modes, for example.

Diving further into HUD Components, there are four separate “Elements” that you can customize, each with different images, animations, colors, clone types, and much more. One thing to remember is that when it comes to transformation settings and order of operations work from the top down. For instance, if you change the rotation on element one, it will affect each element under it, which is kind of handy if you ask me. Once you get the hang of how HUD Components works, it is really easy to make some unique UI components. I really like to use the uni.Point Zoom effect (listed under Universe Glow in the Effects & Presets); it gives you a sort of projector-like effect with your HUD component.

There are so many ways to use and apply HUD Components in everyday work, from building dynamic lower thirds with all of the animatable arcs, clones and rotations to building sci-fi elements, applying Holomatrix to it and even Glitch to create awesome motion graphics elements with multiple levels of detail and color. I did try using HUD Components in Resolve when tracking a 3D object but couldn’t quite get the look I wanted, so I ditched it and used After Effects.

Line
Second up is the Line plug-in. While drawing lines along a path in After Effects isn’t necessarily hard, it’s kind of annoying — think having to make custom map graphics to and from different places daily. Line takes the hard work out of making line effects to and from different points. This plug-in also contains the prefix uni. and is located under Universe Motion Graphics labeled uni.Line.

This plug-in is very simple to use and animate. I quickly found a map, applied uni.Line, placed my beginning and end points, animated the line using two keyframes under “Draw On” and bam! I had an instant travel-vlog style graphic that showed me going from California to Australia in under three minutes (yes, I know three minutes seems a little fast to travel to Australia but that’s really how long it took, render and all). Under the Effect Controls you can find preset looks, beginning and ending shape options like circles or arrows, line types, segmented lines and curve types. You can even move the peak of the curve under bezier style option.

Logo Motion
Third is Logo Motion, located under Universe Motion Graphics titled uni.LogoMotion. In a nutshell you can take a pre-built logo (or anything for that matter), pre-compose it, throw the uni.LogoMotion effect on top, apply a preset reveal, tweak your logo animation, dynamically adjust the length of your pre-comp — which directly affects the logo’s wipe on and off — and, finally, render.

This is another plug-in that makes my life as an editor who dabbles in motion graphics really easy. Red Giant even included some lower third animation presets that help create dynamic lower third movements. You can select from some of the pre-built looks, add some motion while the logo is “idle,” adjust things like rotation, opacity and blur under the start and end properties, and even add motion blur. The new preset browser in Universe 2 really helps with plug-ins like Logo Motion where you can audition animations easily before applying them. You can quickly add some life to any logo or object with one or two clicks; if you want to get detailed you can dial in the idle animation and/or transition settings.

Color Stripe
Fourth is Color Stripe, a transition that uses color layers to wipe across and reveal another layer. This one is a pretty niche case use, but is still worth mentioning. In After Effects. transitions are generally a little cumbersome. I found the Universe 2 transitions infinitely easier to use in NLEs like Adobe Premiere. From the always-popular swish pan to exposure blur, there are some transitions you might use once or some you might use a bunch. Color Stripe is a transition that you probably won’t want to use too often, but when you do need it, it will be right at your fingertips. You can choose from different color schemes like analogous, tetradic, or even create a custom scheme to match your project.

In the end, Universe 2 has some effects that are essential once you begin using them, like uni.Unmult, uni.RGB Separation and the awesome uni.Chromatic Glow. The new ones are great too, I really like the ease of use of uni.HUD Components. Since these effects are GPU accelerated you might be surprised at how fast and fluid they work in your project without slowdowns. For anyone who likes apps like After Effects, but can’t afford to spend hours dialing in the perfect UI interface and HUD, Universe 2 is perfect for you. Check out all of the latest Red Giant Universe 2 tools here.

Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

ATTO ships ThunderLink for 40GigE connectivity to Thunderbolt 3  

ATTO Technology has the introduced ThunderLink 3401 ( $1,595) and 3402 ($1,995) devices, which allow 40GigE connectivity from new Thunderbolt 3 enabled platforms to networks and storage while also providing backwards compatibility to Thunderbolt 2 devices and 10GigE infrastructures.

With single- and dual-output options and speeds that double that of Thunderbolt 2, the new ThunderLink devices provide enough bandwidth for 4K video workflows via a single cable. ATTO’s Thunderbolt 3 devices allow for higher performance and large bandwidth-intensive transfers via all major Ethernet protocols.

ATTO’s ThunderLink features proprietary Advanced Data Streaming (ADS) technology for smooth data transfers, eliminating dropped frames and providing consistent time-to-data for high-performance applications or mobile workstation users.

Review: Tangent Ripple color correction panel

By Brady Betzel

Lately, it feels like a lot of the specializations in post production are becoming generalized and given to the “editor.” One of the hats that the editor now wears is that of color corrector — I’m not saying we are tasked with color grading an entire film, but we are asked to make things warmer or cooler or to add saturation.

With the standard Wacom tablet, keyboard and/or mouse combo, it can get a little tedious when color correcting — in Adobe Premiere, Blackmagic Resolve or Avid Media Composer/Symphony — without specialized color correction panels like the Baselight Blackboard, Resolve Advanced, Nucoda Precision, Avid Artist Color or even Tangent’s Element. In addition, those specialized panels run between $1,000 per piece to upwards of $30,000, leaving many people to fend for themselves using a mouse.

While color correcting with a mouse isn’t always horrible, once you use a proper color correction panel, you will always feel like you are missing a vital tool. But don’t worry! Tangent has released a new color correction panel that is not only affordable and compatible with many of today’s popular coloring and nonlinear editing apps, but is also extremely portable: the Tangent Ripple.

For this review I am covering how the Tangent Ripple works inside of Premiere Pro CC 2015.3, Filmlight’s Baselight Media Composer/Symphony plug-in and Resolve 12.5.

One thing I always found intimidating about color correction and grading apps like Resolve was the abundance of options to correct or grade an image. The Tangent Ripple represents the very basic first steps in the color correction pipeline: color balancing using lift, gamma, gain (or shadows, midtones and highlights) and exposure/contrast correction. I am way over-simplifying these first few steps but these are what the Ripple specializes in.

You’ve probably heard of the Tangent Element Panels, which go way beyond the basics — if you start to love grading with the Tangent Ripple or the Element-VS app, the Element set should be your next step. It retails for around $3,500, or a little below as a set (you can purchase the Element panels individually for cheaper, but the set is worth it). The Tangent Ripple retails for only $350.

Basic Color Correction
If you are an offline editor who wants to add life to your footage quickly, basic color correction is where you will be concentrating, and the Ripple is a tool you need to purchase. Whether you color correct your footage for cuts that go to a network executive, or you are the editor and finisher on a project and want to give your footage the finishing touch, you should check out what a little contrast, saturation and exposure correction can do.

panelYou can find some great basic color correcting tutorials on YouTube, Lynda.com and color correction-focused sites like MixingLight.com. On YouTube, Casey Faris has some quick and succinct color correction tutorials, check him out here. Ripple Training also has some quick Resolve-focused tips posted somewhat weekly by Alexis Van Hurkman.

When you open the Tangent Ripple box you get an instruction manual, the Ripple, three track balls and some carrying pouches to keep it all protected. The Ripple has a five-foot USB cable hardwired into it, but the track balls are separate and do not lock into place. If you were to ask a Ripple user to tell you the serial number on the bottom of the Ripple, most likely they will turn it over, dropping all the trackballs. Obviously, this could wreck the trackballs and/or injure someone, so don’t do it, but you get my point.

The Ripple itself is very simple in layout: three trackballs, three dials above the trackballs, “A” and “B” buttons and revert buttons next to the dials. That is it! If you are looking for more than that, you should take a look at the Element panels.

After you plug in the Ripple to an open USB port, you probably should download the Tangent Hub software. This will also install the Tangent Mapper, which allows you to customize your buttons in apps like Premiere Pro. Unfortunately, Resolve and the Media Composer Baselight plug-in do not allow for customization, but when you install the software you get a nice HUD that signals what service each Ripple button and knob does in the software you are using.

If you are like me and your first intro into the wonderful world of color correction in an NLE was Avid Symphony, you might have also encountered the Avid Artist Color panel, which is very similar in functionality: three balls and a couple of knobs. Unfortunately, I found that the Artist Color never really worked like it should within Symphony. Here is a bit of interesting news: while you can’t use the Ripple in the native Symphony color corrector, you can use external panels in the Baselight Avid plug-in! Finally a solution! It is really, really responsive to the Tangent Ripple too! The Ripple really does work great inside of a Media Composer plug-in.

The Ripple was very responsive, much more than what I’ve experienced with the Avid Artist Color panel. As I mentioned earlier, the Ripple will accomplish the basics of color correcting — you can fix color balance issues and adjust exposure. It does a few things well, and that is it. To my surprise, when I added a shape (a mask used in color correction) in Baselight, I was able to adjust the size, points and position of the shape using the Ripple. In the curves dialogue I was able to add, move and adjust points. Not only does Baselight change the game for powerful, in-Avid color correction, but it is a tool like the Ripple that puts color correction within any editor’s grasp. I was really shocked at how well it worked.

When using the Ripple in Resolve you get what Resolve wants to give you. The Ripple is great for basic corrections inside of Resolve, but if you want to dive further into the awesomeness of color correction, you are going to want to invest in the Tangent Element panels.

With the Ripple inside of Resolve, you get the basic lift, gamma and gain controls along with the color wheels, a bypass button and reset buttons for each control. The “A” button doesn’t do anything, which is kind of funny to me. Unlike the Baselight Avid plug-in, you cannot adjust shapes, or do much else with the Ripple panel other than the basics.

Element-Vs
Another option that took me by surprise was Tangent iOS and the Android app Element-Vs. I expected this app to really underwhelm me but I was wrong. Element-Vs acts as an extension of your Ripple — based off the Tangent Element panels. But keep in mind, it’s still an app and there is nothing comparable to the tactile feeling and response you get from a panel like the Ripple or Elements. Nonetheless, I did use the Element-Vs app on an iPad Mini and it was surprisingly great.

It is a bit high priced for an app, coming in at around $100, but I was able to get a really great response when cycling through the different Element “panels,” leading me to think that the Ripple and Element-Vs app combo is a real contender for the prosumer colorist. At a total of $450 ($350 for the Ripple and $100 for the Element-Vs app), you are in the same ballpark as a colorist who has a $3,000-plus set of panels.

As I said earlier, the Element panels have a great tactile feel and feedback that, at the moment, is hard to compare to an app, but this combo isn’t as shabby as I thought it would be. A welcome surprise was that the installation and connection were pretty simple too.

Premiere Pro
The last app I wanted to test was Premiere Pro CC. Recently, Adobe added external color panel support in version 2015.3 or above. In fact, Premiere has the most functionality and map-ability out of all the apps I tested — it was an eye-opening experience for me. When I first started using the Lumetri color correction tools inside of Premiere I was a little bewildered and lost as the set-up was different from what I was used to in other color correction apps.

I stuck to basic color corrections inside of Premiere, and would export an XML or flat QuickTime file to do more work inside of Resolve. Using the Ripple with Premiere changed how I felt about the Lumetri color correction features. When you open Premiere Pro CC 2015.3 along with the Tangent Mapper, the top row of tabs opens up. You can customize not only the standard functions of the Ripple within each Lumetri panel, like Basic, Creative, Curves, Color Wheels, HSL Secondaries and Vignette, but you can also create an alternate set of functions when you press the “A” button.

In my opinion, the best button press for the Ripple is the “B” button, which cycles you through the Lumetri panels. In the panel Vignette, the Ripple gives you options like Vignette Amount, Vignette Midpoint, feather and Vignette Roundness.

As a side note, one complaint I have about the Ripple is that there isn’t a dedicated “bypass” button. I know that each app has different button designations and that Tangent wants to keep the Ripple as simple as possible, but many people constantly toggle the bypass function.

Not all hope is lost, however. Inside of Premiere, if you hold the “A” button for alternate mapping and hit the “B” button, you will toggle the bypass off and on. While editing in Premiere, I used the Ripple to do color adjustments even when the Lumetri panel wasn’t on screen. I could cycle through the different Lumetri tabs, make adjustments and continue to edit using keyboard functions fast — an awesome feature both Tangent and Adobe should be promoting more, in my opinion.

It seems Tangent worked very closely with Adobe when creating the Ripple. Maybe it is just a coincidence, but it really feels like this is the Adobe Premiere Pro CC Tangent Ripple. Of course, you can also use the Element-Vs app in conjunction with the Ripple, but in Premiere I would say you don’t need it. The Ripple takes care of almost everything for you.

One drawback I noticed when using the Ripple and Element-Vs inside of Premiere Pro was a small delay when compared to using these inside of Resolve and Baselight’s Media Composer plug-in. Not a huge delay, but a slight hesitation — nothing that would make me not buy the Ripple, but something you should know.

Summing Up
Overall, I really love the Ripple color correction panel from Tangent. At $350, there is nothing better. The Ripple feels like it was created for editors looking to dive deep into Premiere’s Lumetri color controls and allows you to be more creative because of it.

Physically, the Ripple has a lighter and more plastic-type of feel than its Element Tk panel brother, but it still works great. If you need something light and compact, the Ripple is a great addition to your Starbuck’s-based color correction set-up.

I do wish there was a little more space between the trackballs and the rotary dials. When using the dials, I kept nudging the trackballs and sometimes I didn’t even realize what had happened. However, since the Ripple is made to be compact, lightweight, mobile and priced to beat every other panel on the market, I can forgive this.

It feels like Tangent worked really hard to make the Ripple feel like a natural extension of your keyboard. I know I sound like a broken record, but saving time makes me money, and the Tangent Ripple color correction panel saves me time. If you are an editor that has to color correct and grade dailies, an assistant editor looking to up their color correction game or just an all-around post production ninja who dabbles in different areas of expertise, the Tangent Ripple is the next tool you need to buy.


Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

The sound of two worlds for The Lost City of Z

By Jennifer Walden

If you are an explorer, your goal is to go where no one has gone before, or maybe it’s to unearth and re-discover a long-lost world. Director James Gray (The Immigrant), takes on David Grann’s non-fiction novel The Lost City of Z, which follows the adventures of British explorer Colonel Percival Fawcett, who in 1925 disappeared with his son in the Amazon jungle while on a quest to locate an ancient lost city.

Gray’s biographical film, which premiered October 15 at the 54th New York Film Festival, takes an interpretive approach to the story by exploring Fawcett’s inner landscape, which is at odds with his physical location — whether he’s in England or the Amazon, his thoughts drift between the two incongruent worlds.

Once Gray returned from filming The Lost City of Z in the jungles of Colombia, he met up with supervising sound editor/sound designer Robert Hein at New York’s Harbor Picture Company. Having worked together on The Immigrant years ago, Hein says he and Gray have an understanding of each other’s aesthetics. “He has very high goals for himself, and I try to have that also. I enjoy our collaboration; we keep pushing the envelope. We have a mutual appreciation for making a film the greatest it can be. It’s an evolution, and we keep pushing the film to new places.”

The Sound of Two Worlds
Gray felt Hein and Harbor Picture Company would be the perfect partner to handle the challenging sound job for The Lost City of Z. “It involved the creation of two very different worlds: Victorian England, and the jungle. Both feature the backdrop of World War I. Therefore, we wanted someone who naturally thinks outside the box, someone who doesn’t only look at the images on the screen, but takes chances and does things outside the realm of what you originally had in mind, and Bob [Hein] and his crew are those people.”

Bob Hein

Gray tasked Hein with designing a soundscape that could merge Fawcett’s physical location with his inner world. Fawcett (Charlie Hunnam) is presented with physical attacks and struggles, but it’s his inner struggle that Gray wanted to focus on. Hein explains, “Fawcett is a conflicted character. A big part of the film is his longing for two worlds: the Amazon and England. When he’s in one place, his mind is in the other, so that was very challenging to pull off.”

To help convey Fawcett’s emotional and mental conflicts, Hein introduced the sounds of England into the Amazon, and vice-versa, subtly blending the two worlds. Through sound, the audience escapes the physical setting and goes into Fawcett’s mind. For example, the film opens with the sounds of the jungle, to which Hein added an indigenous Amazonian battle drum that transforms into the drumming of an English soldier, since Fawcett is physically with a group of soldiers preparing for a hunt. Hein explains that Fawcett’s belief that the Amazonians were just as civilized as Europeans (maybe even more so) was a controversial idea at the time. Merging their drumming wasn’t just a means of carrying the audience from the Amazon to England; it was also a comment on the two civilizations.

“In a way, it’s kind of emblematic of the whole sound design,” explains Hein. “It starts out as one thing but then it transforms into another. We did that throughout the film. I think it’s very beautiful and engaging. Through the sound you enter into his world, so we did a lot of those transitions.”

In another scene, Fawcett is traveling down a river in the jungle and he’s thinking about his family in England. Here, Hein adds an indigenous bird calling, and as the scene develops he blends the sound of that bird with an English church bell. “It’s very subtle,” he says. “The sounds just merge. It’s the merging of two worlds. It’s a feeling more than an obvious trick.”

During a WWI battle scene, Fawcett leads a charge of troops out of their trench. Here Hein adds sounds related to the Amazon in juxtaposition of Fawcett’s immediate situation. “Right before he goes into war, he’s back in the jungle even though he is physically in the trenches. What you hear in his head are memories of the jungle. You hear the indigenous Amazonians, but unless you’re told what it is you might not know.”

A War Cry
According to Hein, one of the big events in the film occurs when Fawcett is being attacked by Amazonians. They are shooting at him but he refuses to accept defeat. Fawcett holds up his bible and an arrow goes tearing into the book. At that moment, the film takes the audience inside Fawcett’s mind as his whole life flashes by. “The sound is a very big part of that because you hear memories of England and memories of his life and his family, but then you start to hear an indigenous war cry that I changed dramatically,” explains Hein. “It doesn’t sound like something that would come out of a human voice. It’s more of an ethereal, haunted reference to the war cry.”

As Fawcett comes back to reality that sound gets erased by the jungle ambience. “He’s left alone in the jungle, staring at a tribe of Indians that just tried to kill him. That was a very effective sound design moment in this film.”

To turn that war cry into an ethereal sound, Hein used a granular synthesizer plug-in called Paulstretch (or Paul’s Extreme Sound Stretch) created by Software Engineer by Paul Nasca. “Paulstretch turns sounds almost into music,” he says. “It’s an old technology, but it does some very special things. You can set it for a variety of effects. I would play around with it until I found what I liked. There were a lot of versions of a lot of different ideas as we went along.”

It’s all part of the creative process, which Gray is happy to explore. “What’s great is that James [Gray] is excited about sound,” says Hein. “He would hang out and we would play things together and we would talk about the film, about the main character, and we would arrive at sounds together.”

Drones
Additionally, Hein sound designed drones to highlight the anxiety and trepidation that Fawcett feels. “The drones are low, sub-frequency sounds but they present a certain atmosphere that conveys dread. These elements are very subtle. You don’t get hit over the head with them,” he says.

The drones and all the sound design were created from natural sounds from the Amazon or England. For example, to create a low-end drone, they would start with jungle sounds — imagine a bee’s nest or an Amazonian instrument — and then manipulate those. “Everything was done to immerse the audience in the world of The Lost City of Z in its purest sense,” says Hein, who worked closely with Harbor’s sound editors Glenfield Payne, Damian Volpe and Dave Paterson. “They did great work and were crucial in the sound design.”

The Amazon
Gray also asked that Hein design the indigenous Amazon world exactly the way that it should be, as real as it could be. Hein says, “It’s very hard to find the correct sound to go along with the images. A lot of my endeavor was researching and finding people who did recordings in the Amazon.”

He scoured the Smithsonian Institute Archives, and did hours of research online, looking for audio preservationists who captured field recordings of indigenous Amazonians. “There was one amazing coincidence,” says Hein. “There’s a scene in the movie where the Indians are using an herbal potion to stun the fish in the river. That’s how they do it so as not to over-fish their environment. James [Gray] had found this chant that he wanted to have there, but that chant wasn’t actually a fishing chant. Fortunately, I found a recording of the actual fishing chant online. It’s beautifully done. I contacted the recordist and he gave us the rights to use it.”

Filming in the Amazon, under very difficult conditions presented Hein with another post production challenge. “Location sound recording in the jungle is challenging because there were loud insects, rain and thunder. There were even far-afield trucks and airplanes that didn’t exist at the time.”

Gray was very concerned that sections of the location dialogue would be unusable. “The performances in the film are so great because they went deep into the Amazon jungle to shoot this film. Physically being in that environment I’m sure was very stressful, and that added a certain quality to the actors’ performances that would have been very difficult to replace with ADR,” says Hein, who carefully cleaned up the dialogue using several tools, including iZotope’s RX 5 Advanced audio restoration software. “With RX 5 Advanced, we could microscopically choose which sounds we wanted to keep and which sounds we wanted to remove, and that’s done visually. RX gives you a visual map of the audio and you can paint out sounds that are unnecessary. It’s almost like Photoshop for sound.”

Hein shared the cleaned dialogue tracks with Gray, who was thrilled. “He was so excited about them. He said, “I can use my location sound!” That was a big part of the project.”

ADR and The Mix
While much of the dialogue was saved, there were still a few problematic scenes that required ADR, including a scene that was filmed during a tropical rainstorm, and another that was shot on a noisy train as it traveled over the mountains in Colombia. Harbor’s ADR supervisor Bobby Johanson, who has worked with Gray on previous films, recorded everything on Harbor’s ADR stage that is located just down the hall from Hein’s edit suite and the dub stage.

Gray says, “Harbor is not just great for New York; it’s great, period. It is this fantastic place where they’ve got soundstages that are 150 feet away from the editing rooms, which is incredibly convenient. I knew they could handle the job, and it was really a perfect scenario.”

The Lost City of Z was mixed in 5.1 surround on an Avid/Euphonix System 5 console by re-recording mixers Tom Johnson (dialogue/music) and Josh Berger (effects, Foley, backgrounds) in Studio A at Harbor Sound’s King Street location in Soho. It was also reviewed on the Harbor Grand stage, which is the largest theatrical mix stage in New York. The team used the 5.1 environment to create the feeling of being engulfed by the jungle. Fawcett’s trips, some which lasted years, were grueling and filled with disease and death. “The jungle is a scary place to be! We really wanted to make sure that the audience understood the magnitude of Percy’s trips to the Amazon,” says Berger. “There are certain scenes where we used sound to heighten the audience’s perspective of how erratic and punishing the jungle can be, i.e. when the team gets caught in rapids or when they come under siege from various Indian tribes.”

Johnson, who typically mixes at Skywalker Sound, had an interesting approach to the final mix. Hein explains that Johnson would first play a reel with every available sound in it — all the dialogue and ADR, all the sound effects and Foley — and the music. “We played it all in the reel,” says Hein. “It would be overwhelming. It would be unmixed and at times chaotic. But it gave us a very good idea of how to approach the mix.”

As they worked through the film, the sound would evolve in unexpected ways. What they heard toward the end of the first pass influenced their approach on the beginning of the second pass. “The film became a living being. We became very flexible about how the sound design was coming in and out of different scenes. The sound became very integrated into the film as a whole. It was really great to experience that,” shares Hein.

As Johnson and Berger mixed, Hein was busy creating new sound design elements for the visual effects that were still coming in at the last minute. For example, the final version of the arrows that were shot in the film didn’t come in until the last minute. “The arrows had to have a real special quality about them. They were very specific in communicating just how dangerous the situation actually was and what they were up against,” says Hein.

Later in the film, Amazonians throw tomahawks at Fawcett and his son as they run through the jungle. “Those tomahawks were never in the footage,” he says. “We had just an idea of them until days before we finished the mix. There was also a jaguar that comes out of the jungle and threatens them. That also came in at the last minute.”

While Hein created new sound elements in his edit suite next to the dub stage, Gray was able to join him for critique and collaboration before those sounds were sent next door to the dub stage. “Working with James is a high-energy, creative blast and super fun. He’s constantly coming up with new ideas and challenges. He spends every minute in the mix encouraging us, challenging us and, best of all, making us laugh a lot. He’s a great storyteller, and his knowledge of film and film history is remarkable. Working with James Gray is a real highlight in my career,” concludes Hein.


Jennifer Walden is a New Jersey-based audio engineer and writer. 

Behind the Title: Volt Studios EP Amanda Tibbits

NAME: Amanda Tibbits

COMPANY: Minneapolis-based Volt Studios

CAN YOU DESCRIBE YOUR COMPANY?
We are a one-stop production shop for high-end creative content. We provide production, post production and design.

WHAT’S YOUR JOB TITLE?
Partner/Executive Producer

WHAT DOES THAT ENTAIL?
Basically, I control the time, money and communication for all projects that come through our doors. I am in charge of figuring out how to bring a piece to life within a client’s timeframe and budget.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I like to refer to myself as a mother hen or air traffic controller, depending on the day. I keep all the artists sane and all the projects moving in and out of the facility.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Getting to work with amazing talent in our studio and collaborating with some of the best creative brains on the client/agency side. And free beer.

WHAT’S YOUR LEAST FAVORITE?
Being attached to my desk all day (i.e. air traffic control).

WHAT IS YOUR FAVORITE TIME OF THE DAY?
This is going to sound crazy, but Monday mornings. We all kind of gather, catch up and talk about what is happening that week.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Tap dancer or tambourine player. Those are jobs, right?

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
This job chose me. I answered a classified ad in the paper, which totally makes me sound like a dinosaur. It was a job as a receptionist at a post house. I had no idea what that meant but as soon as I walked in I knew it was where I belonged. That was 20 years ago.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
Arby’s TV campaigns are always fun. Every time I get a rough cut and hear the scripts I crack up. The Subaru brand spots we recently finished made me pretty emotional. We just worked on Life cereal’s first TV spot in a decade. I remember “Mikey Likes It.” So, it was cool to see where the brand has evolved to.

jon-stewartWHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
It was pretty fun to be involved in the Arby’s commercial that was a farewell to Jon Stewart. The combination of Ving Rhames singing the Golden Girls theme song and Jon Stewart’s one-liners… we couldn’t go wrong.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
iPhone, Bluetooth in my car and a record player.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
I’m big into Instagram. All my friends want me to Snapchat but I can’t handle one more social media outlet.

DO YOU LISTEN TO MUSIC WHILE YOU WORK?
I do… Otis Redding, Johnny Cash or The Beastie Boys, depending on my mood.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
See the comment about free beer. No really, I try to get out and enjoy the Minnesota lakes. I also take every minute of my allotted vacation time. No rolling over days over for
this girl.

The Colonie provides editing, VFX for Toyota Corolla spot

Chicago’s The Colonie has teamed with Burrell Communications to provide editorial, visual effects and design services for I Do, a broadcast spot introducing the 2017 Toyota Corolla.

Creative editor Bob Ackerman edited with the carmaker’s tagline, “Let’s Go Places,” in mind. Fast-paced cuts and editing effects helped create an upbeat message that celebrates the new Corolla, as well as its target audience — what Toyota refers to as today’s “on-the-go” generation of young adults.

Lewis Williams, Burrell’s EVP/CCO, brought The Colonie onboard early in the process to collaborate with the production company The Cavalry, ensuring the seamless integration of a variety of visual effects that were central to the style of this spot.

The commercial integrates three distinct vignettes. The spot opens with a young woman behind the wheel of her Toyota. She arrives at a city park and her friends help her yarn bomb the surroundings — from hand-knitted tree trunk covers to a slipcover for a love seat and a garbage pail cozy in the likeness of whimsical characters.

barbarFrom the get-go art director Winston Cheung was very focused on keeping the tone of the spot fresh and young. When selecting footage during the edit session, Ackerman and Cheung made sure to use some of the more playful set-ups from the yarn vignette to providing the bold color palette for final transfer.

The second scenario finds an enterprising man parking his Corolla and unloading his “Pop-Up Barbershop” in front of a tall wall featuring artful graffiti. A well-placed painting of a young man’s face extends over the top of the wall completes the picture. As soon as the barber sets up his chair, his first customer arrives.

The third vignette features a young filmmaker shooting footage of the 2017 Toyota as her crew adds some illuminating effects. Taking her cues from this scene, The Colonie senior designer, Jen Moody, crafted a series of shots that use a “light painting” technique to create a trail of light effect. One of the characters writes the spot’s title, I Do with a light, which Moody layered to create a more tangible quality that really sells the effect. VFX supervisor Tom Dernulc took a classic Toyota Corolla from a previous segment and seamlessly integrated it into the background of the scene.

The Colonie’s team explored several methods for creating the various VFX in the spot before deciding upon a combination of Autodesk Flame Premium and Adobe After Effects. Then it was a matter of picking the right moments. Ackerman grabbed some of their top choices, roughed in the effect on the Avid Media Composer, and presented the client with a nearly finished look right from the very first rough cuts.

“Early on, creative director Lisa McConnell had expressed a desire to explore using a series of stills flashing (á la TV’s Scandal) to advance the spot’s story,” says Ackerman. “We loved the idea. Condensing short sequences of footage into rapid progressions of imagery provided us with an innovative way to convey the full scope of these three scenarios in a very limited 30-second time frame — while also adding an interesting visual element to the final spot.”

Fred Keller of Chicago’s Filmworkers provided the color grade, CRC’s Ian Scott performed the audio mix and sound design, and composers Mike Dragovic, Michael Yessian and Brian Yessian provided the score.

The color and sound of Netflix’s The Get Down

The Get Down, Baz Luhrmann’s new series for Netflix, tells the story of the birth of hip-hop in the late 1970s in New York’s South Bronx. The show depicts a world filled with colorful characters pulsating to the rhythms of an emerging musical form.

Shot on the Red Dragon and Weapon in 6K, sound and picture finishing for the full series was completed over several months at Technicolor PostWorks New York. Re-recording mixers Martin Czembor and Eric Hirsch, working under Luhrmann’s direction and alongside supervising sound designer Ruy Garcia, put the show’s dense soundtrack into its final form.

The Get Down

Colorist John Crowley, meanwhile, collaborated with Luhrmann, cinematographer William Rexer and executive producer Catherine Martin in polishing its look. “Every episode is like a movie,” says Czembor. “And the expectations, on all levels, were set accordingly. It was complex, challenging, unique… and super fascinating.”

The Get Down’s soundtrack features original music from composer Elliott Wheeler, along with classic hip-hop tracks and flashes of disco, new wave, salsa and even opera. And the music isn’t just ambiance; it is intricately woven into the story. To illustrate the creative process, a character’s attempt to work out a song lyric might seamlessly transform into a full-blown finished song.

According to Garcia, the show’s music team began working on the project from the writing stage. “Baz uses songs as plot devices — they become part of the story. The music works together with the sound effects, which are also very musical. We tuned the trains, the phones and other sounds and synced them to the music. When a door closes, it closes on the beat.”

Ruy Garcia

The blending of story, music, dialogue and sound came together in the mix. Hirsch, who mixed Foley and effects, recalls an intensive trial-and-error process to arrive at a layering that felt right. “There was more music in this show than anything I’ve previously worked on,” he says. “It was a challenge to find enough sound effects to fill out the world without stepping on the music. We looked for places where they could breathe.”

In terms of tools, they used Avid Pro Tools 12 HD for sound and music, ADR manager for ADR cueing and Sound Miner for Sound FX library management. For sound design they called on Altiverb, Speakerphone and SoundToys EchoBoy to create spaces, and iZotope Iris for sampling. “We mixed using two Avid Pro Tools HDX2 systems and a double operator Avid S6 control surface,” explains Garcia. “The mix sessions were identical to the editorial sessions, including plug-ins, to allow seamless exchange of material and elaborate conformations.”

Music plays a crucial role in the series’ numerous montage sequences, acting as a bridge as the action shifts between various interconnecting storylines. “In Episode 2, Cadillac interrogates two gang members about a nightclub shooting, as Shaolin and Zeke are trying to work out the ‘get down’ — finding the break for a hip-hop beat,” recalls Czembor. “The way those two scenes are cut together with the music is great! It has an amazing intensity.”

Czembor, who mixed dialogue and music, describes the mix as a collaborative process. During the early phases, he and Hirsch worked closely with Wheeler, Garcia and other members of the sound and picture editing teams. “We spent several days pre-mixing the dialogue, effects and music to get it into a basic shape that we all liked,” he explains. “Then Baz would come in and offer ideas on what to push and where to take it next. It was a fun process. With Baz, bigger and bolder is always better.”

The team mostly called on Garcia’s personal sound library, “plus a lot of vintage New York E train and subway recordings from some very generous fellow sound editors,” he says. “Shaolin Fantastic’s kung-fu effects come from an old British DJ’s effects record. We also recorded and edited extensive Foley, which was edited against the music reference guide.”

The Color of Hip-Hop
Bigger and bolder also applied to the picture finishing. Crowley notes that cinematographer William Rexer employed a palette of rich reddish brown, avocado and other colors popular during the ‘70s, all elevated to levels slightly above simple realism. During grading sessions with Rexer, Martin and Luhrmann, Crowley spent time enhancing the look within the FilmLight Baselight, sharpening details and using color to complement the tone of the narrative. “Baz uses color to tell the story,” he observes. “Each scene has its own look and emotion. Sometimes, individual characters have their own presence.”

ohn-crowley

John Crowley

Crowley points to a scene where Mylene gives an electrifying performance in a church (photo above). “We made her look like a superstar,” he recalls. “We darkened the edges and did some vignetting to make her the focus of attention. We softened her image and added diffusion so that she’s poppy and glows.”

The series uses archival news clips, documentary material and stock footage as a means of framing the story in the context of contemporary events. Crowley helped blend this old material with the new through the use of digital effects. “In transitioning from stock to digital, we emulated the gritty 16mm look,” he explains. “We used grain, camera shake, diffusion and a color palette of warm tones. Then, once we got into a scene that was shot digitally, we would gradually ride the grain out, leaving just a hint.”

Crowley says it’s unusual for a television series to employ such complex, nuanced color treatments. “This was a unique project created by a passionate group of artists who had a strong vision and knew how to achieve it,” he says.

Nat Geo’s Bertie Gregory shares tips for managing video in the field

Bertie Gregory may be only 22 years old, but he’s already worked at National Geographic magazine, won the 2015 Young Wildlife Photographer of the Year award and is filming Nat Geo WILD’s first online natural history series.

The show, called wild_life, launched on August 3. Each episode finds Gregory (@BertieGPhoto) seeking out wildlife — salmon, black bears, wolves, etc. — to capture with his cameras. We asked this very busy young Englishman about how he manages his workflow during his 18- to 20-hour days in the field.

Here are Gregory’s Top 5 tips:
1) Have a Backup Plan
Before you set foot in the field, find a data backup system that works for you and stick to it. You’re not always going to be at your best when you’re transferring data from one location to another, and you don’t want to make a mistake. Take time before filming to run through your backup procedures so that there are no surprises.

When downloading from my camera, I always make three copies — one to be stored in a separate geographic location and the other two on me. With file sizes being as large as they are now, having a good workflow in place is absolutely essential. I can aspire to be the best tracker or camera operator, but if we don’t have everything dialed in on the back end, then none of that matters.

2) Choose Reliable Equipment
There are many storage manufacturers competing in the market right now, which has been great for consumers, but be sure that you’re choosing equipment not only based on its price, but also its reliability and durability. There’s plenty of bargain-basement hardware out there that might cost a fraction of their higher-quality counterparts, but they’re likely to let you down exactly at the wrong time.

Between being stuffed in a backpack and overzealous airport baggage handlers, my equipment can really take a beating, so I tend to invest in equipment that might be a bit more expensive initially, but will easily save me significant amounts of time, money and effort over the long-term.

My equipment list:
Cameras:
– Red Dragon
– Canon C300
– Sony FS7
– Multiple GoPros
Computer:
– MacBook Pro

Storage:
– 20TB LaCie 5big Thunderbolt 2
– Multiple 4TB LaCie Rugged RAID drives

3) Choose Speed
I shoot a lot of footage — more than 500 GB on some days — and there’s nothing more soul-crushing than wrapping up 15 hours of filming and realizing that you still have hours of work ahead of you just to back up your data. When I get finished with a day’s shooting, all I want to do is get horizontal as fast as possible. That means I need fast transfer speeds. Look for backup storage devices that use Thunderbolt or USB 3.0 interfaces, and which also incorporate RAID technology to improve both speed and reliability.

4) Get Rid of Distractions
Making one mistake can ruin an entire day’s worth of time, money and effort when you’re backing up your footage. When I’m downloading, I do it in a quiet location without distractions. Just like with everything else in life, you’re going to do a better, quicker job if you have your full attention on the task at hand. Admittedly, this is easier to do in the wilds of Canada than in an office somewhere, but quiet places do exist, even in the modern office.

5) Keep With Your Plan
When you have the right equipment, people and plan in place, you’re ready to go — as long as you keep to that plan. But with the long days, the thankless nature of backing up your data and the strains that being in the field can put on you, it can be very easy at some point down the road to just not keep with the plan.

“Oh, I’ll just do it tomorrow” becomes, “Eh, I can do it this weekend,” which becomes, “Wait, when was the last time I backed up my data?” And while you may get lucky and not suffer a mishap while your data is vulnerable, you’re playing with fire every time you put off backing up your data. Keep to your plan, follow your backup schedule and you won’t ever have to worry.


Check out more on wild_life on Nat Geo Wild.

Trippy Empire of the Sun music video mixes live-action and animation

NYC’s Roof Studio recently partnered with Australian music duo Empire of the Sun on their music video High and Low, a surreal visual journey into a psychedelic trip, which captures the song’s celebration of the innocence and boldness of youth. The lead single from the band’s upcoming Two Vines LP, “High and Low” follows a small group of people as they are guided by a Shaman into the forest to indulge in the experience of mind-changing substances. Using a mix of live-action and animation, the video shows the group’s trip experience — the Empire of the Sun members serve as “emperors.”

Roof and Empire of the Sun previously worked together on the Honda Civic The Dreamer spot via ad agency RPA, which combined Roof’s visual language and direction with the band’s “Walking on a Dream” track.

“We recognized that there was something special in our initial partnership,” says Vinicius Costa, Roof Studio co-founder/director. “[The band] wanted a psychedelic film with a strong connection to nature to visually, yet indirectly, represent the mind-bending journey. However, they were open to our ideas on execution.”

The only constraints Luke Steele and Nick Littlemore had were not to take a too-literal approach to the visualization of the lyrics. In contrast to the duo’s previous album’s desert landscape art direction, this time around they wanted to explore a tropical environment. Initially, Roof sought to create the entire film in CG, however, due to the limited timeframe of three weeks, they felt it was best to combine live-action with animation in order to focus on providing more than 40 realistic CG shots. This shift in direction spurred the studio to develop the natural and psychedelic narratives that tie together.

“The band came to us with a clear point of view, even referencing aspects of some of our previous work,” says Guto Terni, Roof Studio co-founder/director. “From there, we created a loose narrative based on the right balance of live-action and post production visuals. As directors, we engage in every step of the process, from concept to storyboarding and pre-visualization to shooting, and finally, post production and finishing. This project really showcased our full range of capabilities.”

Roof’s directors, Costa, Terni and Sam Mason, were on set for both live-action shoots, including the band shoot in a Los Angeles studio, and the actors and extras shoot in Costa Rica, which provided the tropical aesthetics. Roof had one week to plan and facilitate both shoots and then two weeks to execute the ambitious CG animation.

Roof used a 3D scan of Steele and Littlemore through a technique called photogrammetry in order to create the telescope shot featured in the video. This process created multiple images of the band in which the studio was able to generate a 3D version of its members. From there, Roof added cloth simulation in CG to mimic wind blowing their clothes for more believability. The result is a fantastic shot in which only the band members’ faces are real and the remaining is CG.

Roof used a mix of technology, including Nuke, After Effects, Maya, Modo, 3D Max and Corona to blend of live-action and animation.

Hands of Stone DP and colorist weigh in on film’s look and feel

By Randi Altman

“No mas! No mas!” Those famous words were uttered in desperation by legendary fighter Roberto Durán, putting an end to his rematch with Sugar Ray Leonard. But before that, Durán had impressively defeated the charismatic Sugar Ray, capturing the WBC welterweight title. Durán’s story — along with that of his trainer Ray Arcel — was recently told in The Weinstein Company’s feature Hands of Stone.

Written and directed by Jonathan Jakubowicz, the film’s DP was Miguel Ioan Littin Menz. He worked very closely with director Jakubowicz and FotoKem colorist Kostas Theodosiou to develop several different looks for the film, including for the different decades in which the story takes place, boxing versus training scenes in different locations (New York, Panama, Las Vegas) and flashback scenes.

Robert De Niro and Edgar Ramírez star in HANDS OF STONEThe film stars Édgar Ramírez as Duran, Usher Raymond as Sugar Ray and Robert DeNiro as Ray Arcel.

We were lucky enough to get some time from both Littin Menz and Theodosiou, albeit separately, for questions. First we caught up with Theodosiou.
Enjoy.

How early did you get involved with the film?
Theodosiou: Prior to my involvement in the project, FotoKem’s nextLAB was on location and involved in dailies acquisition and management. However, I started working with the filmmakers at the editorial stage, after the shoot was finished.

What kind of overall look/looks did the director and DP have in mind for the film, and how did they share that vision with you?
Theodosiou: Both the director Jonathan Jakubowicz and the director of photography Miguel Ioan Litten Menz were very hands-on. They supervised each session to make sure we created looks that best suited all the different time periods, as well as the variety of locations used in the production. The story involved multiple locations, including Panama, New York and Las Vegas.

Nearly every scene was shot on location to maintain authenticity, and it was important that we were true to the look and feel of each location. Jonathan and Miguel explained in detail what they wanted to achieve visually, so we created a unique look for each location.

kostas

Kostas Theodosiou

In addition, the story took us through many different time periods that spanned Roberto Duran’s life — from childhood through his entire career. Each time period also required a different treatment to establish its place in time. Every look we created had a purpose and is in the film for a reason. As a result, there are many different looks in this movie, but they all worked together to help tell the story.

You called on Resolve for this film. Can you talk about the tool and how it helps you in your work?
Theodosiou: Resolve is a great platform and allowed me to mix footage that was shot using a variety of different cameras, lenses and aspect ratios. The tools in Resolve helped me blend the footage seamlessly to enhance the filmmakers’ vision, and the results surpassed their expectations.

You mentioned that both the director and DP were in the room with you?
Theodosiou: Yes, Miguel and Jonathan were supervising the color correction from beginning to end. We all had great chemistry and worked together as a team. This was Jonathan’s passion project and he was very invested in the film, so he was deeply involved in the finishing process. And Miguel flew in from Chile to make sure he was here with us.

In the final stages of making the film, additional scenes were added and both filmmakers returned to FotoKem to work with me to make sure the new extended scenes fit in with the mood they were trying to portray. It was a very hands-on experience.

Now let’s hear from DP Miguel Ioan Litten Menz:

What were your first meetings like with Kostas?
Littin Menz: I was very pleased to hear that the color correction was to be done at FotoKem in Los Angeles. We chose Kostas because of his background — he’s worked for Paul Thomas Anderson; Robert Elswit, ASC; Christopher Nolan; and Hoyte van Hoytema, ASC. Since the first meeting, the connection and conversation about aesthetic was immediately understood. Our ideas and feelings about how to adjust the palette of colors for the final look of the film were in sync. He did marvelous work.

director-and-dp

Jonathan Jakubowicz and Miguel Ioan Littin Menz.

What was the general overall look the director had in mind for the film and how did he communicate that to you?
Littin Menz: In general, Jonathan talked about creating different looks between Panama and New York, and at the same time creating a look where you can feel an epic and intimate story at the same time. We want the audience to feel the wild, powerful and sensual colors around Roberto Durán’s life in Panama, and more plain, elegant and sober colors around Ray Arcel’s life in New York. In our research, we looked at thousands of photographs from sports magazines from that period, and also many documentaries.

And for my personal research, I again read Norman Mailer’s book “The Fight” and Jack London’s “The Mexican.”

How would you describe the different looks and feel of the film — decade by decade, location by location?
Littin Menz: I worked very closely with Tomás Voth, the production designer, who did amazing work. We described two very different worlds — Duran’s life in Panama and Ray Arcel’s in New York — so as a general concept we tried to create eclectic and powerful palates of colors for Duran’s life, to mimic his real personality.

For Ray Arcel, we used colors that were more serene and elegant, like he was throughout his entire life. Sometimes I used warm colors to evoke nostalgic times for Ray Arcel, and sometimes cool colors appeared in the sad times for both Duran and Arcel. Decade by decade, from the ‘60s to the ‘80s, we created different looks for timeline reasons but also as part of the intimate space for each character.

What cameras did you use, and why did you opt for three different ones? How did that affect the look and the grade?
Littin Menz: We relied on two Alexa XTs, one Alexa M and three Blackmagic cameras for VFX purposes. One of the Alexas, the B camera, was always prepared for the Steadicam. The C camera and the Alexa M were used for the fights. Also, we used Anamorphic Hawk V Lite Lenses. Kostas was thorough in making sure everything from the different shoots matched.

Can you talk about the shoot? Was there a DIT? If so, what role did they play? And what kind of on-set monitors were you using?
Littin Menz: The DIT was there mostly for making the back-ups and dailies. It was a lot of material every day. We also created LUTs for some scenes. The monitors were Asus VS197D-P 18.5-inch for video assist and a Flanders Scientific for the DIT station.

Was there anything unique or challenging about it that you are particularly proud of?
Littin Menz: On the technical side, it was very challenging to reproduce the big spaces and fights, in places like the Madison Square Garden in New York through three decades, the Olympic Stadium in Montreal and the Superdome in New Orleans, but I think we did it successfully.

Some of my favorite scenes were those of Durán when he was a kid in “El Chorrillo,” the poor neighborhood where he lived. We never forgot that the principal idea for the film was to tell the story through the clear and transparent eyes of that child — the story of a child who came from one of poorest neighborhoods of Latin America and became a world champion. I’m very proud to have been a part of this project.

Review: Microsoft Surface Pro 4 running Resolve 12.5

By Brady Betzel

Not long ago, I was asked if I wanted to check out Blackmagic’s DaVinci Resolve 12.5 on a Microsoft Surface Pro 4. I was dubious, and wondered, “Do they really think I can edit, color correct, and deliver footage on a tablet?”

I was incredulous. I really thought this seemed like a pipe dream for Microsoft and Blackmagic. Everyone who works in post knows that you need a pretty monstrous workstation to play, let alone edit, media. Especially media with resolutions over 1920×1080 and 10-bit color! Well, let’s see how all of that played out.

Thankfully, I received the higher-end version of the Microsoft Surface Pro 4 tablet. Under the hood it was packing a dual-core 2.2 GHz Intel i7, 6650U CPU, 8GB RAM, NVMe Samsung MZFLV256 (256GB SSD) and an Intel Iris graphics 540 GPU. The display sports a beautiful 3:2 aspect ratio at 2736×1824 resolution; not quite the UHD 16:9/1.78:1 or true 4
K 1.9:1 aspect ratio that would be comfortable when working in video, but it’s not bad. Keep in mind that when working with high-resolution displays like an Apple Retina 5K or this Surface Pro 4, some apps will be hard to read even with the scaling bumped up.

Resolve looks great, but the words and icons might be a bit smaller than what you are used to seeing. The Surface Pro 4 weighs an incredibly light 1.73 pounds, measures 11.5×7.93x.33 inches and has the best stand I’ve ever used on a tablet. This is a big pet peeve of mine – terrible tablet stands — but the Surface sports a great one. I am on the go a lot, so I need a sturdy stand that, preferably, is attached. The Surface has the stand every other tablet manufacturer should copy.

I use Wacom products, so I am used to working with a great stylus, therefore, I didn’t have high expectations for the pen included with the Surface Pro. Boy, I was wrong! I was I happily surprised at how nice it was. While it doesn’t have the 2,048 levels of pressure sensitivity present in the Wacom products that you might be used to, it does have 1,024, with great palm rejection. The weight of the pen was great — like really great — and it mounts on the side of the Surface by a strong magnet.

Aside from the mouse and stylus, the Surface Pro 4 has a 10-point touchscreen, but I didn’t use it very much. I found myself defaulting to the stylus when I wanted to interact directly on the screen, like in Photoshop or adjusting curves inside Resolve. Last, but not least, is the tremendous battery life. I was constantly running Resolve as well as playing music from Spotify and Pandora and the battery would last me most of the day. Once I got into heavy grading where I pumped up the brightness, the battery life went to lasting under two to four hours, which I think is still great.

Resolve
Ok, enough gloating about the Surface hardware and onto the real test: Blackmagic’s DaVinci Resolve 12.5 running on a tablet!

Right off the bat — and as you’ve probably already surmised — I’m going to tell you that the Surface 4 Pro is not going to stand up to a powerhouse like the HP z840 with 64GB of RAM and an Nvidia Quadro M6000. But, what I found the Surface Pro 4 excelling at was proxy-based workflows and simple color matching.

You won’t be able to play 4K clips that cleanly, but the Surface 4 Pro and Resolve will allow you to color correct, grade, add a few nodes for things, such as a vignette or qualifier, and even export your grade. But if I were you and wanted to use the Surface Pro appropriately, a nice simple color balance will run great.

Essentially. the Surface Pro is a great way to travel and grade your footage thanks to Intel’s pretty amazing Iris graphics technology. You should really check out Intel’s backstory on how one of their engineers went to NAB 2015 and talked with the Blackmagic crew and figured out what he needed to do to get Intel-based GPUs to work with Resolve. Check this out. Regardless of whether or not there is hyperbole in that video, it is very true that almost anybody can run Resolve, whether you are on a Surface or an Intel-powered desktop.

Oh, don’t forget that for many people, the free version of Resolve will be all they need. Resolve is an amazing nonlinear editor and professional-level color correction software available at anyone’s fingertips for free. This is a fact that cannot be understated.

Testing
To test the Surface 4 Pro, I found some Red 5K footage that I scaled down to 1920×1080 in a 1920×1080 23.976 project, did a simple edit, colored and exported a final QuickTime. When I had the debayer set all the way to full resolution, my Surface started to crawl (crawl would be the polite term — in fact, it was more like melt. This is why I suggest the proxy workflow. However, when I played back at ¼, more so at ⅛, I was actually able to work. I was running around 10 to 12 frames per second. While I know 12fps isn’t the best playback for a 23.976 5K clip at 1920×1080 resolution, it let me do my job while on the go. I like to call it the “Starbuck’s Test.” If I need more than that I definitely should be at home using a HP z840, or DIY custom-built 4K workhorse, which I am looking to build.

If you really want to get the Surface to sing in Resolve 12.5, you should stick to 1920×1080 resolution footage or smaller. With a couple of serial nodes I was able to consistently get 15fps playback. Yeah, I know this isn’t ideal, but if I’m on the run and can’t use a workstation with dual Nvidia Titans or GTX1080 GPUs, 64GB DDR5 RAM, running footage off a Thunderbolt 3 external SSD RAID (a set-up that would cost north of $5K), the Microsoft Surface Pro 4 is a great alternate solution.

Something that is tough to deal with on the Surface is the small text and icon size in Windows 10. While there might be a way to fix it using registry key hacks, I don’t want to do that. I want to set it and forget it. For all I know, there is a way to make the text the right size, but I couldn’t find it easily.

There has to be a way this can be fixed, right? If you know of a true fix let me know on Twitter @allbetzroff. I would really love to know. I tried bumping up the icon/text zoom within Resolve and messing around with the zoom in the Window’s Control Panel, with no luck.

Another issue with using a tablet to color correct and grade is the lack of elegance and fluidity that professional color correction panels allow. If you do color at any sort of professional level you should probably have at the very least something like the Tangent Ripple or Element panels. Using a touch screen, mouse and/or stylus to edit and color correct gets old fast on a tablet.

Using the Tangent Ripple, which is surprisingly portable, I felt the elegance I know and love when using Resolve with a panel. (I will be doing a Tangent Ripple review later for some more in-depth analysis). I did love the ability to use the stylus to get in and fine-tune Power Windows and curves in Resolve, but you will definitely need some extra equipment if you find yourself doing more than a couple adjustments — much like any computer, and not just the Surface.

Summing Up
In the end, the Microsoft Surface Pro 4 (my version goes for around $1,600) is an exceptional tablet. I love isurface-pro-4-portst. In addition to running Resolve 12.5. I also installed the Adobe Suite of tools and did some editing in Premiere, effects in After Effects, transcoding in Media Encoder and even round-tripped my sequence between Resolve and Premiere.

The Surface Pro 4 is a great “away-from-home” computer to run very high-end apps like Resolve 12.5, Premiere Pro CC, and even apps like After Effects with hard core plug-ins like Imagineer System’s Mocha Pro 5.

While the touchscreen and stylus are great for occasional use, you should plan on investing in something like the Tangent Ripple color panel if you will be coloring a ton in Resolve or any other app — it’s even priced well at $350.

From the amazing battery life to the surprisingly snappy response of the Intel Iris 540 GPU inside of pro video editing and color correcting apps like Resolve, the Microsoft Surface Pro 4 is the Windows tablet you need in your mobile multimedia creator life.


Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

Creating new worlds for Amazon’s The Man in the High Castle

Zoic Studios used visual effects to recreate occupied New York and San Francisco.

By Randi Altman

What if Germany and Japan had won World War II? What would the world look like? That is the premise of Philip K. Dick’s 1963 novel and Amazon’s series, The Man in the High Castle, which is currently gearing up for its second season premiere in December.

The Man in the High Castle features familiar landmarks with unfamiliar touches. For example, New York’s Time Square has its typical billboards, but sprinkled in are giant swastika banners, images of Hitler and a bizarro American flag, whose blue stripes have been replaced with yet another swastika. San Francisco, and the entire West Coast, is now under Japanese rule, complete with Japanese architecture and cultural influences. It’s actually quite chilling.

Jeff

Jeff Baksinski

Helping to create these “new” worlds was Zoic Studios, whose team received one of the show’s four Emmy nods for its visual effects work. That team was led by visual effects supervisor Jeff Baksinski.

Zoic’s path to getting the VFX gig was a bit different than most. Instead of waiting to be called for a bid, they got aggressive… in the best possible way. “Both myself and another supervisor here, Todd Shifflett, had read Dick’s book, and we really wanted this project.”

They began with some concept stills and bouncing ideas off each other of what a German-occupied New York would look like. One of Zoic’s producers saw what they were up to and helped secure some money for a real test. “Todd found a bunch of late ‘50s/early 60’s airline commercials about traveling to New York, and strung it together as one piece. Then we added various Nazi banners, swastikas and statues. Our commercial features a pullback from a 1960s-era TV. Then we pull back to reveal a New York penthouse with a Nazi solider standing at the window. The commercial’s very static-y and beat up, but as we pull back out the window, we have a very high-resolution version of Nazi New York.”

And that, my friends, is how they got the show. Let’s find out more from Baksinski…

The Man in the High Castle is an Amazon show. Does the workflow differ from traditional broadcast shows?
Yes. For example, on our network TV shows, typically you’ll get a script each week, you’ll break it down and maybe have 10 days worth of post to execute the visual effects. Amazon and Netflix shows are different. They have a good idea of where their season is going, so you can start building assets well in advance.

High Castle’s version of the Brooklyn Bridge features a Nazi/American flag.

When we did the pilot, we were already building assets while I was going out to set. We were building San Francisco’s Hirohito Airport, the airplane that featured heavily in a few episodes and the cities of New York and San Francisco — a lot of that started before we ever shot a single frame.

It’s a whole new world with the streaming channels.
Everybody does it a little bit differently. Right now when we work on Netflix shows, we are working in shooting order, episode 105, 106, 107, etc., but we have the flexibility to say, “Okay, that one’s going to push longer because it’s more effects-heavy. We’re going to need four weeks on that episode and only two on this other one.” It’s very different than normal episodic TV.

Do you have a preference?
At the moment, my preference is for the online shows. I come from a features background where we had much longer schedules. Even worse, I worked in the days where movies had a year-and-a-half worth of schedule to do their visual effects. That was a different era. When I came into television, I had never seen anything this fast in my life. TV has a super quick turnaround, and obviously audiences have gotten smarter and smarter and want better and better work; television is definitely pushing closer to a features-type look.

Assuming you get more time with the pilots?
I love pilots. You get a big chunk of the story going, and a longer post schedule — six to eight weeks. We had about six weeks on Man in the High Castle, which is a good amount of time to ask, “What does this world look like, and what do they expect? In the case of High Castle, it was really about building a world. We were never going to create a giant robot. It was about how do we do make the world interesting and support our actors and story? You need time to do that.

You were creating a world that doesn’t exist, but also a period piece that takes place in the early ‘60s. Can you talk about that?
We started with what the normal versions of New York and San Francisco looked like in the ‘60s. We did a lot of sketch work, some simple modeling and planning. The next step was what would New York look like if Germany had taken over, and how would San Francisco be different under the influence of Japan?

Zoic added a Japanese feel to San Francisco streets and buildings.

In the case of San Francisco, we found areas in other countries that have heavy Japanese populations and how they influence the architecture —so buildings that were initially built by somebody else and then altered for a Japanese feel. We used a lot of that for what you see in the San Francisco shots.

What about New York?
That was a little bit tougher, because if you’re going to reference back to Germany during the war, you have propaganda signs, but our story takes place in 1962, so you’ve got some 17 years there where the world has gotten used to this German and Nazi influence. So while those signs do exist, we scaled back and added normal signs with German names.

In terms of the architecture, we took some buildings down and put new ones in place. You’ll notice that in our Times Square, traffic doesn’t move as it does in real life. We altered the roads to show how traffic would move if somebody eliminated some buildings and put cross-traffic in.

You also added a little bit of German efficiency to some scenes?
Absolutely. It’s funny… in the show’s New York there are long lines of people waiting to get into various restaurants and stores, and it’s all very regimented and controlled. Compare that to San Francisco where we have people milling about everywhere and it’s overcrowded with a lot of randomness.

How much of what you guys did were matte paintings, and could those be reused?
We use a few different types of matte paintings. We have the Rocky Mountains, for example, in the Neutral Zone. Those are a series of matte paintings we did from different angles that show mountains, trees and rivers. That is reusable for the most part.

Other matte paintings are very specific. For example, in the alley outside of Frank’s apartment, you see clothes hanging out to dry, and buildings all the way down the alleyway that lead to this very crowded-looking city. Those matte paintings are shot-specific.

Then we use matte paintings to show things far off in the distance to cut off the CG. Our New York is maybe four square city blocks around in every direction. When we get down to that fourth block, we started using old film tricks — what they used to do on studio lots, where you start curving the roads, dead-ending, or pinching the roads together. There is no way we could build 30 blocks of CG in every direction. I just can’t get there, so we started curving the CG and doing little tricks so the viewer can’t tell the difference.

What was the most challenging type of effects you created for the show? Which shots are you most proud of?
We are most proud of the Hirohito Airport and the V9 rocket plane. What most people don’t realize is that there’s actually nothing there — we weren’t at a real airport and there’s no plane for the actors to interact with. The actors are literally standing on a giant set of grip pipe and crates and walking down a set of stairs. That plane looks very realistic, even super close-up. You see every bolt and hinge and everything as the actors walk out. The monorail and embassy are also cool.

What do you call on in terms of tools?
We use Maya for modeling and lighting environments and for any animation work, such as a plane flying or the cars driving. There is a plug-in for Maya called Miarmy that we used to create CG people walking around in backgrounds. Some of those shots have hundreds of extras, but it still felt a little bit thin, so we were used CG people to fill in the gaps.

What about compositing?
It’s all Nuke. A lot of our environments are combinations of Photoshop and Nuke or projections onto geometry. Nuke will actually let you use geometry and projections in 3D inside of the compositing package, so some of our compositors are doing environment work as well.

Did you do a lot of greenscreen work?
We didn’t do any on the pilot, but did on the following episodes. We decided to go all roto on the pilot because the show has such a unique lighting set-up — the way the DP wanted to light that show — that green would have completely ruined it. This is abnormal for visual effects, where everyone’s always greenscreening.

street-before-nyRoto is such a painstaking process.
Absolutely. Our DP Jim Hawkinson was coming off of Hannibal at the time. DPs are always super wary of visual effects supervisors because when you come on the set you’re immediately the enemy; you’re about to tell them how to screw up all their lighting (smiles).

He said very clearly, “This is how I like to use light, and these are the paintings and the artwork.” This is the stuff I really enjoy. Between talking to him and director David Semel, and knowing that it was an RSA project, your brain immediately starts going to things like Blade Runner. You’re just listening to the conversations. It’s like, “Oh, this is not straightforward. They’re going to have a very contrast-y, smoky look to this show.”

We did use greenscreen on the rest of the episodes because we had less time. So out of necessity we used green.

What about rendering?
We use V-Ray, which is a global illumination renderer. We’d go out and take HDR images of the entire area for lighting and capture all of the DPs lights — that’s what’s most important to me. The DP set up his lights for a reason. I want to capture as much of his lighting as humanly possible so when I need to add a building or car into the shot, I’m using his lighting to light it.

It’s a starting point because you usually build a little bit on top of that, but that’s typically what we do. We get our HDRs, we bring them into Maya, we light the scene inside of Maya, then we render through V-Ray, and it all gets composited together.

IBC 2016: VR and 8K will drive M&E storage demand

By Tom Coughlin

While attending the 2016 IBC show, I noticed some interesting trends, cool demos and new offerings. For example, while flying drones were missing, VR goggles were everywhere; IBM was showing 8K video editing using flash memory and magnetic tape; the IBC itself featured a fully IP-based video studio showing the path to future media production using lower-cost commodity hardware with software management; and, it became clear that digital technology is driving new entertainment experiences and will dictate the next generation of content distribution, including the growing trend to OTT channels.

In general, IBC 2016 featured the move to higher resolution and more immersive content. On display throughout the show was 360-degree video for virtual reality, as well as 4K and 8K workflows. Virtual reality and 8K are driving new levels of performance and storage demand, and these are just some of the ways that media and entertainment pros are future-zone-2increasing the size of video files. Nokia’s Ozo was just one of several multi-camera content capture devices on display for 360-degree video.

Besides multi-camera capture technology and VR editing, the Future Tech Zone at IBC included even larger 360-degree video display spheres than at the 2015 event. These were from Puffer Fish (pictured right). The smaller-sized spherical display was touch-sensitive so you could move your hand across the surface and cause the display to move (sadly, I didn’t get to try the big sphere).

IBM had a demonstration of a 4K/8K video editing workflow using the IBM FlashSystem and IBM Enterprise tape storage technology, which was a collaboration between the IBM Tokyo Laboratory and IBM’s Storage Systems division. This work was done to support the move to 4K/8K broadcasts in Japan by 2018, with a broadcast satellite and delivery of 8K video streams of the 2020 Tokyo Olympic Games. The combination of flash memory storage for working content and tape for inactive content is referred to as FLAPE (flash and tAPE).

The graphic below shows a schematic of the 8K video workflow demonstration.

The argument for FLAPE appears to be that flash performance is needed for editing 8K content and the magnetic tape provides low-cost storage the 8K content, which may require greater than 18TB for an hour of raw content (depending upon the sampling and frame rate). Note that magnetic tape is often used for archiving of video content, so this is a rather unusual application. The IBM demonstration, plus discussions with media and entertainment professionals at IBC indicate that with the declining costs of flash memory and the performance demands of 8K, 8K workflows may finally drive increased demand for flash memory for post production.

Avid was promoting their Nexis file system, the successor to ISIS. The company uses SSDs for metadata, but generally flash isn’t used for actual editing yet. They agreed that as flash costs drop, flash could find a role for higher resolution and richer media. Avid has embraced open source for their code and provides free APIs for their storage. The company sees a hybrid of on-site and cloud storage for many media and entertainment applications.

EditShare announced a significant update to its XStream EFS Shared Storage Platform (our main image). The update provides non-disruptive scaling to over 5PB with millions of assets in a single namespace. The system provides a distributed file system with multiple levels of hardware redundancy and reduced downtime. An EFS cluster can be configured with a mix of capacity and performance with SSDs for high data rate content and SATA HDD for cost-efficient higher-performance storage — 8TB HDDs have been qualified for the system. The latest release expands optimization support for file-per-frame media.

The IBC IP Interoperability Zone was showing a complete IP-based studio (pictured right) was done with the cooperation of AIMS and the IABM. The zone brings to life the work of the JT-NM (the Joint Task Force on Networked Media, a combined initiative of AMWA, EBU, SMPTE and VSF) and the AES on a common roadmap for IP interoperability. Central to the IBC Feature Area was a live production studio, based on the technologies of the JT-NM roadmap that Belgian broadcaster VRT has been using daily on-air all this summer as part of the LiveIP Project, which is a collaboration between VRT, the European Broadcasting Union (EBU) and LiveIP’s 12 technology partners.

Summing Up
IBC 2016 showed some clear trends to more immersive, richer content with the numerous displays of 360-degree and VR content and many demonstrations of 4K and even 8K workflows. Clearly, the trend is for higher-capacity, higher-performance workflows and storage systems that support this workflow. This is leading to a gradual move to use flash memory to support these workflows as the costs for flash go down. At the same time, the move to IP-based equipment will lead to lower-cost commodity hardware with software control.

Storage analyst Tom Coughlin is president of Coughlin Associates. He has over 30 years in the data storage industry and is the author of Digital Storage in Consumer Electronics: The Essential Guide. He also  publishes the Digital Storage Technology Newsletter, the Digital Storage in Media and Entertainment Report.

My first trip to IBC

By Sophia Kyriacou

When I was asked by the team at Maxon to present my work at their IBC stand this year, I jumped at the chance. I’m a London-based working professional with 20 years of experience as a designer and 3D artist, but I had never been to an IBC. My first impression of the RAI convention center in Amsterdam was that it’s super huge and easy to get lost in for days. But once I found the halls relevant to my interests, the creative and technical buzz hit me like heat in the face when disembarking from a plane in a hot humid summer. It was immediate, and it felt so good!

The sounds and lights were intense. I was surrounded by booths with baselines of audio vibrating against the floor changing as you walked along. It was a great atmosphere; so warm and friendly.

My first Maxon presentation was on day two of IBC — it was a show-and-tell of three award-winning and nominated sequences I created for the BBC in London and one for Noon Visual Creatives. As a Cinema 4D user, it was great to see the audience at the stand captivated by my work. and knowing it was streamed live to a large audience globally made it even more exciting.

The great thing about IBC is that it’s not only about companies shouting about their new toys. I also saw how it brings passionate pros from all over the world together — people you would never meet in your usual day-to-day work life. I met people from all over globe and made new friends. Everyone appeared to share the same or similar experience, which was wonderful.

The great thing about having the first presentation of the day at Maxon meant I could take a breather and look around the show. I also sat in on a Dell Precision/Radeon Technologies roundtable event one afternoon. That was a really interesting meeting. We were a group of pros from varied disciplines within the industry. It was great to talk about what hardware works, what doesn’t work, and how it could all get better. I don’t work in a realtime area, but I do know what I would like to see as someone who works in 3D. It was incredibly interesting, and everyone was so welcoming. Thoroughly enjoyed it.

Sunday evening, I went over to the SuperMeet — such an energetic and friendly vibe. The stage demos were very interesting. I was particularly taken with the fayIN tracker plug-in for Adobe After Effects. It appears to be a very effective tool, and I will certainly look into purchasing it. The new Adobe Premiere features look fantastic as well.

Everything about my time at IBC was so enjoyable. I went back London buzzing, and am already looking forward to next year’s IBC show.

Sophia Kyriacou is a London-based broadcast designer and 3D artist who splits her time working as a freelancer and for the BBC.

IBC: Blackmagic buys Fairlight and Ultimatte

Before every major trade show, we at postPerspective play a little game. Who is Blackmagic going to buy this time? Well, we didn’t see this coming, but it’s cool. Ultimatte and Fairlight are now owned by Blackmagic.

Ultimatte makes broadcast-quality, realtime blue- and greenscreen removal hardware that is used in studios to seamlessly composite reporters and talk show hosts into virtual sets.

Ultimatte was founded in 1976 and has won an Emmy for their realtime compositing technology and a Lifetime Achievement Award from the Academy of Motion Picture Arts and Sciences, as well as an Oscar.

“Ultimatte’s realtime blue- and greenscreen compositing solutions have been the standard for 40 years,” says Blackmagic CEO Grant Petty. Ultimatte has been used by virtually every major broadcast network in the world. We are thrilled to bring Ultimatte and Blackmagic Design together, and are excited about continuing to build innovative products for our customers.”

Fairlight creates professional digital audio products for live broadcast event production, film and television post, as well as immersive 3D audio mixing and finishing. “The exciting part about this acquisition is that it will add incredibly high-end professional audio technology to Blackmagic Design’s video products,” says Petty.

New Products
Teranex AV: A new broadcastquality standards converter designed specifically for AV professionals. Teranex AV features 12G-SDI and HDMI 2.0 inputs, outputs and loop-through, along with AV specific features such as low latency, a still store, freeze frame and HiFi audio inputs for professionals working on live, staged presentations and conferences. Teranex AV will be available in September for $1,695 from Blackmagic resellers.

New Video Assist 4K update: A major new update for Blackmagic Video Assist 4K customers that improves DNxHD and DNxHR support, adds false color monitoring, expanded focus options and new screen rotation features. It is available for download from the Blackmagic website next week, free of charge, for all Blackmagic Video Assist 4K customers.

DeckLink Mini Monitor 4K and Mini Recorder 4K: New DeckLink Mini Monitor 4K and DeckLink Mini Recorder 4K PCIe capture cards that include all the features of the HD DeckLink models but now have Ultra HD and HDR (high dynamic range) features. Both models support all SD, HD and Ultra HD formats up to 2160p30. DeckLink Mini 4K models are available now from Blackmagic resellers for $195 each.

Davinci Resolve 12.5.2: The latest version of Resolve is available free for download from Blackmagic’s site. It adds support for additional Ursa Mini Camera metadata, color space tags on QuickTime export, Fusion Connect for Linux, advanced filtering options and more.

DP Vittorio Storaro on color and Woody Allen’s ‘Café Society’

Legendary Italian cinematographer Vittorio Storaro has had a storied career that includes three Oscar wins for his work on Apocalypse Now (1979), Reds (1981) and The Last Emperor (1987). To call his career prodigious would be an understatement.

One of his most recent projects was for writer/director Woody Allen’s Café Society, which follows a young man from Brooklyn to Hollywood and back to New York City in the 1930s. Two filmmaking legends teaming up on one film? How could we not check in with Storaro to talk about his work on Café Society, which represented Allen’s first taste of digital shooting?

You’ve done 58 movies on film. What was your first experience with DI?
A long time ago, someone at Kodak asked me what I thought about digital intermediate versus film. Because I had already started doing transfer from film to telecine, I had some experience with the process. But the quality was not there yet — digital cameras and color correctors were still in their infancy back then.

My first experience in digital finishing was on a movie called Muhammad: The Messenger of God. In 2011 and 2012, we were doing the pre-production and production of the film, which we shot in Iran. I shot on film because, in my opinion, no digital camera could handle such drastic changing weather conditions. One segment, though, was transferred digitally, mostly for VFX purposes.

For the post of the film in 2013, we sent all the negative material to Arri as both Kodak and Technicolor Italy had closed. Arri scanned the negatives in 4K 16-bit. After that we decided to do the entire DI at ScreenCraft where I could review the film in a 4K 16-bit color screening, which is very important. It was an almost 100 percent switch from film to digital. They also had a FilmLight Baselight system in their screening room that we moved into their beautiful 4K theatre so we could work in the optimal environment.

The colorist at ScreenCraft was not used to doing films, as he had mainly worked on video and TV, so I had to influence him step-by-step, feeling the story. My advice to him was to work on color in realtime, listen to the dialogue, understand the dynamic and not just concentrate on the technical aspect of the fixed images.

In cinematography, the first image doesn’t have to be perfect, it just has to be the starting point, and it is moving in time until you reach the end. So when you see an image through Baselight, you have to think about what you really want to achieve. This is somehow a visual journey, which follows the path of the world where the characters interact, or the music plays.

It is fantastic to have color correction in realtime. Baselight through the 4K 16-bit video projector gave me my first taste of this great opportunity.

How did you come to shoot and finish Café Society digitally?
When Woody Allen asked me to do Café Society, he had never done a digital capture before. At that time, I knew it was a chance to step up to this new digital world. I chose the Sony F65 camera so that the image we had on set was as close to the final image as possible. I had experienced the first CineAlta digital video cameras from Sony in the past and valued the quality of the Sony equipment. I know that what I see on set is 90 percent of exactly what I will see in finishing. Plus, I wanted to work with a camera that gave me a ratio close to the 2:1 aspect ratio that was suggested to me by Leonardo Da Vinci’s painting, along with 4K resolutions.

We also had a 4K 16-bit video projector because that was my previous experience and my preference. And for the post production of the movie at Technicolor PostWorks NY, I asked specifically for the color grading to be done on Baselight. It was good news, as they already had the system!

This is when colorist Anthony Raffaele joined your color journey?
Anthony Raffaele was originally only supposed to be the colorist for the DI, but with Technicolor we decided to have him on board from start to finish. In Italy, we are used to having a technician next to us from the beginning to the end of a project. To me, if the color process moves from one person to another from dailies to post to DI, you risk wasting all the history, the knowledge and the experience that has been built, and in my opinion it’s the best experience that I’ve had.

What is the look of the Café Society and its journey?
In my mind, the movie is in four different parts: it starts in the Bronx in 1935, then moves to Hollywood, then the main character comes back to New York and then to LA. In essence, it is four different looks, while keeping an overall style. I wanted to see the subtle differences in the dailies. I’d get the dailies on Blu-ray copies for me to watch on a calibrated Sony monitor, so it was very, very close to what I had on set. That was the process with Woody Allen too.

Anthony often came to Los Angeles during shooting, and when I was in New York we’d watch the dailies together. Looks were saved to SD cards as LUTs with notes. Every day Anthony was going through all the shots and applying the LUT that he already had, then he would make adjustments according to my notes. We practically grew up together through the entire film. And when we arrived to do the DI we had the right experience to continue.

For finishing, we graded using ACES with Baselight converting to XYZ. We got the EDL from editorial, pulled all the RAW media files from the LTO and conformed in Baselight. I told Anthony to always compare source material with the edited version. Check meticulously for any difference and get the feeling of our original intent. It is very easy to get lost in DI.

It is also very important to me to watch the film with sound, even if it’s temporary sound. The dialogue between two characters can give you some kind of feeling, which impacts the light, for instance. Or the time they have spent talking, everything is always moving. Or the music. If you don’t take notice of the words and sound you cannot adjust the color accordingly. Having said that, Woody also asked to watch the corrected copy without sound.

How much time did you spend on the DI overall?
It depends on the movie, of course, but I usually personally get involved in the DI of the movie over a week. Some movies require more time. It also depends on the relationship you have with the colorist. I don’t know how much time Anthony spent in the dark room polishing the movie without me. He is a perfectionist and because I was always pushing our creative intent, he probably spent time seeing what features within Baselight could do more. I’ve always encouraged him to perfect his art and technical knowledge. I’d say, “Can we try this? Can I look at that? What if we try it? Tell me, show me.”

You talked about the evolution from film to digital to DI. How would you say the role of a cinematographer has changed in this time?
The main change is that before digital, nobody was able to tell how the film would ultimately look. Only the cinematographer — through perception, knowledge, culture, intelligence, technology and experience — would eventually predict how the image would end up looking. Today, with digital capture and high-end technology, the standards are higher and reachable, and pretty much everyone can tell if it’s good or ugly, too contrasted, too bright and so on. Digital video cameras have mostly made everything automatic, you don’t even have to think anymore. But knowing the technology is not enough.

You need to know the meaning of the visual elements as well. Know ALL the arts that are part of cinematography. Cinema is a common art, not a single one. A good cinematographer will bring feeling and composition from the storyline, adding the emotion, the feeling and his own perception to the film — to know how one color connects to another color and the kind of emotional reaction you can have in relation to them.

What about the colorist’s role nowadays?
Firstly, I would say that a colorist has to know everything about production on set so that he or she can cover the journey of the project. Anthony told me, “I learned so much working with you, Vittorio, because I’m not used to being asked the things you ask me, and no one explained the why to me.” I was always referencing paintings, always showing him pictures and explaining why the artist had chosen this particular content or softness for instance.

Secondly, to reach that level where you can transfer a completely abstract idea into images and materialize concepts, the colorist has to know and control the grading system he is using as well as the tools sitting in his color suite.

Finally, the more you go to museums, read books and look at photography, the more you know about art and its evolution. I had such an experience when I was at Technicolor in Rome. A color supervisor I was working with, Ernesto Novelli, had an incredible sensitivity to images. If I asked him to do something, he might suggest adding four red, which I thought was crazy, but he would do so and the image was there, it was superb. He was able to use the technology to achieve the look of the image I wanted. Without such talent the technology doesn’t mean much.

On Café Society we worked effectively because Anthony knew Baselight very well. If I could give any advice to colorists, I would say they have to really know their console to reach the true potential capabilities of the machine. Learn, keep learning and never stop.

———————–
Vittorio Storaro is currently in pre-production on the following films: 33 díasStory of Jesus, The Hunchback and Bach.