Author Archives: Randi Altman

Hobo’s Howard Bowler and Jon Mackey on embracing full-service VR

By Randi Altman

New York-based audio post house Hobo, which offers sound design, original music composition and audio mixing, recently embraced virtual reality by launching a 360 VR division. Wanting to offer clients a full-service solution, they partnered with New York production/post production studios East Coast Digital and Hidden Content, allowing them to provide concepting through production, post, music and final audio mix in an immersive 360 format.

The studio is already working on some VR projects, using their “object-oriented audio mix” skills to enhance the 360 viewing experience.

We touched base with Hobo’s founder/president, Howard Bowler, and post production producer Jon Mackey to get more info on their foray into VR.

Why was now the right time to embrace 360 VR?
Bowler: We saw the opportunity stemming from the advancement of the technology not only in the headsets but also in the tools necessary to mix and sound design in a 360-degree environment. The great thing about VR is that we have many innovative companies trying to establish what the workflow norm will be in the years to come. We want to be on the cusp of those discoveries to test and deploy these tools as the ecosystem of VR expands.

As an audio shop you could have just offered audio-for-VR services only, but instead aligned with two other companies to provide a full-service experience. Why was that important?
Bowler: This partnership provides our clients with added security when venturing out into VR production. Since the medium is relatively new in the advertising and film world, partnering with experienced production companies gives us the opportunity to better understand the nuances of filming in VR.

How does that relationship work? Will you be collaborating remotely? Same location?
Bowler: Thankfully, we are all based in West Midtown, so the collaboration will be seamless.

Can you talk a bit about object-based audio mixing and its challenges?
Mackey: The challenge of object-based mixing is not only mixing based in a 360-degree environment or converting traditional audio into something that moves with the viewer but determining which objects will lead the viewer, with its sound cue, into another part of the environment.

Bowler: It’s the creative challenge that inspires us in our sound design. With traditional 2D film, the editor controls what you see with their cuts. With VR, the partnership between sight and sound becomes much more important.

Howard Bowler pictured embracing VR.

How different is your workflow — traditional broadcast or spot work versus VR/360?
Mackey: The VR/360 workflow isn’t much different than traditional spot work. It’s the testing and review that is a game changer. Things generally can’t be reviewed live unless you have a custom rig that runs its own headset. It’s a lot of trial and error in checking the mixes, sound design, and spacial mixes. You also have to take into account the extra time and instruction for your clients to review a project.

What has surprised you the most about working in this new realm?
Bowler: The great thing about the VR/360 space is the amount of opportunity there is. What surprised us the most is the passion of all the companies that are venturing into this area. It’s different than talking about conventional film or advertising; there’s a new spark and its fueling the rise of the industry and allowing larger companies to connect with smaller ones to create an atmosphere where passion is the only thing that counts.

What tools are you using for this type of work?
Mackey: The audio tools we use are the ones that best fit into our Avid ProTools workflow. This includes plug-ins from G-Audio and others that we are experimenting with.

Can you talk about some recent projects?
Bowler: We’ve completed projects for Samsung with East Coast Digital, and there are more on the way.

Main Image: Howard Bowler and Jon Mackey

Ed Koenig returns to MPC to help lead remote color, VFX services  

Ed Koenig has rejoined visual effects and post production studio MPC as an executive producer. Formerly EP of color at MPC’s Los Angeles office, he was one of the first hires the company made when it opened there in 2008. Koenig brings a broad range of post experience to his new post, where he’s tasked with continuing to grow the studio’s network of official partner facilities and expand its remote services beyond color grading. He will be based in New York, but work out of LA as well.

MPC has also announced new additions to its official partner facility roster: 11 Dollar Bill in Chicago and Hero Post in Atlanta. They join Charlieuniformtango in Austin and Dallas, The Work in Detroit and Ditch in Minneapolis as the studio’s list of partner facilities.

“We’re not merely looking to connect with top performers in major markets around the country,” Koenig explains, “but to redefine how these independent companies work with a studio as multifaceted as ours. I’ll also be functioning as a kind of roving advance scout for MPC, finding ways clients anywhere can take full advantage of what we have to offer in a way that works best for them.”

He cites as an example the work they’ve done with Ditch since adding them to the partner roster last year. MPC has not only provided several of the boutique’s clients with high-end color grading, performed by colorists in both its LA and New York offices, but also compositing, finishing, Flame work and a range of other VFX services, all performed by artists based many miles from the Twin Cities. “In these instances, we just point our signal toward Minneapolis and we’re collaborating with Ditch owner and editor Brody Howard and his entire team.”

“A big part of what I’m doing is bringing wide-ranging projects into MPC through our remote partners,” he continues. “While remote color has become an accepted part of the post production mix, our push to expand into a broader roster of visual effects capabilities puts us out in front.”

Creating a sonic world for The Zookeeper’s Wife

By Jennifer Walden

Warsaw, Poland, 1939. The end of summer brings the beginning of war as 140 German planes, Junkers Ju-87 Stukas, dive-bomb the city. At the Warsaw Zoo, Dr. Jan Żabiński (Johan Heldenbergh) and his wife Antonina Żabiński (Jessica Chastain) watch as their peaceful sanctuary crumbles: their zoo, their home and their lives are invaded by the Nazis. Powerless to fight back openly, the zookeeper and his wife join the Polish resistance. They transform the zoo from an animal sanctuary into a place of sanctuary for the people they rescue from the Warsaw Ghetto.

L-R: Anna Behlmer, Terry_Porter and Becky Sullivan.

Director Niki Caro’s film The Zookeeper’s Wife — based on Antonina Żabińska’s true account written by Diane Ackerman — presents a tale of horror and humanity. It’s a study of contrasts, and the soundtrack matches that, never losing the thread of emotion among the jarring sounds of bombs and planes.

Supervising sound editor Becky Sullivan, at the Technicolor at Paramount sound facility in Los Angeles, worked closely with re-recording mixers Anna Behlmer and Terry Porter to create immersive soundscapes of war and love. “You have this contrast between a love story of the zookeeper and his wife and their love for their own people and this horrific war that is happening outside,” explains Porter. “It was a real challenge in the mix to keep the war alive and frightening and then settle down into this love story of a couple who want to save the people in the ghettos. You have to play the contrast between the fear of war and the love of the people.”

According to Behlmer, the film’s aerial assault on Warsaw was entirely fabricated in post sound. “We never see those planes, but we hear those planes. We created the environment of this war sonically. There are no battle sequence visual effects in the movie.”

“You are listening to the German army overtake the city even though you don’t really see it happening,” adds Sullivan. “The feeling of fear for the zookeeper and his wife, and those they’re trying to protect, is heightened just by the sound that we are adding.”

Sullivan, who earned an Oscar nom for sound editing director Angelina Jolie’s WWII film Unbroken, had captured recordings of actual German Stukas and B24 bomber planes, as well as 70mm and 50mm guns. She found library recordings of the Stuka’s signature Jericho siren. “It’s a siren that Germans put on these planes so that when they dive-bombed, the siren would go off and add to the terror of those below,” explains Sullivan. Pulling from her own collection of WWII plane recordings, and using library effects, she was able to design a convincing off-screen war.

One example of how Caro used sound and clever camera work to effectively create an unseen war was during the bombing of the train station. Behlmer explains that the train station is packed with people crying and sobbing. There’s an abundance of activity as they hustle to get on the arriving trains. The silhouette of a plane darkens the station. Everyone there is looking up. Then there’s a massive explosion. “These actors are amazing because there is fear on their faces and they lurch or fall over as if some huge concussive bomb has gone off just outside the building. The people’s reactions are how we spotted explosions and how we knew where the sound should be coming from because this is all happening offstage. Those were our cues, what we were mixing to.”

“Kudos to Niki for the way she shot it, and the way she coordinated these crowd reactions,” adds Porter. “Once we got the soundscape in there, you really believe what is happening on-screen.”

The film was mixed in 5.1 surround on Stage 2 at Technicolor Paramount lot. Behlmer (who mixed effects/Foley/backgrounds) used the Lexicon 960 reverb during the train station scene to put the plane sounds into that space. Using the LFE channel, she gave the explosions an appropriate impact — punchy, but not overly rumbly. “We have a lot of music as well, so I tried really hard to keep the sound tight, to be as accurate as possible with that,” she says.

ADR
Another feature of the train station’s soundscape is the amassed crowd. Since the scene wasn’t filmed in Poland, the crowd’s verbalizations weren’t in Polish. Caro wanted the sound to feel authentic to the time and place, so Sullivan recorded group ADR in both Polish and German to use throughout the film. For the train station scene, Sullivan built a base of ambient crowd sounds and layered in the Polish loop group recordings for specificity. She was also able to use non-verbal elements from the production tracks, such as gasps and groans.

Additionally, the group ADR played a big part in the scenes at the zookeeper’s house. The Nazis have taken over the zoo and are using it for their own purposes. Each day their trucks arrive early in the morning. German soldiers shout to one another. Sullivan had the German ADR group perform with a lot of authority in their voices, to add to the feeling of fear. During the mix, Porter (who handled the dialogue and music) fit the clean ADR into the scenes. “When we’re outside, the German group ADR plays upfront, as though it’s really their recorded voices,” he explains. “Then it cuts to the house, and there is a secondary perspective where we use a bit of processing to create a sense of distance and delay. Then when it cuts to downstairs in the basement, it’s a totally different perspective on the voices, which sounds more muffled and delayed and slightly reverberant.”

One challenge of the mix and design was to make sure the audience knew the location of a sound by the texture of it. For example, the off-stage German group ADR used to create a commotion outside each morning had a distinct sonic treatment. Porter used EQ on the Euphonix System 5 console, and reverb and delay processing via Avid’s ReVibe and Digidesign’s TL Space plug-ins to give the sounds an appropriate quality. He used panning to articulate a sound’s position off-screen. “If we are in the basement, and the music and dialogue is happening above, I gave the sounds a certain texture. I could sweep sounds around in the theater so that the audience was positive of the sound’s location. They knew where the sound is coming from. Everything we did helped the picture show location.”

Porter’s treatment also applied to diegetic music. In the film, the zookeeper’s wife Antonina would play the piano as a cue to those below that it was safe to come upstairs, or as a warning to make no sound at all. “When we’re below, the piano sounds like it’s coming through the floor, but when we cut to the piano it had to be live.”

Sound Design
On the design side, Sullivan helped to establish the basement location by adding specific floor creaks, footsteps on woods, door slams and other sounds to tell the story of what’s happening overhead. She layered her effects with Foley provided by artist Geordy Sincavage at Sinc Productions in Los Angeles. “We gave the lead German commander Lutz Heck (Daniel Brühl) a specific heavy boot on wood floor sound. His authority is present in his heavy footsteps. During one scene he bursts in, and he’s angry. You can feel it in every footstep he takes. He’s throwing doors open and we have a little sound of a glass falling off of the shelf. These little tiny touches put you in the scene,” says Sullivan.

While the film often feels realistic, there were stylized, emotional moments. Picture editor David Coulson and director Caro juxtapose images of horror and humanity in a sequence that shows the Warsaw Ghetto burning while those lodged at the zookeeper’s house hold a Seder. Edits between the two locations are laced together with sounds of the Seder chanting and singing. “The editing sounds silky smooth. When we transition out of the chanting on-camera, then that goes across the cut with reverb and dissolves into the effects of the ghetto burning. It sounds continuous and flowing,” says Porter. The result is hypnotic, agrees Behlmer and Sullivan.

The film isn’t always full of tension and destruction. There is beauty too. In the film’s opening, the audience meets the animals in the Warsaw Zoo, and has time to form an attachment. Caro filmed real animals, and there’s a bond between them and actress Chastain. Sullivan reveals that while they did capture a few animal sounds in production, she pulled many of the animal sounds from her own vast collection of recordings. She chose sounds that had personality, but weren’t cartoony. She also recorded a baby camel, sea lions and several elephants at an elephant sanctuary in northern California.

In the film, a female elephant is having trouble giving birth. The male elephant is close by, trumpeting with emotion. Sullivan says, “The birth of the baby elephant was very tricky to get correct sonically. It was challenging for sound effects. I recorded a baby sea lion in San Francisco that had a cough and it wasn’t feeling well the day we recorded. That sick sea lion sound worked out well for the baby elephant, who is struggling to breathe after it’s born.”

From the effects and Foley to the music and dialogue, Porter feels that nothing in the film sounds heavy-handed. The sounds aren’t competing for space. There are moments of near silence. “You don’t feel the hand of the filmmaker. Everything is extremely specific. Anna and I worked very closely together to define a scene as a music moment — featuring the beautiful storytelling of Harry Gregson-Williams’ score, or a sound effects moment, or a blend between the two. There is no clutter in the soundtrack and I’m very proud of that.”


Jennifer Walden is a New Jersey-based audio engineer and writer.

Comprimato plug-in manages Ultra HD, VR files within Premiere

Comprimato, makers of GPU-accelerated storage compression and video transcoding solutions, has launched Comprimato UltraPix. This video plug-in offers proxy-free, auto-setup workflows for Ultra HD, VR and more on hardware running Adobe Premiere Pro CC.

The challenge for post facilities finishing in 4K or 8K Ultra HD, or working on immersive 360­ VR projects, is managing the massive amount of data. The files are large, requiring a lot of expensive storage, which can be slow and cumbersome to load, and achieving realtime editing performance is difficult.

Comprimato UltraPix addresses this, building on JPEG2000, a compression format that offers high image quality (including mathematically lossless mode) to generate smaller versions of each frame as an inherent part of the compression process. Comprimato UltraPix delivers the file at a size that the user’s hardware can accommodate.

Once Comprimato UltraPix is loaded on any hardware, it configures itself with auto-setup, requiring no specialist knowledge from the editor who continues to work in Premiere Pro CC exactly as normal. Any workflow can be boosted by Comprimato UltraPix, and the larger the files the greater the benefit.

Comprimato UltraPix is a multi-platform video processing software for instant video resolution in realtime. It is a lightweight, downloadable video plug-in for OS X, Windows and Linux systems. Editors can switch between 4K, 8K, full HD, HD or lower resolutions without proxy-file rendering or transcoding.

“JPEG2000 is an open standard, recognized universally, and post production professionals will already be familiar with it as it is the image standard in DCP digital cinema files,” says Comprimato founder/CEO Jirˇí Matela. “What we have achieved is a unique implementation of JPEG2000 encoding and decoding in software, using the power of the CPU or GPU, which means we can embed it in realtime editing tools like Adobe Premiere Pro CC. It solves a real issue, simply and effectively.”

“Editors and post professionals need tools that integrate ‘under the hood’ so they can focus on content creation and not technology,” says Sue Skidmore, partner relations for Adobe. “Comprimato adds a great option for Adobe Premiere Pro users who need to work with high-resolution video files, including 360 VR material.”

Comprimato UltraPix plug-ins are currently available for Adobe Premiere Pro CC and Foundry Nuke and will be available on other post and VFX tools soon. You can download a free 30-day trial or buy Comprimato UltraPix for $99 a year.

Film and sound editor Dody Dorn to headline NAB SuperMeet

The 16th Annual Las Vegas SuperMeet is taking place on April 25 at the Rio Hotel in Las Vegas during the NAB Show. Oscar-nominated film and sound editor Dody Dorn will be the featured presenter.

SuperMeets are networking gatherings of Final Cut Pro, Adobe, Avid and Resolve editors, gurus and digital filmmakers. Tickets are on sale now on the SuperMeet website. Doors open at 4:30pm with the SuperMeet Digital Showcase featuring 20 software and hardware developers. SuperMeet presentations will begin at 7:00pm and continue until 11:00pm.

Dorn received an Oscar nomination for Christopher Nolan’s debut feature, Memento (along with nominations for an AFI Film Award and the ACE Eddie Award for her editing). That same year, Dorn earned Emmy and Eddie Award noms for her work on the ABC miniseries, Life With Judy Garland: Me and My Shadows.

Throughout the 1980s, she worked mostly in the sound arena, with additional supervising and sound editing credits that include Silverado, The Big Chill, Mrs. Soffel, Racing With the Moon, The Big Easy and Children of a Lesser God.

Dorn started the sound company Sonic Kitchen in 1989 with sound designer/composer Blake Leyh, and, in 1990, won a Golden Reel Award for Best Sound from the Motion Picture Sound Editors Society for James Cameron’s The Abyss.

Following her work on Memento, Dorn reunited with filmmaker Nolan on his next feature project, Insomnia. She then began a collaboration with Ridley Scott, editing his next three films — Matchstick Men, Kingdom of Heaven and A Good Year.

Dorn most recently completed work on 2017’s Power Rangers.

Review: Blackmagic’s DaVinci Resolve Mini Panel

By Brady Betzel

If you’ve never used a color correction panel like the Tangent Element, Tangent Ripple, Avid Artist Color, or been fortunate enough to touch the super high-end FilmLight Blackboard 2, Blackmagic Advanced Panel or the Nucoda Precision Control Panel, then you don’t know what you are missing.

If you can, reach out to someone at a post house and sit at a real color correction console; it might change your career path. I’ve talked about it before, but the first time I sat in a “real” (a.k.a. expensive) color correction/editing bay I knew that I was on the right career path.

Color correction can be done without using color correction panels, but think of it like typing with one hand (maybe even one finger) — sure it can be done, but you are definitely missing out on the creative benefit of fluidity and efficiency.

In terms of affordable external color correction panels, Tangent makes the Ripple, Wave and Element panel sets that range from $350 to over $3,300, but work with pretty much every color correction app I can think of (even Avid if you use the Baselight plug-in). Avid offers the Artist Color panel, which also works with many apps, including Avid Media Composer, and costs about $1,300. Beyond those two, you have the super high-end panels that I mentioned earlier; they range from $12,000 to $29,999.

Blackmagic recently added two new offerings to their pool of color correction panel hardware: the DaVinci Resolve Micro Panel and DaVinci Resolve Mini Panel. The Micro is similar in size and functionality to the Avid Artist panel, and the Mini is similar to the center part of most high-end color correction panels.

One important caveat to keep in mind is that you can only use these panels with Blackmagic’s Resolve, and Resolve must be updated to at least version 12.5.5 to function. They connect to your computer via USB 3 Type C or Ethernet.

I received the Resolve Mini Panel to try out for a couple of weeks, and immediately loved it. If you’ve been lucky enough to use a high-end color correction panel like Blackmagic’s Advanced Panel, then you will understand just how great it feels to control Resolve with hardware. In my opinion, using hardware panels eliminates almost 90 percent of the stumbling when using color correction software as opposed to using a keyboard and mouse. The Resolve Mini Panel is as close as you are going to get to professional-level color correction hardware panel without spending $30,000.

Digging In
Out of the box, the panel feels hefty but not too heavy. It’s solid enough to sit on a desk and not have to worry about it walking around while you are using it. Of course, because I am basically a kid, I had to press all the buttons and turn all the dials before I plugged it in. They feel great… the best-feeling wheels and trackballs on a $3,000 panel I’ve used. The knobs and buttons feel fine. I’m not hating on them, but I think I like the way the Tangent buttons depress better. Either way, that is definitely subjective. The metal rings and hefty trackballs are definitely on the level of the high-end color correction panels you can see in pro color bays.

Without regurgitating Blackmagic’s press release in full, I want to go over what I think really shines on this panel. I love the two five-inch LCD panels just above the main rings and trackballs. Below the LCDs and above the row of 12 knobs are eight more knobs that interact with the LCDs. Above the LCDs are eight soft buttons and a bunch of buttons that help you navigate around the node tree and jump into different modes, like qualifiers and tracking.

Something I really loved when working with the Mini Panel was adding points on a curve and adjusting those individual points. This is one of the best features of the Mini Panel, in my opinion. Little shortcuts like adding a node + circle window in one key press are great features. Directly above the trackballs and rings are RGB, All and Level buttons that can reset their respective parameters for each of the Lift Gamma and Gain changes you’ve made. Above those are buttons like Log, Offset and Viewer — a quick way to jump into Log mode, Offset mode and full-screen Viewer mode.

When reading about the user buttons and FX buttons in the Resolve manual it states that they will be enabled in future releases, which gets me excited about what else could be coming down the pike. NAB maybe?

Of course, there can be improvements. I mean, it is a Version 1 product, but everything considered Blackmagic really hit it out of the park. To see what some pros think needs to be changed and/or altered troll over to the holy grail of color correction forums: Lift Gamma Gain. You’ll even notice some Blackmagic folks sniffing around answering questions and hinting at what is coming in some updates. In addition, Blackmagic has their own forum where an interesting post popped up titled DaVinci Mini Panel Suggestion Box. This is another great post to hang around.

Wishlist/Suggestions
When using the panels, when I would exit Resolve the LCDs didn’t dim or go into screen-saver mode like some other panels I’ve used. Furthermore, there isn’t a dimmer for the brightness of the LCD screens and backlit buttons. In the future, I would love the ability to dim or completely shut off the panels when I am in other apps or presenting to a client and don’t want the panel glowing. The backlit keys aren’t terribly bright though, so it’s not a huge deal.

While in the forums, I did notice posts about the panel’s inability to do the NLE-style of transport control: double tapping fast forward to go faster. Furthermore, a wheel might be a nice transport addition for scrubbing. In the node shortcut buttons, I couldn’t find an easy way to delete a node or add an outside node directly from the panel. On other panels, I love moving shapes/windows around using the trackballs but, unfortunately, you can only move/adjust the windows around with knobs, which isn’t terrible but is definitely less natural than using the trackballs. Lastly, I kind of miss the ability to set and load memories from a panel, with the Mini Panel we don’t have that option….yet. Maybe it will come in an update since there are buttons with numbers on them, but who knows.

Mini and Micro Panel
Technically, the Mini Panel is the Micro Panel but with the addition of the top LCDs and buttons. It also has the ability to connect the panel not just by USB-C but also via Ethernet. If connecting via Ethernet, there has been some talk of power over Ethernet (PoE) compatibility, which powers your panel without the need for a power cable. Some folks have had less success with standard PoE, but have had success using PoE+ appliances — something to keep in mind.

Both the Micro and Mini Panels have the standard three trackballs and rings, 12 control knobs and 18 keys hard coded for specific tasks and transport controls. In addition, the Mini Panel has two 5-inch screens, eight additional soft buttons, eight additional soft knobs and 30 additional hard-coded buttons that focus on node navigation and general mode navigation.

Both the Micro and Mini Panels are powered via USB-C, but the Mini Panel also adds PoE connection as mentioned earlier, as well as a 4-pin XLR DC power connection. Something to note: I thought that when I received the Mini Panel I might have been missing a power cable from the box because I had a test unit, but upon more forum reading I found that you do not get a power cable with the Mini Panel. While Blackmagic does ship a USB 3.0 to USB-C adapter cable with the Mini and Micro Panels, they do not ship a power cable, which is unfortunate and an odd oversight, but since the panels are affordable I guess it’s not that big of a deal. Plus, if you are a post nerd like me, you probably have a few 5-15 to C13 power cables lying around the house.

I can’t shake the feeling that Blackmagic is going to be adding some additional external panels to piece together something like the Advanced Panel set-up (much like how the Tangent Element panel set can be purchased). Things like an external memory bank or an X-Keys type set-up seem not too far off for Blackmagic. I would even love to be able to turn the LCD screens into scopes if possible, and even hook up an Ultrascope via the panel so I don’t have to purchase additional hardware. Either way, the Mini Panel gets me real excited about the path Blackmagic is carving for their Resolve users.

Summing Up
In the end, if you are a professional colorist looking for a semi-portable panel and haven’t committed to the Tangent Element ecosphere yet, the Resolve Mini Panel is for you … and your credit card. The Mini Panel is as close to a high-end color correction panel that I have seen, and has a wallet-friendly retail price of $2,995. It is very solid and doesn’t feel like a substitute for a full-sized panel — it can hold its own.

One thing I was worried about when I began writing this review was questioning whether or not tying myself down to one piece of software was a good idea. When you invest in the Mini Panel, you are wholeheartedly dedicating yourself to DaVinci Resolve, and I think that is a safe bet.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Rick Anthony named GM of Light Iron New York

Post company Light Iron has named Rick Anthony to the newly created role of general manager in its New York facility. The addition comes after Light Iron added a second floor in 2016, tripling its inventory of editorial suites.

Anthony previously held GM roles at Pac Lab and New York Lab/Postworks/Moving Images, overseeing teams from lab through digital workflows. He began his career at New York film lab, DuArt, where he was a technical supervisor for many years.

Anthony notes several reasons why he joined Light Iron, a Panavision company. “From being at the forefront of color science and workflow to providing bi-coastal client support, this is a unique opportunity. Working together with Panavision, I look forward to serving the dailies, editorial, and finishing needs of any production, be it feature, episodic or commercial.”

Light Iron’s New York facility offers 20 premium editorial suites from its Soho location, as well as in-house and mobile dailies services, HDR-ready episodic timing bays and a 4K DI theater. The facility recently serviced Panavision’s first US-based feature shot on the new Millennium DXL camera.

Behind the Title: Sounding Sweet audio producer/MD Ed Walker

NAME: Ed Walker

COMPANYSounding Sweet (@sounding_sweet)

CAN YOU DESCRIBE YOUR STUDIO?
We are a UK-based independent recording and audio production company with a recording studio in Stratford Upon Avon, Warwickshire, and separate postproduction facilities in Leamington Spa. Our recording studio is equipped with the latest technology, including a 7.1 surround sound dubbing suite and two purpose-built voiceover booths, which double as Foley studios and music recording spaces when necessary. We are also fully equipped to record ADR, via Source Connect and ISDN.

WHAT’S YOUR JOB TITLE?
Audio producer, sound engineer and managing director — take your pick.

WHAT DOES THAT ENTAIL?
As we are a small business and I am very hands-on., and my responsibilities change on a daily basis. They may include pitching to new clients, liaising with existing clients, overseeing projects from start to finish and ensuring our audio deliveries as a team are over and above what the client is expecting.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
Creating and implementing interactive sound into video games is a technical challenge. While I don’t write code myself, as part of working in this industry, I have had to develop a technical understanding of game development and software programming in order to communicate effectively and achieve my audio vision.

WHAT’S YOUR FAVORITE PART OF THE JOB?
I often get the opportunity to go out and record supercars and motorbikes, as well as occasionally recording celebrity voiceovers in the studio. We work with clients both locally and globally, often working across different time zones. We are definitely not a 9-to-5 business.

WHAT’S YOUR LEAST FAVORITE?
Working through the night during crunch periods is hard. However, we understand that the main audio effort is usually applied toward the end of a project, so we are kind of used to it.

WHAT’S YOUR FAVORITE TIME OF THE DAY?
I would have to say first thing in the morning. My studio is so close to home that I get to see my family before I go to work.

IF YOU DID NOT HAVE THIS JOB WHAT WOULD YOU BE DOING INSTEAD?
If I wasn’t producing audio I would have to be doing something equally creative. I need an outlet for my thoughts and emotions, perhaps video editing or creating visual effects.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I have always loved music, as both my parents are classically trained musicians. After trying to learn lots of different instruments, I realized that I had more of an affinity with sound recording. I studied “Popular Music and Recording” at university. Later on, I realized that a lot of the music recording skills I had learned were transferable to creating sound effects for computer games.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
– BMW Series 7 launch in Bahrain — sound design
– Jaguar F Pace launch in Bahrain — sound design
Forza Horizon 3 for Microsoft/Playground Games —  audio design
Guitar Hero Live for Activision — audio design

Forza Horizon 3 Lamborghini

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
I worked as a sound designer at Codemasters for several years, and I have very fond memories of working on Dirt 2. It sounded awesome back in 2009 in surround sound on the Xbox 360! More recently, Sounding Sweet’s work for Playground Games on Forza Horizon 3 was a lot of fun, and I am very proud of what we achieved.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT?
A portable sound recorder, an iPhone and a kettle.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
Facebook, LinkedIn and Twitter

DO YOU LISTEN TO MUSIC WHILE YOU WORK?
All kinds of music — classics, reggae, rock, electronic, the Stones, Led Zeppelin… the list is truly endless.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
My wife is half Italian, so we often visit her “homeland” to see the family. This really is the time when I get to switch off.

Frame.io 2.0 offers 100 new features, improvements for collaboration

Frame.io, developers of the video review and collaboration platform for content creators, has unveiled Frame.io 2.0 , an upgrade offering over 100 new features and improvements. This new version features new client Review Pages, which expands content review and sharing. In addition, the new release offers deeper workflow integration with Final Cut Pro X and Avid Media Composer, plus a completely re-engineered player.

“Frame.io 2 is based on everything we’ve learned from our customers over the past two years and includes our most-requested features,” says Emery Wells, CEO of Frame.io.

Just as internal teams can collaborate using Frame.io’s comprehensive annotation and feedback tools, clients can now provide detailed feedback on projects with Review Pages, which is designed to make the sharing experience simple, with no log-in required.

Review Pages give clients the same commenting ability as collaborators, without exposing them to the full Frame.io interface. Settings are highly configurable to meet specific customer needs, including workflow controls (approvals), security (password protection, setting expiration date) and communication (including a personalized message for the client).

The Review Pages workflow simplifies the exchange of ideas, consolidating feedback in a succinct manner. For those using Adobe Premiere or After Effects, those thoughts flow directly into the timeline, where you can immediately take action and upload a new version. Client Review Pages are also now available in the Frame.io iOS app, allowing collaboration via iPhones and iPads.

Exporting and importing comments and annotations into Final Cut Pro X and Media Composer has gotten easier with the upgraded, free desktop companion app, which allows users to open downloaded comment files and bring them into the editor as markers. There is now no need to toggle between Frame.io and the NLE.

Users can also now copy and paste comments from one version to another. The information is exportable in a variety of formats, whether that’s a PDF containing a thumbnail, timecode, comment, annotation and completion status that can be shared and reviewed with the team or as a .csv or .xml file containing tons of additional data for further processing.

Also new to Frame.io 2.0 is a SMPTE-compliant source timecode display that works with both non-drop and drop-frame timecode. Users can now download proxies straight from Frame.io.

The Frame.io 2.0 player page now offers better navigation, efficiency and accountability. New “comment heads” allow artists to visually see who left a comment and where so they can quickly find and prioritize feedback on any given project. Users can also preview the next comment, saving them time when one comment affects another.

The new looping feature, targeting motion and VFX artists, lets users watch the same short clip on loop. You can even select a range within a clip to really dive in deep. Frame.io 2.0’s asset slider makes it easy to navigate between assets from the player page.

The new Frame.io 2.0 dashboard has been redesigned for speed and simplicity. Users can manage collaborators for any given project from the new collaborator panel, where adding an entire team to a project takes one click. A simple search in the project search bar makes it easy to bring up a project. The breadcrumb navigation bar tracks every move deeper into a sub-sub-subfolder, helping artists stay oriented when getting lost in their work. The new list view option with mini-scrub gives users the birds-eye view of everything happening in Frame.io 2.0.

Copying and moving assets between projects takes up no additional storage, even when users make thousands of copies of a clip or project. Frame.io 2.0 also now offers the ability to publish direct to Vimeo, with full control over publishing options, so pros can create the description and set privacy permissions, right then and there.

The importance of audio in VR

By Anne Jimkes

While some might not be aware, sound is 50 percent of the experience in VR, as well as in film, television and games. Because we can’t physically see the audio, it might not get as much attention as the visual side of the medium. But the balance and collaboration between visual and aural is what creates the most effective, immersive and successful experience.

More specifically, sound in VR can be used to ease people into the experience, what we also call “on boarding.” It can be used subtly and subconsciously to guide viewers by motivating them to look in a specific direction of the virtual world, which completely surrounds them.

In every production process, it is important to discuss how sound can be used to benefit the storytelling and the overall experience of the final project. In VR, especially the many low-budget independent projects, it is crucial to keep the importance and use of audio in mind from the start to save time and money in the end. Oftentimes, there are no real opportunities or means to record ADR after a live-action VR shoot, so it is important to give the production mixer ample opportunity to capture the best production sound possible.

Anne Jimkes at work.

This involves capturing wild lines, making sure there is time to plant and check the mics, and recording room tone. Things that are already required, albeit not always granted, on regular shoots, but even more important on a set where a boom operator cannot be used due to the 360 degree view of the camera. The post process is also very similar to that for TV or film up to the point of actual spatialization. We come across similar issues of having to clean up dialogue and fill in the world through sound. What producers must be aware of, however, is that after all the necessary elements of the soundtrack have been prepared, we have to manually and meticulously place and move around all the “audio objects” and various audio sources throughout the space. Whenever people decide to re-orient the video — meaning when they change what is considered the initial point of facing forward or “north” — we have to rewrite all this information that established the location and movement of the sound, which takes time.

Capturing Audio for VR
To capture audio for virtual reality we have learned a lot about planting and hiding mics as efficiently as possible. Unlike regular productions, it is not possible to use a boom mic, which tends to be the primary and most naturally sounding microphone. Aside from the more common lavalier mics, we also use ambisonic mics, which capture a full sphere of audio and matches the 360 picture — if the mic is placed correctly on axis with the camera. Most of the time we work with Sennheiser and use their Ambeo microphone to capture 360 audio on set, after which we add the rest of the spatialized audio during post production. Playing back the spatialized audio has become easier lately, because more and more platforms and VR apps accept some form of 360 audio playback. There is still a difference between the file formats to which we can encode our audio outputs, meaning that some are more precise and others are a little more blurry regarding spatialization. With VR, there is not yet a standard for deliverables and specs, unlike the film/television workflow.

What matters most in the end is that people are aware of how the creative use of sound can enhance their experience, and how important it is to spend time on capturing good dialogue on set.


Anne Jimkes is a composer, sound designer, scholar and visual artist from the Netherlands. Her work includes VR sound design at EccoVR and work with the IMAX VR Centre. With a Master’s Degree from Chapman University, Jimkes previously served as a sound intern for the Academy of Television Arts & Sciences.