Tag Archives: Mel Lambert

Netflix's Stranger Things

AES LA Section & SMPTE Hollywood: Stranger Things sound

By Mel Lambert

The most recent joint AES/SMPTE meeting at the Sportsmen’s Lodge in Studio City showcased the talents of the post production crew that worked on the recent Netflix series Stranger Things at Technicolor’s facilities in Hollywood.

Over 160 attendees came to hear how supervising sound editor Brad North, sound designer Craig Henighan, sound effects editor Jordan Wilby, music editor David Klotz and dialog/music re-recording mixer Joe Barnett worked their magic on last year’s eight-episode Season One (Sadly, effects re-recording mixer Adam Jenkins was unable to attend the gathering.) Stranger Things, from co-creators Matt Duffer and Ross Duffer, is scheduled to return in mid-year for Season 2.

L-R: Jordan Wilby, Brad North, Craig Henighan, Joe Barnett, David Klotz and Mel Lambert. Photo Credit: Steve Harvey.

Attendees heard how the crew developed each show’s unique 5.1-channel soundtrack, from editorial through re-recording — including an ‘80s-style, synth-based music score, from Austin-based composers Kyle Dixon and Michael Stein, that is key to the show’s look and feel — courtesy of a full-range surround sound playback system supplied by Dolby Labs.

“We drew our inspiration — subconsciously, at least — from sci-fi films like Alien, The Thing and Predator,” Henighan explained. The designer also revealed how he developed a characteristic sound for the monster that appears in key scenes. “The basic sound is that of a seal,” he said. “But it wasn’t as simple as just using a seal vocal, although it did provide a hook — an identifiable sound around which I could center the rest of the monster sounds. It’s fantastic to take what is normally known as a nice, light, fun-loving sound and use it in a terrifying way!” Tim Prebble, a New Zealand-based sound designer, and owner of sound effects company Hiss and A Roar, offers a range of libraries, including SD003 Seal Vocals|Hiss and A Roar.

Gear used includes Avid Pro Tools DAWs — everybody works in the box — and Avid 64-fader, dual-operator S6 console at the Technicolor Seward Stage. The composers use Apple Logic Pro to record and edit their AAF-format music files.


Mel Lambert is principal of Content Creators, an LA-based copywriting and editorial service, and can be reached at mel.lambert@content-creators.com. Follow him on Twitter @MelLambertLA.

 

Editor Joe Walker on establishing a rhythm for Denis Villeneuve’s Arrival

By Mel Lambert

For seasoned picture editor Joe Walker, ACE, his work with directors Denis Villeneuve and Steve McQueen might best be described as “three times a charm.” His trio of successes with Villeneuve include the drug enforcement drama Sicario, the alien visitor film Arrival and the much-anticipated, upcoming sci-fi drama Blade Runner 2049, which is currently in post. His three films with McQueen include Hunger, Shame and the 2014 Oscar-winner for Best Picture 12 Years a Slave, which earned Walker a nomination for his editing work.

In addition, he has worked on a broad array of films, ranging from director Michael Mann’s cyber thriller Blackhat to writer/director Rupert Wyatt’s The Escapist to director Daniel Barber’s Harry Brown to writer/director Rowan Joffe’s Brighton Rock, which is a reworking of the Graham Greene classic.

Arrival - Paramount

We are currently in midst of awards season, and recently Paramount’s Arrival received eight Oscar noms, including Best Director and a Best Editing nod for Walker. The film was also nominated for nine BAFTA Award nominations, including Best Picture Editing, Best Director and Best Film. It has also been nominated for an American Cinema Editors Eddie in the Best Edited Feature Film — Dramatic category. (Read our interview with director Denis Villeneuve here.)

“My approach to all the films I have edited is to find the basic ‘rhythm’ of a scene,” Walker concedes. His background as a sound designer and composer enhance those sensibilities, in terms of internal pacing, beat and dramatic pulse.

The editor’s path toward Villeneuve began at a 2010 screening of Incendies in his native London. ”I was blown away and set my heart on working with this director. That same heart was beating out of my chest a few years later watching 2014’s Prisoners. While finishing Michael Mann’s Blackhat in 2015, my agent got me into the room with Denis for Sicario, which had a very solid script. That evolution felt like it was going in the right direction for me. Cinematographer Roger Deakins produced stunning work — he’s also cinematographer on Blade Runner 2049.” (Deakins was nominated for both Oscar and BAFTA Awards for Sicario.)

The Edit
For Arrival, Walker’s biggest challenge was reconciling the two parallel worlds that existed within the evolving dramatic arcs. While several alien spacecraft land around the world, a linguistics expert (Amy Adams) is recruited by the military to determine whether they come in peace. “On the one hand we have the natural setting of the mother/daughter relationship, with beautiful, intimate material shot by a lakeside near Montreal, and the narrative content on a far lower gas,” explains Walker. “That’s pitted against the high-tech world of space ships as we learn more about the alien visitors and the psychological task faced as the lead character tries to decode their complex written language. Without CGI visuals of the Heptapods — the multi-limb visitors — I had to make early decisions about what space to leave in a scene for their eventual movements. From what was shot on set, all we had were puppeteers holding tennis balls on a stick.”

ARRIVAL by Paramount PicturesWalker saw every Arrival daily and started his cut early. “We had to turn over the Heptapod sequences to Montreal VFX house Hybride almost as soon as the director’s cut began,” he says. “And because, for me, sound always drives a lot of what I do, I brought on creature sound designer Dave Whitehead ahead of the game. I’d been impressed by Dave’s work on [Neill Blomkamp’s] District 9. I needed to know what type of sounds would be used for the aliens, and cut accordingly. He developed a coherent language with an inbuilt syntax and really nailed the ‘character’ of the Heptapods. I laid up his sounds onto tracks in my Avid Media Composer and they stayed pretty much unchanged all the way through post.”

In terms of pace and narrative arcs, Walker states that director Villeneuve “chose to starve the audience of information and just offer intriguing nuggets, teasing out the suspense and keeping them waiting for the pay off. For example, on one scene we hold on Amy Adams’ face watching the breaking news on the TV rather than the TV show itself,” which was reporting the mysterious spacecraft touching down in 12 cities. “Forest Whitaker [US Army Colonel Weber] plays our first audio of the Heptapods on a Dictaphone and it stimulates such curiosity about how they may look or behave. We avoided any pressure of cutting for the sake of cutting. Instead, we stayed on a shot, let it play and did not do all the thinking for the audience. While editing 12 Years a Slave, we stay on the hanging scene and don’t cut away. There’s no relief, it allows the audience to be truly troubled by the horrible inertia of the scene.”

ARRIVAL by Paramount PicturesAgain, the word “rhythm” figures prominently within Walker’s creative vocabulary. “I always try to find the rhythm of a scene — one that works with the sounds and music elements. For Sicario, I developed peaks and troughs in the dramatic flow that supported different points of view” as the audience slowly begins to understand the complexity of the drug enforcement campaign. “Bad sound disturbs me, including distorted or widely variable dialogue levels. I always work hard to get the best out of the production tracks, perhaps more than I really have time for.

“With both Steve McQueen and Denis Villeneuve, I’ve always tried to avoid using music temp tracks, so that we do not become too influenced during the editing process,” he continues. “By holding off until we’re late into a final cut, we can stay critical in our judgments about the story and characters. When brought in later, music becomes a huge bonus since you’ve already been ruthless with the story. You use music only where it’s absolutely necessary, allowing silence or sound effects to have their day. I think composers want the freedom of a blank canvas. Otherwise, as the English composer Matthew Herbert once said, ‘Music is in an abusive relationship with film.’”

Changing Direction During Edit
While cutting Arrival, Walker recalls that one key scene took a dramatic left turn. “As scripted and shot,” he explains, “the nightmare sequence started out as a normal scene in which Amy Adams’ character, Louise, is visited in her quarters by colleague Ian [Jeremy Renner] and her boss, Colonel Webber, who decides to bench her. This was the beginning of a long piece of story tubing, which felt redundant. We’d tried to discard it, but the scene had an essential piece of information that we couldn’t live without: the notion that exposure to a language can rewire your mind.

ARRIVAL by Paramount Pictures“We thought about conveying that information elsewhere as voiceover or ADR, but instead, as an experiment, we strung together very crudely only the pieces we needed, thereby creating at one point a jarring join between one line of Ian’s dialogue and another. I always try to be ballsy with material, to stay on it with confidence or maul it, to tell the story a better way.”

In that pivotal scene in Arrival, during a close-up, Adams’ character is looking off-camera toward Whitaker. “But we never cut to him because it would take us down the path we wanted to avoid,” explains Walker. “As it happened, that same day in the cutting room, we saw the first test shots from Hybride’s VFX team of an alien crawling forward, looking like an elephant shrouded in mist. That first look inspired our decision to hold onto Adams’ off-camera look for as long as we could, and then — instead of going to a matching reverse revealing Forest Whitaker — we cut to this huge alien crouching in the corner of her bedroom.

“The scene was rounded off by a shot of Amy’s character waking up and looking utterly thrown. We kept the jarring cut [from Ian and then back to him], and added the incongruous sound of a canary, since it signaled early on that all is not as it seems. A nightmare was a great way to get inside Louise’s head. Ian’s presence in her dream also platforms their romance, which enters so late in the story. Normally, returning material to a cut can feel like putting wet swimming trunks back on, but here it set our minds alight.”

Adams’ performance throughout Arrival was thrilling to cut, says Walker. “She is very real in every take and always true to character, keeping her performance at just the right temperature for each scene. Every nuance counts, particularly in a film that has to hold up to scrutiny on a second or third viewing when more is understood about the true nature of things. To hold the audience’s attention in a scene, an editor’s craft involves a balance between time and tension.”

ARRIVAL by Paramount PicturesWalker says, “Time is our superpower since we can slow a moment down, speed it up or jump from one shard of a timeline to another. In Arrival we had two parallel worlds: the real-life world of the army camp with all the news on TVs and heavy technology. In opposition is the child’s world of caterpillars and nature. I could cut those together at will and flip quickly from one to the other.”

Walker says that after the 10-week shoot for Arrival, he spent a week finalizing his editor’s cut and then 10 to 14 weeks on the director’s cut with basic CGI. “We then went through test screenings as the final photorealistic CGI elements slowly took shape,” he recalls. “We refined the film’s overall pace and rhythm and made sure that each tiny fragment of this fantastic puzzle was told as well as we could. I consider the result to be really one of the most successful edits I have been involved with.”


LA-based Mel Lambert is principal of Content Creators. He can be reached at mel.lambert@content-creators.com. Follow him on Twitter @MelLambertLA.


Main Image: Joe Walker and Denis Villeneuve. Photo Credit Javier Marcheselli. 

 

AES Conference focuses on immersive audio for VR/AR

By  Mel Lambert

The AES Convention, which was held at the Los Angeles Convention Center in early October, attracted a broad cross section of production and post professionals looking to discuss the latest technologies and creative offerings. The convention had approximately 13,000 registered attendees and more than 250 brands showing wares in the exhibits halls and demo rooms.

Convention Committee co-chairs Valerie Tyler and Michael MacDonald, along with their team, created the comprehensive schedule of workshops, panels and special events for this year’s show. “The Los Angeles Convention Center’s West Hall was a great new location for the AES show,” said MacDonald. “We also co-located the AVAR conference, and that brought 3D audio for gaming and virtual reality into the mainstream of the AES.”

“VR seems to be the next big thing,” added AES executive director Bob Moses, “[with] the top developers at our event, mapping out the future.”

The two-day, co-located Audio for Virtual and Augmented Reality Conference was expected to attract about 290 attendees, but with aggressive marketing and outreach to the VR and AR communities, pre-registration closed at just over 400.

Aimed squarely at the fast-growing field of virtual/augmented reality audio, this conference focused on the creative process, applications workflow and product development. “Film director George Lucas once stated that sound represents 50 percent of the motion picture experience,” said conference co-chair Andres Mayo. “This conference demonstrates that convincing VR and AR productions require audio that follows the motions of the subject and produces a realistic immersive experience.”

Spatial sound that follows head orientation for headsets powered either by dedicated DSP, game engines or smartphones opens up exciting opportunities for VR and AR producers. Oculus Rift, HTC Vive, PlayStation VR and other systems are attracting added consumer interest for the coming holiday season. Many immersive-audio innovators, including DST and Dolby, are offering variants of their cinema systems targeted at this booming consumer marketplace via binaural headphone playback.

Sennheiser’s remarkable new Ambeo VR microphone (pictured left) can be used to capture 3D sound and then post produced to prepare different spatial perspectives — a perfect adjunct for AR/VR offerings. At the high end, Nokia unveiled its Ozo VR camera, equipped with eight camera sensors and eight microphones, as an alternative to a DIY assembly of GoPro cameras, for example.

Two fascinating keynotes bookended the AVAR Conference. The opening keynote, presented by Philip Lelyveld, VR/AR initiative program manager at the USC Entertainment Technology Center, Los Angeles, and called “The Journey into Virtual and Augmented Reality,” defined how virtual, augmented and mixed reality will impact entertainment, learning and social interaction. “Virtual, Augmented and Mixed Reality have the potential of delivering interactive experiences that take us to places of emotional resonance, give us agency to form our own experiential memories, and become part of the everyday lives we will live in the future,” he explained.

“Just as TV programming progressed from live broadcasts of staged performances to today’s very complex language of multithread long-form content,” Lelyveld stressed, “so such media will progress from the current early days of projecting existing media language with a few tweaks to a headset experience into a new VR/AR/MR-specific language that both the creatives and the audience understand.”

Is his closing keynote, “Future Nostalgia, Here and Now: Let’s Look Back on Today from 20 Years Hence,” George Sanger, director of sonic arts at Magic Leap, attempted to predict where VR/AR/MR will be in two decades. “Two decades of progress can change how we live and think in ways that boggle the mind,” he acknowledged. “Twenty years ago, the PC had rudimentary sound cards, now the entire ‘multitrack recording studio’ lives on our computers. By 2036, we will be wearing lightweight portable devices all day. Our media experience will seamlessly merge the digital and physical worlds; how we listen to music will change dramatically. We live in the Revolution of Possibilities.”

According to conference co-chair Linda Gedemer, “It has been speculated by Wall Street [pundits] that VR/AR will be as game changing as the advent of the PC, so we’re in for an incredible journey!”

Mel Lambert, who also gets photo credit on pictures from the show, is principal of Content Creators, an LA-based copywriting and editorial service, and can be reached at mel.lambert@content-creators.com Follow him on Twitter @MelLambertLA

Industry pros gather to discuss sound design for film and TV

By Mel Lambert

The third annual Mix Presents Sound for Film and Television conference attracted some 500 production and post pros to Sony Pictures Studios in Culver City, California, last week to hear about the art of sound design.

Subtitled “The Merging of Art, Technique and Tools,” the one-day conference kicked off with a keynote address by re-recording mixer Gary Bourgeois, followed by several panel discussions and presentations from Avid, Auro-3D, Steinberg, JBL Professional and Dolby.

L-R: Brett G. Crockett, Tom McCarthy, Gary Bourgeois and Mark Ulano.

During his keynote, Bourgeois advised, “Sound editors and re-recording mixers should be aware of the talent they bring to the project as storytellers. We need to explore the best ways of using technology to be creative and support the production.” He concluded with some more sage advice: “Do not let the geek take over! Instead,” he stressed, “show the passion we have for the final product.”

Other highlights included a “Sound Inspiration Within the Storytelling Process” panel organized by MPSE and moderated by Carolyn Giardina from The Hollywood Reporter. Panelists included Will Files, Mark P. Stoeckinger, Paula Fairfield, Ben L. Cook, Paul Menichini and Harry Cohen. The discussion focused on where sound designers find their inspiration and the paths they take to create unique soundtracks.

CAS hosted a sound-mixing panel titled “Workflow for Musicals in Film and Television Production” that focused on live recording and other techniques to give musical productions a more “organic” sound. Moderated by Glen Trew, the panel included music editor David Klotz, production mixer Phil Palmer, playback specialist Gary Raymond, production mixer Peter Kurland, re-recording mixer Gary Bourgeois and music editor Tim Boot.

Sound Inspiration Within the Storytelling Process panel (L-R): Will Files, Ben L. Cook, Mark P. Stoeckinger, Carolyn Giardina, Harry Cohen, Paula Fairfield and Paul Menichini.

Sponsored by Westlake Pro, a panel called “Building an Immersive Room: Small, Medium and Large” covered basic requirements of system design and setup — including console/DAW integration and monitor placement — to ensure that soundtracks translate to the outside world. Moderated by Westlake Pro’s CTO, Jonathan Deans, the panel was made up of Bill Johnston from Formosa Group, Nathan Oishi from Sony Pictures Studios, Jerry Steckling of JSX, Brett G. Crockett from Dolby Labs, Peter Chaikin from JBL and re-recording mixers Mark Binder and Tom Brewer.

Avid hosted a fascinating panel discussion called “The Sound of Stranger Things,” which focused on the soundtrack for the Netflix original series, with its signature sound design and ‘80s-style, synthesizer-based music score. Moderated by Avid’s Ozzie Sutherland, the panel included sound designer Craig Henighan, SSE Brad North, music editor David Klotz and sound effects editor Jordan Wilby. “We drew our inspiration from such sci-fi films as Alien, The Thing and Predator,” Henighan said. Re-recording mixers Adam Jenkins and Joe Barnett joined the discussion via Skype from the Technicolor Seward stage.

The Barbra Streisand Scoring Stage.

A stand-out event was the Production Sound Pavilion held on the Barbra Streisand Scoring Stage, where leading production sound mixers showed off their sound carts, with manufacturers also demonstrating wireless, microphone and recorder technologies. “It all starts on location, with a voice in a microphone and a clean recording,” offered CAS president Mark Ulano. “But over the past decade production sound has become much more complex, as technologies and workflows evolved both on-set and in post production.”

Sound carts on display included Tom Curley’s Sound Devices 788t recorder and Sound Devices CL9 mixer combination; Michael Martin’s Zaxcom Nomad 12 recorder and Zaxcom Mix-8 mixer; Danny Maurer’s Sound Devices 664 recorder and Sound Devices 633 mixer; Devendra Cleary’s Sound Devices 970, Pix 260i and 664 recorders with Yamaha 01V and Sound Devices CL-12 mixers; Charles Mead’s Sound Devices 688 recorder with CL-12 mixer; James DeVotre’s Sound Devices 688 recorder with CL-12 Alaia mixer; Blas Kisic’s Boom Recorder and Sound Devices 788 with Mackie Onyx 1620 mixer; Fernando Muga’s Sound Devices 788 and 633 recorders with CL-9 mixer; Thomas Cassetta’s Zaxcom Nomad 12 recorder with Zaxcom Oasis mixer; Chris Howland’s Boom Recorder, Sound Devices and 633 recorders, with Mackie Onyx 1620 and Sound Devices CL-12 mixers; Brian Patrick Curley’s Sound Devices 688 and 664 recorders with Sound Devices CL-12 Alaia mixer; Daniel Powell’s Zoom F8 recorder/mixer; and Landon Orsillo’s Sound Devices 688 recorder.

Lon Neumann

CAS also organized an interesting pair of Production Sound Workshops. During the first one, consultant Lon Neumann addressed loudness control with an overview of loudness levels and surround sound management of cinema content for distribution via broadcast television.

The second presentation, hosted by Bob Bronow (production mixer on Deadliest Catch) and Joe Foglia (Marley & Me, Scrubs and From the Earth to the Moon), covered EQ and noise reduction in the field. While it was conceded that, traditionally, any type of signal processing on location is strongly discouraged — such decisions normally being handled in post — the advent of multitrack recording and isolated channels means that it is becoming more common for mixers to use processing on the dailies mix track.

New for this year was a Sound Reel Showcase that featured short samples from award-contending and to-be-released films. The audience in the Dolby Atmos- and Auro 3D-equipped William Holden Theatre was treated to a high-action sequence from Mel Gibson’s new film, Hacksaw Ridge, which is scheduled for release on November 4. It follows the true story of a WWII army medic who served during the harrowing Battle of Okinawa and became the first conscientious objector to be awarded the Medal of Honor. The highly detailed Dolby Atmos soundtrack was created by SSE/sound designer/recording mixer Robert Mackenzie working at Sony Pictures Studios with dialogue editor Jed M. Dodge and ADR supervisor Kimberly Harris, with re-recording mixers Andy Wright and Kevin O’Connell.

Mel Lambert is principal of Content Creators, and can be reached at mel.lambert@content-creators.com. Follow him on Twitter @MelLambertLA.

All photos by Mel Lambert.

 

AES Paris: A look into immersive audio, cinematic sound design

By Mel Lambert

The Audio Engineering Society (AES) came to the City of Light in early June with a technical program and companion exhibition that attracted close to 2,600 pre-registrants, including some 700 full-pass attendees. “The Paris International Convention surpassed all of our expectations,” AES executive director Bob Moses told postPerspective. “The research community continues to thrive — there was great interest in spatial sound and networked audio — while the business community once again embraced the show, with a 30 percent increase in exhibitors over last year’s show in Warsaw.” Moses confirmed that next year’s European convention will be held in Berlin, “probably in May.”

Tom Downes

Getting Immersed
There were plenty of new techniques and technologies targeting the post community. One presentation, in particular, caught my eye, since it posed some relevant questions about how we perceive immersive sound. In the session, “Immersive Audio Techniques in Cinematic Sound Design: Context and Spatialization,” co-authors Tom Downes and Malachy Ronan — both of who are AES student members currently studying at the University of Limerick’s Digital Media and Arts Research Center, Ireland — questioned the role of increased spatial resolution in cinematic sound design. “Our paper considered the context that prompted the use of elevated loudspeakers, and examined the relevance of electro-acoustic spatialization techniques to 3D cinematic formats,” offered Downes. The duo brought with them a scene from writer/director Wolfgang Petersen’s submarine classic, Das Boot, to illustrate their thesis.

Using the university’s Spatialization and Auditory Display Environment (SpADE) linked to an Apple Logic Pro 9 digital audio workstations and a 7.1.4 playback configuration — with four overhead speakers — the researchers correlated visual stimuli with audio playback. (A 7.1-channel horizontal playback format was determined by the DAW’s I/O capabilities.) Different dynamic and static timbre spatializations were achieved by using separate EQ plug-ins assigned to horizontal and elevated loudspeaker channels.

“Sources were band-passed and a 3dB boost applied at 7kHz to enhance the perception of elevation,” Downes continued. “A static approach was used on atmospheric sounds to layer the soundscape using their dominant frequencies, whereas bubble sounds were also subjected to static timbre spatialization; the dynamic approach was applied when attempting to bridge the gap between elevated and horizontal loudspeakers. Sound sources were split, with high frequencies applied to the elevated layer, and low frequencies to the horizontal layer. By automating the parameters within both sets of equalization, a top-to-bottom trajectory was perceived. However, although the movement was evident, it was not perceived as immersive.”

The paper concluded that although multi-channel electro-acoustic spatialization techniques are seen as a rich source of ideas for sound designers, without sufficient visual context they are limited in the types of techniques that can be applied. “Screenwriters and movie directors must begin to conceptualize new ways of utilizing this enhanced spatial resolution,” said Downes.

Rich Nevens

Rich Nevens

Tools
Merging Technologies demonstrated immersive-sound applications for the v.10 release of Pyramix DAW software, with up to 30.2-channel routing and panning, including compatibly for Barco Auro, Dolby Atmos and other surround formats, without the need for additional plug-ins or apps, while Avid showcased additions for the modular S6 Assignable Digital Console, including a Joystick Panning Module and a new Master Film Module with PEC/DIR switching.

“The S6 offers improved ergonomics,” explained Avid’s Rich Nevens, director of worldwide pro audio solutions, “including enhanced visibility across the control surface, and full Ethernet connectivity between eight-fader channel modules and the Pro Tools DSP engines.” Reportedly, more than 1,000 S6 systems have been sold worldwide since its introduction in December 2013, including two recent installations at Sony Pictures Studios in Culver City, California.

Finally, Eventide came to the Paris AES Convention with a remarkable new multichannel/multi-element processing system that was demonstrated by invitation only to selected customers and distributors; it will be formally introduced during the upcoming AES Convention in Los Angeles in October. Targeted at film/TV post production, the rackmount device features 32 inputs and 32 discrete outputs per DSP module, thereby allowing four multichannel effects paths to be implemented simultaneously. A quartet of high-speed ARM processors mounted on plug-in boards can be swapped out when more powerful DSP chips became available.

Joe Bamberg and Ray Maxwell

Joe Bamberg and Ray Maxwell

“Initially, effects will be drawn from our current H8000 and H9 processors — with other EQ, dynamics plus reverb effects in development — and can be run in parallel or in series, to effectively create a fully-programmable, four-element channel strip per processing engine,” explained Eventide software engineer Joe Bamberg.

“Remote control plug-ins for Avid Pro Tools and other DAWs are in development,” said Eventide’s VP of sales and marketing, Ray Maxwell. The device can also be used via a stand-alone application for Apple iPad tablets or Windows/Macintosh PCs.

Multi-channel I/O and processing options will enable object-based EQ, dynamic and ambience processing for immersive-sound production. End user price for the codenamed product, which will also feature Audinate Dante, Thunderbolt, Ravenna/AES67 and AVB networking, has yet to be announced.

Mel Lambert is principal of Content Creators, an LA-based copywriting and editorial service, and can be reached at mel.lambert@content-creators.com. Follow him on Twitter @MelLambertLA.

Media Toaster creates efficient QC, 
post workflow for film, TV

By Mel Lambert

“We have developed a two-tier solution that expedites QC on one end, while enhancing and simplifying asset management and delivery on the other,” describes Michael Meis, chief technology officer at Media Toaster, a recently opened multi-room post facility in Burbank. The studio has developed an innovative business model using proprietary technologies — including QiFile and MonsterFile — to speed up media review, approval, archival and delivery processes.

Meis was joined earlier this year by long-time collaborator Michael DeFusco, who is director of post production. The two met while working at Crest Digital for several years; later DeFusco moved on to Post Logic and then Sony Pictures, where he again worked with Meis.

L-R: Michael Meis and Mike DeFusco

The standard QC (quality control) model is to send a printed report to the client, who then has to either search through a DVD or individual clips in order to make a decision about their material. “The QiFile is a way of enhancing the critical QC process by embedding an entire quality-control report — complete with suggested changes — into a relatively small HD file,” DeFusco explains. “This, in turn, enables the client to make informed decisions in a timely manner. Turnaround times are further improved by setting up a secured, easy-to-use virtual desktop where clients can play and download the QiFile [or other media] directly from our production server to [the client’s] computer or mobile device; the client can then work as efficiently as if they were accessing the file from within our facility.”

“We QC content from a variety of sources, is destined for delivery via a number of outlets and formats,” adds Meis. “Since we only have between 24 and 48 hours to perform our critical quality-control services, this proprietary process has noticeably increased our efficiency and throughput.”

By way of an example, Meis and DeFusco cite ongoing projects with Starz Entertainment. “Our embedded QC reports accompany the media files throughout the process and can be accessed by our operators and remote clients,” says DeFusco. The facility currently handles QC and media delivery for Starz’ Black Sails, Ash vs Evil Dead and other offerings.

What makes the process unique, the collaborators say, is that most archival and delivery workflows are limited by the number of available tracks, but, says DeFusco, “MonsterFile has the capacity to hold an unlimited number of audio and video tracks. Also, repurposing is typically done through various departments and by different operators. With our processes, many tasks — including transcodes, conversions, captions, pitch-correction, audio compliance and final delivery to anywhere in the world — can be quickly completed by one operator who never needs to leave his workstation,” reports Meis. “All of which saves our clients time and money.

Mike DeFusco and Michael Meis1NEW
Media Toaster’s control room and Mike DeFusco and Michael Meis at work.

“Media Toaster uses industry-standard Aspera technology file-transfers and MPAA-sanctioned firewalls to ensure high-speed access across multiple data networks. “We use high-speed Fibre Channel interconnects and a 10GbE intranet,” explains Meis. “Aspera’s software solution lets us move data at maximum speed, regardless of file size, transfer distance or network conditions.”

The facility operates a total of six post/QC suites, with a staff of close to a dozen operators and support staff. Apple Final Cut Pro X is used exclusively for picture editing, with extra support from Adobe Premiere Pro and Avid Media Composer. Staff call on industry-standard Avid Pro Tools for audio deliveries. The in-house server infrastructure uses about 0.5 Petabyte of Promise Technology raw storage, with dual-band fiber-optic wiring and 10 Gbit Ethernet speeds to move data around the facility.

Media Toaster offers a range of services “from new content creation, minor picture and sound tweaks, all the way up to complete overhaul or digital distribution, including 4k/UHD and DCP creation,” says Meis. Other services include picture and sound editorial, color grading, voiceover, music scoring, ADR and Foley.

Marc Vanocur

Marc Vanocur

Video and broadcast material are only a part of Media Toaster’s offerings. For independent film productions, the company provides all the post services for modest-budget motion pictures. For Aristar Entertainment and Incendiary Features’ Dead Awake, directed by Phillip Guzman and written by Jeffrey Reddick (Final Destination), co-executive producer/producer Galen Walker opted to use the Media Toaster for a variety of key post functions. Peter Devaney was the picture editor for this one, while Jussi Tegelman was sound supervisor.

“Marc Vanocur of Shout| Softly has located to our facility providing additional services. Marc brings a production services component with camera, grip and lighting, and a large suite with both color and finishing and full music scoring capabilities. It is a highly collaborative effort that’s saving us time and money; we have shaved maybe six weeks off our post schedule. The QiFile has been the key to our  tracking processes though the film’s completion.

Mel Lambert is principal of Content Creators, an LA-based editorial service. He can be reached at mel.lambert@content-creators.com, and follow him on Twitter @MelLambertLA.

Mark Mangini keynotes The Art of Sound 
Design at Sony Studios

Panels focus on specifics of music, effects and dialog sound design, and immersive soundtracks

By Mel Lambert

Defining a sound designer as somebody “who uses sound to tell stories,” Mark Mangini, MPSE, was adamant that “sound editors and re-recording mixers should be authors of a film’s content, and take creative risks. Art doesn’t get made without risk.”

A sound designer/re-recording mixer at Hollywood’s Formosa Group Features, Mangini outlined his sound design philosophy during a keynote speech at the recent The Art of Sound Design: Music, Effects and Dialog in an Immersive World conference, which took place at Sony Pictures Studios in Culver City.

Mangini is recipient of three Academy Award nominations for The Fifth Element (1997), Aladdin (1992) and Star Trek IV: The Voyage Home (1986).

Acknowledging that an immersive soundtrack should fully engage the audience, Mangini outlined two ways to achieve that goal. “Physically, we can place sound around an audience, but we also need to engage them emotionally with the narrative, using sound to tell the story,” he explained to the 500-member audience. “We all need to better understand the role that sound plays in the filmmaking process. For me, sound design is storytelling — that may sound obvious, but it’s worth reminding ourselves on a regular basis.”

While an understanding of the tools available to a sound designer is important, Mangini readily concedes, “Too much emphasis on technology keeps us out of the conversation; we are just seen as technicians. Sadly, we are all too often referred to as ‘The Sound Guy.’ How much better would it be for us if the director asked to speak with the ‘Audiographer,’ for example. Or the ‘Director of Sound’ or the ‘Sound Artist?’ — terms that better describe what we actually do? After all, we don’t refer to a cinematographer as ‘The Image Guy.’”

Mangini explained that he always tries to emphasize the why and not the how, and is not tempted to imitate somebody else’s work. “After all, when you imitate you ensure that you will only be ‘almost’ as good as the person or thing you imitate. To understand the ‘why,’ I break down the script into story arcs and develop a sound script so I can reference the dramatic beats rather than the visual cues, and articulate the language of storytelling using sound.”

Past Work
Offering up examples of his favorite work as a soundtrack designer, Mangini provided two clips during his keynote. “While working on Star Trek [in 2009] with supervising sound editor Mark Stoeckinger, director J. J. Abrams gave me two days to prepare — with co-designer Mark Binder — a new soundtrack for the two-minute mind meld sequence. J. J. wanted something totally different from what he already had. We scrapped the design work we did on the first day, because it was only different, not better. On day two we rethought how sound could tell the story that J. J. wanted to tell. Having worked on three previous Star Trek projects [different directors], I was familiar with the narrative. We used a complex combination of orchestral music and sound effects that turned the sequence on its head; I’m glad to say that J. J. liked what we did for his film.”

The two collaborators received the following credit: “Mind Meld Soundscape by Mark Mangini and Mark Binder.”

Turning to his second soundtrack example, Mangini recalled receiving a call from Australia about the in-progress soundtrack for George Miller’s Mad Max: Fury Road, the director’s fourth outing with the franchise. “The mix they had prepared in Sydney just wasn’t working for George. I was asked to come down and help re-invigorate the track. One of the obstacles to getting this mix off the ground was the sheer abundance of material to choose from. When you have so many choices on a soundtrack, the mix can be an agonizing process of ‘Sound Design by Elimination.’ We needed to tell him, ‘Abandon what you have and start over.’ It was up to me, as an artist, to tell George that his V8 needed an overhaul and not just a tune-up!”

“We had 12 weeks, working at Formosa with co-supervising sound editor Scott Hecker — and at Warner Bros Studios with re-recording mixers Chris Jenkins and Greg Rudloff — to come up with what George Miller was looking for. We gave each vehicle [during the extended car-chase sequence that opens the film] a unique character with sound, and carefully defined [the lead proponent Max Rockatansky’s] changing mental state during the film. The desert chase became ‘Moby Dick,’ with the war rig as the white whale. We focused on narrative decisions as we reconstructed the soundtrack, always referencing ‘the why’ for our design choices in order to provide a meaningful sonic immersion. Miller has been quoted as saying, ‘Mad Max is a film where we see with our ears.’ This from a director who has been making films for 40 years!”

His advice to fledgling sound designers? Mangini kept it succinct: “Ask yourself why, not how. Be the author of content, take risks, tell stories.”

Creating a Sonic Immersive Experience
Subsequent panels during the all-day conference addressed how to design immersive music, sound effects and dialog elements used on film and TV soundtracks. For many audiences, a 5.1-channel format is sufficient for carrying music, effects and dialog in an immersive, surround experience, but 7.1-channel — with added side speakers, in addition to the new Dolby Atmos, Barco/Auro 3D and DTS:X/MDA formats — can extend that immersive experience.

“During editorial for Guardians of the Galaxy we had so many picture changes that the re-recording mixers needed all of the music stems and breakouts we could give them,” said music editor Will Kaplan, MPSE, from Warner Bros. Studio Facilities, during the “Music: Composing, Editing and Mixing Beyond 5.1” panel. It was presented by Formosa Group and moderated by scoring mixer Dennis Sands, CAS. “In a quieter movie we can deliver an entire orchestral track that carries the emotion of a scene.”

Music: Composing, Editing and Mixing Beyond 5.1 panel (L-R): Andy Koyama, Bill Abbott, Joseph Magee, moderator Dennis Sands, Steven Saltzman and Will Kaplan.

‘Music:Composing, Editing and Mixing Beyond 5.1’ panel (L-R): Andy Koyama, Bill Abbott, Joseph Magee, moderator Dennis Sands, Steven Saltzman and Will Kaplan.

Describing his collaboration with Tim Burton, music editor Bill Abbott, MPSE from Formosa reported that the director “liked to hear an entire orchestral track for its energy, and then we recorded it section by section with the players remaining on the stage, which can get expensive!”

Joseph Magee, CAS, (supervising music mixer on such films as Pitch Perfect 2, The Wedding Ringer, Saving Mr. Banks and The Muppets) likes to collaborate closely with the effects editor to decide who handles which elements from each song. “Who gets the snaps and dance shoes How do we divide up the synchronous ambience and the design ambience? The synchronous ambience from the set might carry tails from the sing-offs, and needs careful matching. What if they pitch shift the recorded music in post? We then need to change the pitch of the music captured in the audience mics using DAW plug-ins.”

“I like to invite the sound designer to the music spotting session,” advised Abbott, “and discuss who handles what — is it a music cue or a sound effect?”

“We need to immerse audiences with sound and use the surrounds for musical elements,” explained Formosa’s re-recording mixer, Andy Koyama, CAS. “That way we have more real estate in the front channels for sound effects.”

“We should get the sound right on the set because it can save a lot of processing time on the dub stage,” advised production mixer Lee Orloff, CAS, during the “A Dialog on Dialog: From Set to Screen” panel moderated by Jeff Wexler, CAS.

A Dialog on Dialog: From Set to Screen panel (L-R): Lee Orloff, Teri Dorman, CAS president Mark Ulano, moderator Jeff Wexler, Gary Bourgeois, Marla McGuire and Steve Tibbo.

‘A Dialog on Dialog: From Set to Screen’ panel (L-R): Lee Orloff, Teri Dorman, CAS president Mark Ulano, moderator Jeff Wexler, Gary Bourgeois, Marla McGuire and Steve Tibbo.

“I recall working on The Patriot, where the director [Roland Emmerich] chose to create ground mist using smoke machines known as a Smoker Boats,” recalled Orloff, who received Oscar and BAFTA Awards for Terminator 2: Judgment Day (1991). “The trouble was that they contained noisy lawnmower engines, whose sound can be heard under all of the dialog tracks. We couldn’t do anything about it! But, as it turned out, that low-level noise added to the sense of being there.”

“I do all of my best work in pre-production,” added Wexler, “by working out the noise problems we will face on location. It is more than just the words that we capture; a properly recorded performance tells you so much about the character.”

“I love it when the production track is full of dynamics,” added dialog/music re-recording mixer Gary Bourgeois, CAS. “The voice is an instrument; if I mask out everything that is not needed I lose the ‘essence’ of the character’s performance. The clarity of dialog is crucial.”

“We have tools that can clean up dialog,” conceded supervising sound editor Marla McGuire, MPSE, “but if we apply them too often and too deeply it takes the life out of the track.”

“Sound design can make an important scene more impactful, but you need to remember that you’re working in the service of the film,” advised sound designer/supervising sound editor Richard King, MPSE, during the “Sound Effects: How Far Can You Go?” moderated by David Bondelevitch, MPSE, CAS.

Sound Effects: How Far Can You Go? panel L_R: Mandell Winter, Scott Gershin, moderator David Bondelevitch, Greg Hedgpath, Richard King and Will Files.

‘Sound Effects: How Far Can You Go?’ panel L-R: Mandell Winter, Scott Gershin, moderator David Bondelevitch, Greg Hedgpath, Richard King and Will Files.

In terms of music co-existing with sound effects, Formosa’s Scott Gershin, MPSE, advised, “During a plane crash sequence, I pitch shifted the sound effect to match the music.”

“I like to go to the music spotting session and ask if the director wants the music to serve as a rhythmic or thematic/tonal part of the soundtrack,” added sound effects re-recording mixer Will File from Fox Post Production Services. “I just take the other one. Or if it’s all rhythm — a train ride, for example — we’ll agree to split [the elements].”

“On the stage, I’m constantly shifting sync and pitch shifting the sound effects to match the music track,” stated Gershin. “For Pacific Rim we had many visual effects arriving late with picture changes. Director Guillermo del Toro received so many new eight-frame VFX cues he wanted to use that the music track ended up looking like bar code” in the final Pro Tools sessions.

In terms of working with new directors, “I like to let them see some good movies with good sound design to start the conversation” offered Files. “I front load the process by giving the director and picture editors a great sounding temp track using dialog predubs that they can load into the Avid Media Composer to get them used to our sound ideas It also helps the producers dazzle the studio!”

“Successful soundtrack design is a collaborative effort from production sound onwards,” advised re-recording mixer Mike Minkler, CAS, during “The Mix: Immersive Sound, Film and Television” panel, presented by DTS and moderated by Mix editor Tom Kenny. “It’s about storytelling. Somebody has to be the story’s guardian during the mix,” stated Minkler, who received Academy Awards for Dreamgirls (2006), Chicago (2002) and Black Hawk Down (2001). “Filmmaking is the ultimate collaboration. We need to be aware of what the director wants and what the picture needs. To establish your authority you need to gain their confidence.”

“For immersive mixes, you should start in Dolby Atmos as your head mix,” advised Jeremy Pearson, CAS, who is currently re-recording The Hunger Games: Mockingjay – Part 2 at Warner Bros. Studio. He also worked in that format on Mockingjay – Part 1 and Catching Fire. “Atmos is definitely the way to go; it’s what everyone can sign off on. In terms of creative decisions during an Atmos mix, I always ask myself, ‘Am I helping the story by moving a sound, or distracting the audience?’ After all, the story is up on the screen. We can enhance sound depth to put people into the scene, or during calmer, gentler scenes you can pinpoint sounds that engage the audience with the narrative.”

Kim Novak Theater at Sony Pictures Studios

Kim Novak Theater at Sony Pictures Studios.

Minkler reported that he is currently working on director Quentin Tarantino’s The Hateful Eight, “which will be released initially for two weeks in a three-hour version on 70mm film to 100 screens, with an immersive 5.1-channel soundtrack mastered to 35 mm analog mag.”

Subsequently, the film will be released next year in a slightly different version via a conventional digital DCP.

“Our biggest challenge,” reported Matt Waters, CAS, sound effects re-recording mixer for HBO’s award-winning Game of Thrones, “is getting everything competed in time. Changes are critical and we might spend half a day on a sequence and then have only 10 minutes to update the mix when we receive picture changes.”

“When we receive new visuals,” added Onnalee Blank, CAS, who handles music and dialog re-recording on the show, “[the showrunners] tell us, ‘it will not change the sound.’ But if the boats become dragons…”

Photos by Mel Lambert.

Mel Lambert is principal of Content Creators, an LA-based editorial service, and can be reached at mel.lambert@content-creators.com. Follow him on Twitter @MelLambertLA.

Cine Gear Expo showcases production, post solutions

By Mel Lambert

With its focus on sound and image acquisition, the annual Cine Gear Expo — now in its 20th year — offers attendees the opportunity to examine a wide cross section of systems targeted at the production and post communities, including capture, storage and delivery configurations that accommodate 4K and HDR workflows. Held last Friday and Saturday at the Paramount Studios complex in central Hollywood, a large number of companies showed off new innovations within Stages 31 and 32, in addition to outdoor booths located throughout the New York Street area. This year’s event reportedly attracted in excess of 12,000 attendees.

One highlight was a rare 70mm screening by Band Pro Film & Digital of Baraka, followed by a Q&A with producer Mark Magidson. Shot in 25 countries on six continents, the film includes a number of scenes that director Ron Fricke defines as “a guided meditation on humanity.” Originally released in 1992, Baraka was the first film in over 20 years to be photographed in the 70mm Todd-AO format, and reportedly the first film ever to be restored and scanned at 8K resolution. “Last year we screened Samsara in 4K at Cine Gear,” reports Band Pro president/CEO Amnon Band. “The response was huge, and we wanted this year to be just as amazing. Baraka is a film that deserves to be projected and appreciated on the big screen.” (As critic Roger Ebert once commented: “If man sends another Voyager to the distant stars and it can carry only one film on board, that film might be Baraka.”)

Panavision Primo 70 lenses and the Red Weapon 8K.

Panavision Primo 70 lenses and the Red Weapon 8K.

Panavision/Light Iron showed test footage from director Quentin Tarantino’s The Hateful Eight, which was shot by Robert Richardson, ASC, in Ultra Panavision 70 and projected from 70mm anamorphic film at the Paramount Theater. The first production since Khartoum (1966) to be shot in Ultra Panavision 70, the anamorphic format is captured on 65mm negative stock to deliver an approximately 2.7:1 image that is described as “sharp but not clinical, with painterly bokeh and immersive depth.” Also shown in the Panavision/Light Iron booth was a demo of 8K footage shot on the RED Weapon 8K with Panavision Primo 70 lenses, PanaNet, a high-speed fiber network between Panavision locations for transferring media at up to 10GB per second; LightScan, a low-cost telecine solution that transfers ProRes UHD quality targeted at independent films, commercials and TV shows that prefer film optics; and Live Play 3, an iPad dailies app for Mac OS X.

Canon’s EOS C300 Mark II digital cinema camera.

Canon took the opportunity to showcase the new EOS C300 Mark II digital cinema camera. According to Joseph Bogacz, a Canon advisor on professional engineering and solutions, the Mk II is a completely new camera, and not derived from the original C300. “The Mk II offers more than 15 stops of dynamic range, with ISO from 160 to 102,400. We have also included 10-bit recording for 4K shoots, in addition to 10- or 12-bit HD/2K resolutions. The Mk II also offers internal 4K recording, for less complexity on a film or TV set.” The camera’s power system has also been beefed up to 14.4 volts, with Lemo connectors.

Also shown was the new portable DP-V2410 24-inch 4K reference monitor, which is designed for on-set use during 4K cinema and 4K/UHD TV/commercial productions. The monitor “delivers a consistent look throughout the entire workflow,” according to Jon Sagud, a professional marketing manager with Canon Imaging Technologies and Communications Group. “It connects via a single cable to the C300 MkII and also accepts HDMI sources.”

Gale Tattersall

The RGB LED backlight panel is rated at 400 NIT light levels with several built-in waveform displays, and can be powered from 24V supplies. It can also de-Bayer live 4K RAW video from EOS C500 and C300 Mark II cameras, and supports 4K ACES proxy (ACES 1.0) to maintain a desired “look” throughout a production-to-post workflow.

The company also hosted a panel discussion, “A First Look at the EOS C300 Mark II with Gale Tattersall,” during which the acclaimed director of photography presented his first impressions of the new camera, together with reactions from first AC Tony Gutierrez, second AC Zoe Van Brunt and Steadicam operator Ari Robbins, while shooting Trick Shot, the first short to be shot entirely with the new system. “I was immediately impressed by the C300 Mk II’s wide dynamic range and output quality,” Tattersall confided. “I could avoid white-level clipping and hold shadow detail; you can go beyond the 15-stop range if you want to. We were working with a 50-1,000 T5 Canon lens, which is a perfect all-round zoom. With Netflix and other studios specifying 4K resolution, the MkII’s on-board recording will definitely streamline our workflows.”

Probably best known for his work as DP on Fox’s House television series, Tattersall currently is working on Netflix’s Grace and Frankie series, using a competitive 4K camera. “When you have [series principals] Jane Fonda and Lily Tomlin – ‘ladies in their seventies’ – wearing black against black backgrounds, dynamic range become a key factor! The Mk II offers outstanding performance down in the critical 15 IRE low-level range.”

During a panel discussion organized by Sony, cinematographer Rob Hardy, BSC, shared details of his work on director Alex Garland’s Ex Machina, using a F65 CineAlta digital camera. Because of his prior experience shooting 35mm film for commercials, “I wanted to retain the same operator workflow,” Hardy concedes. “During pre-production 4K tests [in the UK at Pinewood Studios] we compared the look of Red Dragon, Arri Alexa and Sony F65 cameras, with new and old glass [lenses]. I needed to capture in the camera what I was seeing on the set; skin tones became a key parameter across a range of interior and exterior lighting levels.

DP Rob Hardy during a panel discussion on using Sony CineAlta F65 camera to shoot Ex Machina.

DP Rob Hardy during a panel discussion on using Sony F65 CineAlta camera to shoot Ex Machina.

“We opted for Xtal Express anamorphic glass on the F65, a combination that offered everything I was looking for. The resultant images had the depth that I needed for the film; the F65 ‘read’ the glass perfectly for me at T2.8 or T2.3 apertures.” UK-based Joe Dunton Cameras supplied the Cooke Xtal (Crystal) Express anamorphic lenses, which are derived from vintage Cooke spherical lenses that, in the eighties, were rehoused and modified with anamorphic elements.

Turning to other booth displays at Cine Gear, Amimon demonstrated the Connex series of 5GHz wireless transmission units that are said to deliver full HD video quality with zero-latency transmission over distances up to 3,300 feet, which are targeted at feature films, documentaries, music videos and other production applications that need realtime control of a camera and drone. A built-in OSD provides telemetry information; commands can also be sent via Futaba S-Bus protocol to a drone’s gimbal; the unit supports simultaneous multicasting to four screens.

Audio Intervisual Design (AID) showed examples of recent post-production design and installation projects, including a multi-function dub stage and DI/color grading suite for Blumhouse Productions, which has enjoyed recent success with the Paranormal Activity, The Purge, Insidious and Sinister franchises, in addition to Oscar success with Whiplash and Emmy success with HBO’s The Normal Heart. Also shown at the AID booth was an Avid S6 Console surface for Pro Tools and examples of IHSEusa’s extensive range of KVM switches and extenders, plus DVI splitters and converters.

GoPro demonstrated application of its free-of-charge GoPro Studio software that imports, trims and playbacks videos and timelapse photo sequences; edit templates offer music, edit points and slow-motion effects. Video playback speeds can also be changed for ultra-slow and fast motion using the built-in Flux app.

G-Tech's Aimee Davos with G-Drive ev ATC drives.

G-Tech’s Aimee Davos with G-Drive ev ATC drives.

G-Tech showed the new G-Drive ev ATC with either Thunderbolt or USB3 interfaces, designed to withstand life while on hostile locations. The ruggedized, watertight housing with tethered cable holds a removable hard drive and is available in various capacities. The ATC’s all-terrain case is compatible with the firm’s Evolution Series, with a durable 7,200 RPM drive that is said to leverage the speed of Thunderbolt while providing the flexibility of USB. A 1TB USB drive sells for $179 and $229 for a 1TB Thunderbolt model. Also shown was the RAID 8-Bay Thunderbolt 2 storage solution designed to support multi-stream compressed 4K workflows at transfer rates up to 1,350MB/s

London-based iDailies offers 35/16mm processing and 35 mm printing, together with telecine transfer and color grading; only two such film-processing facilities currently exist in the UK. “We are handling all of the processing for [Walt Disney Pictures’] new Star Wars: Episode 7–The Force Awakens, which is being shot entirely on film by director J. J. Abrams,” explains the firm’s senior colorist Dan Russell. Reportedly, the facility has processed every studio film shot in the UK since March 2013, including Spectre, Mission Impossible 5, Cinderella and Fury, together with The Imitation Game and Far From The Madding Crowd. It also supports the majority of film schools, to help “encourage and enable the next generation of filmmakers to discover the unique attributes of film origination.”

L-R: SNS's Steve McKenna with the John Diel.

L-R: SNS’s Steve McKenna with  John Diel.

Sound Devices showcased the PIX-E Series on-camera video monitors, which includes 1,920-by-1,080 five-inch and or 1,920-by-1,200 seven-inch LCDs, with integral monitoring tools, SDI and HDMI I/O, plus the ability to record 4K and Apple ProRes 4444 to mSATA-based SpeedDrives. PIX-E monitors feature compact, die-cast metal housings and Gorilla Glass 2. Also shown was the 12-input Model 688 production mixer with 16-track recorder, offering eight outputs plus digital mixing and routing and the MixAssist automatically drops the volume of inactive inputs and maintains consistent background levels.

Studio Network Solutions (SNS) showcased practical applications for ShareBrowser, a file/project/asset management interface for OS X and Windows, and which is included with every EVO shared-storage system. “ShareBrowser lets post users search, index, share, preview and verify all assets,” explained sales manager Steve McKenna. “More than a file manager, the app enables automatic project locking for Apple Final Cut Pro, Adobe Premier, Avid Pro Tools and other editors, as well as Avid project and bin sharing, and allows search across all EVO storage as well as local, offline and other network disks.”

Cine Gear photos by Mel Lambert

 

NAB 2015: Love and hate, plus blogs and videos

By Randi Altman

I have been to more NABs than I would like to admit, and I loved them all… I’ve also hated them all, but that is my love/hate relationship with the show. I love seeing the new technology, trends and friends I’ve made from my many years in the business.

I hate the way my feet feel at the end of the day. I hate the way that there is not enough lotion on the planet to keep my skin from falling off.  I extra-hate the cab lines, but mostly I hate not being able to see everything that needs to be seen.

Continue reading

Sound developments at the NAB Show

Spotlighting Pro Sound Effects library, Genelec 7.1.4 Array, Avid Master Joystick Module and Sennheiser AVX wireless mic

By Mel Lambert

With a core theme of ”Crave More,” which is intended to reflect the passion of our media and entertainment communities, and with products from 1,700 exhibitors this year – including over 200 first-time companies – there were plenty of new developments to see and hear at the NAB Show, which continues in Las Vegas until Thursday afternoon.

In addition to unveiling Master Library 2.0, which adds more that 30,000 new sound effects, online access, annual updates and new subscription pricing, Pro Sound Effects demonstrated a Continue reading