Category Archives: A Closer Look

A closer look at some London-based audio post studios

By Mel Lambert

While in the UK recently for a holiday/business trip, I had the opportunity to visit several of London’s leading audio post facilities and catch up with developments among the Soho community.

‘Baby Driver’

I also met up with Julian Slater, a highly experienced supervising sound editor, sound designer and re-recording mixer who relocated to the US a couple of years ago, working first at Formosa Group and then at the Technicolor at Paramount facility in Hollywood. Slater was in London working on writer/director Edgar Wright’s action-drama Baby Driver, starring Lily James, Jon Hamm, Jon Bernthal and Jamie Foxx. The film follows the progress of a young getaway driver who, after being coerced into working for a crime boss, finds himself taking part in a heist that’s doomed to fail.

Goldcrest Films
Slater handled sound effects pre-dubs at Goldcrest Films on Dean Street in the heart of Soho’s film district, while co-mixer Tim Cavagin worked on dialog and Foley pre-mixes at Twickenham TWI Studios in Richmond, a London suburb west of the capital. Finals started just before Christmas at Goldcrest, with Slater handling music and SFX, while Cavagin oversaw dialog and Foley. “We are using Goldcrest’s new Dolby Atmos-capable Theater 1, which opened last May,” explains Slater. “The post crew includes sound effects editors Arthur Graley, Jeremy Price and Martin Cantwell, plus dialog/ADR supervisor Dan Morgan and Foley editor Peter Hanson.

“I cannot reveal too much about my sound design for Baby Driver,” admits Slater, “but because the lead character [actor Ansel Elgort] has a hearing anomaly, I am working with pitch changes to interweave various elements of the film’s soundtrack.”

Baby Driver is scheduled for UK and US release in August, and will be previewed in mid-March at the SXSW Film Festival in Austin. Composer Steven Price’s score for the film was recorded at Abbey Road Studios in North London. Price wrote the music for writer/director Alfonso Cuarón’s Gravity (2013), which won him the Academy Award for Best Original Score.

British-born Wright is probably best known for comedies, such as Shaun of the Dead (2004), Hot Fuzz (2007) and The World’s End (2013), several of which featured Slater’s talents as supervising sound editor, sound designer and/or re-recording mixer.

Slater is a multiple BAFTA and Emmy Award nominee. After graduating from the School of Audio Engineering (now the SAE Institute) in London, at the age of 22 he co-founded the Hackenbacker post company and designed sound for his first feature film, director Mike Figgis’ Leaving Las Vegas (1995). Subsequent films include In Bruges (2008), Dark Shadows (2012), Scott Pilgrim Vs. the World (2010) and Attack the Block (2011).

Goldcrest Films, which has a NYC-based studio as well, provides post services for film and broadcast projects, including Carol (2015), The Danish Girl (2015) and Les Misérables (2012). The facility features three Dolby dubbing theaters with DCI-compliant projection, plus ADR and Foley recording stages, sound design and editing suites, offline editorial and grading suites. “Last May we opened Theatre 1,” reports studio manager Rob Weatherall, “a fully sound-isolated mixing theater that is Dolby Atmos Premier-certified.”

Goldcrest Films Theater 1 (L-R): Alex Green, Rowan Watson, Julian Slater, Rob Weatherall and Robbie Scott.

First used to re-record writer/director Paul Greengrass’ Jason Bourne (2016), the new room houses a hybrid Avid 32-fader S6 M40 Pro Tools control surface section within a 72-fader dual-engine AMS Neve DFC3D Gemini frame. By building interchangeable AMS and S6 “buckets” in a single console frame, the facility can mix and match formats according to the re-recording engineers’ requirements — either “in the box” using the S6 surface, or a conventional workflow using the DFC sections.

“I like working in the box,” says Slater, “since it lets me retain all my sound ideas right through print mastering. For Baby Driver we premixed to a 9.1-channel bed with Atmos objects and brought this submix here to Goldcrest where we could open everything seamlessly on the S6 console and refine all my dialog, music and effects submixes for the final Atmos immersive mix. Because I have so much sound design for the music being heard by our lead character, including sound cues for the earbuds and car radios, it’s the only way to work! We also had a lot of music playback on the set.”

The supervising sound editor needed to carefully prepare myriad sound cues. “Having worked on all of his films, I have come to recognize that Edgar [Wright] is an extremely sound-conscious director,” Slater reports. “The soundtrack for Baby Driver needed to work seamlessly and sound holistic — not forced in any way. In other words, while sound is important in this film — for obvious reasons — it is critical that we don’t detract the audience from the dramatic storyline.”

Theater 1’s 55-loudspeaker Atmos array includes a mixture of Crown-powered JBL 5732s Screen Array cabinets in the front with Meyer cabinets for the surrounds. Accommodated formats include 5.1, 7.1 and DTS:X. Five Pro Tools playback systems are available with Waves Platinum plug-in packages, plus a 192-channel Pro Tools HDX 3 recorder. Each Pro Tools rig features a DAD DX32 audio interface, with both Audinate Dante- and MADI-format digital outputs. The latter can be routed to the DFC console for conventional mixing or to a sixth rig with a DAD AX32 converter system for in the box mixing on the S6 control surface. Video projection is via a Barco DP2K-10SX and an Integrated Media Server for DCP playback, and Pro Tools Native with an AJA video card. Outboards include a pair of Lexicon 960 reverbs, two TC 6000 reverb and four dbx Subharmonic synthesizers.

Hackenbacker Audio Post
Around the corner from Goldcrest, Slater’s former facility Hackenbacker Audio Post comprises a multi-room post facility that was purchased in July 2015 by Molinare from e-Post Media, owners of Halo Post. Hackenbacker handled sound for the TV series Downton Abbey, Cold Feet and Thunderbirds Are Go, plus director Richard Ayoade’s film, The Double (2013). Owner/founder Nigel Heath remains a director of the group management team for the facility’s three dubbing studios, five edit suites and a large Foley stage located a short distance away.

Hackebacker’s Studio 2

Hackenbacker Studio 1 has been Heath’s home base for more than a decade. It houses a large-format AMS Neve 48-fader MMC Neve console with three Avid HD3 Pro Tools systems, two iZ Technologies RADAR 24-track recorder/players and a Dynaudio M3F 5.1 monitoring system that was used to re-record Hot Fuzz, In Bruges, Shaun of the Dead and many other projects.

Studio 2 features Dynaudio monitoring along with an Avid Icon 16-fader D-Control surface linked to a Pro Tools HDX system. It is used for 5.1 TV mixing and ADR and includes a large booth suitable for both ADR and voice-over. Also designed for TV mixing and ADR, Studio 3 features Quested monitoring and an Avid ICON 32-fader D-control surface linked to a Pro Tools HDX system. Edit 1 and 2 handle a wide cross section of sound effects editorial assignments, with access to a large sound library and other creative tools. Edit 3 and 4 are equipped for dialog and ADR editing. Edit 5 features a transfer bay and QC facility in which all sound material is verified and checked.

Twickenham TWI Studios
According to technology development manager/re-recording mixer Craig Irving, Twickenham TWI Studios recently completed mixing of the soundtrack for writer/director Stanley Tucci’s Final Portrait, the story of Swiss painter and sculptor Alberto Giacometti, starring Armie Hammer and Geoffrey Rush. The film was re-recorded by Tim Cavagin and Irving, with sound editorial by Tim Hands on dialog and Jack Gillies on effects.

The lounge at Twickenham-TWI.

“Dialog tracks for Baby Driver were pre-mixed by Tim in our Atmos-capable Theatre 1,” explains Irving. “Paul Massey will be returning soon to complete the mix in Theatre 1 for director Ridley Scott’s Alien Covenant, which reunites the same sound team that worked on The Martian — with Oliver Tarney supervising, Rachel Tate on dialog, and Mark Taylor and our very own Dafydd Archard on effects.” Massey also mixed Scott’s Exodus: Gods and Kings (2014) at Twickenham TWI. He also worked on director Rufus Norris’ London Road (2015) and director Tim Miller’s Deadpool (2016). He recently completed the upcoming Pirates of the Caribbean: Dead Men Tell No Tales. While normally based at Fox Post Production Services in West Los Angeles, Massey also spends time in his native England overseeing a number of film projects.

“Their stages have also been busy with production of Netflix’s Black Mirror series, which consists of six original films looking at the darker side of modern life. Episode 1 was directed by Jodie Foster. “To service an increase in production, we are investing in new infrastructure that will feature a TV mixing stage,” explains Irving. “The new room will be based around an Avid S6 control surface and used as a bespoke area to mix original TV programming, as well as creating TV mixes of our theatrical titles. Our Picture Post area is also being expanded with a second FilmLight Baselight Two color grading system with full 4K projection for both theatrical and broadcast projects.”

Twickenham TWI’s rooftop bar and restaurant opened its doors to clients and staff last year. “It has proved extremely popular and is open to membership from within the industry,” Irving says. The facility’s remodeled front office and reception area was designed Barbarella Design. “We have chosen a ‘’60s retro, Mad Men theme in greys and red,” says the studio’s COO Maria Walker. In addition to its two main re-recording theaters, TWI offers 40 cutting rooms, an ADR/Foley stage and three shooting stages.

Warner Bros. De Lane Lea
Just up the street from Goldcrest Films is Warner Bros. De Lane Lea, which started as a multi-room studio. It also has a rather unusual ancestry. In the 1940s, Major De Lane Lea was looking to improve the way dialog for film and later TV could be recorded and replaced in order to streamline dubbing between French and English. This resulted in his setting up a company called De Lane Lea Processes and a laboratory in Soho. The company also developed a number of other products aimed at post, and over the next 30 years opened a variety of studios in London for voice recording, film, TV and jingle mixing, music recording and orchestral-score recording.

De Lane Lea’s Stage 1.

Around 1970, the operation moved into its current building on Dean Street and shifted its focus toward film and TV sound. The facility, which was purchased by Warner Bros. in 2012, currently includes four re-recording stages, two ADR stages for recording dialog, voiceovers and commentaries, plus 50 cutting rooms, a preview theater, transfer bay and a café/bar. Three of the Dolby-certified mixing stages are equipped with AMS Neve DFC Gemini consoles or Avid S6 control surfaces and Meyer monitoring. A TV mixing stage boasts an Avid Pro Tools control surface and JBL monitoring.

Stage 1 features an AMS Neve 80-fader DFC Gemini digital two-mixer console with an Avid control surface, linked to a Meyer Sound EXP system providing Dolby Atmos monitoring. Six Pro Tools playback systems are available — three 64-channel HDX and three 128-channel HDX2 rigs — together with a 128-channel HDX2 Pro Tools recorder. Film projection is from a Kinoton FP38ECII 35mm unit, with a Barco DP2K-23B digital cinema projector offering resolutions up to 2K. Video playback within Pro Tools is via a VCubeHD nonlinear player or a Blackmagic card. Outboards include a Lexicon 960 and a TC 6000 reverb, plus two dbx Subharmonic Synthesizers. Stage 2 is centered around an Avid S6 M40 24-fader console linked to three Pro Tools playback systems — a pair of 64-channel HDX2 and a single 128-channel HDX2 rig — plus a 64-channel HDX recorder. Monitoring is via a 7.1-channel Meyer Sound EXP system.

Warner Bros. Studios Leavesden
Located 20 miles north west of Central London and serving as its UK-based shooting lot, Warner Bros. Studios Leavesden offers a number of well-equipped stages for large-scale productions, in addition to a large tank for aquatic scenes. The facility’s history dates back almost 70 years, to when it was originally acquired by the UK Ministry of Defense in 1939 as a WWII production base for building aircraft, including the iconic Mosquito Fighter and Halifax Bombers. When hostilities ceased, the site was purchased by Rolls Royce and continued as a base for aircraft manufacture, progressing onto large engines. It eventually closed in 1992.

Warner Bros. Leavesden’s studio layout.

In 1994, Leavesden began a new life as a film studio and over the following decades was home to a number of high-profile productions, including the James Bond film Goldeneye (1995), Mortal Kombat: Annihilation (1997), Star Wars Episode One: The Phantom Menace (1999), An Ideal Husband (1999) and director Tim Burton’s Sleepy Hollow (1999).

By 2000, Heyday Films had acquired use of the site on behalf of Warner Bros. for what would be the first in a series of Harry Potter films — Harry Potter and the Philosopher’s Stone (2001) — with each subsequent film in the franchise during the following decade being shot at Leavesden. While other productions, almost exclusively Warner Bros. productions, made partial use of the complex, the site was mostly occupied by permanent standing sets for the Harry Potter films.

In 2010, as the eighth and final Harry Potter film was nearing completion, Warner Bros. announced its intention to purchase the studio as a permanent European base, the first studio to do so since MGM in the 1940s. By November of that year, the studio had completed purchase of Leavesden Studios and announced plans to invest more than £100 million (close to $200 million at the time) on the site they had occupied, converting Stages A through H into sound stages. As part of the redevelopment, Warner Bros. created two entirely new soundstages to house a permanent public exhibition called Warner Bros. Studio Tour London — The Making of Harry Potter, creating 300 new jobs. It opened to the public in early 2012.

With over 100 acres, WBSL features one of the most extensive backlots in Europe, with level, graded areas, including a former aircraft runway, a variety of open fields, woodlands, hills and clear horizons. In addition, it offers bespoke art departments, dry-hire edit suites and VFX rooms, in addition to a pair of the largest water tanks in Europe, with a 60-by-60 foot filtered and heated indoor tank, and a 250-by-250 foot exterior tank.

Main Image: Goldcrest London’s Theater 1.


Mel Lambert is principal of Content Creators, an LA-based copywriting and editorial service, and can be reached at mel.lambert@content-creators.com. Follow him on Twitter @MelLambertLA.

Utopic editor talks post for David Lynch tribute Psychogenic Fugue

Director Sandro Miller called on Utopic partner and editorCraig Lewandowski to collaborate on Psychogenic Fugue, a 20-minute film starring John Malkovich in which the actor plays seven characters in scenes recreated from some of filmmaker David Lynch’s films and TV shows. These characters include The Log Lady, Special Agent Dale Cooper, and even Lynch himself as narrator of the film.

It is part of a charity project called Playing Lynch that will benefit the David Lynch Foundation, which seeks to introduce at-risk populations affected by trauma to transcendental meditation.

craigChicago-based Utopic handled all the post, including editing, graphics, VFX and sound design. The film is part of a multimedia fundraiser hosted by Squarespace and executed by Austin-based agency, Preacher. The seven vignettes were released one at a time on Playinglynch,com.

To find out more about Utopic’s work on the film, we reached out to Lewandowski with some questions.

How early were you brought in on the film?
We were brought in before the project was even finalized. There were a couple other ideas that were kicked around before this one rose to the top.

We cut together a timing board using all the pieces we would later be recreating. We also pulled some hallway scenes from an old Playstation commercial that he directed, and we then scratched in all the “Lynch” lines for timing.

You were on set. Can you talk about why and what the benefits were for the director and you as an editor?
My job on the set was to have our reference movie at the ready and make sure we were matching timing, framing, lighting, etc. Sandro would often check the reference to make sure we were on track.

For scenes like the particles in Eraserhead, I had the DP shoot it at various frame rates and at the highest possible resolution, so we could shoot it vertical and use the particles falling. I also worked with the Steadicam operator to get a variety of shots in the hallway since I knew we’d need to create some jarring cutaways.

How big of a challenge was it dealing with all those different iconic characters, especially in a 20-minute film?
Sandro was adamant that we not try to “improve” on anything that David Lynch originally shot. Having had a lot of experience with homages, Sandro knew that we couldn’t take liberties. So the sets and action were designed to be as close as possible to the original characters.

In shots where it was only one character originally (The Lady in the Radiator, Special Agent Dale Cooper, Elephant Man) it was easier, but in scenes where there were originally more characters and now it was just Malkovich, we had to be a little more creative (Frank Booth, Mystery Man). Ultimately, with the recreations, my job was to line up as closely as possible with what was originally done, and then with the audio do my best to stay true to the original.

Can you talk about your process and how you went about matching the original scenes? Did you feel much pressure?
Sandro and I have worked together before, so I didn’t feel a lot of pressure from him, but I think I probably put a fair amount on myself because I knew how important this project was for so many people. And, as is the case with anything I edit, I don’t take it lightly that all of that effort that went into preproduction and production now sits on my shoulders.

Again, with the recreations it was actually fairly straightforward. It was the corridor shots where Malkovich plays Lynch and recites lines taken from various interviews that offered the biggest opportunity, and challenge. Because there was no visual reference for this, I could have some more fun with it. Most of the recreations are fairly slow and ominous, so I really wanted these corridor shots to offset the vignettes, kind of jar you out of the trance you were just put in, make you uneasy and perhaps squirm a bit, before being thrust into the next recreation.

What about the VFX? Can you talk about how they fit in and how you worked with them?
Many of the VFX were either in-camera or achieved through editorial, but there were spots — like where he’s in the corridor and snaps from the front to the back — that I needed something more than I could accomplish on my own, so I used our team at Utopic. However, when cutting the trailer, I relied heavily on our motion graphics team for support.

Psychogenic Fugue is such an odd title, so the writer/creative director, Stephen Sayadin, came up with the idea of using the dictionary definition. We took it a step further, beginning the piece with the phonetic spelling and then seamlessly transitioning the whole thing. They then tried different options for titling the characters. I knew I wanted to use the hallway shot, close-ups of the characters and ending on Lynch/Malkovich in the chair. They gave me several great options.

What was the film shot on, and what editing system did you use?
The film was shot on Red at 6K. I worked in Adobe Premiere, using the native Red files. All of our edit machines at Utopic are custom-built, high-performance PCs assembled by the editors themselves.

What about tools for the visual effects?
Our compositor/creative finisher used an Autodesk Flame, and our motion graphics team used Adobe After Effects.

Can you talk about the sound design?
I absolutely love working on sound design and music, so this was a dream come true for me. With both the film and the trailer, our composer Eric Alexandrakis provided me with long, odd, disturbing tracks, complete with stems. So I spent a lot of time just taking his music and sound effects and manipulating them. I then had our sound designer at Brian Lietner jump in and go crazy.

Is there a scene that you are most proud of, or that was most challenging, or both?
I really like the snap into the flame/cigarette at the very beginning. I spent a long time just playing with that shot, compositing a bunch of shots together, manipulating them, adjusting timing, coming back in the next morning and changing it all up again. I guess that and Eraserhead. We had so many passes of particles and layered so many throughout the piece. That shot was originally done with him speaking to camera, but we had this pass of him just looking around, and realized it was way more powerful to have the lines delivered as though they were internal monologue. It also allowed us to play with the timings in a way that we wouldn’t be able to with a one-take shot.

As far as what I’m most proud of, it’s the trailer. We worked really hard to get the recreations and full film done. Then I was able to take some time away from it all and come back fresh. I knew that there was a ton of great footage to work with and we had to do something that wasn’t just a cutdown. It was important to me that the trailer feel every bit as demented as the film itself, if not more. I think we accomplished that.

Check out the trailer here:

Dell 6.15

The creative process behind The Human Rights Zoetrope

By Sophia Kyriacou

As an artist working in the broadcast industry of almost 20 years, I’ve designed everything from opening title sequences to program brands to content graphics. About three years into my career, I was asked to redesign a program entirely in 3D. The rest, as they say, is history.

Over two years ago I was working full-time at the BBC doing the same work as I am doing now, broadcast designer and 3D artist, but decided it was time to cut my time in half and allow myself to focus on my own creative ventures. I wanted to work with external and varied clients, both here in the UK and internationally. I also wanted to use my spare time for development work. In an industry where technology is constantly evolving it’s essential to keep ahead of the game.

One of those creative ventures was commissioned by Noon Visual Creatives — a London-based production and post company that serves several Arabic broadcasters in both the United Kingdom and worldwide — to create a television branding package for a program called Human Rights.

I had previously worked with Noon on a documentary about the ill-fated 1999 EgyptAir plane crash (which is still awaiting broadcast), so when I was approached again I was more than happy to create their Human Rights brand.

My Inspiration
I was very lucky in that my client essentially gave me free rein, which I find is a rarity these days. I have always been excited and inspired by the works of the creative illusionist M.C Escher. His work has always made me think and explore how you can hook your viewer by giving them something to unravel and interact with. His 1960 lithograph, called Ascending and Descending, was my initial starting point. There was something about the figures going round and round but getting nowhere.The Human Rights Zeotrope Titles

While Escher’s work kickstarted my creative process I also wanted to create something that was illusion-based, so I revisited Mark Gertler’s Merry-Go-Round. As a young art student I had his poster on my wall. Sometimes I would find myself staring at it for hours, looking at the people’s expressions and the movement Gertler had expressed in the figures with his onion-skin-style strokes. There was so much movement within the painting that it jumped out at me. I loved the contrasting colors of orange and blue, the composition was incredibly strong and animated.

I have always been fascinated by the mechanics of old hand-cranked metal toys, including zoetropes, and I have always loved how inanimate objects could come alive to tell you a story. It is very powerful. You have the control to be given the narrative or you can walk away from it — it’s about making a choice and being in control.

Once I had established I was going to build a 3D zoetrope, I explored the mechanics of building one. It was the perfect object to address the issue of human rights because without the trigger it would remain lifeless. I then starting digging into the declaration of Human Rights to put forward a proposal of what I thought would work within their program. I shortlisted 10 rights and culled that down to the final eight. Everything had to be considered. The positioning of the final eight had their own hierarchy and place.

At the base of the zoetrope are water pumps, signifying the right to clean water and sanitation. This is the most important element of the entire zoetrope, grounding the entire structure, as without water, there simply is no life, no existence. Above, a prisoner gestures for attention to the outside world, its environment completely contradicting, given hope by an energetic burst of comforting orange. The gavel references the rights for justice and are subliminally inspired by the hammers walking defiantly within the Pink Floyd video, Another Brick in the Wall. The gavel within the zoetrope becomes that monumental object of power, helped along by the dynamic camera with repetitions of itself staggered over time like echoes on a loop. Surrounding the gavel of justice is a dove flying free from a metal birdcage in a shape of the world. This was my reference to the wonderful book, I Know Why the Caged Bird Sings, by Maya Angelou.

My client wanted to highlight the crisis of the Syrian refugees, so I decided to depict an exhausted child wearing a life jacket, suggesting he had travelled across the Mediterranean Sea, while a young girl at his side, oblivious, happily plays with a spinning top. I wanted to show the negativity being cancelled out by optimism.

To hammer home the feeling of isolation and emptiness that the lack of human rights brings forth, I placed the zoetrope into a cold and almost brutal environment: an empty warehouse. My theme of the positivity canceling out negativity once again is echoed as the sunlight penetrates through hitting the cold floor in an attempt to signify hope and reconnect with the outside world.

the-human-rights-zoetrope_gavel-shotEvery level of detail was broken up into sections. I created very simple one-second loops of animation that were subtle, but enough to tell the story. Once I had animated each section, it was a case of painstakingly pulling apart each object into a stop-frame animated existence so once they were placed in their position and spun, they would animate back into life again.

My Workflow
For ease and budget, I used Poser Pro, a character-based software to animate all the figures in isolation first. Using both the PoserFusion plug-in and the Alembic export, I was able to import each looping character into Maxon Cinema 4D where I froze and separated each 3D object one by one. Any looping objects that were not figure-based were all modelled and animated within Cinema 4D. Once the individual components were animated and positioned, I imported everything into a master 3D scene where I was able to focus on the lighting and camera shots.

For the zoetrope centrepiece, I built a simple lighting rig made up of the GSG Light Kit Pro, two soft boxes, that I had adapted and placed within a NULL and an area Omni light above. This allowed me to rotate the rig around according to my camera shot. Having a default position and brightness set-up was great and helped to get me out of trouble if I got a little too carried away with the settings, and the lighting didn’t change too dramatically on each camera shot. I also added a couple of Visible Area Spotlights out of the warehouse pointing inwards to give the environment a foggy distant feel.

I deliberately chose not to render using volumetric lighting because I didn’t want that specific look and did not want any light bursts hitting my zoetrope. The zoetrope was the star of the show and nothing else. Another lighting feature I tend to use within my work is the combination of the Physical Sky and the Sun. Both give a natural warm feel and I wanted sunlight to burst through the window; it was conceptually important and it added balance to the composition.

The most challenging part of the entire project was getting the lighting to work seamlessly throughout, as well as the composition within some of the camera shots. Some shots were very tight in frame, so I could not rely on the default rig and needed additional lighting to catch objects where the 3-point lights didn’t work so well. I had decided very early on, that rather than work from a single master file, as with the lighting, I had a default “get me out of trouble” master, saving each shot with its own independent settings as I went along to keep my workflow clean. Each scene file was around a gigabyte in size as none of the objects within the zoetrope were parametric anymore once they had been split, separated-out and converted to polygons.

My working machine was a 3.2GHz 8-core Mac Pro with 24GB of RAM, rendered out on a PC — custom-built 3X3 machine — with an Intel Core Processor i7 5960X with water cooling, 32GB RAM and clockable to 4.5GHz.

Since completion, The Human Rights Zoetrope titles have won several awards, including a Gold at the Muse Creative Awards in the Best Motion Graphics category, a Platinum Best of Show in the Art Direction category, and a Gold in the Best Graphic Design category at the Aurora Awards.

The Human Rights Zoetrope is also a Finalist at the New York Festivals 2017 in the Animation: Promotion/Open & IDs category. The winners will be announced at the NAB Show.

 

Sophia Kyriacou is a London-based broadcast designer and 3D artist.

GoPro intros Karma foldable drone, Hero5 with voice-controlled recording

By Brady Betzel

“Hey, GoPro, start recording!” That’s right, voice-controlled recording is here. Does this mean pros can finally start all their GoPros at the same time? More on this in a bit…

I’m one of the lucky few journalists/reviewers who have been brought out to Squaw Valley, California, to hear about GoPro’s latest products first hand — oh, and I got to play with them as well.

So, the long awaited GoPro Karma drone is finally here, but it’s not your ordinary drone. It is small and foldable so it can fit in a backpack, but the three-axis camera stabilizer can be attached to the included Karma grip so you can grab the drone before it lands and carry it or mount it. This is huge! If worked out correctly you can now fake a gigantic jib swing with a GoPro, or even create some ultra-long shots. One of the best parts is that the controller is a videogame style remote that doesn’t require you use your phone or tablet! Thank you GoPro! No, really, thank you.

The Karma is priced at $799, the Karma plus Session is $999, and the Karma plus Hero5 Black is $1,099. And it’s available one day before my birthday next month — hint, hint, nudge, nudge — October 23.

To the Cloud! GoPro Plus and Quik Apps
So you might have been wondering how GoPro intends to build a constant revenue stream. Well, it seems like they are banking on the new GoPro Plus cloud-based subscription service. While your new Hero5 is charging it can auto-upload photos and videos via a computer or phone. In addition you will be able to access, edit and share all from GoPro Plus. For us editing nerds, this is the hot topic because want to edit everything from anywhere.

My question is this: If everyone gets on the GoPro Plus train, are they prepared for the storage and bandwidth requirements? Time will tell. In addition to being able to upload to the cloud with your GoPro Plus subscription, you will have a large music library at your disposal, 20 percent off accessories from GoPro.com, exclusive GoPro Apparel and Premium Support.

The GoPro Subscription breaks down to $4.99 and is available in the US on October 2 — it will be in more markets in January 2017.

Quik App is GoPro’s ambitious attempt at creating an autonomous editing platform. I am really excited about this (even though it basically eliminates the need for an editor — more on this later). While many of you may be hearing about Quik for the first time, it actually has been around for a bit. If you haven’t tried it yet, now is the time. One of the most difficult parts of a GoPro’s end-to-end workflow is the importing, editing and exporting. Now, with GoPro Plus and Quik you will be automatically uploading your Hero5 footage while charging so you can be editing quickly (or Quik-ly. Ha! Sorry, I had to.)

Hero5 Black and Hero5 Session
It’s funny that the Hero5 Black and Session are last on my list. I guess I am kind of putting what got GoPro to the dance last, but last doesn’t in any way mean least!

Hero5 Black

Available on October 2, the Hero5 Black is $399, and includes the following:
● Two-inch touch display with simplified controls.
● Up to 4K video at 30fps
● Auto-upload to GoPro Plus while charging
● Voice Control with support for seven languages, with more to come
● Simplified one-button control
● Waterproof, without housing, to 33 feet
● Compatible with existing mounts, including Karma
● Stereo audio recording
● Video Stabilization built-in
● Fish-eye-free wide-angle video
● RAW and WDR (wide dynamic range) photo modes
● GPS built-in!

Hero5 Session is $299 and offers these features:
● Same small design
● Up to 4K at 30fps
● 10 Megapixel photos
● Auto upload to GoPro Plus while charging
● Voice Control support for seven languages with more to come
● Simplified one-button control
● Waterproof, without housing, to 33 feet
● Compatible with existing mounts, including Karma
● Video Stabilization built in
● Fish-eye-free wide-angle video

Summing Up
GoPro has made power moves. They not only took the original action camera — the Hero — to the next level with upgrades like image stabilization, waterproof without housing, and simplifying the controls in the Hero5 Black and Hero5 Session, they added 4K recording a 30fps and stereo audio recording with Advanced Wind Noise Reduction.

Not only did they upgrade their cameras, GoPro is attempting to revolutionize the drone market with the Karma. The Karma has potential to bring the limelight back to GoPro and steal some thunder from competitors, like DJI, with this foldable and compact drone whose three-axis gimbal can be held by the included Karma handle.

Hero5 Session

Remember that drone teaser video that everyone thought was fake!? Here it is just in case. Looks like that was real and with some pre-planning you can recreate these awesome shots. What’s even more awesome is that later this year GoPro will be launching the “Quik Key,” a micro-USB card reader that plugs into your phone to transfer your videos and photos to your phone, as well as REMO — a voice-activated remote control for the Hero5 (think Apple TV, but for your camera: “GoPro, record video.”

Besides the incredible multimedia products GoPro creates, I really love the family feeling and camaraderie within the GoPro company and athletes they bring in to show off their tools. Coming from the airport to Squaw Valley, I was in the airport shuttle with some mega-pro athletes/content creators like Colin, and they were just as excited as I was.

It was kind of funny because the people who are usually in the projects I edit were next to me geeking out. GoPro has created this amazing, self-contained, ecosphere of content creators and content manipulators that are fan-boys and fan-girls. The energy around the GoPro Karma and Hero5 announcement is incredible, and they’ve created their own ultra-positive culture. I wish I could bottle it up and give it out to everyone reading this news.

Check out some video I shot here.

Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.


‘Suicide Squad’: Imageworks VFX supervisor Mark Breakspear 

By Randi Altman

In Warner Bros.’ Suicide Squad, a band of captured super-villains are released from prison by the government and tasked with working together to fight a common enemy, the evil Joker. This film, which held top box office honors for weeks, has a bit of everything: comic book antiheroes, super powers, epic battles and redemption. It also features a ton of visual effects work that was supervised by Sony Imageworks’ Mark Breakspear, who worked closely with production supervisor Jerome Chen and director David Ayer (see our interview with Ayer).

Mark Breakspear

Mark Breakspear

Breakspear is an industry veteran with more than 20 years of experience as a visual effects supervisor and artist, working on feature films, television and commercials. His credits include American Sniper, The Giver, Ender’s Game, Thor: The Dark World, The Great Gatsby… and that’s just to name a few.

Suicide Squad features approximately 1,200 shots, with Imageworks doing about 300, including the key fight at the end of the film between Enchantress, the Squad, Incubus and Mega Diablo. Imageworks also provided shots for several other sequences throughout the movie.

MPC worked on the majority of the other visual effects, with Third Floor creating postviz after the shoot to help with the cutting of the film.

I recently threw some questions at Breakspear about his process and work on Suicide Squad.

How early did you get involved in the project?
Jerome Chen, the production supervisor, involved us from the very beginning in the spring of 2015. We read the script and started designing one of the most challenging characters — Incubus. We spent a couple of months working with designer Tim Borgmann to finesse the details of his overall look, shape and, specifically, his skin and sub-surface qualities.


How did Imageworks prepare for taking on the film?

We spent time gathering as much information as we could about the work we were looking to do. That involved lengthy calls with Jerome to pick over every aspect of the designs that David Ayer wanted. As it was still pretty early, there was a lot more “something like” rather than “exactly like” when it came to the ideas. But this is what the prepro was for, and we were able to really focus on narrowing down the many ideas in to key selections and give the crew something to work with during the shoot in Toronto.

Can you talk about being on set?
The main shoot was at Pinewood in Toronto. We had several soundstages that were used for the creation of the various sets. Shoot days are usually long and arduous, and this was no exception. For VFX crews, the days are typically hectic, quiet, hectic, very hectic, quiet and then suddenly very hectic again. After wrap, you still have to download all the data, organize it and prep everything for the next day.

I had fantastic help on set from Chris Hebert who was our on-set photographer. His job was to make sure we had accurate records (photographic and data sets) of anything that could be used in our work later on. That meant actors, props, witness cameras, texture photography and any specific one-off moments that occur 300 times a day. Every movie set needs a Chris Hebert or it’s going to be a huge struggle later on in post!

gb0140_comp_breakdown_plate.1052.tif


Ok, let’s dig into the workflow. Can you walk us through it?

Workflow is a huge subject, so I’ll keep the answer somewhat concise! The general day would begin with a team meet between all the various VFX departments here at Imageworks. The work was split across teams in both Culver and Vancouver, so we did regular video Hangouts to discuss the daily plan, the weekly targets and generally where we were at, plus specific needs that anyone had. We would usually follow this by department meetings prior to AM dailies where I would review the latest work from the department leads, give notes, select things to show Jerome and David, and give feedback that I may have received from production.

We tried our best to keep our afternoons meeting-free so actual work could get done! Toward the end of the day we would have more dailies, and the final days selection of notes and pulls to the client would take place. Most days ended fairly late, as we had to round off the hundreds of emails with meaningful replies, prep for the next day and catch any late submission arrivals from the artists that might benefit from notes before the morning.

What tool, or tools, did you use for remote collaboration?
We used Google Hangouts for video conferencing, and Itview for shot discussion and notes with Jerome and David. Itview is our own software that replaces the need to use [off-the-shelf tools], and allows a much faster, more secure and accurate way to discuss and share shots. Jerome had a system in post and we would place data on it remotely for him to view and comment on in realtime with us during client calls. The notes and drawings he made would go straight in to our note tracker and then on to artists as required.

gb1156_comp_breakdown.1242.tif
What was the most challenging shot or shots, and why?

Our most challenging work was in understanding and implementing fractals into the design of the characters and their weapons. We had to get up to speed on three-dimensional mandlebulbs and how we can render them into our body of work. We also had to create vortical flow simulations that came off the fractal weapons, which created their own set of challenges due to the nature of how particles uniquely behave when near high velocity emissions.

So there wasn’t a specific shot that was more challenging than another, but the work that went in to most of them required a very challenging pre-design and concept solve involving fractal physics to make them work.

Can you talk about tools — off-the-shelf or proprietary — you used for the VFX? Any rendering in the cloud?
We used Side Effects Houdini and Autodesk Maya for the majority of shots and The Foundry’s Nuke to comp everything. When it came to rendering we used Arnold, and in regards to cloud rendering, we did render remotely to our own cloud, which is about 1,000 miles away — does that count (smiles)?


VFX Supervisor Volker Engel: ‘Independence Day,’ technology and more

Uncharted Territory’s Volker Engel is one of Hollywood’s leading VFX supervisors, working on movies as diverse as White House Down, Hugo and Roland Emmerich’s Shakespeare movie Anonymous. Most recently he was in charge of the huge number of effects for Emmerich’s Independence Day: Resurgence.

Engel was kind enough to make time in his schedule to discuss his 28-year history with Emmerich, his favorite scenes from Independence Day, his experience with augmented reality on set and more.

When did you get involved with Independence Day?
I was probably the earliest person involved after Roland Emmerich himself! He kept me posted over the years while we were working on other projects because we were always going to do this movie.

I think it was 2009 when the first negotiations with 20th Century Fox started, but the important part was early 2014. Roland had to convince the studio regarding the visuals of the project. Everyone was happy with the screenplay, but they said it would be great to get some key images. I hired a company called Trixter — they are based in Germany, but also have an office in LA. They have a very strong art department. In about six weeks we finished 16 images that are what you can call “concept art,” but they are extremely detailed. Most of these concepts can be seen as finished shots in the movie. This artwork was presented to 20th Century Fox and the movie was greenlit.

Concept art via Trixter.

You have worked with Emmerich many times. You must have developed a sort of shorthand?
This is now a 28-year working relationship. Obviously, we haven’t done every movie as a team but I think this is our eighth movie together. There is a shorthand and that helps a lot. I don’t think we really know what the actual shorthand is other than things that we don’t need to talk about because we know what needs to happen.

Technology continues to advance. Does that make life easier, or because you have more options does it make it even more complex?
It’s less the fact that there’s more options, it’s that the audience is so much more sophisticated. We now have better tools available to make better pictures. We can do things now that we were not able to do before. So, for example, now we can imagine a mothership that’s 3,000 miles in diameter and actually lands on Earth. There is a reason we had a smaller mothership in the first movie and that it didn’t touch down anywhere on the planet.

The mothership touching down in DC.

So it changes the way you tell stories in a really fundamental way?
Absolutely. If you look at a movie like Ex Machina, for example, you can show a half-human/half-robot and make it incredibly visually convincing. So all of a sudden you can tell a story that you wouldn’t have been able to tell before.

If you look at the original Independence Day movie, you really only see glimpses of the aliens because we had to do it with practical effects and men in suits. For Independence Day: Resurgence we had the chance to go much further. What I like actually is that Roland decided not to make it too gratuitous, but at least we were able to fully show the aliens.

Reports vary, but they suggest about 1,700 effects shots in Independence Day: Resurgence. Is that correct?
It was 1,748. Close to two-thirds of the movie!

What was your previs process like?
We had two different teams: one joined us from Method Studios and the other was our own Uncharted Territory team, and we split the task in half. The Method artists were working in our facility, so we were all under one roof.

Method focused on the whole lunar sequence, for example, while our in-house team started with the queen/bus chase toward the end of the movie. Roland loves to work with two specific storyboard artists and has several sessions during the week with them, and we used this as a foundation for the previs.

Trixter concept art.

So Roland was involved at the previs stage looking at how it was all going to fit together?
He had an office where the previs team was working, so we could get him over and go literally from artist to artist. We usually did these sessions twice a day.

What tools were you using?
Our in-house artists are Autodesk 3D Studio Max specialists, and the good folks from Method worked with Autodesk Maya.

The live shoot used camera-tracking technology from Ncam to marry the previs graphics and the live action in realtime to give a precise impression of how the final married shot would work.

How were you using the Ncam exactly?
The advantage is that we took the assets we had already built for previs and then re-used them inside the Ncam set-up, doing this with Autodesk Motion Builder. But some of the animation had to be done right there on set.

After: Area 51

I’ll give you an example. When we’re inside the hangar at Area 51, Roland wanted to pan from an actor’s face looking at 20 jet fighters lifting off and flying into the distance, and he wanted to pan off the actors face to show the jets. The Ncam team and Marion [Spates, the on-set digital effects supervisor] had to right there, on the spot, do the animation for the fighters. In about five minutes, they had to come up with something there and then and do the animation, and what’s more, it worked. That’s why Roland also loves to work with Ncam, because it gives him the flexibility to make some decisions right there in the moment.

So you’re actually updating or even creating shots on set?
Yes, exactly. We have the toolbox there — the assets like the interior of the hangar — but then we do it right there to the picture. Sometimes for both the A-camera and the B-camera.

We did a lot of extensions and augmentations on this movie and what really helped was our experience of working with Ncam on White House Down. For Roland, as the director, it helps him compose his images instead of just looking at a gigantic bluescreen. That’s really what it is, and he’s really good at that.

The Ncam at use on set.

I explain it this way: imagine you already have your first composite right there, which goes straight to editorial. They immediately have something to work with. We just deliver two video files: the clean one with the bluescreen and another from Ncam that has the composite.

Did using Ncam add to the shooting time?
Working with AR on set always adds some shooting time, and it’s really important that the director is briefed and wants to use this tool. The Ncam prep often runs parallel to the rehearsals with the actors, but sometimes it adds two or three additional minutes. When you have someone who’s not prepared for it, two or three minutes can feel like a lifetime. It does, however, save a lot of time in post.

On White House Down, when we used Ncam for the first time, it actually took a little over a week until everything grooved and everyone was aware of it — especially the camera department. After a little while they just knew this is exactly what needed to be done. It all became instant teamwork. It is something that supports the picture and it’s not a hindrance. It’s something that the director really wants.

Do you have a favorite scene from Resurgence?
There is a sequence inside the mothership where our actors are climbing up one of these gigantic columns. We had a small set piece being built for the actors to climb, and it was really important for Roland to compose the whole image. He could ask for a landing platform to be removed and more columns to be added to create a sense of depth, then move the view around another 50 or 60 degrees.

He was creating his images right there, and that’s why the guys have to be really quick on their feet and build these things in and make it work. At the same time, the assistant director is there saying the cameras are ready, the actors are ready and we’re ready to shoot, and of course no one wants them to wait around, so they better have their stuff ready!

Destruction of Singapore

The destruction of Singapore.

Some of my other favorite sequences from the film are the destruction of Singapore while the mothership enters the atmosphere and the alien queen chasing the school bus!

What is next for you?
In 1999, when I started Unchartered Territory with my business partner Marc Weigert, we set it up as a production company and started developing our own projects. We joke that Roland interrupts us from developing our projects because he comes with projects of his own that we just cannot say no to! But we have just come back from a trip to Ireland where we scouted two studios and met with several potential production partners for a new project of our own. Stay tuned!


Talking with new Shade VFX NY executive producer John Parenteau

By Randi Altman

John Parenteau, who has a long history working in visual effects, has been named executive producer of Shade VFX’s New York studio. Shade VFX, which opened in Los Angeles in 2009, provides feature and television visual effects, as well as design, stereoscopic, VR and previs services. In 2014, they opened their New York office to take advantage of the state’s fairly aggressive tax incentives and all that it brings to the city.

Shade-1“As a native New Yorker, with over a decade of working as an artist there, the decision to open an office back home was an easy one,” explains owner Bryan Godwin. “With John coming on board as our New York executive producer, I feel our team is complete and poised to grow — continuing to provide feature-film-level visuals. John’s deep experience running large facilities, working with top tier tent-pole clients and access to even more potential talent convinced me that he is the right choice to helm the production efforts out east.”

Shade’s New York office is already flush with work, including Rock that Body for Sony, The OA and The Get Down for Netflix, Mosaic for HBO and Civil for TNT. Not long ago, the shop finished work on Daredevil and Jessica Jones, two of Marvel’s Netflix collaborations. As John helps grow the client list in NYC, he will be supporting NY visual effects supervisor Karl Coyner, and working directly with Shade’s LA-based EP/VP of production Lisa Maher.

John has a long history in visual effects, starting at Amblin Entertainment in the early ‘90s all the way through to his recent work with supercomputer company Silverdraft, which provides solutions for VFX, VR and more. I’ve known him for many years. In fact, I first started spelling John Parenteau’s name wrong when he was co-owner and VFX supervisor at Digital Muse back in the mid to late ‘90s — kidding, I totally know how to spell it… now.

We kept in touch over the years. His passion and love for filmmaking and visual effects has always been at the forefront of our conversations, along with his interest in writing. John even wrote some NAB blogs for me when he was managing director of Pixomondo (they won the VFX Oscar for Hugo during that time) and I was editor-in-chief of Post Magazine. We worked together again when he was managing director of Silverdraft.

“I’ve always been the kind of guy who likes a challenge, and who likes to push into new areas entertainment,” says John. “But leaving visual effects was less an issue of needing a change and more of a chance to stretch my experience into new fields. After Pixomondo, Silverdraft was a great opportunity to delve into the technology behind VFX and to help develop some unique computer systems for visual effects artists.”

Making the decision to leave the industry a couple years ago to take care of his mother was difficult, but John knew it was the right thing to do. “While moving to Oregon led me away from Hollywood, I never really left the industry; it gets under your skin, and I think it’s impossible to truly get out, even if you wanted to.”

Parenteau realized quickly that the Portland scene wasn’t a hot-bed of film and television VFX, so he took the opportunity to apply his experience in entertainment to a new market, founding marketing boutique Bigfoot Robot. “I discovered a strong need for marketing for small- to mid-sized companies, including shooting and editing content for commercials and marketing videos. But I did keep my hand in media and entertainment thanks to one of my first clients, the industry website postPerspective. Randi and I had known each other for so many years, and our new relationship helped her out technically while allowing me to stay in touch with the industry.”

John’s mom passed over a year ago, and while he was enjoying his work at Bigfoot Robot, he realized how much he missed working in visual effects. “Shade VFX had always been a company I was aware of, and one that I knShade-2ew did great work,” he says. “In returning to the industry, I was trying to avoid landing in too safe of a spot and doing something I’d already done before. That’s when Bryan Godwin and Dave Van Dyke (owner and president of Shade, respectively) contacted me about their New York office. I saw a great opportunity to help build an already successful company into something even more powerful. Bryan, Lisa and Dave have become known for producing solid work in both feature and television, and they were looking for a missing component in New York to help them grow. I felt like I could fill that role and work with a company that was fun and exciting. There’s also something romantic about living in Manhattan, I have to admit.”

And it’s not just about building Shade for John. “I’m the kind of guy who likes to become part of a community. I hope I can contribute in various ways to the success of visual effects for not only Shade but for the New York visual effects community as a whole.”

While I’ll personally miss working with John on a day-to-day basis, I’m happy for him and for Shade. They are getting a very talented artist, who also happens to be a really nice guy.

FMPX8.14

Blending Ursa Mini and Red footage for Aston Martin spec spot

By Daniel Restuccio

When producer/director Jacob Steagall set out to make a spec commercial for Aston Martin, he chose to lens it on the Blackmagic Ursa Mini 4.6k and the Scarlet Red. He says the camera combo worked so seamlessly he dares anyone to tell which shots are Blackmagic and which are Red.

L-R Blackmagic’s Moritz Fortmann and Shawn Carlson with Jacob Steagall and Scott Stevens.

“I had the idea of filming a spec commercial to generate new business,” says Steagall. He convinced the high-end car maker to lend him an Aston Martin 2016 V12 Vanquish for a weekend. “The intent was to make a nice product that could be on their website and also be a good-looking piece on the demo reel for my production company.”

Steagall immediately pulled together his production team, which consisted of co-director Jonathan Swecker and cinematographers Scott Stevens and Adam Pacheco. “The team and I collaborated together about the vision for the spot which was to be quick, clean and to the point, but we would also accentuate the luxury and sexiness of the car.”

“We had access to the new Blackmagic Ursa Mini 4.6k and an older Red Scarlet with the MX chip,” says Stevens. “I was really interested in seeing how both cameras performed.”

He set up the Ursa Mini to shoot ProRes HQ at Ultra HD (3840×2160) and the Scarlet at 8:1 compression at 4K (4096×2160). He used both Canon still camera primes and a 24-105mm zoom, switching them from camera to camera depending on the shot. “For some wide shots we set them up side by side,” explains Stevens. “We also would have one camera shooting the back of the car and the other camera shooting a close-up on the side.”

In addition to his shooting duties, Stevens also edited the spot, using Adobe Premiere, and exported the XML into Blackmagic Resolve Studio 12. Stevens notes that, in addition to loving cinematography, he’s also “really into” color correction. “Jacob (Steagall) and I liked the way the Red footage looked straight out of the camera in the RedGamma4 color space. I matched the Blackmagic footage to the Red footage to get a basic look.”

Blackmagic colorist Moritz Fortmann took Stevens’ basis color correction and finessed the grade even more. “The first step was to talk to Jacob and Scott and find out what they were envisioning, what feel and look they were going for. They had already established a look so we saved a few stills as reference images to work off. The spot was shot on two different types of cameras, and in different formats. Step two was to analyze the characteristics of each camera and establish a color correction to match the two.  Step three was to tweak and refine the look. We did what I would describe as a simple color grade, only relying on primaries, without using any Power Windows or keys.”

If you’re planning to shoot mixed footage, Fortmann suggests you use cameras with similar characteristics, matching resolution, dynamic range and format. “Shooting RAW and/or Log provides for the highest dynamic range,” he says. “The more ‘room’ a colorist has to make adjustments, the easier it will be to match mixed footage. When color correcting, the key is to make mixed footage look consistent. One camera may perform well in low light while another one does not. You’ll need to find that sweet spot that works for all of your footage, not just one camera.”

Daniel Restuccio is a writer and chair of the multimedia department at California Lutheran University.


Sony at NAB with new 4K OLED monitor, 4K, 8X Ultra HFR camera

At last year’s NAB, Sony introduced its first 4K OLED reference monitor for critical viewing — the BVM-X300. This year, Sony added a new monitor, the the PVM-X550, a 55-inch, OLED panel with 12-bit signal processing, perfect for client viewing. The Trimaster EL PVM-X550 supports HDR through various Electro-Optical Transfer Functions (EOTF), such as S-Log3, SMPTE ST.2084 and Hybrid Log-Gamma, covering applications for both cinematography and broadcast. The PVM-X550 is a quad-view OLED monitor, which allows customized individual display settings across four distinct views in HD. It is equipped with the same signal-processing engine as the BVM-X300, providing a 12-bit output signal for picture accuracy and consistency. It also supports industry standard color spaces including the wider ITU-R BT.2020 for Ultra High Definition.

HFR Camera
At NAB 2016, Sony displayed their newest camera system: the HDC-4800 combines 4K resolution with enhanced high frame rate capabilities, capturing up to 8X at 4K, and 16X in full HD. “This camera system can do a lot of everything — very high frame rate, very high resolution,” said Rob Willox, marketing manager for content creation, Sony Electronics.

I broke the second paragraph into two, and they are now: The HDC-4800 uses a new Super 35mm 4K CMOS sensor, supporting a wide color space (both BT.2020 and BT.709), and provides an industry standard PL lens mount, giving the system the capability of using the highest quality cinematic lenses for clear and crisp high resolution images.The new sensor brings the system into the cinematic family of RED and Alexa, making it well suited as a competitor to today’s modern, high end cinematic digital solutions.

An added feature of the HDC-4800 is how it’s specifically designed to integrate with Sony’s companion system, the Sony HDC-4300, a 2/3 inch image sensor 4k/HD camera. Using matching colorimetry and deep toolset camera adjustments, and with the ability to take advantage of existing build-up kits, remote control panels and master setup units, the two cameras can blend seamlessly.

Archive
Sony also showed the second generation of its Optical Disc Archive System, which adopts new, high-capacity optical media, rated with a 100 year shelf life with double the transfer rate and double the capacity of a single cartridge at 3.3 TB. The Generation 2 Optical Disc Archive System also adds an 8-channel optical drive unit, doubling the read/write speeds of the previous generation, helping to meet the data needs of real-time 4K production.

Making our dialogue-free indie feature ‘Driftwood’

By Paul Taylor and Alex Megaro

Driftwood is a dialogue-free feature film that focuses on a woman and her captor in an isolated cabin. We chose to shoot entirely MOS… because we are insane. Or perhaps we were insane to shoot a dialogue-free feature in the first place, but our choice to remove sound recording from the set was both freeing and nerve wracking due to the potential post production nightmare that lay ahead.

Our decision was based on how, without speech to carry along the narrative, every sound would need to be enhanced to fill in the isolated world of our characters. We wanted draconian control over the soundscape, from every footstep to every door creak, but we also knew the sheer volume of work involved would put off all but the bravest post studios.

The film was shot in a week with a cast of three and a crew of three in a small cabin in Upstate New York. Our camera of choice was a Canon 5D Mark II with an array of Canon L-series lenses. We chose the 5D because we already owned it — so more bang for our buck — and also because it gave us a high-quality image, even with such a small body. Its ease of use allowed us to set up extremely quickly, which was important considering our extremely truncated shooting schedule. Having no sound team on set allowed us to move around freely without the concerns of planes passing overhead or cars rumbling in the distance delaying a shot.

The Audio Post
The editing was a wonderfully liberating experience in which we cut purely to image, never once needing to worry about speech continuity or a host of other factors that often come into play with dialogue-driven films. Driftwood was edited on Apple’s Final Cut Pro X, a program that can sometimes be a bit difficult for audio editing, but for this film it was a non-issue. The Magnetic Timeline was actually quite perfect for the way we constructed this film and made the entire process smooth and simple.

Once picture locked, we brought the project to New York City’s Silver Sound Studios, who jumped at the chance to design the atmosphere for an entire feature from the ground up. We sat with the engineers at Silver Sound and went through Driftwood shot-by-shot, creating a master list of all the sounds we thought necessary to include. Some were obvious, such as footsteps, breathing, clocks ticking and others less so, such as the humming of an old refrigerator or creaking of a wooden chair.

Once the initial list was set, we discussed whether or not to use stock audio or rerecord everything at the original location. Again, because we wanted complete control to create something wholly unique, we concluded it was important to return to the cabin and capture its particular character. Over the course of a few days, the Silver Sound gang rerecorded nearly every sound in the film, leaving only some basic Foley work to complete in their studio.

Once their library was complete, one of the last steps before mixing was to ADR all of the breathing. We had the actors come into the studio over a one-week period during which they breathed, moaned and sighed inside Silver Sound’s recording booth. These subtle sounds are taken for granted in most films, but for Driftwood they were of utter importance. The way the actors would sigh or breath could change the meaning behind that sound and change the subtext of the scene. If the characters cannot talk, then their expressions must be conveyed in other ways, and in this case we chose a more physiological track.

By the time we completed the film we had spent over a year recording and mixing the audio. The finished product is a world unto itself, a testament to the laborious yet incredibly exciting work performed by Silver Sound.

Driftwood was written, directed and photographed by Paul Taylor. It was produced and edited by Alex Megaro.