Category Archives: post production

Chatting up IBC’s Michael Crimp about this year’s show

Every year, many from our industry head to Amsterdam for the International Broadcasting Convention. With IBC’s start date coming fast, what better time for the organization’s CEO, Michael Crimp, to answer questions about the show, which runs from September 15-19.

IBC is celebrating its 50th anniversary this year. How will you celebrate?
In addition to producing a commemorative book, and our annual party, IBC is starting a new charitable venture, supporting an Amsterdam group that provides support through sport for disadvantaged and disabled children. If you want to play against former Ajax players in our Saturday night match, bid now to join the IBC All-Stars.

It’s also about keeping the conversation going. We are 50 years on and have a huge amount to talk about — from Ultra HD to 5G connectivity, from IP to cyber security.

How has IBC evolved over the past 10 years?
The simple answer is that IBC has evolved along with the industry, or rather IBC has strived to identify the key trends which will transform the industry and ensure that we are ahead of the curve.

Looking back 10 years, digital cinema was still a work in progress: the total transition we have now seen was just beginning. We had dedicated areas focused on mobile video and digital signage, things that we take for granted today. You can see the equivalents in IBC2017, like the IP Showcase and all the work done on interoperability.

Five years ago we started our Leaders’ Summit, the behind-closed-doors conference for CEOs from the top broadcasters and media organizations, and it has proved hugely successful. This year we are adding two more similar, invitation-only events, this time aimed at CTOs. We have a day focusing on cyber security and another looking at the potential for 5G.

We are also trying a new business matchmaking venue this year, the IBC Startup Forum. Working with Media Honeypot, we are aiming to bring startups and scale-ups together with the media companies that might want to use their talents and the investors who might back the deals.

Will IBC and annual trade shows still be relevant in another 50 years?
Yes, I firmly believe they will. Of course, you will be able to research basic information online — and you can do that now. We have added to the online resources available with our IBC365 year-round online presence. But it is much harder to exchange opinions and experiences that way. Human nature dictates that we learn best from direct contact, from friendly discussions, from chance conversations. You cannot do that online. It is why we regard the opportunity to meet old friends and new peers as one of the key parts of the IBC experience.

What are some of the most important decisions you face in your job on a daily basis?
IBC is an interesting business to head. In some ways, of course, my job as CEO is the same as the head of any other company: making sure the staff are all pulling in the same direction, the customers are happy and the finances are secure. But IBC is unlike any other business because our focus is on spreading and sharing knowledge, and because our shareholders are our customers. IBC is organized by the industry for the industry, and at the top of our organization is the Partnership Board, which contains representatives of the six leading professional and trade bodies in the industry: IABM, IEE, IET, RTS, SCTE and SMPTE.

Can you talk a bit about the conference?
One significant development from that first IBC 50 years ago is the nature of the conference. The founders were insistent that an exhibition needed a technical conference, and in 1967 it was based solely on papers outlining the latest research.

Today, the technical papers program still forms the center piece of the conference. But today our conference is much broader, speaking to the creative and commercial people in our community as well as the engineering and operational.

This year’s conference is subtitled “Truth, Trust and Transformation,” and has five tracks running over five days. Session topics range from the deeply technical, like new codec design, to fake news and alternative facts. Speakers range from Alberto Duenas, the principal video architect at chipmaker ARM to Dan Danker, the product director at Facebook.

How are the attendees and companies participating in IBC changing?
The industry is so much broader than it once was. Consumers used to watch television, because that was all that the technology could achieve. Today, they expect to choose what they want to watch, when and where they want to watch it, and on the device and platform which happen to be convenient at the time.

As the industry expands, so does the IBC community. This year, for example, we have the biggest temporary structure we have ever built for an IBC, to house Hall 14, dedicated to content everywhere.

Given that international travel can be painful, what should those outside the EU consider?
Amsterdam is, in truth, a very easy place for visitors in any part of the world to reach. Its airport is a global hub. The EU maintains an open attitude and a practical approach to visas when required, so there should be no barriers to anyone wanting to visit IBC.

The IBC Innovation Awards are always a draw. Can you comment on the calibre of entries this year?
When we decided to add the IBC Innovation Awards to our program, our aim was to reflect the real nature of the industry. We wanted to reward the real-world projects, where users and technology partners got together to tackle a real challenge and come up with a solution that was much more than the sum of its parts.

Our finalists range from a small French-language service based in Canada to Google Earth; from a new approach to transmitters in the USA to an online service in India; and from Asia’s biggest broadcaster to the Spanish national railway company.

The Awards Ceremony on Sunday night is always one of my highlights. This year there is a special guest presenter: the academic and broadcaster Dr. Helen Czerski. The show lasts about an hour and is free to all IBC visitors.

What are the latest developments in adding capacity at IBC?
There is always talk of the need to move to another venue, and of course as a responsible business we keep this continually under review. But where would we move to? There is nowhere that offers the same combination of exhibition space, conference facilities and catering and networking under one roof. There is nowhere that can provide the range of hotels at all prices that Amsterdam offers, nor its friendly and welcoming atmosphere.

Talking of hotels, visitors this year may notice a large building site between hall 12 and the station. This will be a large on-site hotel, scheduled to be open in time for IBC in 2019.

And regulars who have resigned themselves to walking around the hoardings covering up the now not-so-new underground station will be pleased to hear that the North-South metro line is due to open in July 2018. Test trains are already running, and visitors to IBC next year will be able to speed from the centre of the city in under 10 minutes.

As you mentioned earlier, the theme for IBC2017 is “Truth, Trust and Transformation.” What is the rationale behind this?
Everyone has noticed that the terms “fake news” and “alternative facts” are ubiquitous these days. Broadcasters have traditionally been the trusted brand for news: is the era of social media and universal Internet access changing that?

It is a critical topic to debate at IBC, because the industry’s response to it is central to its future, commercially, as well as technically. Providing true, accurate and honest access to news (and related genres like sport) is expensive and demanding. How do we address this key issue? Also, one of the challenges of the transition to IP connectivity is the risk that the media industry will become a major target for malware and hackers. As the transport platform becomes more open, the more we need to focus on cyber security and the intrinsic design of safe, secure systems.

OTT and social media delivery is sometimes seen as “disruptive,” but I think that “transformative” is the better word. It brings new challenges for creativity and business, and it is right that IBC looks at them.

Will VR and AR be addressed at this year’s conference?
Yes, in the Future Zone, and no doubt on the show floor. Technologies in this area are tumbling out, but the business and creative case seems to be lagging behind. We know what VR can do, but how can we tell stories with it? How can we monetize it? IBC can bring all the sides of the industry together to dig into all the issues. And not just in debate, but by seeing and experiencing the state of the art.

Cyber security and security breaches are becoming more frequent. How will IBC address these challenges?
Cyber security is such a critical issue that we have devoted a day to it in our new C-Tech Forum. Beyond that, we have an important session on cyber security on Friday in the main conference with experts from around the world and around the industry debating what can and should be done to protect content and operations.

Incidentally, we are also looking at artificial intelligence and machine learning, with conference sessions in both the technology and business transformation strands.

What is the Platform Futures — Sport conference aiming to address?
Platform Futures is one of the strands running through the conference. It looks at how the latest delivery and engagement technologies are opening new opportunities for the presentation of content.

Sport has always been a major driver – perhaps the major driver – of innovation in television and media. For many years now we have had a sport day as part of the conference. This year, we are dedicating the Platform Futures strand to sport on Sunday.

The stream looks at how new technology is pushing boundaries for live sports coverage; the increasing importance of fan engagement; and the phenomenon of “alternative sports formats” like Twenty20 cricket and Rugby 7s, which provide lucrative alternatives to traditional competitions. It will also examine the unprecedented growth of eSports, and the exponential opportunities for broadcasters in a market that is now pushing towards the half-billion-dollar size.

 

Michael Kammes’ 5 Things – Video editing software

By Randi Altman

Technologist Michael Kammes is back with a new episode of 5 Things, which focuses on simplifying film, TV and media technology. The web series answers, according to Kammes, the “five burning tech questions” people might have about technologies and workflows in the media creation space. This episode tackles professional video editing software being used (or not used) in Hollywood.

Why is now the time to address this segment of the industry? “The market for NLEs is now more crowded than it has been in over 20 years,” explains Kammes. “Not since the dawn of modern NLEs have there been this many questions over what tools should be used. In addition, the massive price drop of NLEs, coupled with the pricing shift (monthly/yearly, as opposed to outright) has created more confusion in the market.”

In his video, Kammes focuses on Avid Media Composer, Adobe Premiere, Apple Final Cut Pro, Lightworks, Blackmagic Resolve and others.

Considering its history and use on some major motion pictures, (such as The Wolf of Wall Street), why hasn’t Lightworks made more strides in the Hollywood community? “I think Lightworks has had massive product development and marketing issues,” shares Kammes. “I rarely see the product pushed online, at user groups or in forums.  EditShare, the parent company of LightWorks, also deals heavily in storage, so one can only assume the marketing dollars are being spent on larger ticket items like professional and enterprise storage over a desktop application.”

What about Resolve, considering its updated NLE tools and the acquisition of audio company Fairlight? Should we expect to see more Resolve being used as a traditional NLE? “I think in Hollywood, adoption will be very, very slow for creative editorial, and unless something drastic happens to Avid and Adobe, Resolve will remain in the minority. For dailies, transcodes or grading, I can see it only getting bigger, but I don’t see larger facilities adopting Resolve for creative editorial. Outside of Hollywood, I see it gaining more traction. Those outlets have more flexibility to pivot and try different tools without the locked-in TV and feature film machine in Hollywood.”

Check it out:

Dell 6.15

Industry vets open NYC post boutique Twelve

Colorist Lez Rudge and veteran production and post executives Marcelo Gandola, Axel Ericson and Ed Rilli have joined forces to launch New York City-based Twelve, a high-end post boutique for the advertising, film and television industries. Twelve has already been working on campaigns for Jagermeister, Comcast, Maybelline and the NY Rangers.

Twelve’s 4,500-square-foot space in Manhattan’s NoMad neighborhood features three Blackmagic Resolve color rooms, two Autodesk Flame suites and a 4K DI theater with a 7.1 Dolby surround sound system and 25-person seating capacity. Here, clients also have access to a suite of film and production services — editorial, mastering, finishing and audio mixing — as part of a strategic alliance with Ericson and his team at Digital Arts. Ericson, who brings 25 years of experience in film and television, also serves as managing partner of Twelve.

From Twelve’s recent Avion tequila campaign.

Managing director Rilli will handle client relations, strategy, budgets and deadlines, among other deliverables for the business. He was previously head of production at Nice Shoes for 17 years. His long list of agency clients includes Hill Holiday, Publicis, Grey and Saatchi & Saatchi and projects for Dunkin Donuts, NFL, Maybelline and Ford.

Gandola was most recently chief operations officer at Harbor Picture Company. Other positions include EVP at Hogarth, SVP of creative services at Deluxe, VP of operations at Company 3 and principal of Burst @ Creative Bubble, a digital audio and sound design company.

On the creative side, Rudge was formerly a colorist and partner at Nice Shoes. Since 2015, Rudge has also been focusing on his directorial career. His most recent campaign for the NY Rangers and Madison Square Garden — a concept-to-completion project via Twelve — garnered more than 300,000 Facebook hits on its first day.

While Twelve is currently working on short-form content, such as commercials and marketing campaigns, the company is making a concerted effort to extend its reach into film and television. Meanwhile, the partners also have a significant roster expansion in the works.

“After all of these years on both the vendor and client side, we’ve learned how best to get things done,” concludes Gandola. “In a way, technology has become secondary, and artistry is where we keep the emphasis. That’s the essence of what we want to provide clients, and that’s ultimately what pushed us to open our own place.”

Main Image (L-R): Ed Rilli, Axel Ericson, Lez Rudge & Marcelo Gandola


Millennium Digital XL camera: development to delivery

By Lance Holte and Daniel Restuccio

Panavision’s Millennium DXL 8K may be one of today’s best digital cinema cameras, but it might also be one of the most misunderstood. Conceived and crafted to the exacting tradition of the company whose cameras captured such films as Lawrence of Arabia and Inception, the Millennium DXL challenges expectations. We recently sat down with Panavision to examine the history, workflow, some new features and how that all fits into a 2017 moviemaking ecosystem.

Announced at Cine Gear 2016, and released for rent through Panavision in January 2017, the Millennium DXL stepped into the digital large format field as, at first impression, a competitor to the Arri Alexa 65. The DXL was the collaborative result of a partnership of three companies: Panavision developed the optics, accessories and some of the electronics; Red Digital Cinema designed the 8K VV (VistaVision) sensor; and Light Iron provided the features, color science and general workflow for the camera system.

The collaboration for the camera first began when Light Iron was acquired by Panavision in 2015. According to Michael Cioni, Light Iron president/Millennium DXL product manager, the increase in 4K and HDR television and theatrical formats like Dolby Vision and Barco Escape created the perfect environment for the three-company partnership. “When Panavision bought Light Iron, our idea was to create a way for Panavision to integrate a production ecosystem into the post world. The DXL rests atop Red’s best tenets, Panavision’s best tenets and Light Iron’s best tenets. We’re partners in this — information can flow freely between post, workflow, color, electronics and data management into cameras, color science, ergonomics, accessories and lenses.”

HDR OLED viewfinder

Now, one year after the first announcement, with projects like the Lionsgate feature adventure Robin Hood, the Fox Searchlight drama Can You Ever Forgive Me?, the CBS crime drama S.W.A.T. and a Samsung campaign shot by Oscar-winner Linus Sandgren under the DXL’s belt, the camera sports an array of new upgrades, features and advanced tools. They include an HDR OLED viewfinder (which they say is the first), wireless control software for iOS, and a new series of lenses. According to Panavision, the new DXL offers “unprecedented development in full production-to-post workflow.”

Preproduction Considerations
With so many high-resolution cameras on the market, why pick the DXL? According to Cioni, cinematographers and their camera crew are no longer the only people that directly interact with cameras. Panavision examined the impact a camera had on each production department — camera assistants, operators, data managers, DITs, editors, and visual effects supervisors. In response to this feedback, they designed DXL to offer custom toolsets for every department. In addition, Panavision wanted to leverage the benefits of their heritage lenses and enable the same glass that photographed ‘Lawrence of Arabia’ to be available for a wider range of today’s filmmakers on DXL.

When Arri first debuted the Alexa 65 in 2014, there were questions about whether such a high-resolution, data-heavy image was necessary or beneficial. But cinematographers jumped on it and have leaned on large format sensors and glass-to-lens pictures — ranging from Doctor Strange to Rogue One — to deliver greater immersiveness, detail and range. It seems that the large format trend is only accelerating, particularly among filmmakers who are interested in the optical magnification, depth of field and field-of-view characteristics that only large format photography offers.

Kramer Morgenthau

“I think large format is the future of cinematography for the big screen,” says cinematographer Kramer Morgenthau, who shot with the DXL in 2016. “[Large format cinematography] gives more of a feeling of the way human vision is. And so, it’s more cinematic. Same thing with anamorphic glass — anamorphic does a similar thing, and that’s one of the reasons why people love it. The most important thing is the glass, and then the support, and then the user-friendliness of the camera to move quickly. But these are all important.”

The DXL comes to market offering a myriad of creative choice for filmmakers. Among the large format cameras, the Millennium DXL aims to be the crème de la crème — it’s built around an 46mm 8192×4320 Red VV sensor, custom Panavision large format spherical and anamorphic lenses, wrapped in camera department-friendly electronics, using proprietary color science — all of which complements a mixed camera environment.

“The beauty of digital, and this camera in particular, is that DXL actually stands for ‘digital extra light.’ With a core body weight of only 10 pounds, and with its small form factor, I’ve seen DXL used in the back seat of a car as well as to capture the most incredible helicopter scenes,” Cioni notes.

With the help of Light Iron, Panavision developed a tool to match DXL footage to Panavised Red Weapon cameras. Guardians of the Galaxy Vol. 2 used Red Weapon 8K VV Cameras with Panavision Primo 70 lenses. “There are shows like Netflix’s 13 Reasons Why [Season Two] that combined this special matching of the DXL and the Red Helium sensor based on the workflow of the show,” Cioni notes. “They’re shooting [the second season] with two DXLs as their primary camera, and they have two 8K Red cameras with Helium sensors, and they match each other.”

If you are thinking the Millennium DXL will bust your budget, think again. Like many Panavision cameras, the DXL is exclusively leasable through Panavision, but Cioni says they’re happy to help filmmakers to build the right package and workflow. “A lot of budgetary expense can be avoided with a more efficient workflow. Once customers learn how DXL streamlines the entire imaging chain, a DXL package might not be out of reach. We always work with customers to build the right package at a competitive price,” he says.

Using the DXL in Production
The DXL could be perceived as a classic dolly Panavision camera, especially with the large format moniker. “Not true,” says Morgenthau, who shot test footage with the camera slung over his shoulder in the back seat of a car.

He continues, “I sat in the back of a car and handheld it — in the back of a convertible. It’s very ergonomic and user-friendly. I think what’s exciting about the Millennium: its size and integration with technology, and the choice of lenses that you get with the Panavision lens family.”

Panavision’s fleet of large format lenses, many of which date back to the 1950s, made the company uniquely equipped to begin development on the new series of large format optics. To be available by the end of 2017, the Primo Artiste lenses are a full series of T/1.8 Primes — the fastest optics available for large format cinematography — with a completely internalized motor and included metadata capture. Additionally, the Primo Artiste lenses can be outfitted with an anamorphic glass attachment that retains the spherical nature of the base lens, yet induces anamorphic artifacts like directional flares and distorted bokeh.

Another new addition to the DXL is the earlier mentioned Panavision’s HDR OLED Primo viewfinder. Offering 600-nit brightness, image smoothing and optics to limit eye fatigue, the viewfinder also boasts a theoretical contrast ratio of 1,000,000:1. Like other elements on the camera, the Primo viewfinder was the result of extensive polling and camera operator feedback. “Spearheaded by Panavision’s Haluki Sadahiro and Dominick Aiello, we went to operators and asked them everything we could about what makes a good viewfinder,” notes Cioni. “Guiding an industry game-changing product meant we went through multiple iterations. We showed the first Primo HDR prototype version in November 2016, and after six months of field testing, the final version is both better and simpler, and it’s all thanks to user feedback.”

Michael Cioni

In response to the growing popularity of HDR delivery, Light Iron also provides a powerful on-set HDR viewing solution. The HDR Village cart is built with a 4K HDR Sony monitor with numerous video inputs. The system can simultaneously display A and B camera feeds in high dynamic range and standard dynamic range on four different split quadrants. This enables cinematographers to evaluate their images and better prepare for multi-format color grading in post, given that most HDR projects are also required to deliver in SDR.

Post Production
The camera captures R3D files, the same as any other Red camera, but does have metadata that is unique to the DXL, ranging from color science to lens information. It also uses Light Iron’s set of color matrices designed specifically for the DXL: Light Iron Color.

Designed by Light Iron supervising colorist Ian Vertovec, Light Iron Color deviates from traditional digital color matrices by following in the footsteps of film stock philosophy instead of direct replication of how colors look in nature. Cioni likens Light Iron Color to Kodak’s approach to film. “Kodak tried to make different film stocks for different intentions. Since one film stock cannot satisfy every creative intention, DXL is designed to allow look transforms that users can choose, export and integrate into the post process. They come in the form of cube lookup tables and are all non-destructive.”

Light Iron Color can be adjusted and tweaked by the user or by Light Iron, which Cioni says has been done on many shows. The ability to adjust Light Iron Color to fit a particular project is also useful on shows that shoot with multiple camera types. Though Light Iron Color was designed specifically for the Millennium DXL, Light Iron has used it on other cameras — including the Sony A7, and Reds with Helium and Dragon sensors — to ensure that all the footage matches as closely as possible.

While it’s possible to cut with high-resolution media online with a blazing fast workstation and storage solution, it’s a lot trickier to edit online with 8K media in a post production environment that often requires multiple editors, assistants, VFX editors, post PAs and more. The good news is that the DXL records onboard low-bitrate proxy media (ProRes or DNx) for offline editorial while simultaneously recording R3Ds without requiring the use of an external recorder.

Cioni’s optimal camera recording setup for editorial is 5:1 compression for the R3Ds alongside 2K ProRes LT files. He explains, “My rule of thumb is to record super high and super low. And if I have high-res and low-res and I need to make something else, I can generate that somewhere in the middle from the R3Ds. But as long as I have the bottom and the top, I’m good.”

Storage is also a major post consideration. An hour of 8192×4320 R3Ds at 23.976fps runs in the 1TB/hour range — that number may vary, depending on the R3D compression, but when compared to an hour of 6560×3100 Arriraw footage, which lands at 2.6TB an hour, the Millennium DXL’s lighter R3D workflow can be very attractive.

Conform and Delivery
One significant aspect of the Millennium DXL workflow is that even though the camera’s sensor, body, glass and other pipeline tools are all recently developed, R3D conform and delivery workflows remain tried and true. The onboard proxy media exactly matches the R3Ds by name and timecode, and since Light Iron Color is non-destructive, the conform and color-prep process is simple and adjustable, whether the conform is done with Adobe, Blackmagic, Avid or other software.

Additionally, since Red media can be imported into almost all major visual effects applications, it’s possible to work with the raw R3Ds as VFX plates. This retains the lens and camera metadata for better camera tracking and optical effects, as well as providing the flexibility of working with Light Iron Color turned on or off, and the 8K R3Ds are still lighter than working with 4K (as is the VFX trend) DPX or EXR plates. The resolution also affords enormous space for opticals and stabilization in a 4K master.

4K is the increasingly common delivery resolution among studios, networks and over-the-top content distributors, but in a world of constant remastering and an exponential increase in television and display resolutions, the benefit in future-proofing a picture is easily apparent. Baselight, Resolve, Rio and other grading and finishing applications can handle 8K resolutions, and even if the final project is only rendered at 4K now, conforming and grading in 8K ensures the picture will be future-proofed for some time. It’s a simple task to re-export a 6K or 8K master when those resolutions become the standard years down the line.

After having played with DXL footage provided by Light Iron, it was surprising how straightforward the workflow seems. For a very small production, the trickiest part is the requirement of a powerful workstation — or sets of workstations — to conform and play 8K Red media, with a mix of (likely) 4K VFX shots, graphics and overlays. Michael Cioni notes, “[Everyone] already knows a RedCode workflow. They don’t have to learn it, I could show the DXL to anyone who has a Red Raven and in 30 seconds they’ll confidently say, ‘I got this.’”


Choosing the right workstation set-up for the job

By Lance Holte

Like virtually everything in the world of filmmaking, the number of available options for a perfect editorial workstation are almost infinite. The vast majority of systems can be greatly customized and expanded, whether by custom order, upgraded internal hardware or with expansion chassis and I/O boxes. In a time when many workstations are purchased, leased or upgraded for a specific project, the workstation buying process is largely determined by the project’s workflow and budget.

One of Harbor Picture Company’s online rooms.

In my experience, no two projects have identical workflows. Even if two projects are very similar, there are usually some slight differences — a different editor, a new camera, a shorter schedule, bigger storage requirements… the list goes on and on. The first step for choosing the optimal workstation(s) for a project is to ask a handful of broad questions that are good starters for workflow design. I generally start by requesting the delivery requirements, since they are a good indicator of the size and scope of the project.

Then I move on to questions like:

What are the camera/footage formats?
How long is the post production schedule?
Who is the editorial staff?

Often there aren’t concrete answers to these questions at the beginning of a project, but even rough answers point the way to follow-up questions. For instance, Q: What are the video delivery requirements? A: It’s a commercial campaign — HD and SD ProRes 4444 QTs.

Simple enough. Next question.

Christopher Lam from SF’s Double Fine Productions/ Courtesy of Wacom.

Q: What is the camera format? A: Red Weapon 6K, because the director wants to be able to do optical effects and stabilize most of the shots. This answer makes it very clear that we’re going to be editing offline, since the commercial budget doesn’t allow for the purchase of a blazing system with a huge, fast storage array.

Q: What is the post schedule? A: Eight weeks. Great. This should allow enough time to transcode ProRes proxies for all the media, followed by offline and online editorial.

At this point, it’s looking like there’s no need for an insanely powerful workstation, and the schedule looks like we’ll only need one editor and an assistant. Q: Who is the editorial staff? A: The editor is an Adobe Premiere guy, and the ad agency wants to spend a ton of time in the bay with him. Now, we know that agency folks really hate technical slowdowns that can sometimes occur with equipment that is pushing the envelope, so this workstation just needs to be something that’s simple and reliable. Macs make agency guys comfortable, so let’s go with a Mac Pro for the editor. If possible, I prefer to connect the client monitor directly via HDMI, since there are no delay issues that can sometimes be caused by HDMI to SDI converters. Of course, since that will use up the Mac Pro’s single HDMI port, the desktop monitors and the audio I/O box will use up two or three Thunderbolt ports. If the assistant editor doesn’t need such a powerful system, a high-end iMac could suffice.

(And for those who don’t mind waiting until the new iMac Pro ships in December, Apple’s latest release of the all-in-one workstation seems to signal a committed return for the company to the professional creative world – and is an encouraging sign for the Mac Pro overhaul in 2018. The iMac Pro addresses its non-upgradability by futureproofing itself as the most powerful all-in-one machine ever released. The base model starts at a hefty $4,999, but boasts options for up to a 5K display, 18-core Xeon processor, 128GB of RAM, and AMD Radeon Vega GPU. As more and more applications add OpenCL acceleration (AMD GPUs), the iMac Pro should stay relevant for a number of years.)

Now, our workflow would be very different if the answer to the first question had instead been A: It’s a feature film. Technicolor will handle the final delivery, but we still want to be able to make in-house 4K DCPs for screenings, EXR and DPX sequences for the VFX vendors, Blu-ray screeners, as well as review files and create all the high-res deliverables for mastering.

Since this project is a feature film, likely with a much larger editorial staff, the workflow might be better suited to editorial in Avid (to use project sharing/bin locking/collaborative editing). And since it turns out that Technicolor is grading the film in Blackmagic Resolve, it makes sense to online the film in Resolve and then pass the project over to Technicolor. Resolve will also cover any in-house temp grading and DCP creation and can handle virtually any video file.

PCs
For the sake of comparison, let’s build out some workstations on the PC side that will cover our editors, assistants, online editors, VFX editors and artists, and temp colorist. PC vs. Mac will likely be a hotly debated topic in this industry for some time, but there is no denying that a PC will return more cost-effective power at the expense of increased complexity (and potential for increased technical issues) than a Mac with similar specs. I also appreciate the longer lifespan of machines with easy upgradability and expandability without requiring expansion chassis or external GPU enclosures.

I’ve had excellent success with the HP Z line — using z840s for serious finishing machines and z440s and z640s for offline editorial workstations. There are almost unlimited options for desktop PCs, but only certain workstations and components are certified for various post applications, so it pays to do certification research when building a workstation from the ground up.

The Molecule‘s artist row in NYC.

It’s also important to keep the workstation components balanced. A system is only as strong as its weakest link, so a workstation with an insanely powerful GPU, but only a handful of CPU cores will be outperformed by a workstation with 16-20 cores and a moderately high-end GPU. Make sure the CPU, GPU, and RAM are similarly matched to get the best bang for your buck and a more stable workstation.

Relationships!
Finally, in terms of getting the best bang for your buck, there’s one trick that reigns supreme: build great relationships with hardware companies and vendors. Hardware companies are always looking for quality input, advice and real-world testing. They are often willing to lend (or give) new equipment in exchange for case studies, reviews, workflow demonstrations and press. Creating relationships is not only a great way to stay up to date with cutting edge equipment, it expands support options, your technical network and is the best opportunity to be directly involved with development. So go to trade shows, be active on forums, teach, write and generally be as involved as possible and your equipment will thank you.

Our Main Image Courtesy of editor/compositor Fred Ruckel.

 


Lance Holte is an LA-based post production supervisor and producer. He has spoken and taught at such events as NAB, SMPTE, SIGGRAPH and Createasphere. You can email him at lance@lanceholte.com.


Doing more with Thunderbolt 3

Streamlined speed on set or in the studio

By Beth Marchant

It was only six years ago that Thunderbolt, the high-speed data transfer and display port standard co-developed by Apple and Intel, first appeared in Apple’s MacBook Pros and iMacs. Since then, the blended PCI Express, DisplayPort and power plug cable has jolted its way toward ubiquity, giving computers and peripherals increased speed and functionality with every iteration.

Content creators were the first to discover its potential, and gamers quickly followed. Intel, which now owns the sole rights to the spec, announced in late May it would put Thunderbolt 3 into all of its future CPUs and release the spec to the industry in 2018. In a related blog post, Intel VP Chris Walker called Thunderbolt 3 “one of the most significant cable I/O updates since the advent of USB.” The company envisions not just a faster port, but “a simpler and more versatile port, available for everyone, coming to approximately 150 different PCs, Macs and peripherals by the end of this year,” said Walker.

So what can it do for you on set or in the studio? First, some thumbnail facts about what it does: with double the video bandwidth of Thunderbolt 2 and eight times faster than USB 3.0, Thunderbolt 3 clocks 40Gbps transfer speeds, twice as fast as the previous version. T3 also includes USB-C connectivity, which finally makes it usable with Windows-based workstations as well as with Macs. On top of those gains, a T3 port now lets you daisy-chain up to six devices and two 4K monitors — or one 5K monitor — to a laptop through a single connection. According to Intel’s Walker, “We envision a future where high-performance single-cable docks, stunning photos and 4K video, lifelike VR, and faster-than-ever storage are commonplace.” That’s an important piece of the puzzle for filmmakers who want their VR projects and 4K+ content to reach the widest possible audience.

The specification for Thunderbolt 3, first released in 2015, gave rise to a smattering of products in 2016, most importantly the MacBook Pro with Thunderbolt 3. At NAB this year, many more flexible RAID storage and improved T3 devices that connect directly to Mac and Windows computers joined their ranks. In June, Apple released iMacs with TB3.

For directors Jason and Josh Diamond, a.k.a. The Diamond Brothers, upgrading to new TB3-enabled laptops is their first priority. “When we look at the data we’re pushing around, be it 24 cameras from a VR shoot, or many TBs of 8K R3Ds from a Red Helium multicam shoot, one of the most important things in the end is data transfer speed. As we move into new computers, drives and peripherals, USB-C and TB3 finally have ubiquity across our Mac and PC systems that we either own or are looking to upgrade to. This makes for much easier integrations and less headaches as we design workflows and pathways for our projects,” says Jason Diamond, The Diamond Bros./Supersphere.

If you are also ready to upgrade, here are a sampling of recently released products that can add Thunderbolt 3 performance to your workflow.

CalDigit docking station

Clean Up the Clutter
CalDigit was one of the first to adopt the Thunderbolt interface when it came out in 2011, so it’s no surprise that the first shipment of the CalDigit Thunderbolt Station 3 (TS3) docking station introduced at NAB 2017 sold out quickly. The preorders taken at the show are expected to ship soon. TS3 is designed to be a streamlined, central charging hub for MacBook Pro, delivering 85W of laptop charging via USB 3.1 Type-A (plus audio in and out), along with two Thunderbolt ports, two eSATA ports, two USB 3.1 Type A ports, Gigabit Ethernet and a DisplayPort. DisplayPort lets users connect to a range of monitors with a DisplayPort to HDMI, DVI or VGA cable.

CalDigit also introduced the TS3 Lite, shipping now, which will work with any Thunderbolt 3 computer from PCs to iMacs or MacBook Pros and features two Thunderbolt 3 ports, Gigabit Ethernet, audio in and out, an AC power adapter and DisplayPort. It includes two USB 3.1 Type-A ports — one on the back and one on its face — that let you charge your iPhone even when the dock isn’t connected to your computer.

The Need for Speed
Like the other new T3 products on the market, LaCie‘s 6big and 12big Thunderbolt 3 RAID arrays feature both Thunderbolt 3 and USB 3.1 interfaces for Mac- or Windows-based connections.

LaCie 12Big

But as their names imply, the relatively compact “big” line ramps up to 120TB in the 12big desktop tower. The hardware RAID controller and 7200RPM drives inside the12big will give you speeds of up to 2600MB/s, and even 2400MB/s in RAID 5. This will significantly ramp up how quickly you ingest footage or move through an edit or grade in the course of your day (or late night!). Thanks to Thunderbolt 3, multiple streams of ProRes 422 (HQ), ProRes 4444 XQ and uncompressed HD 10-bit and 12-bit video are now much easier to handle at once. Preview render rates also get a welcome boost.

The new Pegasus3 R4, R6 and R8 RAIDs from Promise debuted at Apple’s WWDC 2017 in early June and were designed to integrate seamlessly with Apple’s latest Thunderbolt 3-enabled offerings, which will include the upcoming iMac Pro coming in December. They will deliver 16TB to 80TB of desktop storage and can also sync with the company’s Apollo Cloud personal storage device, which lets you share small clips or low-res review files with a group via mobile devices while in transit. When used with Promise’s SANLink Series, the new Pegasus3 models can also be shared over a LAN.

Lighten the Load on Set
If you regularly work with large media files on set, more than one G-Technology G-Drive ev series drives are likely on your cart. The latest version of the series so popular with DITs has a Thunderbolt 3-enabled drive for improved transfer speeds and an HDMI input so you can daisy-chain the drive and a monitor through a single connection on a laptop. Users of G-Tech ev series drives who need even more robust Thunderbolt 3 RAID on location — say to support multistream 8K and VR — now have another option: the 8-bay G|Speed Shuttle XL with ev Series Bay Adapters that G-Tech introduced at NAB. Shipping this month, it comes in RAID-0, -1, -5, -6 and -10 configurations, includes two T3 ports and ranges in price from $3,999.95 (24TB) to $6,599.95 (60TB).

Sonnet Cfast 2.0 Pro card reader

Transfer Faster on Location
One of the first card readers with a Thunderbolt interface is the SF3 Series — Cfast 2.0 Pro launched in May by Sonnet Technologies. Dual card slots let the reader ingest files simultaneously from Canon, Arri and Blackmagic cameras at concurrent data transfer speeds up to 1,000 MB/s, twice as fast as you can from a USB 3.0 reader. The lightweight, extruded aluminum shell is made to handle as much abuse as you can throw at it.

Stereoscopic-Ready
The Thunderbolt 3 version of Blackmagic’s UltraStudio 4K Extreme resolved two critical obstacles when it began shipping last year: it was finally fast enough to support RGB and stereoscopic footage while working in 4K and it could

Blackmagic UltraStudio 4K Extreme

be connected directly to color correction systems like DaVinci Resolve via its new Thunderbolt 3 port. The 40 Gbps transfer speeds are “fast enough for the most extreme, high bit-depth uncompressed RGB 4K and stereoscopic formats,” says Blackmagic’s Grant Petty.

Blackmagic introduced the UltraStudio HD Mini with Thunderbolt 3 at NAB this year. It adds 3G-SDI and HDMI along with analog connections for 10-bit recording up to 1080p60 and 2K DCI, likely making it the first of its kind. It’s aimed at the live broadcast graphics editing and archiving.

Connect Back to PCI-E and Be Eco-Friendly
OWC makes little black boxes that do two very important things: retrieve your PCI-Express card options, while also helping the planet. The zero emissions Mac and PC technology company began shipping the updated OWC Mercury Helios with Thunderbolt 3 expansion chassis in May. The box includes two Thunderbolt 3 ports, a PCI-E post, and a Mini DisplayPort, which lets you connect to high-bandwidth NIC cards, HBAs and RAID controllers and add video capture and processing cards and audio production PCIe cards. An energy saver mode also powers it on and off with your computer.


Boxx Apexx 4 features i9 X-Series procs, targets post apps

Boxx’s new Apexx 4 6201 workstation features the new 10-core Intel Core i9 X-Series processor. Intel’s most scalable desktop platform ever, X-Series processors offer significant performance increases over previous Intel technology.

“The Intel Core X-Series is the ultimate workstation platform,” reports Boxx VP of engineering Tim Lawrence. “The advantages of the new Intel Core i9, combined with Boxx innovation, will provide architects, engineers and motion media creators with an unprecedented level of performance.”

One of those key Intel X-Series advantages is Intel Turbo Boost 3.0. This technology identifies the two best cores to boost, making the new CPUs ideal for multitasking and virtual reality, as well as editing and rendering high-res 4K/VR video and effects with fast video transcode, image stabilization, 3D effects rendering and animation.

When comparing previous-generation Intel processors to X-Series processors (10-core vs.10-core), the X-Series is up to 14% faster in multi-threaded performance and up to 15% faster in single-threaded performance.

The first in a series of Boxx workstations featuring the new Intel X-Series processors, Apexx 4 6201 also includes up to three professional-grade Nvidia or AMD Radeon Pro graphics cards, and up to 128GB of system memory. The highly configurable Apexx 4 series workstations provide support for single-threaded applications, as well as multi-threaded tasks in applications like 3ds Max, Maya and Adobe CC.

“Professionals choose Boxx because they want to spend more time creating and less time waiting on their compute-intensive workloads,” says Lawrence. “Boxx Apexx workstations featuring new Intel X-Series processors will enable them to create without compromise, to megatask, support a bank of 4K monitors and immerse themselves in VR — all faster than before.”

 

FMPX8.14

BCPC names Kylee Peña president, expands leadership

The Blue Collar Post Collective (BCPC) has revised and expanded its leadership. Kylee Peña has been upped to president, having served as Los Angeles vice president. Ryan Penny will now serve as New York VP and Chris Visser will take over as LA VP.

“In the time since I’ve been directly involved with the leadership of BCPC, we’ve seen continued exponential growth, both in our membership and our scope,” says Peña (our main image). “Inclusiveness and accessibility are incredibly important to people, and they want to be involved with our mission.”

Chris Visser

The shift in leadership was prompted by the upcoming departure of co-president Janis Vogel, who will resign after nearly three years at the helm of the organization that consists entirely of full-time working professionals who volunteer their time to run its operations. Vogel will remain in the organization as an active member and sit on the Board. She will be spending the remainder of 2017 in London, where she will co-host a BCPC meet-up, marking the first extension of official on-the-ground activity for the organization outside of the US.

Co-president Felix Cabrera, who has served BCPC for the last year focusing on an “Intro to Post” training course in collaboration with the New York City Mayor’s Office of Media and Entertainment and Brooklyn Workforce Industries, will be stepping down from his role as well.

Peña has been at the helm of BCPC West for the last year, recruiting a committee and building the BCPC community from the ground up in Los Angeles. She is also an associate editor for Creative COW, active with SMPTE, and an outspoken advocate for gender equality and mental health in post production. By day, she is a workflow supervisor for Bling Digital, working on feature films and television.

Ryan Penny

Penny is an editor and currently serves as director for the newly launched “Made in NY Post Production Training Program,” partnering with the NYC Mayor’s Office for Media and Entertainment to train and provide job placement in the post production industry for low income and unemployed New Yorkers.

Visser is an assistant editor in Los Angeles, currently working on season two of Shooter for USA Network. Eager to expand his role on the original West planning committee, he took the lead on #TipJar, a weekly led discussion on BCPC’s Facebook page on important topics in the industry.

Peña says, “I’m excited about what’s on the horizon for BCPC. Funneling all this momentum into our mission to make all of post production more inclusive will have an explosive impact on the industry for years to come. People in our industry are ready to open their doors and help us change the face of what an expert looks like in post. They want to look outside their bubble, learn from people who don’t look like them, and mentor or hire emerging talent. We’re rising to meet that demand with action.”


Sound — Wonder Woman’s superpower

By Jennifer Walden

When director Patty Jenkins first met with supervising sound editor James Mather to discuss Warner Bros. Wonder Woman, they had a conversation about the physical effects of low-frequency sound energy on the human body, and how it could be used to manipulate an audience.

“The military spent a long time investigating sound cannons that could fire frequencies at groups of people and debilitate them,” explains Mather. “They found that the lower frequencies were far more effective than the very high frequencies. With the high frequencies, you can simply plug your ears and block the sound. The low-end frequencies, however, impact the fluid content of the human body. Frequencies around 5Hz-9Hz can’t be heard, but can have physiological, almost emotional effects on the human body. Patty was fascinated by all of that. So, we had a very good sound-nerd talk at our first meeting — before we even talked about the story of the film.”

Jenkins was fascinated by the idea of sound playing a physical role as well as a narrative one, and that direction informed all of Mather’s sound editorial choices for Wonder Woman. “I was amazed by Patty’s intent, from the very beginning, to veer away from very high-end sounds. She did not want to have those featured heavily in the film. She didn’t want too much top-end sonically,” says Mather, who handled sound editorial at his Soundbyte Studios in West London.

James Mather (far right) and crew take to the streets.

Soundbyte Studios offers creative supervision, sound design, Foley and dialog editing. The facility is equipped with Pro Tools 12 systems and Avid S6 and S3 consoles. Their client list includes top studios like Warner Bros., Disney, Fox, Paramount, DreamWorks, Aardman and Pathe. Mather’s team includes dialog supervisor Simon Chase, and sound effects editors Jed Loughran and Samir Fočo. When Mather begins a project, he likes to introduce his team to the director as soon as possible “so that they are recognized as contributors to the soundtrack,” he says. “It gives the team a better understanding of who they are working with and the kind of collaboration that is expected. I always find that if you can get everyone to work as a collaborative team and everyone has an emotional investment or personal investment in the project, then you get better work.”

Following Jenkins’s direction, Mather and his team designed a tranquil sound for the Amazonian paradise of Themyscira. They started with ambience tracks that the film’s sound recordist Chris Munro captured while they were on-location in Italy. Then Mather added Mediterranean ambiences that he and his team had personally collected over the years. Mather embellished the ambience with songbirds from Asia, Australasia and the Amazon. Since there are white peacocks roaming the island, he added in modified peacock sounds. Howler monkeys and domestic livestock, like sheep and goats, round out the track. Regarding the sheep and goats, Mather says, “We pitched them and manipulated them slightly so that they didn’t sound quite so ordinary, like a natural history film. It was very much a case of keeping the soundtrack relatively sparse. We did not use crickets or cicadas — although there were lots there while they were filming, because we wanted to stay away the high-frequency sounds.”

Waterfalls are another prominent feature of Themyscira, according to Mather, but thankfully they weren’t really on the island so the sound recordings were relatively clean. The post sound team had complete control over the volume, distance and frequency range of the waterfall sounds. “We very much wanted the low-end roar and rumble of the waterfalls rather than high-end hiss and white noise.”

The sound of paradise is serene in contrast to London and the front lines of World War I. Mather wanted to exaggerate that difference by overplaying the sound of boats, cars and crowds as Steve [Chris Pine] and Diana [Gal Gadot] arrived in London. “This was London at its busiest and most industria

l time. There were structures being built on a major scale so the environment was incredibly active. There were buses still being drawn by horses, but there were also cars. So, you have this whole mishmash of old and new. We wanted to see Diana’s reaction to being somewhere that she has never experienced before, with sounds that she has never heard and things she has never seen. The world is a complete barrage of sensory information.”

They recorded every vehicle they could in the film, from planes and boats to the motorcycle that Steve uses to chase after Diana later on in the film. “This motorcycle was like nothing we had ever seen before,” explains Mather. “We knew that we would have to go and record it because we didn’t have anything in our sound libraries for it.”

The studio spent days preparing the century-old motorcycle for the recording session. “We got about four minutes of recording with it before it fell apart,” admits Mather. “The chain fell off, the sprockets broke and then it went up in smoke. It was an antique and probably shouldn’t have been used! The funny thing is that it sounded like a lawnmower. We could have just recorded a lawnmower and it would’ve sounded the same!”

(Mather notes that the motorcycle Steve rides on-screen was a modern version of the century-old one they got to record.)

Goosing Sounds
Mather and his sound team have had numerous opportunities to record authentic weapons, cars, tanks, planes and other specific war-era machines and gear for projects they’ve worked on. While they always start with those recordings as their sound design base, Mather says the audience’s expectation of a sound is typically different from the real thing. “The real sound is very often disappointing. We start with the real gun or real car that we recorded, but then we start to work on them, changing the texture to give them a little bit more punch or bite. We might find that we need to add some gun mechanisms to make a gun sound a bit snappier or a bit brighter and not so dull. It’s the same with the cars. You want the car to have character, but you also want it to be slightly faster or more detailed than it actually sounds. By the nature of filmmaking, you will always end up slightly embellishing the real sound.”

Take the gun battles in Wonder Woman, for instance. They have an obvious sequentiality. The gun fires, the bullet travels toward its target and then there is a noticeable impact. “This film has a lot of slow-motion bullets firing, so we had to amp up the sense of what was propelling that very slow-motion bullet. Recording the sound of a moving bullet is very hard. All of that had to be designed for the film,” says Mather.

In addition to the real era-appropriate vehicles, Wonder Woman has imaginary, souped-up creations too, like a massive bomber. For the bomber’s sound, Mather sought out artist Joe Rush who builds custom Mad Max-style vehicles. They recorded all of Rush’s vehicles, which had a variety of different V8, V12 and V6 engines. “They all sound very different because the engines are on solid metal with no suspension,” explains Mather. “The sound was really big and beefy, loud and clunky and it gave you a sense of a giant war monster. They had this growl and weight and threat that worked well for the German machines, which were supposed to feel threatening. In London, you had these quaint buses being drawn by horses, and the counterpoint to that were these military machines that the Germans had, which had to be daunting and a bit terrifying.

“One of the limitations of the WWI-era soundscapes is the lack of some very useful atmospheric sounds. We used tannoy (loudspeaker) effects on the German bomb factory to hint at the background activity, but had to be very sparing as these were only just invented in that era. (Same thing with the machine guns — a far more mechanical version than the ‘retatatat’ of the familiar WWII versions).”

One of Mather’s favorite scenes to design starts on the frontlines as Diana makes her big reveal as Wonder Woman. She crosses No Man’s Land and deflects the enemies’ fire with her bulletproof bracelets and shield. “We played with that in so many different ways because the music was such an important part of Patty’s vision for the film. She very much wanted the music to carry the narrative. Sound effects were there to be literal in many ways. We were not trying to overemphasize the machismo of it. The story is about the people and not necessarily the action they were in. So that became a very musical-based moment, which was not the way I would have normally done it. I learned a lot from Patty about the different ways of telling the story.”

The Powers
Following that scene, Wonder Woman recaptured the Belgian village they were fighting for by running ahead and storming into the German barracks. Mather describes it as a Guy Ritchie-style fight, with Wonder Woman taking on 25 German soldiers. “This is the first time that we really get to see her use all of her powers: the lasso, her bracelets, her shield, and even her shin guards. As she dances her way around the room, it goes from realtime into slow motion and back into realtime. She is repelling bullets, smashing guns with her back, using her shield as a sliding mat and doing slow-motion kicks. It is a wonderfully choreographed scene and it is her first real action scene.”

The scene required a fluid combination of realistic sounds and subdued, slow-motion sounds. “It was like pushing and pulling the soundtrack as things slowed down and then sped back up. That was a lot of fun.”

The Lasso
Where would Wonder Woman be without her signature lasso of truth? In the film, she often uses the lasso as a physical weapon, but there was an important scene where the lasso was called upon for its truth-finding power. Early in the film, Steve’s plane crashes and he’s washed onto Themyscira’s shore. The Amazonians bind Steve with the lasso and interrogate him. Eventually the lasso of truth overpowers him and he divulges his secrets. “There is quite a lot of acting on Chris Pine’s part to signify that he’s uncomfortable and is struggling,” says Mather. “We initially went by his performance, which gave the impression that he was being burned. He says, ‘This is really hot,’ so we started with sizzling and hissing sounds as if the rope was burning him. Again, Patty felt strongly about not going into the high-frequency realm because it distracts from the dialogue, so we wanted to keep the sound in a lower, more menacing register.”

Mather and his team experimented with adding a multitude of different elements, including low whispering voices, to see if they added a sense of personality to the lasso. “We kept the sizzling, but we pitched it down to make it more watery and less high-end. Then we tried a dozen or so variations of themes. Eventually we stayed with this blood-flow sound, which is like an arterial blood flow. It has a slight rhythm to it and if you roll off the top end and keep it fairly muted then it’s quite an intriguing sound. It feels very visceral.”

The last elements Mather added to the lasso were recordings he captured of two stone slabs grinding against each other in a circular motion, like a mill. “It created this rotating, undulating sound that almost has a voice. So that created this identity, this personality. It was very challenging. We also struggled with this when we did the Harry Potter films, to make an inert object have a character without making it sound a bit goofy and a bit sci-fi. All of those last elements we put together, we kept that very low. We literally raised the volume as you see Steve’s discomfort and then let it peel away every time he revealed the truth. As he was fighting it, the sound would rise and build up. It became a very subtle, but very meaningful, vehicle to show that the rope was actually doing something. It wasn’t burning him but it was doing something that was making him uncomfortable.”

The Mix
Wonder Woman was mixed at De Lane Lea (Warner Bros. London) by re-recording mixers Chris Burdon and Gilbert Lake. Mather reveals that the mixing process was exhausting, but not because of the people involved. “Patty is a joy to work with,” he explains. “What I mean is that working with frequencies that are so low and so loud is exhausting. It wasn’t even the volume; it was being exposed to those low frequencies all day, every day for nine weeks or so. It was exhausting, and it really took its toll on everybody.”

In the mix, Jenkins chose to have Rupert Gregson-Williams’s score lead nearly all of the action sequences. “Patty’s sensitivity and vision for the soundtrack was very much about the music and the emotion of the characters,” says Mather. “She was very aware of the emotional narrative that the music would bring. She did not want to lean too heavily on the sound effects. She knew there would be scenes where there would be action and there would be opportunities to have sound design, but I found that we were not pushing those moments as hard as you would expect. The sound design highs weren’t so high that you felt bereft of momentum and pace when those sound design heavy scenes were finished. We ended up maintaining a far more interesting soundtrack that way.”

With DC films like Batman v Superman: Dawn of Justice and Spider-Man, the audience expects a sound design-heavy track, but Jenkins’s music-led approach to Wonder Woman provides a refreshing spin on superhero film soundtracks. “The soundtrack is less supernatural and more down to earth,” says Mather. “I don’t think it could’ve been any other way. It’s not a predictable soundtrack and I really enjoyed that.”

Mather really enjoys collaborating with people who have different ideas and different approaches. “What was exciting about doing this film was that I was able to work with someone who had an incredibly strong idea about the soundtrack and yet was very happy to let us try different routes and options. Patty was very open to listening to different ideas, and willing to take the best from those ideas while still retaining a very strong vision of how the soundtrack was going to play for the audience. This is Patty’s DC story, her opportunity to open up the DC universe and give the audience a new look at a character. She was an extraordinary person to work with and for me that was the best part of the process. In the time of remakes, it’s nice to have a film that is fresh and takes a different approach.”


Jennifer Walden is a New Jersey-based audio engineer and writer. Follow her on Twitter at @AudioJeney

Arcade grows with creative editor Graham Chisholm

Edit house Arcade, with offices in New York and Santa Monica, has hired creative editor Graham Chisholm. He will be based in the LA studio, but is available to work on either coast.

Chisholm’s career began in Montreal, where he worked for three years before moving to Toronto. For over a decade, he worked with a variety of advertising agencies and brands, including Gatorade, Land Rover, Budweiser, Ford, Chevrolet and the Toronto Raptors, to name a few. He has earned several awards for his work, including multiple Cannes Lions and Best in Show at the AICE Awards. According to Arcade, Chisholm has become best known for his ability to tell compelling and persuasive stories, regardless of the brand or medium he’s working with.

“Graham’s influence and dedication on a project extend beyond the edit and into the finishing of the film,” notes Michael Lawrence, director of a Powerade spot that Chisholm edited. “In our case, he is involved in everything, a true collaborator on an intellectual level, as well as a gifted craftsman. Graham has earned my trust and heartfelt praise through our time working together and becoming friends along the way. He is a gifted storyteller and a great man.”

Chisholm is in the midst of working on a new project at Arcade for Adidas via ad agency 72andSunny. He had just completed his first Arcade project, a short film called LA2024, also via 72andSunny, promoting Los Angeles’ bid for the 2024 Olympic Games.