NBCUni 7.26

Category Archives: Quick Chat

The pipeline experts behind Shotgun’s ‘Two Guys and a Toolkit’ blog

Jeff Beeland and Josh Tomlinson know pipelines, and we are not exaggerating. Beeland was a pipeline TD, lead pipeline TD and pipeline supervisor at Rhythm and Hues Studios for over nine years. After that, he was pipeline supervisor at Blur Studio for over two years. Tomlinson followed a similar path, working in the pipeline department at R&H starting in 2003. In 2010 he moved over to the software group at the studio and helped develop its proprietary toolset. In 2014 he took a job as senior pipeline engineer in the Digital Production Arts MFA program at Clemson University where he worked with students to develop an open source production pipeline framework.

This fall the pair joined Shotgun Software’s Pipeline Toolkit team, working on creating even more efficient — wait for it —pipelines!  In the spirit of diving in head first, they decided to take on the complex challenge of deploying a working pipeline in 10 weeks — and blogging the good, the bad and the ugly of the process along the way. This was the genesis of their Two Guys and a Toolkit series of blogs, which ended last week.

IMG_6655 jbee
Josh Tomlinson and Jeff Beeland.

Before we dig in to find out more, this is what you should know about the Pipeline Toolkit: The Shotgun Pipeline Toolkit (sgtk) is a suite of tools and building blocks designed to help users to set up, customize and evolve their pipelines. Sgtk integrates with apps such as Maya, Photoshop and Nuke and makes it easy to access Shotgun data inside those environments. Ok, let’s talk to the guys…

What made you want to start the Two Guys and a Toolkit series?
Josh: Since we were both relatively new to Shotgun, this was originally just a four-week exercise for us to get up and running with Toolkit; there wasn’t really any discussion of a blog series. The goal of this exercise was to learn the ins and outs of Toolkit, identify what worked well, and point out things we thought could be improved.

After we got started, the word spread internally about what we were up to and the idea for the blog posts came up. It seemed like a really good way for us to meet and interact directly with the Shotgun community and try to get a discussion going about Toolkit and pipeline in general.

Did you guys feel exposed throughout this process? What if you couldn’t get it done in 10 weeks?
Jeff: The scope of the original exercise was fairly small in terms of the requirements for the pipeline. Coupled with the fact that Toolkit comes with a great set of tools out of the box, the 10-week window was plenty of time to get things up and running.

We had most of the functional bits working within a couple of weeks, and we were able to dive deep into that experience over the first five weeks of the blog series. Since then we’ve been able to riff a little bit in the posts and talk about some more sophisticated pipeline topics that we’re passionate about and that we thought might be interesting to the readers.

pipe_layout copy

What would you consider the most important things you did to ensure success?
Josh: One of the most important ideas behind the blog series was that we couldn’t just talk about what worked well for us. The team really stressed the importance of being honest with the readers and letting them in on the good, the bad and the ugly bits of Toolkit. We’ve tried our best to be honest about our experience.

Jeff: Another important component of the series was the goal of starting up a dialogue with the readers. If we just talked about what we did each week, the readers would get bored quickly. In each post we made it a point to ask the readers how they’ve solved a particular problem or what they think of our ideas. After all, we’re new to Toolkit, so the readers are probably much more experienced than us. Getting their feedback and input has been critical to the success of the blog posts.

Josh: Now that the series is over, we’ll be putting together a tutorial that walks through the process of setting up a simple Toolkit pipeline from scratch. Hopefully users new to Toolkit will be able to take that and customize it to fit their needs. If we can use what we’ve learned over the 10 weeks and put together a tutorial that is helpful and gives people a good foundation with Toolkit, then the blog series will have been successful.

Do you feel like you actually produced a pipeline path that will be practical and realistic for applying in real-world production studios?
Jeff: The workflow designs that we model our simple pipeline off of are definitely applicable to a studio pipeline. While our implementations are often at a proof-of-concept level, the ideas behind how the system is designed are sound. Our hope has always been to present how certain workflows or features could be implemented using Toolkit, even if the code we’ve produced as part of that exercise might be too simplistic for a full-scale studio pipeline.

During the second half of the blog series we started covering some larger system designs that are outside of the scope of our simple pipeline. Those posts present some very interesting ideas that studios of any size — including the largest VFX and animation studios — could introduce into their pipelines. The purpose of the later posts was to evoke discussion and spread some possible solutions to very common challenges found in the industry. Because of that, we focused heavily on real-world scenarios that pipeline teams everywhere will have experienced.

What is the biggest mistake you made, what did you do to solve it and how much time did it set you back?
Josh: To be honest, we’ve probably made mistakes that we’ve not even caught yet. The fact that this started as an exercise to help us learn Toolkit means we didn’t know what we were doing when we dove in.

In addition, neither of us have a wealth of modern Maya experience, as R&H used mostly proprietary software and Blur’s pipeline revolved primarily around 3DS Max. As a result, we made a complete mess out of Maya’s namespaces on our first pass through getting the pipeline up and running. It took hours of time and frustration to unravel that mess and get a clean, manageable namespacing structure into place. In fact, we nearly eliminated Maya namespaces from the pipeline simply so we could move on to other things. In that regard, there would still be work to do if we wanted to make proper use of them in our workflow.

You spent 10 weeks building a pipeline essentially in a vacuum… how much time realistically would this take in an operational facility where you would need to integrate pipeline into existing tech infrastructure?
Jeff: That all depends on the scope of the pipeline being developed. It’s conceivable that a small team could get a Toolkit-driven pipeline up and running in weeks, if relying on mostly out-of-the-box functionality provided.

This would require making use of well-supported DCC applications, like Maya and Nuke, as custom integrations with others would require some development time. This sort of timeframe would also limit the pipeline to supporting a single physical studio location, as multi-location or cloud-based workflows would require substantial development resources and time.

It’s worth noting that R&H’s pipeline was initially implemented in a very short period of time by a small team of TDs and engineers, and was then continually evolved by a larger group of developers over the course of 10-plus years. Blur’s pipeline evolved similarly. This goes to show that developing a pipeline involves hitting a constantly moving target, and shouldn’t be viewed as a one-time development project. The job of maintaining and evolving the pipeline will vary in scope and complexity depending on a number of factors, but is something that studios should keep in mind. The requirements laid out by production and artists often change with time, so continued development is not uncommon.

Any lessons learned, parting words of wisdom for others out there taking on pipeline build-out?
Jeff: This really goes for software engineering in general — iterate quickly and set yourself up to fail as fast as possible. Not all of your ideas are going to pan out, and even when they do, your implementation of the good ones will often let you down. You need to know whether the direction you’re going in will work as early as possible so that you can start over quickly if things go wrong.

Josh: A second piece of advice is to listen to the users. Too often, developers think they know how artists should work and fail to vet their ideas with the people that are actually going to use the tools they’re writing. In our experience, many of the artists know more about the software they use than we do. Use that to your advantage and get them involved as early in the process as possible. That way you can get a better idea of whether the direction you’re going in aligns with the expectations of the people that are going to have to live with your decisions.

Quick Chat: ‘Mermaids on Mars’ director Jon V. Peters

Athena Studios, a Bay Area production and animation company, has completed work on a short called Mermaids on Mars, which is based on a children’s book and original music by the film’s producer Nancy Guettier. It was directed by Jon V. Peters and features the work of artists whose credits include the stop-motion offerings Coraline, James and the Giant Peach and The Nightmare Before Christmas, as well as many other feature length films.

The film is about a young boy who is magically transported to Mars, where he tries to stop an evil Martian from destroying the last of the planet’s mermaids. The entire story was told with stop-motion animation, which was shot on Athena Studios‘ (@AthenaStudios) soundstage.

The 24-minute film was comprised of 300 shots. Many involved complex compositing, putting heavy demands on Athena’s small team of visual effects artists who were working within a post schedule of just over three months.

Mermaids on Mars

Kat Alioshin (Coraline, The Nightmare Before Christmas, Monkeybone, Corpse Bride) was co-producer of the film, running stages and animation production. Vince De Quattro (Hellboy, Pirates of the Caribbean, Star Wars, Mighty Joe Young) is the film’s digital post production supervisor.

Let’s find out more from Peters who in addition to directing and producing Mermaids on Mars, is also the founder of Athena Studios.

Why did you decide to create Mermaids on Mars as an animated short?
The decision was budget-driven, primarily. We were originally approached by Nancy Guettier, who is the author of the book the film is based on, and one of the film’s producers. She had originally presented us with a feature length script with 12 songs. Given budgetary restrictions, however, we worked with Nancy and her screenwriter, Jarrett Galante, to cut the film down to a 24-minute short that retained five of her original songs.

What are some of the challenges you faced turning a book into an animated short?
The original book is a charming short story that centers more on mermaids conserving water. The first feature-length script had added many other elements, which brought in Martian armies and a much more detailed and storyline. The biggest problem we had was trying to simplify the story as much as possible without losing the heart of the material. Because of our budget, we were also limited in the number of puppets and the design of our sets.

julian_mars

Are there wrong ways to go about this?
There are hundreds, perhaps thousands, of ways to approach production on a film like this. The only “wrong” way would have been to ignore the budget. As many other films have shown, limitations (financial or otherwise) can breed creativity. If you walk the budget backward it can help you define your approach. The film’s co-producer, Kat Alioshin, had worked on numerous stop-motion features previously, so she had a good handle on what the cost for each element would be.

Describe your thought process for setting the stage for Mermaids on Mars.
Originally, we looked at doing the entire production as more of a 2D stop-motion down shooter design, but the producer really wanted 3D characters. We did not have the budget for full sets however. As we looked at combining a 2D set design with 3D practical stop-motion puppets it took us all the way back to Georges Méliès, the father of visual effects. He was a stage magician and his films made use of flats in combination with his actors. We drew inspiration from his work in the design of our production.

l

While we wanted to shoot as much in-camera as possible we knew that because of the budget we would need to rely almost as much on post production as the production itself. We shot many of the elements individually and then combined them in post. That part of the production was headed up by veteran visual effects artist Vince De Quattro.

What cameras did you use? 
Animation was shot on Canon DSLR cameras, usually 60D, using DragonFrame. The puppeted live-action wave rank shots were done on a Blackmagic Studio Camera in RAW and then graded in DaVinci Resolve to fit with the Canon shots. Live action shots (for the bookends of the film) were shot on Red Epic cameras.

What was used for compositing and animation?

All compositing was done in Adobe After Effects. There was no 3D animation in the film since it was all practical, stop-motion, but the 3D models for the puppet bodies (used for 3D printing and casting) was done in Autodesk Maya.

Was the 2D all hand drawn?
Yes, all 2D was hand drawn and hand painted. We wanted to keep a handmade feel to as many aspects of the film as possible.

How much time did you devote to the set-up and which pieces took the longest to perfect?
It was a fairly quick production for a stop-motion piece. Given the number of stages, shop needs, size of the project and other shoots we had scheduled, we knew we could not shoot it in our main building, so we needed to find another space. We spent a lot of our time looking for the right building, one that met the criteria for the production. Once we found it we had stages set up and running within a week of signing the lease agreement.

Our production designer Tom Proost (Galaxy Quest, Star Wars — The Phantom Menace, Lemony Snicket’s, Coraline) focused on set and prop building of the hero elements, always taking a very “stage-like” approach to each. We had a limited crew so his team worked on those pieces that were used in the most shots first. The biggest pieces were the waves of the ocean, used on both Earth and Mars, a dock set, the young boy’s bedroom, the mermaid palace, the Martian fortress and a machine called the “siphonator.”

GilbertOnDock

Initial builds and animation took approximately six months, and post production took an equal amount of time.

What was your favorite set to work with, and why?
There were many great sets, but I think the wave set that Tom Proost and his team built was my favorite. It was very much a practical set that had been designed as a raked stage with slots for each of the wave ranks. It was manually puppeted by the crew as they pulled the waves back and forth to create the proper movement. That was filmed and then the post production team composited in the mermaid characters, since they could not be animated within the many wave ranks.

You did the post at Athena?
Twenty-four minutes of film with an average of five composited iterations per shot equates to approximately 300,000 frames processed to final, all completed by Athena’s small team under tight festival deadlines.

NBCUni 7.26

Quick Chat: Rampant’s Sean Mullen on new mograph release

Rampant Design Tools who has been prolific about introducing new offerings and updates to its motion graphics products is at it again. This time with two new Style Effects volumes for motion graphic artists offer 1,500 new 2K, 4K and 5K effects.

Rampant Motion Graphics for Editors v1 and v2, are QuickTime elements that users can drag and drop into the software of their choice; Rampant effects are not plug-ins and, therefore, not platform dependent.

The newly launched Rampant Textured Overlays library features 230 effects for editors, also in ultra-high resolution 2K, 4K and 5K elements.

This volume provides a large amount of overlay effects for editors or anyone else looking to add a unique and modern look to video projects. Rampant Textured Overlays are suited for editing, motion graphics, photography and graphic design.

We reached out to Rampant’s Sean Mullen, who runs the company with his wife Stefanie. They create all of the effects themselves. Ok, let’s find out more.

What’s important about this new release?
The Motion Graphics for Editors series is a completely new direction for us. We’ve designed thousands of animated elements so that busy editors or anyone who doesn’t have time to design their own can easily create great looking motion graphics without having to start from scratch. These designs are the same that you see in current television and commercial trends.

Volume one is more of a base, something that you can use in just about any kind of situation.  Volume 2 is more edgy and is similar to the kinds of designs that I’ve previously created for the X Games, MTV, Fuel and the National Guard. Motion Graphics for Editors is the beginning of a new trend at Rampant. You can expect to see a variety of different projects coming out of our shop in the near future.  All vastly different from what people are used to seeing from Rampant. I’m super stoked about the next six to eight months.

Were the new offerings based on user feedback?
In part, yes. I have hundreds of project ideas on my whiteboards that I’d like to build out. We’re only limited by time and resources. We don’t have a studio full of artists cranking out our designs. Rampant is just Stefanie and myself. I’m always roughing out ideas and letting them percolate. The great thing about being a small company is that we get to travel and talk to editors and artists directly. We often visit with amazing groups like the Blue Collar Post Collective in NYC and talk with assistant editors, editors and colorists. This allows us to hear first hand about what people want and need in their respective workflows.

What do you expect to be the most used of the bunch?
Motion Graphics for Editors v1 was designed as a base. It’s got more of a universal appeal. It’s perfect for everything from corporate work to infographics and commercials. Volume 2 is a lot more edgy and has a specific feel.

Rampant_Motion_Graphics_for_Editors_V2_010

What’s your process when developing a new release?
There are dozens of projects in various stages of development at any given time. When an idea pops into my head, I’ll start camera, compositing and animation tests right away. Everything starts at 5K resolution or higher. Typically, I’ll let a project sit for a while after the initial R&D. This allows the idea to mature and gives us time to attack the project from multiple angles. Once we decide that something is worth pursuing, I’ll shoot or animate every possible thing I can think of. This can take days or weeks, depending on the amount of post work and transcoding that is involved. From there we’ll have a vat of hundreds, or in some cases thousands, of elements.

We toss out the ones that don’t work or aren’t deemed as useful. Then we organize the elements and give them a proper naming structure. After the elements are named, we output 4K and 2K versions of our 5K master elements and begin the long process of zipping and uploading them to our servers for download delivery.  The final elements, camera masters and project files are then archived. Lastly, we cut a promo video showing our new products in use, build a new product page on our site and develop a newsletter to let our customers know about the latest release. Once that cycle is complete, it’s back to the whiteboard.

Anything you want to add that’s important?
It’s our mission to save editors time and money.If something normally takes hours or days to complete and our effects can help reduce that time, we’ve achieved our goal. There are many editors out there who use our effects in a pre-visual manner. They use our effects to quickly design something in order to get a green light from their producer or client and this saves a ton of time and money.

Others look at our effects as a starting off point. They start with our elements and combine them to make something new. We receive emails every day from editors who just don’t have time to make anything from scratch. Their budgets are too tight, turnaround time is insane or they simply aren’t mograph designers but still want good-looking motion graphics. These are our people, they are why we work every single day. We read each and every email and take every phone call, even at 3am.


A chat with post production veteran Leon Silverman

This industry mainstay will be receiving the HPA Lifetime Achievement Award this month.

By Randi Altman

Leon Silverman is an industry icon. He’s GM of the Digital Studio at Walt Disney Studios and president of the Hollywood Post Alliance (HPA), the organization that is presenting him with its Lifetime Achievement Award on November 12.

I’ve known Leon for a very long time, meeting him for the first time when he was a senior executive of LaserPacific. He was then, and still remains, a genuinely good guy who has done a lot to help create a feeling of community within an industry filled with competitive businesses. He has also been a big supporter of me personally, for which I will always be grateful.

Anyway, not long before the late October SMPTE Conference in Hollywood, Leon was kind enough to talk to me about the Lifetime Achievement Award, the SMPTE and HPA growing partnership and the industry in general.

Leon will be accepting the HPA Lifetime Achievement Award on November 12.

Leon will be accepting the HPA Lifetime Achievement Award on November 12.

When I asked him how it felt to be chosen for this honor, he was humble, saying, “It’s flattering, embarrassing, and it means a lot to me. It reminds me of those who have helped, mentored me and taught me in my life and career — those who helped me along the path. What it’s really about is how the industry has made transitions, and how the recognition of me is really the recognition of a lot of people who contributed to transformative change in our industry.”

Leon’s passion for the media and entertainment business started early — he says ever since he could remember. In school he was the projectionist and worked at the high school radio station. He was also known to hop the train into downtown Chicago and stare into the window at the WLS radio station and watch them broadcast. He thought it was “the most interesting thing in the world.” So imagine how he felt after taking a tour of CBS Television with his friend’s dad. He was hooked.

Next was a degree in telecommunications and a ticket to Hollywood. “I moved from Chicago to LA with $1,000 in my pocket, thinking I was rich, to work at Compact Video, although they didn’t know it.”

Yup, he moved across country for the possibility of getting a job at Compact Video. Leon had seen the company’s brochure, with pictures of “cool” production vehicles and a helicopter and a Lear jet, all adorned with the Compact logo. “I really wanted to work there, so I wrote a letter to founder Robert Seidenglanz. That led to an interview with Newt Bellis, but there was not a job available at that time, so I ended up getting a number of temporary and weird jobs while I was waiting for Compact Video.

After four months of knocking on Compact’s door, he was allowed in… working in their shipping and receiving department, and doing whatever was asked of him. It was there he stayed until, one day, an ad for a sales trainee at Compact Video appeared in the trades.

“That was my job,” he says, explaining that he knocked on the door of the VP of marketing & sales office until he gave him a chance. Within four and a half years, Leon was head of marketing and sales at Compact. His dream was coming true. It was during that time he met mentors who played a big role in his life, such as Emory Cohen and Steve Schifrin.

Listen up kids: determination and a dream really can play a role in your dream job.

The award for Best in Show at the recent SMPTE HPA Film Festival. Leon Silverman congratulates the winners along with Bob Seidel, SMPTE.

The award for Best in Show at the recent SMPTE HPA Film Festival. Leon Silverman congratulates the winners along with Bob Seidel, president of SMPTE.

Ok, now let’s turn to the Hollywood Post Alliance (HPA), which Leon helped found and is now a part of SMPTE (Society of Motion Picture Engineers).

It seems you have always been interested in encouraging a spirit of community within the industry. Can you talk about the genesis of the HPA, and how you’ve seen it grow?
HPA wasn’t a beginning, it was a continuation of a long effort of many people in this community who were in trade associations that focused on post — this goes back to the late ‘70s/early ‘80s. The HPA has its roots in the Southern California chapter of the ITS (International Teleproduction Society), which had a very active board of directors.

One of the great accomplishments was the ITS technology retreat, which HPA inherited (the yearly HPA Tech Retreat takes place in Palm Springs, California). They also got involved in California politics, which resulted in the changing of the tax laws to eliminate the state sales tax on post equipment. This is still in effect to today, as well as the reassessing of how property tax was assessed on post equipment.

When the headquarters of ITS disbanded in the early 2000s, the Southern California chapter felt there was more to do. That was the birth of HPA. There are differences — the ITS was only for those working in the video facility business and the HPA has a broader tent. As a tool manufacturer, you could sponsor the ITS, but you couldn’t sit at the table. The HPA was about a larger community of, not just competitors in the facility business, but collaborators across the entire content creation ecosystem.

Leon Silverman and Barbara Lange.

Leon Silverman and Barbara Lange.

What about your relationship with SMPTE?
As SMPTE and HPA get together, HPA becomes the forum for the discussion of issues that impact the industry, and SMPTE becomes the forum for taking some of those ideas and making them into standards… more importantly broadening our perspective from the local Hollywood base to one that is internationally focused as well.

Barbara Lange (executive director of SMPTE) likes to talk about SMPTE as the 100-year-old startup. So as we start this really new chapter, in which SMPTE celebrates 100 years, and as HPA finds its way within SMPTE. We have a huge opportunity to create a broader view of the content creation ecosystem and how to impact both the discussions and the issues that are really challenging in this digital age — and the real task of how we begin the integration of digital technology in more standardized approaches and, in fact, within industry standards.

There is currently much talk of HDR within the industry. What does the future hold?
The industry has always gravitated towards ways to present our stories in a higher quality and in more compelling ways. Developments like immersive audio and higher brightness projectors that can display contrast greater than film — or colors that we haven’t seen before — allow for a creative pallet that filmmakers are excited about using. Those who create and distribute content are excited about bringing a superior consumer experience to people who enjoy our content.

HDR and immersive audio are new content creation tools that we will really learn the impact of as the creative community puts their own hands and creative minds around how this technology could be deployed in service of a story. Personally, I’m very excited in the things that I’ve seen in films like Tomorrowland and Inside Out. Seeing Inside Out in Dolby Cinema’s EDR was a revelation. I think the industry will work together, as it always has, to understand the impact of how we increasingly create higher-quality products and how best to ensure that this quality flows into our archives.

The HPA Awards, happening in early November in LA, turn 10 years old this year. Congratulations.
The HPA, and the generation that I represent, helped to create this transition from a world of filmed cinema to this digital world, hopefully creating a future that’s worthy of its past. That’s what we’re really celebrating. Over the last 20 or 30 years, we have moved our industry to a place where it’s not really just about the tools or new distribution entities, it’s about a community that now routinely understands how to work together to take advantage of transformative change and how to take this industry into the future.

Part of what our generation did, part of what we did at Pacific Video and LaserPacific, and the idea behind Emory Cohen’s vision of the Electronic Laboratory, was to show how to take the film model and bring that into the electronic and digital age. We’re now done using the past as a future model, because I don’t think there’s a blueprint for the industry today, and I think that’s very exciting, but over the process of the last 30 years we’ve created an industry that understands how to talk to each other, work together. It’s competitors and colleagues — from all aspects of the industry — working together to create this industry’s future.

—-
Leon Silverman will accept his HPA Lifetime Achievement Award on November 12 during the HPA Awards at the Skirball Cultural Center.


Quick Chat: Northern Lights editor Chris Carson on Globetops campaign

Northern Lights editor Chris Carson has teamed up with the global non-profit Globetops — which collects used laptops and donates them to people in need of computers worldwide — in order to tell the stories of entrepreneurs in need of access to technology in Guinea.

The almost four-minute Laptop Stories: Guinea takes viewers to Guinea, where the non-profit began, to deliver laptops to the leader of an agricultural group in need of access to paperwork to receive government subsidies, a teacher who needs to log her students’ grades, an artisan who creates technological courses and a single mother who dreams of starting her own Internet café. Watch the short here.

Chris Carson

Chris Carson

Let’s dig in a bit with Carson regarding the edit…

How long was the shoot, and what was it shot on?
It was shot over a week while Globetops founder and the film’s director, Becky Morrison, was traveling in Guinea distributing laptops. She used a Nikon D800.

How did you work with the Morrison? What direction were you given and did you have some say in the edit?
She gave me a lot of control over the direction of the edit. She helped me at the beginning, finding (and translating) the best interview pieces. There were so many recipients we wanted to profile, but we eventually narrowed it down to five.

What did you edit on?
I usually work in Avid Media Composer, but for this I used Adobe Premiere.

Can you talk about the challenge of editing this project, which features interviews in a language different than your own?
Language was a big challenge. The hardest part was finding a way to quickly tell everyone’s backstory so we could get on to them receiving their laptops. Another thing Becky wanted to convey was a sense of the recipients’ strength, community spirit and entrepreneurial savvy, and not just frame them as needy or desperate people.

I suppose happiness and emotion bridges all language, because the looks on their faces when they got the laptops were amazing.
I was tempted to just edit a string of 15 people receiving laptops, because their joy is so palpable. But we wanted to give a sense of why they needed computers, what kind of work they were engaged in, and how much they could improve their own lives and even their communities.

Did you only do editing, or were you asked to do anything else?
I only did the editing (and the graphics), but Ted Gannon from SuperExploder did the audio mix. The DP was Jordan Engle.


Quick Chat: Cut+Run’s Georgia Dodson on ‘Call of Duty’ film

Georgia Dodson has traveled a long way, literally and figuratively, to where she is today — a full-time editor at Cut+Run in New York City. This Bland, Virginia-native left home at 17 and hasn’t looked back. Now she spends her days in an edit suite helping tell stories, and one of those most recent stories is the short documentary film Call of Duty from director Matt Lenski.

The two have worked together before. Back in 2012 Dodson edited Lenski’s Meaning of Robots, which debuted at the Sundance Film Festival, won Best Short Doc at the Nashville Film Festival and screened at SXSW and MoMA’s New Directors/New Films. This year she reunited with the director once more, this time on his new short film Call of Duty, which also made the festival rounds. In Call of Duty, Manhattan jury duty clerk Walter Schretzman wants you to remember that you are the only thing standing between civilization and anarchy.

What was the original concept presented by director Matt Lenski?
Matt had filmed and interviewed three jury clerks working in Manhattan. They were each engaging, but Walter brought something a little more existential to the table. While the others tried to sell us on the merits of doing jury duty, Walter was self-aware. He spoke about what it was like to be in the same room, every day, with people who are constantly trying to get out of that room… and he likes it.  So I think the idea of Walter’s identity in relationship to the perceived monotony of his job was what Matt was going for with Call of Duty.

How did that evolve in the edit?
It took a long time. The ending and beginning came together quickly, but once we got into how to convey these feelings of waiting, boredom and peppering in Walter’s zingers at the right places… it was really tough. For the most part, we had all the best pieces picked out early on but had to figure out the right arc. Somehow, things fell into place magically. For me, the piece that finally pulled things together was Walter talking about being at the same job for 20 years, doing the same thing every day, while he’s counting hundreds of juror slips. He says, “It is what it is.”

You’ve collaborated with Matt before — give us a little background on your work together.
I met Matt when I was an assistant, and by chance helped him with a director’s cut when my editor was out of town. We became friends and have worked on projects together since. The first big one was Meaning of Robots, which evolved from a chance encounter Matt had with Mike Sullivan, a hoarder who makes Metropolis-inspired robot pornography. Our little portrait of him ended up in Sundance, which was a pleasant surprise for us. That project definitely has parallels to Call of Duty, in both subject and style.

CALLOFDUTY3

What were some interesting moments with Walter that ended up on the cutting room floor (or the digital trash bin)?
He talked about his love of avant-garde jazz that’s difficult to listen to but will “wake you up.” I tried for the longest time to work that moment into our edit, paired with an appropriate jazz track over sleeping jurors… but it didn’t work in context of the whole piece. Too bad. We could make a feature length film of Walter saying amazing things.

What piece of this exploration surprised you the most?
It’s really funny, but I also think it’s darker than I expected it to turn out. Early on, I cut together the part where the prospective jurors watch the jury duty film. (I saw the whole thing when I did jury duty. It’s ridiculous.) I quickly connected the man drowning with the ticking clock, Walter checking his watch and then the infinity loop of the screensaver behind him. It makes me laugh, but it also kind of helped set a dark tone for the whole thing. Also, sound. Sound is always important, but weirdly, it’s especially important in a film about nothing happening, where, theoretically, little sound is being made.

What are you hoping people take from the film?
I like Walter’s sentiment, toward the end of the film, that “people are more than what they do.” Walter is definitely more than what he does.

Have you been to any of the festival screenings?
I was able to go to Rooftop Films, and I met Walter there, finally. He retired a couple of weeks later, so the timing of the film is pretty perfect. It was amazing to hear people laughing so much throughout the entire piece, because after working on something for so long, it’s hard to see it.

What is it about editing longform/short films, as opposed to commercials, that resonates with you?
I come from a writing background. I was an English major in college. I love documentary editing, because I become the writer. My favorite thing is getting an interview and cutting it up to create some emotion or humor.

What are some other recent projects you’ve edited?
This is my latest short film. I’ve been doing a lot of commercials. I just finished a documentary style commercial for Hershey, directed by Jonty Toosey, that will be out soon.


Automatic Duck’s Wes Plate talks about building bridges

By Randi Altman

If you’ve worked in post production during the past 14 years, there is a very good chance you know Automatic Duck and its president, Wes Plate. Over their time in business, Wes and his father, Harry, have created a number of software tools designed to make different programs and formats work together… the ultimate facilitators.

In 2011, Automatic Duck licensed its technology to Adobe, and Wes joined them as head of its Prelude team. While Adobe had acquired the technological assets of Automatic Duck, it did not acquire Automatic Duck, the company.

Fast forward a few years and the Plates and Automatic Duck are back with new products. As you might expect, Automatic Duck Ximport AE and Automatic Duck Media Copy 4.0 are designed to make post pros’ lives easier. Ximport AE transfers entire timelines, including cuts, third-party effects and transitions from Final Cut Pro X to Adobe After Effects. Media Copy 4.0 uses AAF and XML to simplify copying and moving media files from any Final Cut Pro 7, Final Cut Pro X or Avid Media Composer/Symphony project. Both products are being sold via Red Giant.

XimportAE-After Effects copy

XimportAE — After Effects

On the heels of this news, we reached out to Wes Plate, who, after working for Adobe for two years, is back at the family business.

When you joined Adobe, they bought your technology. How did that work with this AE plug-in?
We do have the ability to use some of what Adobe acquired from us, but we are also limited in some ways. We told them we had an idea for a product — translating Final Cut Pro X into After Effects, and they said, “Okay.” We used some of the After Effects code from the past, but we also had to add a whole bunch of new code for Final Cut Pro X. We are still really good working partners with Adobe and we could not have made this product without rewriting everything from scratch without their help or without their permission.

Why now, and why this?
After leaving Adobe at the start of 2014, I was trying to figure what was going to be next. At that same time, I had been hearing a lot of people on social media talking about how Final Cut Pro X had improved and become a great NLE, so I gave it a try. I really enjoy using FCP X as an editing tool, but while I am editing I want to take clips or a section of timeline and bring it in to After Effects… it’s how I work.

Harry and I were looking for a project, Final Cut Pro X is growing in the marketplace, and I need to get from Final Cut Pro X to After Effects if I am going to use it as an NLE. All of that together meant Automatic Duck should build a bridge from Final Cut Pro X to After Effects.

XimportAE — Final Cut Pro X copy

XimportAE-Final Cut Pro X

Before your plug-in, how were people getting from FCP X to After Effects?
When I started down this path, there was a free utility on the market that would translate a Final Cut Pro X XML into a file format called JavaScript; After Effects would then run that JavaScript to create a comp. I tried it but I couldn’t make it work, which gave us even more reason to jump into this. That particular tool is now in version two and available for purchase through the App Store, but we still feel like what we are creating makes much more sense. Another option is to use Intelligent Assistance’s XtoCC app to convert FCPX XML to FCP7 XML and then import that into After Effects. But that workflow is also not as complete as what Ximport AE can do.

Makes more sense how?
Our solution makes it super easy to get from Final Cut Pro X to After Effects. To get from Final Cut Pro X to After Effects, the first step is to export the XML, then switch to After Effects and import our new product, Automatic Duck Ximport AE. You can change some settings or change some options, but essentially all you have to do now is open the XML file and our plugin brings it directly to After Effects.

Red Giant is selling your new products. Can you explain the relationship?
There is an enormous amount of work that goes into selling a product. What we enjoy the most is making the product, interacting with users and making sure their problems are solved, but dealing with credit cards and that type of thing is less interesting to us and takes our attention away from what we want to do.

Our friend Stu Maschwitz, who designed Magic Bullet, and also Peder Norrby from Trapcode, have been very happy with their relationship with Red Giant, which is essentially publishing their products and doing the sales, marketing and support. Another benefit for us using Red Giant‘s infrastructure for products and distribution is we are now able to offer trial versions of Ximport AE and also Media Copy, our media copying utility.

Media Copy

Media Copy

As we look toward future product development, we’ll be evaluating FCPX as an editing platform to invest in and spend considerable time developing solutions for. The great thing about our partnership with Red Giant is that it gives us access to their expertise and resources. I can foresee opportunities to partner up on products that, all by ourselves, we might not be able to execute or have what we need to make some workflows and solutions possible. I’m excited about what we’ll be able to do both with Red Giant and opportunities that we see coming forward from the FCPX landscape.

Can you talk about Automatic Duck Classic and how that came to be?
After joining Adobe, Automatic Duck retained some products that we were allowed to sell, but we just didn’t have the time to properly support them. So instead we made the products available for free on our website. When we started to prepare for the relaunch and updated our website the download links for the free stuff went away. We didn’t realize people still needed those tools, and I kept seeing posts on social media asking where the links went. We realized that there was still a need for people to get projects between FCP7 and Avid. The old products that we used to give away for free will be coming back on the website at no cost.

—-
For more details on the products click here.


Quick Chat: CO3’s Stephen Nakamura on grading ‘The Martian’

Ridley Scott’s The Martian tells the story of an astronaut left behind on Mars. The director, who created that world, called on Company 3’s Stephen Nakamura for the color grade, which he completed in London to be closer to Scott and the production.

We checked in with Nakamura to find out more about his process on The Martian.

You and Ridley have collaborated in the past. We assume you have developed a short hand of sorts?
There are definitely things I know he likes and doesn’t like, but each project is also a little bit different. Obviously, he is very interested in the visuals of every shot. The Martian was relatively straightforward. Something like Exodus: Gods and Kings was much more complex because of the kinds of things we were looking at, like the sea parting. On Prometheus, it was about helping to bring shape and definition to scenes that were really dark. Of course, he’s worked with [Dariusz Wolski, ASC], so a lot of the shaping has already happened between the two of them.

How early does he bring you on a film?

We speak very early on. I know before I see any images what kind of look he’s interested in.

Can you talk about the look of Mars? He referenced the terra cotta/orange look in our recent interview with him.
It was something that we all had a sense of conceptually but it took a lot of work with Ridley, the visual effects supervisor Richard Stammers and me in the DI theater to get it to all look the way it does in the final film. Quite a few shots involved a lot of sky replacements and the addition of mountains in the background. Richard’s team created these additional elements with a combination of CGI practical plates shot in Jordan and combined them with the first unit photography of Matt Damon.

So then when I added the heavy color correction Ridley wanted for that kind of orange look he talks about, it would have an effect on every element in the shot. It’s impossible to know in advance exactly how that correction for the planet’s surface is going to look in context and in a theater until you actually see it. I could get some elements of some shots where we needed them using Power Windows [in the DaVinci Resolve] but sometimes that heavy correction was too much and the effects elements would have to be altered. Maybe the sky needed to be darkened or we needed more separation in the mountains. We might make a change to the foreground, and the background would “break,” or vice versa.

So we had quite a few sessions where Richard would sit with Ridley and me and we would figure that out shot by shot.

You work with Resolve. What is it about that system helps your creative process?
I’ve worked in it as long as it’s been around. I like the way it’s laid out. I like the way I can work… the node-based corrections. I can get to the tools that most colorists use on a normal basis very quickly, and with very few keystrokes or buttons to push. That kind of time saving adds up to a really big deal when you’re coloring complicated movies.

I know there are other great color correctors out there too, but so far Resolve is just the most comfortable for me.

(from left) Matt Damon, Jessica Chastain, Sebastian Stan, Kate Mara, and Aksel Hennie portray the crewmembers of the fateful mission to Mars.

Was there one particular scene that was more challenging than others, or a scene that you are most proud of?
There are a number of shots set outside the ship Jessica Chastain’s character commands where we see the ship and some characters in the foreground and the surface of Mars further away and then blackness and stars in the far background.

Here again, we all have a strong conceptual sense of the look, but ultimately it’s something you can’t get to without seeing it in a theater and in context. How saturated should the color of Mars be? How sharp should the focus be on the planet’s surface, on the distant stars? It’s not simply a question of having it look “real.” Ridley’s the kind of filmmaker who wants it feel right for the story. And so I might use Resolve’s aperture correction function to make the stars appear more vibrant, the way Ridley wants it, and that could “break” another part of the shot. And then it’s a question of whether I can use power windows to address that issue or if the VFX team needs to re-render and composite the element.

That kind of massaging of every shot takes a lot of time, but when it’s done you really see the results on the screen.

Can you talk about grading for the brighter Dolby Vision 3D?
It definitely gets rid of one of the major issues in 3D when you can effectively put a stereoscopic image onscreen at the traditional 2D spec of 14-foot lamberts. Previously, doing a stereoscopic pass always involved putting a darker image on screen, and when you have that much less light to work with it affects the whole image. That’s particularly true with highlights that might have plenty of detail at 14 but will blow out when you’re working at 3.5.

Of course, we still did a pass for traditional 3D, since there are very few theaters currently able to show Dolby Vision 3D.

Does that involve a whole different pass or a trim pass, or is it just a LUT that translates everything from the 14-foot lambert world to 3.5?
Company 3’s technology team is always building and updating LUTs that get us a lot of the way there. But when there’s never 100 percent “translation” from the one set of display parameters to the other, image characteristics change. The relative brightness of that practical in the background to the character in the shadows may not feel the same at 14 as it does at 3.5.

So which pass would you do first?
The way I work when we’re doing multiple theatrical deliverables like this is to start with the most “constricted” version [the 3.5 fl 3D] and get that where we want it. Then we go and “open it up” for the wider space. It’s important to be consistent. Very often, it’s a question of building Power Windows around bright parts of the frame and bringing them down for the regular 3D version and then either taking them off or lessening the corrections for the brighter projection spec.


For more on The Martian, read our interview with director Ridley Scott.


Quick Chat: Dictionary Films director/DP Michael Ognisanti

Dictionary Films, the production arm of Cutters Studios, has expanded its roster with the addition of director/DP Michael Ognisanti. He joins from Chicago-based production company MK Films where he worked from 2004 to 2015.

While there he trained as a motion control operator and began assistant directing and shooting under director Mark Klein. Over the years, he built a reputation as a tabletop director and DP for commercials and documentaries. His credits include spots for Bobble, Bud Light, Coors, Giant Eagle and Golden Corral.

Let’s find out more.

You recently joined the roster of Dictionary Films. Can you talk about that and why you made the move?
I’ve always loved the idea of merging the production and post worlds closer together.  One of my first jobs out of college was as a videographer for a local news station. Between the reporter and myself, we would write, shoot and edit pieces daily. Being that close to the whole process of putting a project together was important to me. That is what I’ve found at Dictionary and Cutters Studios. Editing, effects, graphics, it’s all under one roof so those channels of communication are more available. I can feel more connected to the project and give my input along the way. I’m sure the editors will love that (smile).

You have a rich background in production, but your expertise seems to be tabletop. Can you describe the differences, if any, between directing tabletop and traditional shooting?
In general, there is not a huge difference. In the commercial world at least, the goals are still the same: we are trying to find the best ways to communicate a certain feeling through our images.  Composition, lighting, blocking, environment, they all work together to achieve that. That is the same whether it’s a live-action scene or a product-only scene.

That being said, the biggest difference is that a traditional live-action shoot revolves around what the talent is doing. We rely heavily on dialogue, action or facial expressions to get our message across.  When we shoot products, we obviously don’t have that, so we have to pay close attention to how we can make our subject visually pop off the screen and draw in the viewer. This is where the details become so important.

Anyone who has ever been on a tabletop set knows what I’m talking about. The backgrounds and surfaces and propping become much more essential to our work. Also, many product shots accompany the live action, so there is a constant battle for screen time. You may only have a couple of seconds to grab someone’s attention, so you have to make it count.

What about being the DP on a tabletop vs. traditional shoot?
I’ve found I use a lot of the same techniques on a traditional shoot that I would on tabletop, only on a bigger scale. It’s still crafting light to make the subject look interesting. Some of the lights might change but you still have to make the same decisions. Should the lighting be hard and contrasty, or soft and airy? It all depends on the message we are trying to convey.

Different types of shoots, different type of vibe?
I think tabletop does move at a slower pace. In some ways it can be more like a still shoot. The sets are usually smaller and more low-key since we are rarely dealing with talent and extras and intense location changes. However, I do think it’s a more detail-oriented way of shooting. We work on a micro level. We spend a lot of time making intricate adjustments to the lighting and framing; something you don’t normally see on a live-action set.

Any tips for those working in tabletop? What do they need to know?
Give away my secrets? Are you crazy?! I’m kidding, of course!  Nowadays you can pretty much learn anything on the Internet anyway. One thing I would say is that it’s easy to get too attached to one shot and not realize that what you are doing is a part of a bigger piece. Everything we shoot should be able to be put into one coherent piece. So you might have a really cool idea for a particular shot, but you have to ask if it actually works with everything else you are shooting.

Also, many times we are shooting product to accompany a live-action piece. Do the elements all work together cohesively? That is really important and sometimes it can get lost when we get sucked into our own world. I’ve also learned that it’s important to be very nice to the food stylists — they can be life-savers!  You can have the coolest, most dynamic shot ever, but if the product doesn’t look good, it’s all for naught.

BL_lime_stil copy Kitchen Aid_still copy
Michael Ognisanti’s resume includes spots for Bud Light Lime and Kitchen Aid.

Does the experience of working in tabletop help when you shoot more traditional pieces? What about the other way around?
I think they can complement each other nicely. For me, I like working on as many different types of projects as possible. That way, you pick up new tricks along the way and apply them in different situations. I found things I thought we would only do in tabletop work on a live-action set. The reverse is true too.

Can you talk about the tools you use?
In terms of camera gear and lighting, we use a lot of the same gear as we would for traditional shoots, although our lights might be smaller since our sets are smaller and we don’t need as much power. I’ve found macro lenses are important since sometimes we want to get right in on the subject but still want the feel of a wider lens, so the close minimum focus is key. Motion control is also a big asset for us. I was trained as a motion control operator (using Kuper Controls) so designing dynamic camera moves can really bring a uniqueness and intrigue to the shots.  Especially since much of the time the products are simply sitting on a surface, adding some dimension can be a nice eye-catcher.

In addition to the gear we use, any good food stylist will have their own bag of tricks that can help food stay fresh on set under the hot lights.

What project are you most proud of?
I don’t know if I can name one project specifically, but the most rewarding jobs for me are the ones that involve a lot of collaboration and problem solving. It’s inevitable that you will get stuck on a shot and either the specific action you are trying to achieve isn’t working or maybe the shot just doesn’t look good. But when we all come together —agency, client, crew — and find the answers in a collaborative way, that’s the best part. That’s when I feel most proud.

—-

Cutters Studios is a full-service company with offices in Chicago, Detroit, LA, New York and Tokyo. The Cutters Studios group also includes Dictionary Films, Chicago-based sound company Another Country, design/animation/VFX company Flavor (which has offices in Chicago, LA and Detroit) and Detroit-based Picnic Media. 

Checking in with ArsenalCreative since transition from ArsenalFX

A couple of months ago ArsenalFX owner Mark Leiss, along with executive creative director Kaan Atilla and executive producer Cortney Haile, launched their content creation studio ArsenalCreative in Santa Monica. So ArsenalFX is now ArsenalCreative, focusing on design, branding, animation and visual effects for commercials and other entertainment entities.

We decided to check in with Leiss to find out how things were going and what led him and his partners to start ArsenalCreative.

Is this an outgrowth from your already established ArsenalFX?
ArsenalFX has changed its direction to be more creatively centric, and we’ve changed our name to reflect this. ArsenalFX was primarily a high-end finishing company focused on commercial based projects with limited design/animation and visual effects. ArsenalCreative is a complete expansion of that. With top talent focusing on commercials and anything that requires a creative solution — experiential, digital, titles, VR, etc — ArsenalCreative will push the boundaries through creative and become more of a forefront company in the creative process. We also will continue to maintain our high-end finishing business, as it goes hand-in-hand with our new direction.

You and Cortney Haile were with ArsenalFX, but Kaan Atilla is an outside hire. Can you talk about what he brings to the studio?
Kaan brings years of creative experience to ArsenalCreative. It was something I wanted to add to the roster, as we are now pitching many creative projects. His knowledge of the industry, ability to envision and execute exceptional creative as well as his experience working with top brands in the world is a game changer for the direction of this new company and a huge advantage.

How do Kaan’s and Cortney’s strengths differ?
Cortney is very focused on the technical delivery and day-to-day running of the jobs, while Kaan is focused on the creative aspect of the project.

What tools — software, hardware, set-ups, etc. — are used at ArsenalCreative?
Autodesk’s Maya, Maxon Cinema 4D, The Foundry’s Nuke, After Effects and the rest of the Adobe Creative Suite. For storage we use Isilon and Dot Hill.

arsenal-amazonkindle1 (1) copyarsenal-amazonkindle3 (1) copy

Are there any jobs just finished or on the horizon that you can talk about?
We recently finished an exciting Amazon Kindle job for the UK — it’s a digital display in in London at Waterloo Station that helped launch the #haveKindlewilltravel summer campaign. The station houses the largest indoor digital display in the UK — it measures 131 feet wide and 10 feet high.

Quick Chat: Walter Biscardi on his new Creative Hub co-op

Post production veteran and studio owner Walter Biscardi has opened The Creative Hub within his Atlanta-area facility Biscardi Creative Media (BCM). This co-op workspace for creatives is designed to offer indie filmmakers and home-based video producers a place to work, screen their work, meet with clients and collaborate.

Biscardi has had this idea in the back of his head for the past few years, but it was how he started his post company that inspired The Creative Hub. After spending years at CNN and in the corporate world, Biscardi launched his post business in 2001, working out of a spare bedroom in his house. In 2003 he added 1,200 square feet to the back of his house, where he ran the company until 2010. In January 2011 he moved into his current facility. So he knows a thing or two about starting small and growing a business naturally.

color

Color grading

Let’s find out more.

Why was this the right time to launch this co-op?
The tools keep getting smaller and more powerful, so it’s easier than ever to work at home.  But from time to time there is still a need for “bigger iron” to help get the job done.  There’s also a need for peripherals that you might want to use such as the Tangent Element panels and FSI monitors for color grading, but making that investment for just one project isn’t feasible. Or maybe you’re planning a large project and would like to lay out your storyboards and planning where everyone can see it. Our conference room has 30 feet of corkboard and a 10-foot dry erase wall that is killer for production planning.

How will it work?
We have a beautiful space here and oftentimes we have rooms available for use. In the “traditional post production world” you would charge $50- $175/hour just for the suite, but many indie filmmakers — and even many long-form projects like reality shows and episodics — just don’t have that kind of budget.  So I looked at the co-op office space for inspiration on how to set up a pricing structure that would allow the maximum benefit for indie creatives and yet allow us to pay the bills. So we came up with the basic hourly/daily/weekly/monthly pricing structure that’s easy to follow with no commitments.

I think the time has been right for the co-op creative space for at least two years now, it just took this much time for me to finally get my act together and get everything down on paper.

What’s great about the co-op space too is that we hope it’ll foster collaboration by getting folks out of their houses for the day and into a common space where you can bounce ideas off each other, create those, “Hey, can you come look at this” moments. You see a lot of that online, but being able to actually talk to the person in the same room always leads to much better collaboration than a thread of responses to your online video.

One of the edit rooms

One of the edit rooms

Can you talk more about the pricing and room availability?
Depending on the room, we have availability by the hour, day, week and month. Prices are very straightforward such as $100/day for a fully furnished edit suite. (See pricing here.) That includes the workstation, dual monitors, Flanders Scientific reference monitor and two KRK Rokit 5 audio monitors. Those rates are definitely below “market value” but we have the space, the gear and we’re happy to open our doors and let filmmakers and creatives come on in and have some fun in our sandbox.

The caveat to all the low pricing is that it is restricted to standard business hours only. Right now that’s 8am-6pm. This follows with most of the co-ops I researched and if folks wanted to have 24-hour access or longer access to the space, that would be priced according to their needs. But the rates would revert to more market standard rates with overnight being more. We’ll see how this goes and if it takes off, we could always run a second shift at night to help maintain a lower rate in those hours.

What about gear?
For editorial, graphics, animation, sound and design, we have the full Adobe Creative Cloud in every Creative Suite.  Four of the suites run Mac and one room runs Windows.  Every suite has a Flanders Scientific Reference monitor connected via AJA or BMD hardware.

Color grading is offered via Blackmagic’s DaVinci Resolve and Adobe’s SpeedGrade on a Mac Pro with a Tangent Elements control surface and an FSI OLED Reference Monitor.

The sound mixing theater features ProTools|HD 5.1 mixing system with Genelec audio monitoring.  The main system is a Mac Pro. That theater has an eight-foot projection screen (pictured right) and can serve as a screening room for up to 12 people or a classroom for workshops with seating for up to 18 people. It’s a great workshop space.

None of our pricing includes high-speed storage as we assume people will bring their own. We do have 96TB of high-speed networked storage on site, which is available for $15/TB per day should it be needed.

So you are mostly Adobe CC based?
Adobe is provided because that’s what we use here so it’s already on all of the systems. By not having to invest in additional software, we can keep the rates low. We do have Avid Media Composer and Final Cut Pro on site, but they are older versions. If we get enough requests for Avid and FCP, we can update our software packages at a later date.

———
Walter Biscardi is a staple on social media. Follow him at @walterbiscardi.

Quick Chat: Ron Pomerantz brings his experience at Disney to Pongo

Not long ago 25-year-old creative agency Pongo hired Disney veteran Ron Pomerantz. In his new role, Pomerantz is heading up Pongo’s new creative content division, where he is responsible for developing strategic content for entertainment and retail brands. The new unit will develop and produce interstitial content, PSAs, commercials and B-to-B integration.

We reached out to LA-based Pomerantz to pick his brain about the move from Disney to Pongo (@GoPongoGo) and what he sees as the future for this veteran agency, whose credits include work for CBS’ CSI: Cyber, ABC’s Shark Tank and Disney Channel’s Teen Beach 2.

How long had you been at Disney, and why was now the time to make a change?
I started at Disney as a freelancer in 2000 and was hired in 2001 to help with the logo redesign and re-launch of SoapNet. In 2006, I was asked to creative direct Playhouse Disney (now Disney Junior) and Disney Channel.

I really had a great 13-year run at Disney in five different positions. I had the privilege to rebrand several of the networks multiple times, launch Disney Junior and manage and promote some really spectacularly relevant IP. But, of late, I have been eager to stretch my wings beyond the kid space. It was time for a change and time for me to return to my creative roots — making entertaining content and great brand marketing.

Pomerantz will be working directly with (L-R) Cary Sachs and Tom McGough.

Pomerantz will be working directly with (L-R) chief marketing officer Cary Sachs and  CEO/president Tom McGough.

Why Pongo? What excites you about this company and this segment of the industry?
I have worked with Pongo for almost 20 years. I have a great relationship with them and they are perfectly in line with what I want to do and explore next. Both Pongo and I want to make more pro-social, storytelling pieces in addition to great promotion and marketing… the time was right for both parties.

What changes have you seen in this industry over your career?
Well, the industry has changed quite a bit as we all have seen. Viewing habits have changed drastically as social media has risen. I think we are all aware of what is happening, but one of the strongest effects I have seen is the lack of channel branding in breaks.

Networks are increasingly promoting other entities rather than their own IP in their breaks. There is an increased franticness in the pace of promotion, and the need for ratings is driving many decisions, so brand building has taken a back seat. All this is not necessarily bad as content has become multi-platform and the content needs to be branded as much, if not more so, than the brand that made it.

How will you use your skills, honed and perfected, at Disney to help Pongo?
Disney gave me global experience with beloved brands that had to be interpreted locally. Finding emotional connections to brands and their content is at the core of what I did at Disney and that is exactly what Pongo does as well.

What do you see for Pongo’s new creative content division?
Pongo has hired me to oversee the development of strategic branded content for entertainment and retails brands. Together we are going to develop and produce interstitial content, PSAs, commercials and philanthropic presentations, but we will also delve into branding and global brand identity strategy as well. Our goal is to help clients uncover their brand essence and translate that into multi-platform content and different brand expression.

How much will you be involved in the other parts of Pongo’s business?
As much as they will let me!

Quick Chat: Hobo’s Howard Bowler talks about his pot reform campaign

By Randi Altman

The president of audio post house Hobo in New York City has a passion project, and it involves legalizing marijuana — you know, weed, pot, grass, Mary Jane, kush, bud. That stuff. The “End Prohibition Now” PSA campaign, which was funded, produced and posted by Hobo, supports reform in marijuana enforcement policies.

The campaign is made up of TV, radio and Internet ads targeting states that will be voting to legalize marijuana. Hobo (@hoboaudio) has made these spots available for free to broadcast outlets and organizations interested in spreading the word about this issue. All of the spots can be customized for different regions. Check out the video spot here, and the radio spot here.

Let’s find out why this is such an important topic to Hobo and Bowler, and how they went about conceiving, producing and posting the campaign.

You funded this campaign yourself?
Yes. The more I learned about the origin of prohibition, the more I realized these laws have a complex political history that is not based on science or health, and yet their social impact is huge. Last year alone 700,000 people were arrested on Marijuana related charges. Think about that. That’s more than for all violent crimes combined. It makes no sense.

So I could see that there was a lot to this issue and that the current efforts of organizations like MPP (Marijuana Policy Project), NORML, LEAP, and DPA (Drug Policy Alliance) could benefit from professional creative marketing support.

Why is this such an important message for you to spread?
Two members of my family were arrested, and although the charges were eventually dropped it was costly to get them out of the system. The whole experience made me wonder why marijuana was illegal in the first place. What I found out about the history of prohibition got me angry, and then it got me thinking.

You acted as creative director on these. How did you come up with the concept, etc.?
I’m very interested in history, politics, science, culture and the arts, and all of these subjects intersect with marijuana. Take for example this line from one of the radio spots, where a voiceover actor portraying President Richard Nixon says, “Everyone of those bastards out for legalizing marijuana is Jewish.” That’s a direct quote from a secret White House recording made during a discussion on whether marijuana should be legalized. I didn’t make it up. It’s the most fertile creative soil I’ve seen in a long time. There is so much material.

How did you work with the animator/editor on this? The animation was new, but given a vintage look?
We worked with editor Matt Hartman on the visuals. He had to get very creative since we had very little money to work with, so the public domain footage was actually vintage. A Google search turned up the usable footage we needed. He edited with an Avid Media Composer and did the motion graphics in Adobe After Effects. Everyone who has seen the spot tells me it was eye opening. I thank Matt for the excellent job highlighting those pesky facts.

Are there more spots to come? 
There are a lot more in the works. Many at Hobo have been contributing to the creative effort. Chris Stangroom (VP at Hobo) has written several of the spots. Julian Angel and LoudPack Zack have contributed music. All the guys have helped with the mixes.  It’s a team effort.

Can you talk about the mix and what gear was used?
Pro Tools|HD, various plug-ins, VO recorded with a Neumann U87 mic. The music was recorded with Pro Tools.

GP PSA 1 Five Classes

Anything else we should know?
We learned throughout this process that when one is passionate about a subject that passion can turn into power. We even wrote a song about it that we plan to release at a later date with the line, “Guided by the light of justice, with a gospel ever strong, we welcome you to freedom with a liberation song.” When it comes to ending prohibition that sums up everything we’re doing.

A Closer Look: Interstate’s work on Master & Dynamic headphones short

By Randi Altman

To tell the story of how their high-end MH40 headphones are made, Master & Dynamic called on New York-based production company Interstate to create a film educating potential buyers. Interstate is the US branch of the Montreal-based integrated production company BLVD. It’s run by managing partner Danny Rosenbloom (formerly with Psyop, Brand New School) and creative director Yann Mabille (formerly with The Mill, Mill+).

This almost 1.5-minute piece talks about the materials that go into creating the headphones and describes the manufacturing process and why it’s all meant to maximize sound quality. There are images of molten metal, flowing liquids and magnetized metal shavings that transform into headphone components. To create the finished look, Interstate captured as much as they could in-camera, shooting with a birds eye view, and a mix of stop motion and visual effects.

For the liquid aluminum sequence, Interstate used a material called gallium for the melting aluminum effect — also used in the original Terminator movies — and cast and melted an aluminum ingot from it on camera.

According to Interstate EP Rosenbloom, “The material melts at roughly 80 degrees Fahrenheit. It’s the same stuff some magicians use to bend spoons with their minds — not all of them, of course, because the good ones really can bend spoons with their minds!”

Interstate’s Mabille, who co-directed the piece with Adam Levite, answered our questions and helped us dig a bit deeper.

yann_mabille

How early did Interstate get involved with the project? 
We started to get involved during the final stages of setting up Interstate, which makes this project our very first. We thought it was a great way to start.

 

Were you involved in the creative, or did the client come with that already spelled out?
Miles Skinner, who is a freelance creative director for Master & Dynamic, wanted to create a sequence that suggested a building process that had a specific elegance and artistic value while showcasing the beautiful qualities of the raw materials used to build the headphones.

At the same time, the goal was to stray away from the traditional pipeline representations, which are usually hands or machines interacting with objects etc. We were tasked with finding creative solutions to implement Miles’ ideas. We conceived a semi-abstract representation of each of the main steps of the building process, starting by glorifying the raw materials, processing these materials in an interesting manner, and eventually ending it in an elegant way to showcase the finished product.

How much is live-action versus VFX?
The product is very well designed and has a great finish, so we knew that it would look great on camera. Adam and I love macro-photography and were keen to feature the natural beauty of raw and noble materials on a small scale. This naturally led to trying to shoot as much as we could in-camera, therefore limiting the role of VFX in the sequence.

That said, CGI was used to animate certain elements that would have been challenging to puppeteer on such a small scale. In order to add light interactions across the lens, cleanup and retime shots, we used 2D. We wanted to retain a physical approach from the very beginning to keep all the wonderful qualities of the raw materials as genuine as possible.

Did you do any prepro?
Indeed. Most of the prepro was spent getting to know the materials we were going to work with and how to best represent the headphones, as well as all the components used for the construction process. For example, we ended up using gallium to simulate melting aluminum, and a specific metallic powder was brought to life to shape components, such as steel screws, which were also made out of wax that we then melted. Overall, it was obviously much easier to film deconstruction and reverse the footage to give the illusion of construction.

Can you walk us through the production and post workflow? What did Interstate shoot on?
Alexander Hankoff was the DP. I had worked with him when I was at The Mill, and I always wanted to work with him again as I knew he had a great eye for macro-photography. He can find beauty where you expect it the least. He did a great job over the two-day shoot.

We shot the whole spot on a Red Epic camera, most of it at about 120fps. Also, production designer Jaime Moore and her prop master, Gino Fortebuono, were indispensable to the process and did a great job bringing this to life. We shot the whole sequence in a fairly big studio to make sure we could use different set-ups at the same time.

new2

Interstate produced and provided some post, but you also worked with BLVD. What exactly did they provide in terms of post?
Most of the time we will do all the post internally, but in this case we could not do all of it as we were just starting the company. BLVD was the right choice to help with the 3D and some of the 2D components, but their audio experience was key, and they also did a great job with the sound design.

How did you work with them on approvals?
We had daily reviews, which were all remote, but hassle free. Everyone was really responsive and engaged thoughout the process.

What tools were used for the edit, VFX and color?
Apple Final Cut, Autodesk Maya and Blackmagic DaVinci Resolve for color.

How did you describe to your colorist, Tristan Kneschke, the look and feel you wanted for the piece?
A very favorite part of the process for me is to establish a color look, but I also think it’s crucial to sleep on it. It’s important to step back when you do coloring since your first pass will often be either too extreme or off tone. Keeping a fresh eye is the hardest thing to do while coloring. Luckily we were able to do that with Tristan. We established a look, which we then refined over the course of a week.

What was the most challenging part of this project?
Besides figuring out how to get the most out of the materials we had — the components that make the headsets or the materials used to shape specific objects — the conceptual phase was crucial and the most challenging. It was key to find the right balance between an overly abstract and removed representation of the actual building process, and an elegant and somewhat explicit representation of that same process. It was important not to get too far away from a clear and palpable depiction of what happens to the materials in order to constantly keep the audience hooked and able to relate to the product.

What haven’t we asked that’s important?
The client was amazing — they really gave us total freedom. Miles is a rock star, every idea he had was great and everything we proposed he quickly came on board with. As a company, we really wanted to make sure that our first piece out of the gate was memorable. I think we got there.

Quick Chat: ‘Ted 2’ previs/postvis supervisor Webster Colcord

By Randi Altman

Ted, the foul-mouthed but warm-hearted teddy bear, is back on the big screen, this time fighting for the right to be recognized as a person — he really wants to get married — in Ted 2 from director Seth MacFarlane. Once again, this digital character is seen out and about in Boston, in all sorts of environments, so previs, mocap and postvis played a huge role.

Cue Webster Colcord. He was previsualization and postvisualization supervisor on Ted 2, reporting to Culver City, California’s The Creative-Cartel. Colcord held similar titles on the first Ted, serving as the production’s previs/postvis artist and mocap integration artist. He also worked on Ted’s appearances between the two movies — The Jimmy Kimmel Show and the Oscars. He worked out of the production unit set up by Universal Pictures and studio MRC.

For Ted 2, Colcord and team used motion capture, via the Xsens MVN system, to record MacFarlane, who also voices Ted, as he acted out scenes. Because it’s an inertial system, MVN allowed the character (and director) to step out of the mocap volume and onto the streets, something that couldn’t be done with an optical offering.

Colcord has been working in CG since 1997. Prior to that he was a stop-motion animation artist. “I do all kinds of things, he says. “Previs, animation, postvis and supervision.  Mocap is not my usual gig, actually! Right now, I’m animation supervising at Atomic Fiction (Flight, Star Trek Into Darkness, Game of Thrones) in the Bay Area.”

We reached out to Colcord to find out more about his process and the workflow on Ted 2.

You worked with The Creative-Cartel and Jenny Fulle. What was that relationship like?
Creative-Cartel has been the VFX management unit on the Ted movies, so they oversee the dissemination of assets between the different parties involved and planning. They are involved every step of the way, from pre-production through to final delivery.

The on-set duties for all of us tend to be all-engrossing but after principal photography, when I am in-house with the editorial department doing postvis, I’m supporting just the editorial department and the VFX teams. At a couple of points in the schedule, however, we were prepping for re-shoots on stage with the main unit, mocap at the editorial office, postvis for upcoming screenings and delivery of synced mocap to the vendors. It could be overwhelming!

Whose decision was it to use Xsens? Do you know if that’s what they used on the first Ted?
During development on the first movie, producer Jason Clark and VFX producer Jenny Fulle researched and tested various mocap options and arrived at Xsens’ inertial mocap system, which was very new at the time. It was decided to go with the Xsens MVN system because of the ease of set-up on location. You don’t need to set up a volume, and it’s very portable — the set-up is minimal. Also, there are no marker occlusion issues.  It has a few limitations that the optical systems do not have, but with every update those differences become less and less.

There is a big dance sequence in the film. It must have been particularly challenging to capture the movements of a completely CG character?
It was a complex sequence, and it blends in from a previous scene with Ted dancing in a different environment, adding to the complexity.  The credit for working out the choreography goes to Rob Ashford, Sara O’Gleby and Chris Bailey.  Also, of course, VFX supervisor Blair Clark.  It’s important to understand, though, that the mocap system just provides a core performance and the final Ted is a blending of keyframe animation (Iloura did the dance sequence) and mocap. My role was to facilitate the performance and get it over to the VFX team with a high degree of fidelity and in a pipeline-ready form.

Film Title: Ted 2 Film Title: Ted 2

We ended up capturing it in about four sessions, with five different dancers, each of whom acted out Ted’s motions for various parts of the choreography. During production on Ted 2, Xsens released an updated version of their system, which they call MVN Link.  The sensors are smaller, the data has been improved and the wireless signal uses Wi-Fi rather than Bluetooth.  So we used that version of the system for the dancers. For Seth’s performances we use a fully wired system with an umbilical cable attached to the computer, as Ted is usually not being very acrobatic in his motions.

What’s the workflow like?
We recorded a live feed of the mocap on the low-res Ted model from Autodesk Motion Builder, while we captured the data.  In some cases editorial was able to comp this into shots to use as postvis, pretty much right out-of-the-box.

Having Fun: Colocrd and the postvis team were called in the day of a screening  to help make a joke "play" as per MacFarlane's direction.

Postvis in progress on “Ted Hooker scene: Colcord and the postvis team were called on the day of a screening to help make a joke “play” as per MacFarlane’s direction.

So you captured the data and sent it to Iloura and Tippett Studio?
Yes, I would retarget the data in Motion Builder, then sync the data in Autodesk Maya with a minimal amount of clean-up and send it off to both houses. The data would also be used as the core performance for many of our postvis shots.

If Ted makes any more appearances on talk shows/awards shows will you be using Xsens for that too?
I assume that we will be using the MVN system since we have an established pipeline. It’s pretty much the same as what we do for the feature, but it depends on who is doing the editorial duties, since the first pass at deciding which part of a mocap performance is used is made by the editor.

Quick Chat: Utopic editor Kat Pryor on Porsche film

By Randi Altman

Utopic editor Katherine Pryor didn’t grow up a racing fan, but a recent short film for an iconic car company turned her head. New York City-based production company ADDigital and Chicago edit house Utopic teamed up on a documentary-style film for Porsche. Director Sam Ciaramitaro and Pryor worked side by side on the web offering, via agency Cramer-Krasselt Chicago.

The five-minute-plus film, called The Enduring Bondis the first of two long-form projects Ciaramitaro and Pryor are slated to collaborate on. The second one will shoot at Road Atlanta this fall. The Enduring Bond, which shot over four days this past March, offers a “fly-on-the-wall style and features two personal stories: one showing how much the 12 Hours of Sebring endurance race means to the crews and drivers from Porsche, and another following a family that attends the race year after year.

We reached out to Pryor to find out more about editing the project, and working through 25 hours of footage, as well as her collaboration with the director.

How early did you get involved in the project? 
I had several conversations with director Sam Ciaramitaro prior to production. I had worked with him before, so when we knew we were teaming up again for this one, he would send me ideas and we would discuss things like music, style and pacing.

Still 2

Can you talk about how you worked with them before the edit process began?
I was given some “homework” before production. There were a few docs I watched as examples of great techniques for fly-on-the-wall-style documentaries. In addition to that, I had a learning curve for what endurance racing actually is. I had never seen a car race or understood the culture with fans and drivers, so I watched the feature film Le Mans (Steve McQueen) and Senna, a documentary composed entirely of found footage about Formula 1 driver Ayerton Senna. This all helped to get me into the driver’s seat POV, so to speak.

You mentioned that you and the director had worked together in the past. That must have made things a bit easier?
Sam and I have a great short hand. There was a lot of collaboration back and forth during post. Also, it was great to be on set and see him come back from a location excited to tell me what they just captured and what to keep my eyes open for as I looked through footage.

So you were on set, not near set?
The set was the entire track — 3.74 miles — and the surrounding areas where fans were camped out all weekend. We were set up in a room where media and TV people were stationed, behind the grandstand. I was editing on set with my assistant Christen Nehmer. On several occasions we were able to hop in a golf cart and head out to a location for parts of the shoot. I really couldn’t have done this without her there – she was syncing interviews and logging wild sounds right alongside me so I could focus on pulling selects.

It was essential to be there to have a grasp on where and how everything was shot, and how sound was captured. We were also able to go into the pit area to observe. Having been at a race track for four days, I can now distinguish the sound of a Porsche engine from any other race car!

L-R: DP Steven Huber and director Sam Ciaramitaro.

What was the piece shot on, and how did they come to that specific format/camera?
We had two very agile and talented DPs — Anthony Arendt and Steve Huber — who shot on Sony A7s combined with Atomos Shogun 4K. These cameras are great for their small size and ease with getting around quickly. The footage looks fantastic. And with everything at 4K, I had a lot of opportunity to blow shots up and move around the frame.

When did you start getting footage, and what was the workflow like?
We started getting footage on the second day of the four-day shoot. We were set up near the DIT, and as he transcoded the footage, we would then copy to our drives. We worked off of 15-inch Apple MacBook Pros, running Adobe Premiere CC, with 4TB G-Tech G-RAID Thunderbolt drives.

It was a 12-hour race on Saturday, day four of the shoot, so we had lots of footage trickling in as the day went on. We spent most of Saturday organizing the three days of footage we had already gotten. By Sunday morning, we had everything in hand, which was around 25 hours of footage. Then we got on a plane and flew back to Chicago. Sunday night Christen and Jarrad Quadir, another rockstar Utopic assistant, transferred all the footage over so I could continue editing on Monday without missing a beat. Sam came to town Tuesday and by Thursday we had about a nine-minute working cut.

What kind of direction did you get about the edit?
The race itself was never the focus of the piece. Nor was the goal to sell Porsches. The focus was to tell the human side of motorsports. To let the personal stories unfold, side by side. I knew that the meat of this was going to be the family’s story, followed by the driver, Jörg Bergmeister. With this kind of documentary style, there are always surprises. There was no traditional board or script to work from, so I started with the director’s treatment.

How would you describe your creative process on this one?
I watched every frame before I started laying anything into a timeline. I was extremely disciplined. With 25 hours of footage, it would be tempting to skip through a lot of it, but I made sure to screen everything, pull selects and really just digest all of it. With that amount of footage, I wanted to be sure of everything we had to work with.

Next, I started with Sam’s treatment as my roadmap. Everything in his treatment was captured. It came down to finding the most essential and best moments to tell the family’s story, and to balance that with the driver’s perspective leading up to the race. It was also important to fill it in with cinematic moments — like Jörg shaving and then driving to the track with a fellow race car driver. The goal was to create tension and build up the characters, then end with the beginning of the race.

Was there a part that was most challenging?
I think the challenge was what to do once we got to the actual race! Since it was not a focal point, I wasn’t quite sure how to wrap it all up. I felt like we could just keep building and building. Once I saw what an emotional story we had from the Diaz family, I knew we had to end with that. In fact, Sam discussed it with me on set immediately after he shot it — that was the ending…  Javier’s tears. So I definitely had that in mind from the beginning when I started cutting.

The actual race result was a bit of a surprise. Porsche ended up having a rough last hour of the 12 hours and they lost despite holding leads throughout the day. Later in the post process it became important to get their message across, which ultimately ties back into the theme of Enduring Bond. Creating the drama of this losing moment but still maintaining their will to win was a bit of a challenge.

8 seven

What are you most proud of?
I absolutely love this piece. I’m so thrilled to have been brought into this project by Sam, ADD and CK. I would say I am most proud of the sound design that I built with the variety of elements I had — original music, ambience, pit-to-car radio communication, track announcer voices, wild track audio of the race and sync sound from interviews. It was definitely outside of how I normally work on projects. I really pushed myself to build and layer the audio especially during the rough-cut stage. Then, of course, I worked closely with Brian Leitner, Utopic’s sound designer and music composer.

Can you talk more the music?
Before production we created an original music track, based on direction that Sam had given us. I wanted a track to cut with that could eventually be post scored. Instead of doing a traditional music search or getting hooked on a song from a band or soundtrack, I asked Brian Leitner to create something. It turned out to be an amazing piece of music, and perfect for this film.

Chatting with new Aardman 2D director Åsa Lucander

By Randi Altman

Not long ago 2D director, illustrator, animator and graphic designer Åsa Lucander joined the iconic Aardman Animations team in Bristol, England. This Finnish-born artist first came to England to study illustration. After being introduced to animation, her world opened up a bit further, and she has never looked back.

After the studio she worked at shut down, she found herself at Aardman (@aardman) and on a new path. We reached out to Lucander to find our more about her past, her current job and her work ethic.

You recently joined Aardman — can you talk about why this studio is a good fit for you and your artistic talent?
I’ve always been a big fan of Creature Comforts, Wallace & Gromit and all the other endlessly funny pieces that come out of Aardman. But like so many others, it is the wonderful stop-motion animations that I associate with Aardman.

What I’ve recently learned is that Aardman is about so much more, and with other great styles. I hope I can bring a different layer to Aardman’s talent, coming from a 2D background and from a small studio where you are hands on in all the stages of the production. I’m sure I will learn a lot from Aardman, but also hope that I can bring a new perspective to jobs I will work on.

Aardman has quite a reputation in the industry and a long and storied history. Was it at all intimidating to be joining them or just exciting?
Absolutely! Aardman is like a British institution, and of course you feel slightly intimidated. But I was also very excited about surrounding myself with such great talent. I can only see that as inspiring. I do thrive on challenges, which force you to push yourself and develop.

Can you tell us a bit about your background and how you ended up at Aardman?
I’m originally from Finland, where I studied graphic design, going on to work in an advertising agency. After two years I decided I wanted to return to studies as drawing has always been my biggest passion. I moved to London in 2001 to study illustration, and it was here that I first came in contact with animation. It was like a whole new world opening up for me. Seeing your drawings come into life is like creating pure magic. I’m sure many people have the same epiphany, and after that moment I was hooked.

ASA LUCANDER LOST PROPERTY-.mov.00_00_34_07.Still004 ASA LUCANDER LOST PROPERTY-.mov.00_00_51_01.Still005

We had a guest teacher, Tom Mortimer, teaching us animation in the third year of my studies. He asked if I wanted to come and work for his company, 12foot6, after graduating. I did a work placement first and then became full-time employed. I stayed there for 11 years! Sadly, last year the company shut down, and that wave brought me to Aardman. It was a whirlwind of a year… I finished my short film, Lost Property (shown above), the company shut down, I had a baby, I got taken on by Aardman and relocated to Bristol. But I’m very much enjoying the ride.

How does having expertise in illustration, graphic design and animation inform your work?
I think the more tools you have in your bag the better. I’ve always loved drawing, and I feel the visual look is as important as the rest in animation. A bit of knowledge in graphic design helps with the overall layout and balancing elements and texts as well. It all works hand in hand really. I do like to be hands on and be part of each stage of the process.

What tools do you typically call on?
I always start with drawing in my sketch book, then tracing characters in Adobe Flash where I animate them. I use the same process in my backgrounds and then mix up all the textures, collage and digital painting and put them together in Photoshop.

Can you describe your directing style?
I do quite like an old-school style of animation, re-drawing quite a lot frame by frame rather than using too many symbols and tweens. I also like to pay attention to small details, or small sub-plots, that help to make the outcome quirky and interesting… drawing in the viewer and creating a multilayered animation that is much more than what first meets the eye.

When starting a new project, what are the first steps you take?
It depends on the project but if I’m working on a script/brief I start by getting into a certain mindset and then read the script over and over again, scribbling notes on the side of the page — little ideas, details, doodles and sub plots. Then I begin my research by collecting influences, reference pictures, looking for color palettes and other inspirations for the visual look of the project. Then I sit down to draw the characters and nail the look of the background art. What follows is a storyboard, animatic, animation and so on.

You mentioned your short film, Lost Property, which is playing at some festivals at the moment. Can you talk about it?
The idea for the film came to me when listening to the radio. Someone was talking about a lost property office. I thought the subject matter sounded very interesting. All the very strange things that people leave behind or lose in one way or another.

ASA LUCANDER LOST PROPERTY-.mov.00_01_01_16.Still006 ASA LUCANDER LOST PROPERTY-.mov.00_00_11_18.Still001

I found this world could be an intriguing starting point to base a short film on. Developing the idea further, I imagined that a lost property office could represent something else, be a metaphor for something different — our mind.

Lost Property (above) is a love story, but above all it is about the fragility of the mind — how we take it for granted, and how lost we are without it. It is about hope, persistence and devotion. It portrays an illness — Alzheimer’s — that robs us of who we are.

After the initial idea, the script came fairly quickly to me. I had one of those rare moments of having a dream, and when I woke up all the loose ends tied together and I pretty much wrote down the script there and then. It was five o’clock in the morning.

Visually I wanted to do something different to what I had done before. I explored digital painting and spent a lot of time getting the visual look and feel with an interplay between a rich color palette and lights and shadows.

I was very lucky that when telling the idea to the film to my producer he jumped on it, and 12foot6 decided to privately fund it. So I had a great team behind me that worked many nights in order to bring it to life.

Since joining Aardman you worked on a project for British Gas. Can you describe the job?
British Gas was a fun little film about British Gas Apprenticeship. We were given audio of an interview with the apprentices, and from there we worked out the script. It was a very quick turnaround and we had to animate 20 seconds in only one week.

BRITISHGAS_DEL-03.mov.00_00_33_14.Still005BRITISHGAS_DEL-03.mov.00_00_11_23.Still003

I felt a bit faint to start with when I heard the schedule, but its surprising how much you can achieve when you have to be time efficient and organized to the second. It was also a pleasure to work with the creative team from Ogilivy.

Do you have any advice for young people just starting out in the business? Maybe something you wish someone told you early on?
Work really hard, and always follow your strengths and dreams. Nothing is impossible.

Quick Chat: SuperMeet founders Michael Horton and Daniel Bérubé

On June 26, the first Bay Area SuperMeetUp is taking place at the McEnery Convention Center in San Jose, California. Held in association with Future Media Concepts and the FCPX Creative Summit, the Bay Area SuperMeetUp will feature, among other things, a keynote by Randy Ubillos, former chief architect of Final Cut Pro.

This all seems pretty cool, so we decided to reach out to the event’s architects to find out more about this SuperMeetUp and the other events they hold all over the world. The following is our Quick Chat with Daniel Bérubé and Michael Horton (pictured left to right above). Bérubé, of the Boston Creative Pro User Group (BOSCPUG), is co-producer of these SuperMeets with Michael Horton, the founder of the Los Angeles Creative Pro User Group (LACPUG).

What is a SuperMeetUp versus a SuperMeet?
SuperMeets are gatherings of Adobe, Avid, Final Cut Pro and DaVinci Resolve editors, gurus and digital filmmakers from the US, Europe and the world over. A SuperMeetUp is the same thing, only smaller. There is room for only 300 people at this event in San Jose, thus we are calling it a SuperMeetUp. This will be the first-ever Bay Area SuperMeetUp. We’ve also previously held SuperMeetUps during SXSW and in Boston.

How did this upcoming SuperMeetUp come to be?
While attending the 2015 Editors Retreat in Daytona earlier this year, Dan had an opportunity to discuss with Future Media Concepts his idea of holding one of our SuperMeet events alongside their three-day FCPX Creative Summit in San Jose. We thought that might be fun, and FMC loved the idea! So after a few weeks of figuring out how to do this, we decided that a SuperMeetUp would be perfect to hold during the Summit.

Do you have to register or attend the FCPX Creative Summit in order to attend the SuperMeetUp?
No. The SuperMeetUp is a separate event. It is held in the same location as the Creative Summit, but it is an entirely separate event. Everyone is welcome. All you need is a SuperMeetUp ticket.

Does the main interest for attendees need to be in FCPX in order to make the SuperMeetUp worth their time?
Not at all. This event is not just about Final Cut Pro X. We will have presentations from Blackmagic Design, Adobe, Other World Computing and Kanen Flowers of That Studio.

Randy Ubillos

Randy Ubillos

Having Randy Ubillos as your Keynote speaker is exciting. How did that come about?
Some of your readers might not recognize the name Randy Ubillos, but he is the former chief architect of video apps at Apple. He recently announced his retirement. While at Apple he created and worked on a range of products, including Final Cut Pro, Final Cut Pro X, Aperture, iMovie ’08 through ’11 and iPhoto for iOS. He also created versions 1 through 4.2 of Adobe Premiere Pro. How’s that for a career?

Randy Ubillos fundamentally changed the way we all tell visual stories. He changed the world’s post industry. Entire industries have grown up around what he has created. So, yes indeed, this is exciting.

What is the Digital SuperMeetUp Showcase?
This is where all attendees can enjoy a few cocktails, network and party with industry peers, talk one on one with leading manufacturers, get hands-on demos and simply learn about what is going on in the world of hardware and software.

There will be over 15 software and hardware developers to hang out with, including Adobe, Atomos, Blackmagic, Boris FX, CoreMelt, FCPWorks, Flanders Scientific, Imagineer, LumaForge, Lumberjack Systems, OWC, Pond5, Ripple Training, Sonnet, Telestream, That Studio and others. Doors open at 5pm, so it is best to get to the SuperMeetUp at that time. Stage presentations begin at 7pm.

What is the value of attending a SuperMeetUp?
Good question. Like everything, you get out of it what you put into it. A SuperMeetUp, like a SuperMeet, or like any networking event you attend, is all about you. We provide you with the venue, some food and drink and a bunch of like-minded people who are interested in the same things you are. But it is YOU who must have the courage to go up to strangers and say, “My name is,” and take it from there. Do that and you just might meet that one person who can change your life. This event will be jam packed with some of the brightest minds in the industry.

In just one night you will not only get a chance to meet and greet these people, but learn something as well — all for the price of a ticket and the will to get out of the house.

Will there be one of your “world famous” raffles?
Oh sure. How can there be a SuperMeetUp without a raffle? It’s a tradition and a heck of a lot of fun. Currently, there is more than $25,000 worth of valuable prizes to give away to dozens of lucky filmmakers, including a Blackmagic Pocket Camera, an Atomos Shogun, a one-year subscription to Adobe Creative Cloud and copies of iZotope RX Advanced. Plus, we’ve so much more to give away. Raffle tickets are only $2 each or three for $5.

How much does it cost to attend the SuperMeetUp?
It’s $15 per person, however readers of PostPerspective can save $5 off of general admission by using code PPVIP during registration — making the cost only $10, plus ticket fee. Now that’s a heck of a deal!

Just go to http://supermeet.com and click on the Buy 
Tickets button, then enter the PPVIP promo code on the Eventbrite RSVP page. If someone needs a hotel room, FMC has made a deal with the Fairmont Hotel in San Jose, which is a five-minute walk to the SuperMeetUp. Just go here and click on the Register Now button. Fill out all required information, click Continue to Part 2 and select SuperMeetUp single-night package.

Any other SuperMeets coming up this year?
Yes! The Annual Amsterdam SuperMeet will once again be held at the Hotel Krasnapolsky in the heart of Amsterdam on Sunday, September 13. It’s going to be huge. Plus, we’re exploring the idea of holding another SuperMeet in November, and we look forward to sharing an update with you soon.

Post Factory’s Alex Halpern on SIM Group merger

Post Factory NY, a long-time New York City post production staple, has merged with SIM Group. The move gives SIM Group an even larger East Coast presence — not long ago, the company’s Bling Digital opened an outpost in Brooklyn, offering dailies, offline editorial and finishing services.

Post Factory NY has two Manhattan facilities totaling 38,000-square-feet — an impressive amount of space for a New York post house — with more than 60 editing suites, two color grading suites, a DI theater and post sound. Future plans call for further development of finishing services, both for feature films and dramatic TV series.

Says SIM Group CTO Chris Parker, “Our companies share a common culture that is customer-centric and service-focused. It’s the people at Post Factory NY who set it apart and that was what really attracted us to them.”

“This addition will allow us to more effectively address the needs of clients on the East Coast and throughout North America,” adds SIM Group CEO Rob Sim. “It provides a great path for both companies to grow and to offer a more comprehensive mix of services.”

Post Factory, which will continue to operate under its current name and management, gains expanded resources and geographic reach through the ability to align with Bling and other SIM Group companies, including SIM Digital, PS Production Services, Chainsaw, Pixel Underground and Tattersall Sound & Picture.

Post Factory’s clientele includes HBO, Fox, ITV and Paramount, as well as many independent producers. It has a strong record for supporting filmmakers in New York and beyond.

In the wake of this news, we reached out to Post Factory founding partner/CEO Alex Halpern, who says this move made perfect sense for the studio. “I’m excited by the opportunities and the resources available to us and our clients: the intellectual capital at SIM, the culture at SIM, just like Post Factory NY, they’re filmmakers first. They understand what it takes to tell a great story, and all of us want to partner with our clients toward that end.”

Let’s find out more…

Why was this the right time for Post Factory in terms of merging? 
This is the perfect time for Post Factory NY to join forces with SIM because it’s going to allow us to provide a host of additional services for clients spread across the globe. We’re entering the next phase as a company in a market that requires agility and multiple verticals, and this is the kind of partner SIM and how they operate their existing business.

What does this merger allow you to do that you couldn’t have done previously?
It allows us to change the paradigm. We can now start working with clients from production all the way through to delivery. We can offer any part of the chain from cameras to final delivery and everything in-between or selective parts. We want to provide our clients with the best choices for their shows and partner with them to deliver the best content they can to audiences around the world.

What does this merger mean for existing clients of Post Factory
It means the post factory they’ve come to love and cherish will be able to provide them with a stable, comfortable work environment with the services they’ve become accustom to for years to come.

New York has seen increased production, especially with TV series, thanks to the tax incentives. Do you expect to see more of those productions staying in NY to post at Post Factory?
I think TV is staying in NY, and certainly Post Factory, like everyone in town, has benefited from the tax incentives and the additional work. I believe that aligning with SIM will allow us to grow our relationships with TV producers.

Quick Chat: Nick Mattingly from Switcher Studio

This young company brings multi-camera production to iOS devices

By Randi Altman

At NAB 2015, I got to see first hand what the Switcher Studio app was capable of. During the show, my mornings were spent at the iOgrapher booth interviewing product makers and post pros about news, technology and trends. In years past, I had just one iPad in an iOgrapher rig recording a two-shot for my interviews. This year, thanks to Switcher Studio and team, I was able to record multiple camera set-ups (here’s an example). I was so impressed with the product and the young team behind it that I wanted to know more, so I reached out to Nick Mattingly, the CEO and co-founder of Switcher Studio (@switcherstudio).

Mattingly is an eight-year industry veteran who prior to co-founding Switcher Studio was partner/owner at a digital media agency called Arcas Digital. His primary role was web/application development. “My business partner, Dan Petrik (who also co-founded Switcher Studio), ran the production side of things and did video consulting. Together we helped a number of media companies and stations get setup to do live video. We had clients that we’re spending $10,000-plus to get set up to do their own live video broadcasts. Sometimes they would even hire someone just to manage the equipment and productions because the set-ups were so complex and costly.” It was that cost and complexity that led them to create Switcher Studio.

postPerspective's set-up during NAB.

postPerspective’s set-up during NAB.

What does Switcher Studio do, and when did you create the product?
Switcher Studio is a mobile video app that makes multi-camera video production easier and more affordable. With Switcher Studio you can sync up to four iPhones and/or iPads to record and stream live video. You can also insert photos and graphic overlays and manage multi-view effects.

Typically broadcasting a multi-camera production requires a significant upfront investment. The set-up alone is a time-consuming process and a complicated mess with cameras, cables, capture cards, computers and specialized video mixing and encoding software or hardware. When you add it all up, a traditional multi-camera setup is going to run $5,000 to $10,000 and can easily be more depending on your needs.

After years of running these types of multi-camera productions and providing consulting services for others wanting to create their own online content, we started looking for a better solution. When we couldn’t find one, we brought together a team of programmers, engineers and broadcast gurus and started working on a more convenient and affordable solution for video production. Switcher Studio is the result of this work and was launched in the fall of 2014.

With Switcher Studio, almost all of the extra hardware requirements have been eliminated for creating dynamic multi-camera video. This allows content creators to focus on capturing their story rather than time consuming set up. It also allows users to capture professional video for events they wouldn’t otherwise cover.

All you need to get started is a single iPhone or iPad. This primary device acts as a full-featured video switcher. With it, you can use the built-in camera on your device, add full-screen images and graphics, or insert pre-built image overlays, lower-thirds and corner bugs. In addition, you can use our desktop screen-sharing app, Switcher Cast, to bring your computer in as a source and trigger multi-view effects to show multiple sources simultaneously. Every cut and edit that you make gets saved in realtime and can be broadcast online to any service that uses standard video streaming protocols like YouTube, Ustream, Twitch and others. You can even remotely manage your streaming channels with our online dashboard.

If you want to add another camera you just connect the device to the same Wi-Fi network and launch the app (Switcher Studio currently allows up to three additional inputs). From there you just start switching between sources and when the event is done you have a finished product.

Is it only for streaming?
Switcher Studio was originally built to make live video streaming easier and more affordable. Since launch, there has been great interest in using Switcher Studio to create multi-camera productions for video-on-demand playback. By default the final mixed live output is saved to the primary switching device with every cut, transition and graphic recorded, so you always have a local recording of the production. We have also introduced a new feature to allow broadcasters the option of recording on all connected cameras so you can edit full quality video in post production if needed.

Can you talk about the technology?
All devices communicate over a local Wi-Fi network. This gives us the flexibility to set up camera angles wirelessly and capture perspectives that have been difficult to manage or sometimes even impossible to get in the past. Because of this, you can seamlessly transition between shots no matter where they are positioned.

If you are producing on-demand content, you don’t have to be connected to the Internet, just use your router or even a mobile hotspot as a hub for communication between the devices. When you’re ready to go live, just make sure the router is connected to the Internet and you can start streaming. The great thing about this approach is that you can always create a bigger network or even use additional antennas and access points to increase the distance between cameras.

How often are you updating the software?
Constantly. By subscribing to Switcher Studio you can access the main mixing interface on one iOS device and connect up to three additional iOS devices as cameras. You also get every update and new feature as they become available, as well as access to our desktop screen sharing app and cloud services. If you are out in the field for a production and need some help you could literally grab anyone with an iPhone, get them to install the free seven-day trial version of the app and bring it in as a camera.

What’s the learning curve like?
When it comes to multi-camera productions, Switcher Studio is intuitive and easy to use. We also have an expansive knowledge base and dedicated support team to help if you have any questions getting set up. Most of our users start their first production the same day they get the app. It’s just crazy to think that we used to spend hours setting up for a three or four camera production and now we can be up and running in just a few minutes.

If you are going to stream live video there is a little more of a learning curve. Streaming live video is a bit of a monster by its very nature. There are so many elements outside of the production that can impact the quality and delivery of your video, whether it’s related to bandwidth, networking conditions or your streaming destination or endpoint. In addition you have to have an understanding of how to adjust your resolution and bit-rate depending on available upload speed.

Switcher Studio is open and works with most RTMP compatible platforms, including YouTube, Ustream, Bambuser, Concert Window and Wowza servers to name a few. We also have tools that help streamline the process of linking Switcher with these services through our online dashboard and an integrated speed test that can check your upload speed and auto assign values for many of these settings.

How is the product being used?
We have had journalists broadcast live video using the 4G LTE connection from their phone. We even did a live video stream of a concert shortly after we first launched where the artist grabbed my phone and started shooting video of himself and the crowd from the stage. He handed my phone back without skipping a beat. We were able to stream that moment live — I don’t know if that’s ever been done before.

 

Director Mark Kudsi on ‘The Art of Patrón VR Experience’

Director Mark Kudsi’s work on The Art of Patrón Virtual Reality Experience, via production company White Label S/R Product and agency Firstborn, involves a blend of virtual reality technology, drone filming and photoreal CG designed to transport participants to Patron’s hacienda and experience its handcrafted process as guided by the brand’s “Bee.”

The virtual reality experience is a centerpiece of Art of Patrón events being held around the country. An interactive, 360-degree web version can be viewed here.

When did you get involved with the project?
Firstborn approached me to collaborate on the project early in the process of production. We traveled to the Hacienda Patrón  distillery in Jalisco, Mexico, and investigated elements we could bring to the screen that would dramatically and authentically translate the Patrón bespoke process. There’s no roadmap for this kind of work — not yet — so we conducted four weeks of R&D, testing and previs to develop the methodology to bring this world to life. Nothing had been done exactly like this before so the investigative phase was vital.

Mark

Mark Kudsi

What was particularly exciting for me was pushing into this new technology because it serves this story. It transports, entertains and provides a deeper understanding of the art and craft of Patrón. It’s a window to a world people wouldn’t otherwise experience.

What were some of the steps involved in the production?
One of the first things we needed to do was to further develop the story and plan how we would connect all the different processes of the production of tequila. Once we traveled to the hacienda, took a tour and tasted for ourselves, I was able to propose some ideas of how we would seamlessly connect the parts together.

We then boarded out all the shots, put together a comprehensive previs in a 3D 360-degree VR space, so we could meticulously plan out the shots and transitions. Simultaneously the team went to work on the camera system, developing rigs for the shots, which specifically included outfitting an aerial drone as well as a land drone.

There were also many technical and logistic hurtles to plan for since the camera rig comprised seven cameras that filmed 360 degrees. To name a few, we needed to figure out how to light scenes, direct talent and to view what the camera was recording, all without being seen in the frame.

Some of the tour was impossible to film — how was CGI used here?
Photoreal CGI was crafted and integrated by the teams at Firstborn and Legend 3D to create smooth transitions, which can otherwise be jarring in virtual space, and to take the viewers through areas the camera could not travel, like through a keyhole.

So Firstborn is an agency and they have in-house digital artists?
Yes, it is a unique situation — and a new experience for me — but one that worked seamlessly. The communication between the creatives and technical teams was in realtime, which made it possible to problem-solve any and all of the issues that had come up on the shoot and though post. Since we were pushing the existing technology, and going into uncharted areas of production, it was imperative that we all worked together.

How did Legend 3D become involved on the project? Have you worked with them before?
I had not worked with them in the past as a company, but I had worked with Jared Sandrew, the chief creative officer, during my post days. I have always wanted to reconnect with Jared and his team at Legend 3D, and for this one the stars lined up.

Coincidentally, before I was approached by Firstborn, Jared had talked to me about the work he was doing in the VR space and was excited about how his team at Legend was ushering in a new approach on how to tackle some of the challenges of shooting and posting for the 3D 360-degree virtual space. I made him and his team part of my pitch to the team at Firstborn, and it made sense for us to collaborate together since there was no roadmap for what we were trying to achieve. Legend’s Matt Ackey and Justin Denton worked on the project’s stereo conversion.

Any advice you’d give to people who want to create an immersive experience like this one?
This area is so new… it is like the Wild West out there, so I would say plan as much as possible. There will always be challenges that pop up on any project, but for this type of experience, the problems are exponentially more difficult to solve for. The best thing to do is to R&D and, probably more important, surround yourself with as many talented, problem-solving, positive people — like I was fortunate enough to do — to help you achieve something special.

Your work often blends live-action directing and design — what are some other projects you’ve directed that people will recognize?
I began as a designer and was one of the original creatives at Motion Theory. I have had the good fortune to work with a wide variety of executional styles for different types of brands and artists. My goal is to find ways to communicate human stories so the approach is a driver of story rather than just spectacle — though that can be fun too. Some of my past projects include the Apple iMac with Retina display, Fiat’s North American launch and the launch of Nintendo’s Wii U gaming console. On the music video side, I designed and directed The Black Eyed Peas’ Boom Boom Pow and Katy Perry’s Roar.

How important is it for today’s creatives to be able to work in various mediums?
The traditional form of the way people consume media is changing, and so are the mediums. For this reason, it is important to stay on top of technology and how to create content that can reach your audience in an effective way. I think it is important to remember these are all tools, so the great thing is that the same principles apply to telling great stories and taking people on memorable experiences. The key is to understand how to use these tools to your advantage and help them elevate the story you are trying to tell.

 

Quick Chat: Scarab’s Robyn Haddow talks ‘The Flash’ and ‘Arrow’

Robyn Haddow is a motion graphics and playback artist with Vancouver-based Scarab Digital as well as an active freelancer. At Scarab, she is presently providing Fantasy User Interface (FUI) and animation designs for the series The Flash and Arrow for the CW as well as for Proof, a new medical drama debuting soon on TNT.

The Flash is known for his distinctive powers. What kinds of animations and graphics are you providing to create his unique look?
Yes, the Flash has very distinctive powers! The Flash’s tech comes from a team of scientists based out of the “Cortex,” which is his lair. One of the scientists is the character of Cisco Ramon, who builds a lot of the tech for Flash. Our job is to create all the tech that comes out the Cortex. We developed the look of the Cortex early on for the pilot; a cyan and blue tone palette with hints of yellow and white for a nice complimentary color accent. The screens are generally comprised of many different schematics of buildings, machine diagnostics, calculations, wireframes of gadgets and all kinds of maps of various locations in Central City.

FlashScarabDigital-03
The Flash

Once you are given story points from the production team how much creative freedom do you have?
We have almost total creative freedom. We worked hard to establish the look of the Cortex (where most of our screens take place) for the pilot and in turn earned a lot of trust from the production designer.

Can you tell us about your design/animation workflow?
My workflow begins by reading the scenes in the script I am building screens for. After I have correctly understood all the story points, I jump into Illustrator to begin to block out my shots. I take screen grabs of 3D assets I have that I would like to incorporate and begin to rough out my composition with them. I start by blocking out the main elements on my screen and then concentrate on building out the different sections piece by piece.

Once I have a layout I am happy with, I bring in tools such as Adobe After Effects to get my elements working in here. I then jump into Maxon Cinema 4D to work on shading my 3D assets and create the look I am going for in the model. Once happy with the look, I block out my 3D animation, generally as a single asset that I will import into After Effects. I jump back into After Effects to block out my main animation and also focus on smaller detailed elements and animations.

I have a very tight production deadline — typically I am given only a day to work on a screen, but sometimes I am required to generate two or three. Using efficient tools is critical in helping me to maximize my time as best I can.

ArrowScarabDigital-01 ArrowScarabDigital-03
Arrow

What are the challenges of creating sophisticated VFX in the fast turnaround time of television?
Great question. Time is the biggest one! Typically I will have a day to complete a build for a scene, and as I mentioned this can include two or three builds. This is a personal challenge because I like to include a lot of detail in my work. I hit the design and animation in broad strokes and with whatever time remains I get to go back and add in extra little bits. As I continue to build up my workflow I am constantly working to improve on-time management, thus giving me more time for all the extra little bits I like to add.

Another challenge is handling the 3D assets that we receive from post. Often times the models are very dense meshes and it can be tricky to optimize them in order to work best for my purposes.

More from The Flash

More from The Flash

Is there a scene — or show — that are you particularly proud of, and why?
Hmm… The season went by so fast for all the shows, so this is a tricky question! I suppose I can narrow it down to sets. I am really proud of the look of the Cortex in The Flash as I led the design on that as well as the look of the Palmer Tech set in Arrow. I am a big nerd for super hero suits so any screen that includes ATOM’s Exosuit from Arrow or the Flash’s suit is definitely up there with some of my fav builds.

Who are some of your personal sci-fi inspirations?
I am a huge fan of work done by Jayse Hansen. His work is so well thought out, detailed and polished! I also look up to his abilities in design and animation as well as the speed in which he can create. I was first introduced to the world of screen graphics by stumbling upon some of Mark Coleran’s work. And, of course, I am in love with everything put out there by Territory Studio.

Quick Chat: Stargate Studios president Darren Frankel

Stargate Studios, a visual effects and high-end production company, has 10 offices in seven countries where they work on television, features, commercials and special venue projects. Their credits are impressive and include work on GracepointThe Walking Dead, Grey’s Anatomy, Ray Donovan, House of Lies and 12 Monkeys.

Stargate has been around for 25 years, which in this business is a rarity. We checked in with president Darren Frankel to find out how they have survived, thrived and more.

You’ve hit the quarter of a century mark, which is impressive. How have you not only survived in a very difficult market but also thrived and expanded around the world? Any wisdom to share?
You have to constantly reinvent yourself and your process. The industry is constantly changing and producers, directors, studio executives, etc. are looking for partners to help them stay ahead of the curve. We are always looking at new ways of doing things and which tools exist to allow us to do the things that we couldn’t achieve before. To stay current, you always have to be ready to break the model and improve it. You also need to look at your client’s problems as your own so that you are thinking along with them rather than just being reactive.

Can you talk about your different locations and is different work done at each or does the work at all locations mirror the others?
We have 10 offices — Los Angeles, Atlanta, Vancouver, Toronto, Mexico City, London, Berlin, Cologne, Malta and Dubai. At the crux of each facility is local work. All the facilities are interconnected using a proprietary system known as VOS (Visual Operating System). This inter-connectivity allows artists to share work and for VFX supervisors, producers and coordinators to communicate with greater ease. The business has grown internationally, so Stargate’s network of facilities also allows us to put talent in place regardless of location and bring that talent to the projects that need it.


Before and After: An example of work Stargate did for the show Gracepoint.

Can you talk about some of the work you have done and are working on currently?
Currently we’re working approximately 25 different projects around the world, a sampling of which includes The Walking Dead, Dig, Grey’s Anatomy, Ray Donovan, Damien, El Principe and a host of other projects that I wish I was at liberty to talk about!

Any one that you are particularly proud of. Can you describe?
Every project brings its own set of challenges and some of the work that I’m most proud of is work that nobody would ever know we even did because it’s invisible. At the end of the day, a company is really about its people, and I’m extremely proud of all of them.

101-138-10_before second
Before and After: More work for Gracepoint.

What are your main tools?
Our main software tools, in addition to the aforementioned VOS, are After Effects, Maya, LightWave, Premiere, Mocha Pro, RealFlow, Photoshop, Golaem and a host of other plugins and tools. In addition we use all kinds of cameras and production tools because real is always best when feasible, so we often shoot our own elements as well.

You are also using Signiant’s Media Shuttle to work with all your locations seamlessly. Can you walk us through that workflow and describe how you were doing this before Media Shuttle?
The business has become global. Shows often shoot in one city, do editorial in another city, and desire to gain tax incentives from yet another city. The ability to move and share data across Stargate’s network has become of paramount importance. Using our internal VOS system, data can be moved through the network automatically using preference settings rather than manual human interaction. It will even place files in the same directory on the network in a different city, without relying on moving files to a shared folder, and then having to manually disperse those files to their proper locations once the transfer is complete. We have moved from using FTP along our private VPN network and externally to clients, to Signiant’s Media Shuttle.

The two major reasons we did this are:
1. File Transfer Speed: Media Shuttle optimizes the bandwidth of users on upload and download to make files move faster between locations, and helps to mitigate the need for shuttling drives from client editorial and post facilities.

2. Security: Password-protected FTP sites are only so secure, and the way Signiant packages files makes them less susceptible to being compromised in any way.

Internally it didn’t create more work on our end but provided a significant net gain.

Check out the Stargate website for reels and more credits.

Quick Chat: Humble’s ECD Sam Stephens

At the end of March, integrated content studio Humble acquired production house Paranoid US, and at the same time launched sister company Postal to officially handle visual effects and post production.

We thought now was the right time to check in with Humble’s executive creative director, Sam Stephens, to tell us how things are progressing.

Humble has always provided full production and post services. What led to the decision to launch Postal as its own banner?
Historically, we had posted about 75 percent of everything we shot at Humble in-house. As Humble began to work on bigger campaigns, the post asks got bigger as well. Full CG spots, high-end VFX and long-form editorial projects all meant that post was going beyond just servicing our directors.

The post team was creating its own content that was design, concept and/or character-based. We were proud of that work and felt that it needed its own stage and its own brand behind it. We are always going to provide post service to Humble’s productions; that hasn’t changed. What’s changed is that we are also doing more than that. With Postal, we are offering a full-on creative partner and a content creator, not just a provider of post services.

How will clients benefit from Postal being its own brand?
We are all still under the same roof, so the best parts about being a one-stop-shop still apply. Postal is heavily invested and involved from the initial pitch to the ship date. What’s nice for both clients and our directors is that we now have a creative studio model and this roster of double-threat talent who serve as directors, creative directors, designers, animators and VFX artists. We can build these custom-designed teams across both Humble and Postal to collaborate with agencies on whatever they need from concept through post. And our directors have all the support they need in-house regardless of the type or length of the project.

What tools are in use at Postal?
We do a bit of everything here. We like pen and paper. We like our still cameras, our exacto knives, our whiteboards and our brushes. When it’s time to plop down in front of a box we use the typical suite of stuff. Offline and online edits are in FCP and Avid Media Composer. We use Dragon for stop-motion. VFX is either via Flame or Nuke. 3D animation is Maya with rendering out of V-Ray. For 2D goodness we employ the usual collection of Adobe CC programs. We’re also testing out Arnold for 3D rendering these days, and really like what we are seeing so far.

Humble also recently announced an expansion into original feature length fare. Can you elaborate on this expansion, and on how Postal will play into upcoming original content projects?
Moving from festival shorts to features just felt like a natural progression for us. Directors from both Humble and Postal will be pursuing original projects, and for any project Postal will of course provide any design or post production work that’s needed. Postal just did the edit on our first documentary feature Son of the Congo, which had a really successful premiere at SXSW, and we are currently working on poster design and pitch decks for the upcoming narrative feature Humble is producing for roster director Marc Raymond Wilkins.

Son of the Congo

Son of the Congo

Between Humble and Postal we have 18 directors, each one of them overflowing with ideas for shows, features and one-offs. A big part of my job, along with our VP Persis Koch, is to identify which ones should be developed further and then make sure they happen. Over the past year this has become part of who we are as a company, not just something we do in our spare time.

Where do you see Humble and Postal a year from now?
I see us in a hot tub at Sundance, straight from our first premiere. I see Postal’s name getting some real traction, attracting even more A-list talent and A-list work. I see all of us showing up at work every day stoked to be here and proud of the work we’re churning out. And finally I see that attitude catching on to the point where we’ll need to buy a bigger hot tub.

Ingenuity Engine effects ‘The Last Man on Earth’

Los Angeles-based Ingenuity Engine was the primary visual effects vendor on the first season of the Fox hit The Last Man on Earth — the story of a man who believes he is the only survivor of a deadly virus that has wiped out the world’s population. In an act of desperation, Phil (series creator and star Will Forte) puts up billboards telling anyone who might care that he is “Alive in Tuscon.” Amazingly he finds other survivors.

The series, scored by Mark Mothersbaugh (see our story here), has already been renewed for a second season.

Aside from the show’s virtual cul de sac exterior location (via CBS Digital) and a few small shots done by Mango LA, Ingenuity Engine was responsible for all of this season’s VFX on the show. The number of shots they provided per episode varied quite a bit, anywhere from 10 to more than 50. “Last Man takes place a few years after a global disaster kills most of the world’s population,” explains Ingenuity Engine’s VFX producer Oliver Taylor. “So any time the show’s characters went out into that world we wound up doing a large amount of work to assist making the world look uninhabited.”

Oliver Taylor

Oliver Taylor

The work ranged from basic paint-outs, sky replacements and general clean-up all the way up to creating a CG space station orbiting Earth for the season finale. “The Last Man crew is very experienced and pretty VFX savvy, so they had a very good grasp of what was possible in post, what they could do on their own and what we needed to be deeply involved in,” says Taylor.

Two VFX sequences from the first season stick out in Taylor’s mind in terms of how involved the studio got with the shots. The first is a where Will Forte’s character climbs up onto the catwalk of a billboard, only to have the ladder fall. “They couldn’t put their actor on top of a real billboard because it would be unsafe, and putting a camera up that high would be slow and error-prone. So a billboard set was constructed with greenscreen around it, and a plate shoot was done to capture the backgrounds,” he describes. “Production brought us on a week or so before the shoot and we went over the storyboards in detail with their team. We also participated in the location scouts and helped them figure out the dimensions of the set they were building.”

The second VFX-heavy work is the reveal of the International Space Station (ISS) for the finale. Just as with the billboard sequence, Ingenuity Engine got involved from the very start. “We were looking at storyboards, recommending adjustments that were appropriate, walking through the set with the director and DP, and helping everyone come to a consensus as to how it was going to be photographed.”

LME_stills_20150508.002jason 2 main

The first shot of the sequence is a crane shot that rises from the surface of the Earth, through the clouds and up to the International Space Station. “We knew right away we couldn’t get away with a matte painting of the Earth, and that to some extent we had to do it for real,” explains Taylor. “Because the camera moves through the clouds we needed them to be volumetric. We also needed them to cast correct shadows on the Earth, interrupt reflections on the surface of the Earth’s terrain and water, etc. All this necessitated a detailed build of the Earth, it’s textures and displacement, and proper interaction with the clouds.”

Once the viewer gets inside the ISS they see a character that has been stranded for some time. Making the actor (Jason Sudeikis) look like he was in zero gravity involved placing the actor on a crane arm, which would be moved around as he pushed off the walls. Taylor explains that the scene and the removal of the zero-gravity rig from the actor on the ISS was a fun challenge. “The two complicating factors were that it was shot on a moving Steadicam and the stunts team used a crane rig, which entered the shot from just behind camera. To remove the crane, from a shot which drifted around, we had to build geo for the interior of the ISS to match the set, get an accurate match-move for the camera and re-project clean-plate textures back onto the geo. It’s a tricky process, one that requires a lot of tweaking, but it allows production to be very flexible with how they shoot the scene.”

LME_stills_20150508.003iss LME_stills_20150508.000ISS

Remember that all of this was done on a TV production schedule, meaning fast turnarounds. Taylor says timing is a big challenge when it comes to creating CG-heavy shots like this. “The typical episode only gets a week or so for VFX, but with the ISS sequence it was necessary to lock the edit five weeks in advance. Editorial was able to do that because we did animated previsualiztions specifically for their edit. They were able to cut in different versions of our animation and retime them to get the pacing and shots they wanted. In parallel we developed the model, texture and rendering work. The Last Man post team was great to work with. Doing a sequence like this in such a short time requires very tight collaboration, which keeps everyone on the same page and makes the creatives on their end feel more involved in the process and more at ease with where we’re going.”

Tools
Ingenuity Engine is in the process of testing and switching to The Foundry’s Modo for a lot of its 3D work, so they used this project as an opportunity to “jump in the deep end,” says Taylor. “The interactivity of the render preview is a great advantage for us and for this particular situation. To stay true to reality we knew there was only one way we could light the exterior of the ISS — with the light from the sun and a soft bounce from the Earth. Being able to work on the lighting and shading and quickly see results makes the process much easier to navigate and, ultimately, allows us to do much better work.”

The studio calls on The Foundry’s Nuke for compositing, as well as The Foundry’s Hiero for all I/O, which Taylor refers to as “a great integrated pipeline that makes everything a little faster and easier.” Side Effects Houdini was used to create the clouds around the Earth in the ISS sequence. “The primary benefit for us was that we could place rough geo in the scene, so that the placement of the clouds made sense from an artistic sense, and use that geo to generate VDB volumes of the clouds. It’s a process we nailed down working on commercials and have used it again and again doing cloud work.”

Ingenuity Engine called on another Foundry product, Mari, for texturing the exterior of the ISS. “Mari was very useful in this situation because it saved us a lot of time in model prep and skipping a lot of the grunt work in laying out UVs,” concludes Taylor.

If you haven’t seen The Last Man on Earth yet, check it out on Hulu and get your binge on.

DP John Seale on capturing ‘Mad Max: Fury Road’

This film vet goes digital for the first time with Alexa cameras and Codex recorders

Mad Max: Fury Road is the fourth movie in writer/director George Miller’s post-apocalyptic action franchise and a prequel to the first three. It is also the first digital film for Australian cinematographer John Seale ASC, ACS, whose career spans more than 30 years and includes such titles as The English Patient (for which he won an Oscar), The Mosquito Coast, Witness, Dead Poets Society and Rain Man.

Facing difficult conditions, intense action scenes and the need to accommodate a massive number of visual effects, Seale and his crew chose to shoot principal photography with Arri Alexa cameras and capture ArriRaw on Codex onboard recorders, a workflow that has become standard among filmmakers due to its ruggedness and easy integration with post.

Warner Bros.’ Fury Road, which takes place in a post-apocalyptic wasteland, was shot in Namibia. The coastal deserts of that African country are home to sand dunes measuring 1,000 feet high and 20 miles long. Frequent sandstorms and intense heat required special precautions by the camera crew.

FRD-11255.JPG

“I’d shot plenty of film-negative films in deserts and jungles under severe conditions, but never digital,” notes Seale. “So I was a bit worried, but I had a fantastic crew of people who had done that… had worked with digital cameras in jungles, deserts, dry, heat, wet, moist, whatever. They were ready and put together full precaution kits of rain covers, dust covers and even heat covers to take the heat off the cameras in the middle of the day.

“We were using a lot of new gear.” Seale adds. “Everything that our crew did in pre-production in Sydney and took to Namibia worked very, very well for the entire time. Our time loss through equipment was minimal.”

Seale’s crew was outfitted with six Arri Alexas and a number of Canon 5Ds, with the latter used in part as crash-cams in action sequences. The Alexas were supported by 11 Codex on-board recorders. The relatively large number of cameras and recorders helped the camera crew to remain nimble. While one scene was being shot, the next was being prepped.

“We kept two kick cameras built the whole time and two ultra-high vehicles rigged the whole time,” explains camera coordinator Michelle Pizanis. “When we when drove up (to a location) we could start shooting, rather than break down the camera at one site and rebuild it at the next.”

Fury Rd 173 john e Fury Rd 160 john 2
John Seale on location shooting Mad Max: Fury Road.

The original Mad Max is remembered for its gritty look. Fury Road took a different route due to the film’s heavy use of visual effects. “The DI and the post work is so explicit; almost every shot is going to be manipulated in some way,” Seale explains. “Our edict was ‘just shoot it.’ Continuity of light wasn’t really a question. We knew that the film would be cut very quickly, so there wouldn’t be time to analyze every shot. Intercutting between overcast and full sun wasn’t going to be a problem. On this film, the end result controlled the execution.”

In order to provide maximum image quality and flexibility for the post team, Seale and his crew chose to record ArriRaw with the Alexa cameras. That, the cinematographer notes, made Codex an obvious choice as only Codex recorders were capable of reliably capturing ArriRaw.

“The choice to go with Codex was definitely for the quality of the recording and post-production considerations,” says Seale. “Once again, we were a little worried about desert heat and desert cold. It changes so much from night to day. And during the day, we had dust storms, dust flying everywhere. We sometimes had moisture in the air. But the Codex systems didn’t fail us.”

Shooting digitally with Codex offered an advantage over shooting on film as it avoided the need to reload cameras with film negative in the blowing winds of the desert. “There is a certain amount of paraphernalia needed to shoot digitally,” Seale says. “But our crew was used to that. They built special boxes to put everything in. They had little fans. They had inlet and outlet areas to keep air circulation going. Those boxes were complete. Cables came out and went to the camera. If we were on the move, the boxes were bolted down so that they were out of the way and didn’t fall off. Sometimes we sat on them to get our shot.”

FURY ROAD

RF interfaces were used with the Alexa cameras to transmit images to a command vehicle for monitoring by director George Miller, who was not only able to review shots, he could edit material to determine what further coverage was needed. “For George, it was a godsend,” says Seale. “That refined the film shooting and made it a lot quicker than the normal procedures.”

It was that sort of flexibility that made shooting with Alexa and Codex so appealing, adds Seale. “I was a great advocate of digital 10 or 15 ago when it started to come in. Film negative is a beautiful image recording process, but it’s 120 years old and you get scratches and dead flies caught in the reels. It’s pretty archaic.

“I think the way digital has caught on is extraordinary. Its R&D is vertical, where film development has stopped. The ability of digital to record images coupled with the DI, where you can change it, manipulate it, allows you do anything you like. I know with Mad Max, it won’t look anything like a ‘good film image’ and it won’t look anything like a ‘good digital image’ — it will look like its own image. I think that’s the wonder of it.”

Director George Miller recently appeared at Comic-Con and seems to agree with Seale, “It was very familiar,” he said about returning to the Mad Max world. “A lot of time has passed. Technology has changed. It was an interesting thing to do. Crazy, but interesting.”

Quick Chat: Paul London from K Street Post in DC

Washington DC-based post boutique K Street Post, which opened its doors in April 2004, provides editing, audio post and finishing for spots, PSAs, station promos, corporate presentations and docs. Oh, and as their location might suggest, when the political season heats up, so do their post suites.

Recently we reached out to owner Paul London to find out more about K Street and how they work.

What was your goal when you opened K Street?
At the time we opened there were many small, nonlinear edit shops in Washington mainly using Avid Media Composer and Final Cut Pro. From the beginning, I wanted to set myself apart by taking advantage of my graphic design skills. I wanted to focus on high-end, graphically-intensive video and TV commercial projects. That’s one of the reasons I chose Quantel gear early on; their built-in editor, effects, text and paint tools were perfect for this type of work.

You are in the center of DC. How much of your work is political-based advertising?
K Street Post has always concentrated on television spot work. This includes local/regional commercials (Next Day Blinds, Silver Diner, Washington Times, Jiffy Lube), and some national TV ads for associations like the American Petroleum Institute and national PSAs for the USO (pictured below).

US use this one

We also do a large amount of political advertising for governors, congressmen, senators and some presidential races and issue advertising for political action committees (PACs and Super PACs).

Can you name a recent political job?
We have been working on a number of political campaigns recently, including a four-minute video for Carson America played at the Chicago rally during Ben Carson’s presidential announcement this Tuesday. We were making adjustments and shot changes right up to the last minute.

You are a boutique. What are the benefits of staying small?
K Street Post is small and that is by design. Right now we have a Quantel Pablo Rio edit room, a Final Cut Pro 7/Adobe Premiere edit room and an Avid Pro Tools audio room. During the very busy months of the political season — August and September of every even numbered year — we typically add an additional edit room, but the way things are shaping up for the 2016 race, we might end up adding more.

I love being small. I get to focus on what I like best, which is being creative and working with clients on TV projects. I have thought about growing the company and adding additional rooms, but then I would become more of a manager and have less time to be creative. Another great thing about being small is clients have direct access to us. The schedule book is easy to manage and clients can discuss projects and adjust booking directly with me.

K Street Post’s new Quantel Pablo Rio suite.

Can you talk about your workflow?
We handle all aspects of the post process, so offline editing with FCP or Premiere and online editing and color correction with the Pablo Rio. But to be honest, we have not done an offline for some time. The political spots we work on simply don’t have time for that type of workflow. You usually have 8 to 12 hours to create a finished ad, so you load, edit, grade and get a very polished version off to the client within the day. This is where the Pablo Rio comes in for us. We have found no other system that can do this with the quality level required for a statewide or national TV commercial.

Also, most of our commercial work is graphic heavy. It’s not uncommon to have 30 to 40 layers being used to create the finished ad. The Pablo Rio is fast enough for this type of client-attended work. It’s also great at accommodating changes, and there are lots of changes! It’s quite normal for a political ad to have several script changes even during editing.

Marching Murphys use 2
K Street Post’s Pablo Rio compositing with up to 50 layers.

If you could share one tip with clients about getting the most out of the post experience, what would it be?
Good question. I ask my clients to get me involved as early as possible. The more I know about the project the more I become immersed in it and the better the final result. Clients should never ambush their editors with their projects.

Quick Chat: Lucky Post’s Scottie Richardson on ‘Reclaim the Kitchen’

Wolf Appliances and agency The Richards Group recently launched the “Reclaim the Kitchen” campaign meant to inspire families to prepare and eat meals together. At the center of this initiative is a film that shows audiences the joys of home cooking. Lucky Post’s Scottie Richardson, based in Dallas, created the sound design, music edit and the final mix for the three-minute, stop-motion piece directed by Brikk.

In Reclaim the Kitchen the viewer’s perspective is sitting at a dinner table or making tasty dishes. There are statistics, meal suggestions and recipes. You can see the film on http://www.reclaimthekitchen.com, a site created to offer “tools to cook with confidence.”

We checked in with Richardson in order to dig a bit deeper into the sound and music.

When did you first meet with the agency?
I was put on hold by The Richards Group producer David Rucker, but didn’t truly know what the job would be. I only knew that I was working on a video for Wolf and I had two days to work on my own before the creative team would come in.  David and I had just worked on a huge Chrysler campaign so there was a strong trust factor going in.

Scottie Richardson

Scottie Richardson

What direction did they give or were they open to ideas?
The concept behind the Wolf project is “reclaim the kitchen,” getting people together to share home-cooked meals. It’s not meant to badger viewers; it’s more like, “Wow, what have I been missing? I can do this!” Inspiring and whimsical. In terms of the sound, I was just told to “do what I do.” That’s a dream on any project — to have the time to immerse yourself in the narrative. I wanted the sound design to match the integrity of this initiative.

There are many layers of sounds — the home, cooking, technology. What were you trying to convey with the audio?
The creatives did an unbelievable job creating a sweeping yet simple message. Preparing a meal isn’t just about the food. Time is ticking, money matters and family are all important. These factors are influenced by myriad circumstances, but rather than ignore them, they’re addressed head-on. The sounds outside the kitchen are designed to resonate with viewers, to put them in these moments that influence their meal decisions. Phones are often seen as tools that distract us from our family time, but they can be used to help with family participation — Googling recipes, meal-planning apps, converting measurements — there are ways to use these tools mindfully and together.  Fast food may be a modern solution to creating more time with your family, but if that time is allocated to a mission in the kitchen, your time is invested in camaraderie.

In some cases the sound was meant to add atmosphere, while in other cases it was to specifically key off of what was happening on screen. We used it to interplay with the voiceover script. There is a scene nearing the end that is a tight close-up of a food scale. Meanwhile, you hear a ticking from a stopwatch as the camera pulls out to reveal that it is a food scale. That sound was to accent the voiceover talking about “time” rather than the image, but it provides a nice juxtaposition. Overall, the goal was to key off of the verbal cues and visuals with both sound design and music edits so they were additional characters in the narrative. In some sections, we chose an absence of sound to allow moments to breathe and stand out. This piece was designed to inspire people to recalibrate, be somewhat introspective and learn, but not feel intimidated, so creating moments to process were crucial.

Can you talk about creating the sounds?
I have a large sound effects library that I’ve built over the last 20 years. I start with using the logical pre-recorded, time-tested sounds as a baseline. On this particular job I pulled from my stock library but also Foley’d lots of sounds. I like to be musical with sound design, so I am constantly making sure the sounds work with the music track in tone or pitch. Sometimes that’s using verb and delay to match the music and its space, or pitching things up or down. Being a musician I like to use musical effects like cymbals or shakers to accent things as well.  These elements integrated well on this project because Breed’s music track was so lively and elegant.

What about the mix?
At the end of the day what the voiceover is saying is important, like the vocals to a great song. I made sure all of that was clear, then I experimented with the music and sound design. I built a nice bed for the voice to lay in that would hopefully let the poignancy of the message resonate. Sometimes it’s best to just feel the sound and not actually be able to articulate what it is. You miss it if it is gone but you can’t actually say, “That was a scooter running over an umbrella.”

You wore a few different hats on this one. Can you talk about that?
Well at first it was as a sound designer. I created a sound scape from beginning to end of the cut. Then I brought in the editor’s sound design and went through to see if anything clashed or to see what needed to be replaced or enhanced. When agency creative Dave Longfield came into the session, he had very specific things he wanted to try with the music, so we spent a half-day cutting up the music stems and trying out things to hit with picture. Breed’s music was amazing. It balanced the narrative with energy and the intelligence of the message. After that, we edited dialog, trying out various takes and pacing that felt right. This was followed by the mixing stage to bring it all together.

What tools do you call on?
This was all built and mixed in Avid Pro Tools. One tool I use often on sound design is Omnisphere by Spectrasonics. This allows me to make more music sound effects and really transform them into something new.

ReclaimTheKitchenmain

Where do you find inspiration?
Honestly all over. Art, movies, music. One of my favorite groups as a young kid was The Art Of Noise. I just loved how they made music out of door slams and breaking glass. I love how layering many sounds together make one solid sound. I enjoy seeing a good movie and hearing how they use a sound you wouldn’t think of for what you are seeing, or how the absence of sound speaks better than having one.

Finally, how did this project influence you personally?
I am truly the healthiest I have been in a long, long time. I have not had fast food in over three months and we have been cooking as a family every night for dinner. I’m avidly researching recipes and trying to one-up the next meal. This is a project that changed my lifestyle for the better. I didn’t see that coming.

Quick Chat: Encore Colorist Paul Westerbeck Talks ‘Gotham’

Fox’s Gotham gives viewers a look at the early days of some of your favorite, and not-so-favorite, Batman characters, mostly focusing on a young detective named Jim Gordon… way before he became the Commissioner Gordon most of us are familiar with.

Gotham tells some dark stories, and the look of the show matches that dark narrative. To talk more about the color of the show, which will shortly come back from hiatus, Encore colorist Paul Westerbeck was kind enough to take time out and answer some or our questions.

What unique challenges does Gotham present?
Starting with the pilot, we were required to do rolling conforms. We colored each version as if it Continue reading

Quick Chat: Assimilate’s Lucas Wilson talks about Scratch Web

Recently, Assimilate launched Scratch Web, a cloud-based multi-user collaboration tool that offers individual clip, timeline or timeline plus version sharing (known as Scratch Construct) as well as native support for industry-standard RAW camera formats.

It’s actually already in use at Santa Monica’s Local Hero Post, where founder and supervising colorist Leandro Marini has made it a part of his everyday workflow. Actually, keep an eye on this space in the future for a Scratch Web review from Marini.

To find out more about the product itself, we picked the brain of Assimilate’s VP of business Continue reading

Quick Chat: The Hit House’s Sally House on new Lexus spots

LA-based The Hit House created and produced original music and sound design for the new Lexus NX campaign via Team One Advertising. The Corner Shop produced and Wilfrid Brimo directed. Jump Editorial’s Richard Cooperman provided the cut.

The What You Get Out of It spot features a man in a parking garage, opening a large shipping container. Suddenly people start appearing and entering the container with random items, such as a bike, luggage and a dog. The man then closes the doors and they fall away, revealing a white Lexus filled with all the people and their stuff. They drive away together.

The other commercial in this campaign, which promotes the Lexus’ NX Hybrid, F Sport and Turbo car,  is called Moving. The Hit House (@HitHouseMusic) describes the music they created as industrial and contemporary.

Continue reading

Quick Chat: Cutters producer Heather Richardson

Cutters in Chicago recently hired producer Heather Richardson, who comes to the editing house after nine years at New York City’s Cosmo Street Editorial. She brings with her a ton of experience working with ad agencies and on commercials. This move allowed Richardson to come back to her home town.

While at Cosmo, she produced commercials for the company’s editors and many leading directors. Some of her most recent projects were for AT&T, DirecTV, ESPN, Geico, GoDaddy, Land Rover, MasterCard, Mercedes, Samsung and Verizon.

Richardson began with Cosmo Street in Los Angeles before moving to New York in 2008, and prior to that, she was a producer at LA-based visual effects studio A52.

Continue reading

Mixtape Club discusses Android campaign for Google Creative Lab

Toward the end of 2014, New York-based animation/production house Mixtape Club partnered with Google Creative Lab to create a multi-platform campaign for the Android mobile OS. Mixtape Club created five 30-second spots and one 15-second spot for TV. Also included in the campaign was a multi-screen animation for 10 digital out-of-home installations on newsstands throughout Manhattan. To get a taste of the campaign, click here.

Mixtape Club got involved early in terms of concept development as well as provided character animation. Their sister company Huma-Huma provided the sound and music, including the
music that plays during the title card sequence at the end of the spots.

postPerspective reached out to Mixtape Club partners and creative directors Chris Lenox Smith Continue reading

Quick Chat: Rampant Design Tools’ Stefanie Mullen

Many of you are already likely familiar with Rampant Design Tools, who are well-known in the industry for their high-quality drag-and-drop visual effects offerings. You might even have some on your system right now.

What you might not know is that Rampant is a small shop owned by the husband-and-wife-team of Sean and Stefanie Mullen, artists themselves. We thought it would be fun to get a look behind the curtain at this small company that creates tools that a lot of pros rely on every day.

How did Rampant begin? What was the genesis of the company?
Rampant began five years ago, but the true beginning of the company goes back 20 years when my husband Sean began working in Hollywood. He was working with top producers and visual Continue reading

Quick Chat: DP John Pingry

For this Quick Chat, we reached out to DP John “Ping” Pingry, who recently signed with Santa Monica’s Colleen for exclusive commercial representation. Pingry, who has experience in films as well as spots, was attracted to the agency’s intimate feel and hit it off pretty quickly with founder Colleen Dolan Vinetz. “I wanted a manager who not only represents quality people, but who also really loves what they do,” he says. “Colleen hit all the marks. She was the ideal choice to help me further expand my blossoming DP career.”

Over the past three years, this Chicago native worked with commercial directors such as Tom Dey (Centrum) Chace Strickland (Dutch Boy, California Tourism) Morgan Lawley (American Girl Dolls) Sam Jones (McDonald’s) Lance Anderson (Salt Sunglasses) and Jim Krantz (Ralph Lauren Continue reading

Checking in with Modern VideoFilm’s new CEO Scott Avila

By Randi Altman

Recently there have been some management changes at Modern VideoFilm, the 30-year-old independently owned LA-based post house. CEO Moshe Barkat and CFO Hugh Miller have left the company and Scott Avila is now CEO. Avila most recently held a similar position with The Culver Studios, where he oversaw a major expansion and renovation.

Others joining the Modern executive team include president Cooper Crouse and CFO Roxanna Sassanian. Longtime Modern VideoFilm execs, such as president of digital media services Bill Watt, president of creative services Mark Smirnoff and EVP/sales Jon Johnson, have taken on added managerial and operational responsibilities. Modern employs nearly 500 people in its Burbank and Santa Monica locations.

Continue reading

Quick Chat: FilmLight co-founder Wolfgang Lempp

In what has become a semi-regular column here at postPerspective, we have taken to doing short Q&As with the people behind the products you use. The questions, submitted by pros, are meant to illicit responses that allow users to understand how a company goes about creating, updating and servicing gear for our industry.

This time we spoke to Wolfgang Lempp, who co-founded UK-based FilmLight with Peter Stothart and Steve Chapman in 2002. He and the other co-founders oversee the management of the company, including business and product development, product management and strategic development.

Lempp’s tech credentials are pretty impressive. He has a degree in theoretical physics from Munich University and has been working in the motion picture industry since 1983. Continue reading

Quick Chat: Butcher editor Jason Painter

By Randi Altman

Remember those “Terry Tate, Office Linebacker” spots for Reebok? One premiered during the 2003 Super Bowl and quickly become the talk of water coolers everywhere. Editor Jason Painter won a Cannes Lion, an AICE Award and a Clio for his work on that campaign. To borrow from another sport, some would call that a hat trick.

Painter also has two Emmy noms for his work on the MTV PSA Train. Not a bad resume for a guy who also counts spots for Apple, Chevy, Pepsi, Twix, Visa, Burger King and many more top-tier brands as part of his reel.

Continue reading