New York City-based East Coast Digital believes in VR and has set up its studio and staff to be able to handle virtual reality projects. In fact, they recently provided editorial, 3D animation, color correction and audio post on the 60-second VR short Cardboard City, co-winner of the Samsung Gear Indie VR Filmmaker Contest. The short premiered at the 2016 Sundance Film Festival. You can check it out here.
Cardboard City, directed by Double Eye Productions’ Kiira Benzing, takes viewers inside the studio of Brooklyn-based stop-motion animator Danielle Ash, who has built a cardboard world inside her studio. There is a pickle vendor, a bakery and a neighborhood bar, all of which can be seen while riding a cardboard roller coaster.
East Coast Digital‘s Stina Hamlin was post producer on the project. We reached out to her to find out more about this project and how the VR workflow differs from the traditional production and post workflow.
How did this project come about?
The project came about organically after being introduced to director Kiira Benzing by narrative designer Eulani Labay. We were all looking to get our first VR project under our belt. In order to understand the post process involved, I thought it was vital to be involved in a project from the inception, through the production stage and throughout post. I was seeking projects and people to team up with, and after I met Kiira this amazing team came together.
What direction did you get?
We were given the understanding of the viewer experience that the film should evoke and were asked to be responsible for the technical side of things on set and in editorial.
So you were you on set?
Yes, we were definitely on set. That was an important piece of the puzzle. We were able to consult on what we could do in color and we were able to determine file management and labeling of takes to make it easier to deal with when back in the edit room. Also, we were able to do a couple of stitches at the beginning of the day to determine best camera positioning, etc.
How does your workflow differ from a traditional project to a VR project?
A VR project is different because we are syncing and concerned with seven-plus cameras at a time. The file management has to be very detailed and the stitching process is tedious and uses new software that all editors are getting up to speed with.
Monitoring the cameras on set is tricky, so being able to stitch on set to make sure the look is true to the vision was huge. That is something that doesn’t happen in the traditional workflow… the post team is definitely not on set.
Can you elaborate on some of the challenges of VR in general and those you encountered on this project?
The challenges are dealing with multiple cameras and cards, battery or power, and media for every shot from every camera. Syncing the cameras properly in the field and in post can be problematic, and the file management has to uber-detailed. Then there’s the stitching… there are different software options, no one is a master yet. It is tedious work, and all of this has to get done before you can even edit the clips together in a sequence.
Our project also used stop-motion animation, so we had the artist featured in our film experimenting with us on how to pull that off. That was really fun and it turned out great! I heard someone say recently at the Real Screen conference that you have to unlearn everything that you have learned about making a film. It is a completely different way to tell a story in production and post.
What was your workflow like?
As I mentioned before, I thought that it was vital to be on set to help with media management and “shot looks” using only natural light and organically placed light in preparation for color. We were also able to stitch on set to get a sense of each set-up, which really helped the director and artist see their story and creatively do their job. We then had a better sense of managing the media and understanding how the takes were marked.
Once back in the edit room we used Adobe Premiere to clean up each take and sync each clip for each camera. We then brought only those clips into the stitching software — Autopano and Giga software from Kolor.com — to stitch and clean up each scene. We rendered out each scene into a self contained QuickTime for color. We colored in DaVinci Resolve and edited the scenes together using Premiere.
What about the audio?
We recorded nothing on location. All of the sound was designed in post using the mix from the animated short film Pickles for Nickels that was playing on the wall, in addition to the subway and roller coaster sound effects.
What tools were used on set?
We used GoPro Hero 4s with firmware 3.0 and shot in log, 2.7k/30fps. iPads and iPhones were used to wirelessly monitor the rig, which was challenging. We used a laptop with AutoPano and Giga software to stitch on set. This is the same software we used in the edit bay.
We are collaborating once more with Kiira Benzing on the follow-up to Cardboard City. It’s a full-fledged 360 VR short film. The sequel will be even more technically advanced and create additional possibilities for interaction with the user.