By Barry Goch
Steve Cronan has been on the cutting edge of digital production starting with his work on The Matrix sequels in 2001. Based on that experience, he saw an opportunity to build a solutions platform for film production. His 5th Kind has become the backbone for Marvel Studios asset management system. 5th Kind Core has expanded its reach beyond just the media and entertainment space into a wide variety of industries.
PostPerspective recently spoke to Steve about how Artificial Intelligence is used in his product and how it will more widely impact post production.
How did 5th Kind come about?
The idea for 5th Kind started in 2001 when I was the IT manager on the Matrix sequels in Sydney. I was able to analyze what files and data each department managed and how that information was used and flowed around the productions, video games and The Animatrix.
This was really the beginning of the creation of our PAM (production asset management), which was first used on Superman Returns in 2005. In 2012, we signed our first big studio deal with Marvel. They essentially built the studio around the product that, in collaboration, helped us to extend the platform to manage a huge amount of workflows going out to marketing, licenses, vendors, etc. It was the framework for what we now call a SAM (studio asset management).
The focus is to be the backbone of all digital files and metadata as it propagates around all layers — from a creator to a department to a production to a studio and beyond. In 2015, we began rewriting the product and completed it at the beginning of 2018. It’s now called 5th Kind Core.
The primary objectives of Core were to build a framework that has the ultimate in security, unlimited scalability, high-performance and extendability, with an easy to use interface — all while supporting a huge list of features. There was a big focus on creating the best dailies experience possible, but since it’s agnostic to any file type or size, it can be used across an array of workflows, such as the management of scripts, storyboards, concept art, set drawing, location photos, production documents and 3D models. Also for workflows like bidding, review, approval, distribution, archive, etc.
We achieved that this year and as our first big win, we signed a multi-year deal with Universal to provide dailies to all feature films.
How are you using AI now?
The two main areas are facial and object detection and speech to text. Metadata is a huge part of our overall framework that allow you to control file access, user access, search, edit capabilities, notification triggers, processing triggers, tiered storage, etc.
What benefits are your clients getting from AI on your platform?
The key workflow it currently helps are things like reduction in data entry, increase in search accuracy and capabilities, accelerate production still approvals, subtitling and localization, legal and compliance.
How do you see AI and machine learning changing production and post?
The creative side of AI is growing much faster than I anticipated. Everything from color correction to mob simulations seems to be exploring ways AI can help. From the perspective of our application, it will continue to allow us to save people time and money by leveraging machine learning for some of the menial data management tasks.
Barry Goch is a finishing artist at The Foundation, a boutique post facility in the heart of Burbank’s Media District. He is also an instructor for post production at UCLA Extension. You can follow him on Twitter @gochya