Category Archives: Adobe Premiere

Jimmy Helm upped to editor at The Colonie

The Colonie, the Chicago-based editorial, visual effects and motion graphics shop, has promoted Jimmy Helm to editor. Helm has honed his craft over the past seven years, working with The Colonie’s senior editors on a wide range of projects. Most recently, he has been managing ongoing social media work with Facebook and conceptualizing and editing short format ads. Some clients he has collaborated with include Lyft, Dos Equis, Capital One, Heineken and Microsoft. He works on both Avid Media Composer and Adobe Premiere.

A filmmaking major at Columbia College Chicago, Helm applied for an internship at The Colonie in 2010. Six months later he was offered a full-time position as an assistant editor, working alongside veteran cutter Tom Pastorelle on commercials for McDonald’s, Kellogg’s, Quaker and Wrangler. During this time, Helm edited numerous projects on his own, including broadcast commercials for Centrum and Kay Jewelers.

“Tom is incredible to work with,” says Helm. “Not only is he a great editor but a great person. He shared his editorial methods and taught me the importance of bringing your instinctual creativity to the process. I feel fortunate to have had him as a mentor.”

In 2014, Helm was promoted to senior assistant editor and continued to hone his editing skills while taking on a leadership role.

“My passion for visual storytelling began when I was young,” says Helm “Growing up in Memphis, I spent a great deal of time watching classic films by great directors. I realize now that I was doing more than watching — I was studying their techniques and, particularly, their editing styles. When you’re editing a scene, there’s something addictive about the rhythm you create and the drama you build. I love that I get to do it every day.”

Helm joins The Colonie’s editorial team, comprised of Joe Clear, Keith Kristinat, Pastorelle and Brian Salazar, along with editors and partners Bob Ackerman and Brian Sepanik.

 

 

Adobe acquires Mettle’s SkyBox tools for 360/VR editing, VFX

Adobe has acquired all SkyBox technology from Mettle, a developer of 360-degree and virtual reality software. As more media and entertainment companies embrace 360/VR, there is a need for seamless, end-to-end workflows for this new and immersive medium.

The Skybox toolset is designed exclusively for post production in Adobe Premiere Pro CC and Adobe After Effects CC and complements Adobe Creative Cloud’s existing 360/VR cinematic production technology. Adobe will integrate SkyBox plugin functionality natively into future releases of Premiere Pro and After Effects.

To further strengthen Adobe’s leadership in 360-degree and virtual reality, Mettle co-founder Chris Bobotis will join Adobe, bringing more than 25 years of production experience to his new role.

“We believe making virtual reality content should be as easy as possible for creators. The acquisition of Mettle SkyBox technology allows us to deliver a more highly integrated VR editing and effects experience to the film and video community,” says Steven Warner, VP of digital video and audio at Adobe. “Editing in 360/VR requires specialized technology, and as such, this is a critical area of investment for Adobe, and we’re thrilled Chris Bobotis has joined us to help lead the charge forward.”

“Our relationship started with Adobe in 2010 when we created FreeForm for After Effects, and has been evolving ever since. This is the next big step in our partnership,” says Bobotis, now director, professional video at Adobe. “I’ve always believed in developing software for artists, by artists, and I’m looking forward to bringing new technology and integration that will empower creators with the digital tools they need to bring their creative vision to life.”

Introduced in April 2015, SkyBox was the first plugin to leverage Mettle’s proprietary 3DNAE technology, and its success quickly led to additional development of 360/VR plugins for Premiere Pro and After Effects.

Today, Mettle’s plugins have been adopted by companies such as The New York Times, CNN, HBO, Google, YouTube, Discovery VR, DreamWorks TV, National Geographic, Washington Post, Apple and Facebook, as well as independent filmmakers and YouTubers.

Dell 6.15

Bluefish444 supports Adobe CC and 4K HDR with Epoch card

Bluefish444 Epoch video audio and data I/O cards now support the advanced 4K high dynamic range (HDR) workflows offered in the latest versions of the Adobe Creative Cloud.

Epoch SDI and HDMI solutions are suited for Adobe’s Premiere Pro CC, After Effects CC, Audition CC and other tools that are part of the Creative Cloud. With GPU-accelerated performance for emerging post workflows, including 4K HDR and video over IP, Adobe and Bluefish444 are providing a strong option for pros.

Bluefish444’s Adobe Mercury Transmit support for Adobe Creative Cloud brings improved performance in demanding workflows requiring realtime video I/O from UHD and 4K HDR sequences.

Bluefish444 Epoch video card support adds:
• HD/SD SDI input and output
• 4K/2K SDI input and output
• 12/10/8-bit SDI input and output
• 4K/2K/HD/SD HDMI preview
• Quad split 4K UHD SDI
• Two sample interleaved 4K UHD SDI
• 23, 24, 25, 29, 30fps video input and output
• 48, 50, 59, 60fps video input and output
• Dual-link 1.5Gbps SDI
• 3Gbps level A & B SDI
• Quad link 1.5Gbps and 3Gbps SDI
• AES digital audio
• Analog audio monitoring
• RS-422 machine control
• 12-bit video color space conversions

“Recent updates have enabled performance which was previously unachievable,” reports Tom Lithgow, product manager at Bluefish444. “Thanks to GPU acceleration, and [the] Adobe Mercury Transmit plug-in, Bluefish444 and Adobe users can be confident of smooth realtime video performance for UHD 4K 60fps and HDR content.”


WWE adds iPads, iPhones to production workflow

By Nick Mattingly

Creating TV style productions is a big operation. Lots of equipment, lots of people and lots of time. World Wrestling Entertainment (WWE) is an entertainment company and the largest professional wrestling organization in the world. Since its inception, it has amassed a global audience of over 36 million.

Each year, WWE televises over 100 events via its SmackDown, WWE Raw and Pay-Per-View events. That doesn’t include the hundreds of arena shows that the organization books in venues around the world.

“Putting this show on in one day is no small feat. Our shows begins to load-in typically around 4:00am and everything must be up and ready for production by 2:00pm,” explained Nick Smith, WWE’s director of remote IT and broadcast engineering. “We travel everything from the lighting, PA, screens, backstage sets, television production facilities, generators and satellite transmission facilities, down to catering. Everyone [on our team] knows precisely what to do and how to get it done.”

Now the WWE is experimenting with a new format for the some 300 events it hosts that are currently not captured on video. The goal? To see if using Switcher Studio with a few iPhones and iPads can achieve TV-style results. A key part of testing has been defining workflow using mobile devices while meeting WWE’s high standard of quality. One of the first requirements was moving beyond the four-camera setup. As a result, the Switcher Studio team produced a special version of Switcher that allows unlimited sources. The only limitation is network bandwidth.

Adding more cameras was an untested challenge. To help prevent bottlenecks over the local network, we lowered the resolution and bitrate on preview video feeds. We also hardwired the primary iPad used for switching using Apple dongles. Using the “Director Mode” function in Switcher Studio. WWE then triggered a recording on all devices.

For the first test using Switcher Studio, the WWE had a director and operator at the main iPad. The video from the iPad was output to an external TV monitor using Apple’s AirPlay. This workflow allowed the director to see a live video feed from all sources. They were also able to talk with the camera crew and “direct” the operator when to cut to each camera.

The WWE crew had three camera operators from their TV productions to run iPhones in and around the ring. To ensure the devices had enough power to make it through the four-hour-long event, iPhones were attached to batteries. Meanwhile, two camera operators captured wide shots of the ring. Another camera operator captured performer entrances and crowd reaction shots.

WWE setup a local WiFi network for the event to wirelessly sync cameras. The operator made edits in realtime to generate a line cut. After the event the line cut and a ISO from each angle was sent to the WWE post team in the United Kingdom.

Moving forward, we plan to make further improvements to the post workflow. This will be especially helpful for editors, using tools like Adobe Premiere or Avid Media Composer.

If future tests prove successful, WWE could use this new mobile setup to provide more content to their fans–building new revenue streams along the way.


Nick Mattingly is the CEO/co-founder of Switcher Studio. He has over 10 years of experience in video streaming, online monetization and new technologies. 


A glimpse at what was new at NAB

By Lance Holte

I made the trek out to Las Vegas last week for the annual NAB show to take in the latest in post production technology, discuss new trends and products and get lost in a sea of exhibits. With over 1,700 exhibitors, it’s impossible to see everything (especially in the two days I was there), but here are a handful of notable things that caught my eye.

Blackmagic DaVinci Resolve Studio 14: While the “non-studio” version is still free, it’s hard to beat the $299 license for the full version of Resolve. As 4K and 3D media becomes increasingly prevalent, and with the release of their micro and mini panels, Resolve can be a very affordable solution for editors, mobile colorists and DITs.

The new editorial and audio tools are particularly appealing to someone like me, who is often more hands-on on the editorial side than the grading side of post. To that regard, the new tracking features look to provide extra ease of use for quick and simple grades. I also love that Blackmagic has gotten rid of the dongles, which removes the hassle of tracking numerous dongles in a post environment where systems and rooms are swapped regularly. Oh, and there’s bin, clip and timeline locking for collaborative workflows, which easily pushes Resolve into the competition for an end-to-end post solution.

Adobe Premiere CC 2017 with After Effects and Audition Adobe Premiere is typically my editorial application of choice, and the increased integration of AE and Audition promise to make an end-to-end Creative Cloud workflow even smoother. I’ve been hoping for a revamp of Premiere’s title tool for a while, and the Essential Graphics panel/new Title Tool appears to greatly increase and streamline Premiere’s motion graphics capabilities — especially as someone who does almost all my graphics work in After Effects and Photoshop. The more integrated the various applications can be, the better; and Adobe has been pushing that aspect for some time now.

On the audio side, Premiere’s Essential Sound Panel tools for volume matching, organization, cleanup and other effects without going directly into Audition (or exporting for ProTools, etc.) will be really helpful, especially for smaller projects and offline mixes. And as a last note, the new Camera Shake Deblur effect in After Effects is fantastic.

Dell UltraSharp 4K HDR Monitor — There were a lot of great looking HDR monitors at the show, but I liked that this one fell in the middle of the pack in terms of price point ($2K), with solid specs (1000 nits, 97.7% of P3, and 76.9% of Rec. 2020) and a reasonable size (27 inches). Seems like a good editorial or VFX display solution, though the price might be pushing budgetary constraints for smaller post houses. I wish it was DCI 4K instead of UHD and a little more affordable, but that will hopefully come with time.

On that note, I really like HP’s DreamColor Z31x Studio Display. It’s not HDR, but it’s 99% of the P3 colorspace, and it’s DCI 4K — as well as 2K, by multiplying every pixel at 2K resolution into exactly 4 pixels — so there’s no odd-numbered scaling and sharpening required. Also, I like working with large monitors, especially at high resolutions. It offers automated (and schedulable) color calibration, though I’d love to see a non-automated display in the future if it could bring the price down. I could see the HP monitor as a great alternative to using more expensive HDR displays for the majority of workstations at many post houses.

As another side note, Flanders Scientific’s OLED 55-inch HDR display was among the most beautiful I’ve ever seen, but with numerous built-in interfaces and scaling capabilities, it’s likely to come at a higher price.

Canon 4K600STZ 4K HDR laser projector — This looks to be a great projection solution for small screening rooms or large editorial bays. It offers huge 4096×2400 resolution, is fairly small and compact, and apparently has very few restraints when it comes to projection angle, which would be nice for a theatrical edit bay (or a really nice home theater). The laser light source is also attractive because it will be low maintenance. At $63K, it’s at the more affordable end of 4K projector pricing.

Mettle 360 Degree/VR Depth plug-ins: I haven’t worked with a ton of 360-degree media, but I have dealt with the challenges of doing depth-related effects in a traditional single-camera space, so the fact that Mettle is doing depth-of-field effects, dolly effects and depth volumetric effects with 360-degree/VR content is pretty incredible. Plus, their plug-ins are designed to integrate with Premiere and After Effects, which is good news for an Adobe power user. I believe they’re still going to be in beta for a while, but I’m very curious to see how their plug-ins play out.

Finally, in terms of purely interesting tech, Sony’s Bravia 4K acoustic surface TVs are pretty wild. Their displays are OLED, so they look great, and the fact that the screen vibrates to create sound instead of having separate speakers or an attached speaker bar is awfully cool. Even at very close viewing, the screen doesn’t appear to move, though it can clearly be felt vibrating when touched. A vibrating acoustic surface raises some questions about mounting, so it may not be perfect for every environment, but interesting nonetheless.


Lance Holte is an LA-based post production supervisor and producer. He has spoken and taught at such events as NAB, SMPTE, SIGGRAPH and Createasphere. You can email him at lance@lanceholte.com.


Exceptional Minds: Autistic students learn VFX, work on major feature films

After graduation, these artists have been working on projects for Marvel, Disney, Fox and HBO.

By Randi Altman

With an estimated 1 in 68 children in the US being born with some sort of autism spectrum disorder, according to the Centers for Disease Control’s Autism and Developmental Disabilities Monitoring, I think it’s fair to say that most people have been touched in some way by a child on the spectrum.

As a parent of a teenager with autism, I can attest to the fact that one of our biggest worries, the thing that keeps us up at night, is the question of independence. Will he be able to make a living? Will there be an employer who can see beyond his deficits to his gifts and exploit those gifts in the best possible way?

Enter Exceptional Minds, a school in Los Angeles that teaches young adults with autism how to create visual effects and animation while working as part of a team. This program recognizes how bright these young people are and how focused they can be, surrounds them with the right teachers and behavioral therapists, puts the right tools in their hands and lets them fly.

The school, which also has a VFX and animation studio that employs its graduates, was started in 2011 by a group of parents who have children on the spectrum. “They were looking for work opportunities for their kids, and quickly discovered they couldn’t find any. So they decided to start Exceptional Minds and prepare them for careers in animation and visual effects,” explains Susan Zwerman, the studio executive producer at Exceptional Minds and a long-time VFX producer whose credits include Broken Arrow, Alien Resurrection, Men of Honor, Around the World in 80 Days and The Guardian.

Since the program began, these young people have had the opportunity to work on some very high-profile films and TV programs. Recent credits include Game of Thrones, The Fate of the Furious and Doctor Strange, which was nominated for an Oscar for visual effects this year.

We reached out to Zwerman to find out more about this school, its studio and how they help young people with autism find a path to independence.

The school came first and then the studio?
Yes. We started training them for visual effects and animation and then the conversation turned to, “What do they do when they graduate?” That led to the idea to start a visual effects studio. I came on board two years ago to organize and set it up. It’s located downstairs from the school.

How do you pick who is suitable for the program?
We can only take 10 students each year, and unfortunately, there is a waiting list because we are the only program of its kind anywhere. We have a review process that our educators and teachers have in terms of assessing the student’s ability to be able to work in this area. You know, not everybody can function working on a computer for six or eight hours. There are different levels of the spectrum. So the higher functioning and the medium functioning are more suited for this work, which takes a lot of focus.

Students are vetted by our teachers and behavioral specialists, who take into account the student’s ability, as well as their enthusiasm for visual effects and animation — it’s very intense, and they have to be motivated.

Susie Zwerman (in back row, red hair) with artists in the Exceptional Minds studio.

I know that kids on the spectrum aren’t necessarily social butterflies, how do you teach them to work as a team?
Oh, that’s a really good question. We have what’s called our Work Readiness program. They practice interviewing, they practice working as a team, they learn about appearance, attitude, organization and how to problem solve in a work place.

A lot of it is all about working in a team, and developing their social skills. That’s something we really stress in terms of behavioral curriculum.

Can you describe how the school works?
It’s a three-year program. In the first year, they learn about the principles of design and using programs like Adobe’s Flash and Photoshop. In Flash, they study 2D animation and in Photoshop they learn how to do backgrounds for their animation work.

During year two, they learn how to work in a production pipeline. They are given a project that the class works on together, and then they learn how to edit using Adobe Premiere Pro and compositing on Adobe After Effects.

In the third year, they are developing their skills in 3D via Autodesk Maya and compositing with The Foundry’s Nuke. So they learn the way we work in the studio and our pipeline, as well as preparing their portfolios for the workplace. At the end of three years, each student completes their training with a demo reel and resume of their work.

Who helps with the reels and resumes?
Their teachers supervise that process and help them with editing and picking the best pieces for their reel. Having a reel is important for many reasons. While many students will work in our studio for a year after graduation, I was able to place some directly into the work environment because their talent was so good… and their reel was so good.

What is the transition like from school to studio?
They graduate in June and we transition many of them to the studio, where they learn about deadlines and get paid for their work. Here, many experience independence for the first time. We do a lot of 2D-type visual effects clean-up work. We give them shots to work on and test them for the first month to see how they are doing. That’s when we decide if they need more training.

The visual effects side of the studio deals with paint work, wire and rod removal and tracker or marker removals — simple composites — plus a lot of rotoscoping and some greenscreen keying. We also do end title credits for the major movies.

We just opened the animation side of the studio in 2016, so it’s still in the beginning stages, but we’re doing 2D animation. We are not a 3D studio… yet! The 2D work we’ve done includes music videos, Websites, Power Points and some stuff for the LA Zoo. We are gearing up for major projects.

How many work in the studio?
Right now, we have about 15 artists at workstations in our current studio. Some of these will be placed on the outside, but that’s part of using strategic planning in the future to figure out how much expansion we want to do over the next five years.

Thanks to your VFX background, you have many existing relationships with the major studios. Can you talk about how that has benefitted Exceptional Minds?
We have had so much support from the studios; they really want to help us get work for the artists. We started out with Fox, then Disney and then HBO for television. Marvel Studios is one of our biggest fans. Marvel’s Victoria Alonso is a big supporter, so much so that we gave her our Ed Asner Award last June.

Once we started to do tracker marker and end title credits for Marvel, it opened doors. People say, “Well, if you work for Marvel, you could work for us.” So she has been so instrumental in our success.

What were the Fox and Marvel projects?
Our very first client was Fox and we did tracker removals for Dawn of the Planet of the Apes — that was about three years ago. Marvel happened about two years ago and our first job for them was on Avengers: Age of Ultron.

What are some of the other projects Exceptional Minds has worked on?
We worked on Doctor Strange, providing tracker marker removals and end credits. We worked on Ant-Man, Captain America: Civil War, Pete’s Dragon, Alvin & the Chipmunks: The Road Chip and X-Men: Apocalypse.

Thanks to HBO’s Holly Schiffer we did a lot of Game of Thrones work. She has also been a huge supporter of ours.

It’s remarkable how far you guys have come in a short amount of time. Can you talk about how you ended up at Exceptional Minds?
I used to be DGA production manager/location manager and then segued into visual effects as a freelance VFX producer for all the major studios. About three years ago, my best friend Yudi Bennett, who is one of the founders of Exceptional Minds, convinced me to leave my career and  come here to help set up the studio. I was also tasked with producing, scheduling and budgeting work to come into the studio. For me, personally, this has been a spiritual journey. I have had such a good career in the industry, and this is my way of giving back.

So some of these kids move on to other places?
After they have worked in the studio for about a year, or sometimes longer, I look to have them placed at an outside studio. Some of them will stay here at our studio because they may not have the social skills to work on the outside.

Five graduates have been placed so far and they are working full time at various productions studios and visual effects facilities in Los Angeles. We have also had graduates in internships at Cartoon Network and Nickelodeon.

One student is at Marvel, and others are at Stargate Studios, Mr. Wolf and New Edit. To be able to place our artists on the outside is our ultimate goal. We love to place them because it’s sort of life changing. For example, one of the first students we placed, Kevin, is at Stargate. He moved out of his parents’ apartment, he is traveling by himself to and from the studio, he is getting raises and he is moving up as a rotoscope artist.

What is the tuition like?
Students pay about 50 percent and we fundraise the other 50 percent. We also have scholarships for those that can’t afford it. We have to raise a lot of money to support the efforts of the school and studio.

Do companies donate gear?
When we first started, Adobe donated software. That’s how we were able to fund the school before the studio was up and running. Now we’re on an educational plan with them where we pay the minimum. Autodesk and The Foundry also give us discounts or try to donate licenses to us. In terms of hardware, we have been working with Melrose Mac, who is giving us discounts on computers for the school and studio.


Check out Exceptional Minds Website for more info.


Comprimato plug-in manages Ultra HD, VR files within Premiere

Comprimato, makers of GPU-accelerated storage compression and video transcoding solutions, has launched Comprimato UltraPix. This video plug-in offers proxy-free, auto-setup workflows for Ultra HD, VR and more on hardware running Adobe Premiere Pro CC.

The challenge for post facilities finishing in 4K or 8K Ultra HD, or working on immersive 360­ VR projects, is managing the massive amount of data. The files are large, requiring a lot of expensive storage, which can be slow and cumbersome to load, and achieving realtime editing performance is difficult.

Comprimato UltraPix addresses this, building on JPEG2000, a compression format that offers high image quality (including mathematically lossless mode) to generate smaller versions of each frame as an inherent part of the compression process. Comprimato UltraPix delivers the file at a size that the user’s hardware can accommodate.

Once Comprimato UltraPix is loaded on any hardware, it configures itself with auto-setup, requiring no specialist knowledge from the editor who continues to work in Premiere Pro CC exactly as normal. Any workflow can be boosted by Comprimato UltraPix, and the larger the files the greater the benefit.

Comprimato UltraPix is a multi-platform video processing software for instant video resolution in realtime. It is a lightweight, downloadable video plug-in for OS X, Windows and Linux systems. Editors can switch between 4K, 8K, full HD, HD or lower resolutions without proxy-file rendering or transcoding.

“JPEG2000 is an open standard, recognized universally, and post production professionals will already be familiar with it as it is the image standard in DCP digital cinema files,” says Comprimato founder/CEO Jirˇí Matela. “What we have achieved is a unique implementation of JPEG2000 encoding and decoding in software, using the power of the CPU or GPU, which means we can embed it in realtime editing tools like Adobe Premiere Pro CC. It solves a real issue, simply and effectively.”

“Editors and post professionals need tools that integrate ‘under the hood’ so they can focus on content creation and not technology,” says Sue Skidmore, partner relations for Adobe. “Comprimato adds a great option for Adobe Premiere Pro users who need to work with high-resolution video files, including 360 VR material.”

Comprimato UltraPix plug-ins are currently available for Adobe Premiere Pro CC and Foundry Nuke and will be available on other post and VFX tools soon. You can download a free 30-day trial or buy Comprimato UltraPix for $99 a year.

FMPX8.14

Frame.io 2.0 offers 100 new features, improvements for collaboration

Frame.io, developers of the video review and collaboration platform for content creators, has unveiled Frame.io 2.0 , an upgrade offering over 100 new features and improvements. This new version features new client Review Pages, which expands content review and sharing. In addition, the new release offers deeper workflow integration with Final Cut Pro X and Avid Media Composer, plus a completely re-engineered player.

“Frame.io 2 is based on everything we’ve learned from our customers over the past two years and includes our most-requested features,” says Emery Wells, CEO of Frame.io.

Just as internal teams can collaborate using Frame.io’s comprehensive annotation and feedback tools, clients can now provide detailed feedback on projects with Review Pages, which is designed to make the sharing experience simple, with no log-in required.

Review Pages give clients the same commenting ability as collaborators, without exposing them to the full Frame.io interface. Settings are highly configurable to meet specific customer needs, including workflow controls (approvals), security (password protection, setting expiration date) and communication (including a personalized message for the client).

The Review Pages workflow simplifies the exchange of ideas, consolidating feedback in a succinct manner. For those using Adobe Premiere or After Effects, those thoughts flow directly into the timeline, where you can immediately take action and upload a new version. Client Review Pages are also now available in the Frame.io iOS app, allowing collaboration via iPhones and iPads.

Exporting and importing comments and annotations into Final Cut Pro X and Media Composer has gotten easier with the upgraded, free desktop companion app, which allows users to open downloaded comment files and bring them into the editor as markers. There is now no need to toggle between Frame.io and the NLE.

Users can also now copy and paste comments from one version to another. The information is exportable in a variety of formats, whether that’s a PDF containing a thumbnail, timecode, comment, annotation and completion status that can be shared and reviewed with the team or as a .csv or .xml file containing tons of additional data for further processing.

Also new to Frame.io 2.0 is a SMPTE-compliant source timecode display that works with both non-drop and drop-frame timecode. Users can now download proxies straight from Frame.io.

The Frame.io 2.0 player page now offers better navigation, efficiency and accountability. New “comment heads” allow artists to visually see who left a comment and where so they can quickly find and prioritize feedback on any given project. Users can also preview the next comment, saving them time when one comment affects another.

The new looping feature, targeting motion and VFX artists, lets users watch the same short clip on loop. You can even select a range within a clip to really dive in deep. Frame.io 2.0’s asset slider makes it easy to navigate between assets from the player page.

The new Frame.io 2.0 dashboard has been redesigned for speed and simplicity. Users can manage collaborators for any given project from the new collaborator panel, where adding an entire team to a project takes one click. A simple search in the project search bar makes it easy to bring up a project. The breadcrumb navigation bar tracks every move deeper into a sub-sub-subfolder, helping artists stay oriented when getting lost in their work. The new list view option with mini-scrub gives users the birds-eye view of everything happening in Frame.io 2.0.

Copying and moving assets between projects takes up no additional storage, even when users make thousands of copies of a clip or project. Frame.io 2.0 also now offers the ability to publish direct to Vimeo, with full control over publishing options, so pros can create the description and set privacy permissions, right then and there.


Aardman creates short film, struts its stuff

By Randi Altman

All creative studios strive for creative ways to show off their talent and offerings, and London-based Aardman is no exception. Famous for its stop-motion animation work (remember the Wallace and Gromit films?), this studio now provides so much more, including live-action, CG, 2D animation and character creation.

Danny Capozzi

In order to help hammer home all of their offerings, and in hopes of breaking that stop-motion stereotype, Aardman has created a satirical short film, called Visualize This, depicting a conference call between a production company and an advertising agency, giving the studio the ability to show off the range of solutions they can provide for clients. Each time the fictional client suggests something, that visual pops up on the screen, whether it’s adding graffiti to a snail’s shell or textured type or making a giant monster out of CG cardboard boxes.

We reached out to Aardman’s Danny Capozzi, who directed the short, to find out more about this project and the studio in general.

How did the idea for this short come about?
I felt that the idea of making a film based on a conference call was something that would resonate with a lot of people in any creative industry. The continuous spit balling of ideas and suggestions would make a great platform to demonstrate a lot of different styles that myself and Aardman can produce. Aardman is well known for its high level of stop-motion/Claymation work, but we do CGI, live action and 2D just as well. We also create brand new ways of animating by combining styles and techniques.

Why was now the right time to do this?
I think we are living in a time of uncertainty, and this film really expresses that. We do a lot of procrastinating. We have the luxury to change our minds, our tastes and our styles every two minutes. With so much choice of everything at our fingertips we can no longer make quick decisions and stick to them. There’s always that sense of “I love this… it’s perfect, but what if there’s something better?” I think Visualize This sums it up.

You guys work with agencies and directly with brands — how would you break that up percentage wise?
The large majority of our advertising work still comes through agencies, although we are increasingly doing one-off projects for clients who seek us out for our storytelling and characters. It’s hard to give a percentage on it because the one-offs vary so much in size that they can skew the numbers and give the wrong impression. More often than not, they aren’t advertising projects either and tend to fall into the realm of short films for organizations, which can be either charities, museums or visitor attractions, or even mass participation arts projects and events.

Can you talk about making the short? Your workflow?
When I first pitched the idea to our executive producer Heather Wright, she immediately loved the idea. After a bit of tweaking on the script and the pace of the dialogue we soon went into production. The film was achieved during some down time from commercial productions and took about 14 weeks on and off over several months.

What tools did you call on?
We used a large variety of techniques CGI, stop-motion, 2D, live action, timelapse photography and greenscreen. Compositing and CG was via Maya, Houdini and Nuke software. We used HDRI (High Dynamic Range Images). We also used Adobe’s After Effects, Premiere, Photoshop, and Illustrator, along with clay sculpting, model making and blood, sweat and, of course, some tears.

What was the most complicated shot?
The glossy black oil shot. This could have been done in CGI with a very good team of modelers and lighters and compositors, but I wanted to achieve this in-camera.

Firstly, I secretly stole some of my son Vinny’s toys away to Aardman’s model-making workshop and spray painted them black. Sorry Vinny! I hot glued the black toys onto a black board (huge mistake!), you’ll see why later. Then I cleared Asda out of cheap cooking oil — 72 litres of the greasy stuff. I mixed it with black oil paint and poured it into a casket.

We then rigged the board of toys to a motion control rig. This would act as the winch to raise the toys out of the black oily soup. Another motion control was rigged to do the panning shot with the camera attached to it. This way we get a nice up and across motion in-camera.

We lowered the board of toys into the black soup and the cables that held it up sagged and released the board of toys. Noooooo! I watched them sink. Then to add insult to injury, the hot glue gave way and the toys floated up. How do you glue something to an oily surface?? You don’t! You use screws. After much tinkering it was ready to be submerged again. After a couple of passes, it worked. I just love the way the natural glossy highlights move over the objects. All well worth doing in-camera for real, and so much more rewarding.

What sort of response has it received?
I’m delighted. It has really travelled since we launched a couple of weeks ago, and it’s fantastic to keep seeing it pop up in my news feed on various social media sites! I think we are on over 20,000 YouTube views and 40,000 odd views on Facebook.

Editor Eddie Ringer joins NYC’s Wax

Wax, an editorial house based in NYC, has added film and commercial editor Eddie Ringer. Ringer comes to Wax from Wildchild + Bonch in New York. Prior to that, he spent over eight years at Sausalito-based agency Butler Shine Stern + Partners (BSSP), where he edited and directed advertising projects spanning broadcast commercials, viral campaigns and branded content.

Ringer says he calls on his agency background for his editing work. “Working on the agency side I saw firsthand the tremendous amount of thought and hard work that goes into creating a campaign. I take this into consideration on every project. It focuses me. The baton has been passed, and it’s my responsibility to make sure the collective vision is carried through to the end.”

In addition to his agency experience, Ringer enjoys the way sound design can dictate the flow of the edit and stresses the importance of balancing the creative part with the commerce side of things and understanding why it works. “At the end of the day,” he notes, “we’re trying to connect with an audience to sell a product and a brand.”

Ringer’s first job with Wax was a new spot for ITV London promoting the horse-racing channel. It features momentum edits, hard cuts, energy and, of course, lots of sound design.

His tool of choice is Adobe Premiere Pro. “I made the switch to Premiere about four years ago and never looked back. I find the functionality more intuitive than other NLEs I’ve used in the past,” he says.