Tag Archives: Red Camera

DevinSuperTramp: The making of a YouTube filmmaker

Devin Graham, aka DevinSuperTramp, made the unlikely journey from BYU dropout to a viral YouTube sensation who has over five million followers. After leaving school, Graham went to Hawaii to work on a documentary. The project soon ran out of money and he was stuck on the island… feeling very much a dropout and a failure. He started making fun videos with his friends to pass the time, and DevinSuperTramp was born. Now he travels, filming his view of the world, taking on daring adventures to get his next shot, and risking life and limb.

Shooting while snowboarding behind a trackhoe with a bunch of friends for a new video.

We recently had the chance to sit down with Graham to hear firsthand what lessons he’s learned along his journey, and how he’s developed into the filmmaker he is today.

Why extreme adventure content?
I grew up in the outdoors — always hiking and camping with my dad, and snowboarding. I’ve always been intrigued by pushing human limits. One thing I love about the extreme thing is that everyone we work with is the best at what they do. Like, we had the world’s best scooter riders. I love working with people who devote their entire lives to this one skillset. You get to see that passion come through. To me, it’s super inspiring to show off their talents to the world.

How did you get DevinSuperTramp off the ground? Pun intended.
I’ve made movies ever since I can remember. I was a little kid shooting Legos and stop-motion with my siblings. In high school, I took photography classes, and after I saw the movie Jurassic Park, I was like, “I want to make movies for a living. I want to do the next Jurassic Park.” So, I went to film school. Actually, I got rejected from the film program the first time I applied, which made me volunteer for every film thing going on at the college — craft service, carrying lights, whatever I could do. One day, my roommate was like, “YouTube is going to be the next big thing for videos. You should get on that.”

And you did.
Well, I started making videos just kind of for fun, not expecting anything to happen. But it blew up. Eight years later, it’s become the YouTube channel we have now, with five million subscribers. And we get to travel around the world creating content that we love creating.

Working on a promo video for Recoil – all the effects were done practically.

And you got to bring it full circle when you worked with Universal on promoting Fallen Kingdom.
I did! That was so fun and exciting. But yeah, I was always making content. I didn’t wait ‘til after I graduated. I was constantly looking for opportunities and networking with people from the film program. I think that was a big part of (succeeding at that time), just looking for every opportunity to milk it for everything I could.

In the early days, how did you promote your work?
I was creating all my stuff on YouTube, which, at that time, had hardly any solid, quality content. There was a lot of content, but it was mostly shot on whatever smartphone people had, or it was just people blogging. There wasn’t really anything cinematic, so right away our stuff stood out. One of the first videos I ever posted ended up getting like a million views right away, and people all around the world started contacting me, saying, “Hey, Devin, I’d love for you to shoot a commercial for us.” I had these big opportunities right from the start, just by creating content with my friends and putting it out on YouTube.

Where did you get the money for equipment?
In the beginning, I didn’t even own a camera. I just borrowed some from friends. We didn’t have any fancy stuff. I was using a Canon 5D Mark II and the Canon T2i, which are fairly cheap cameras compared to what we’re using now. But I was just creating the best content I could with the resources I had, and I was able to build a company from that.

If you had to start from scratch today, do you think you could do it again?
I definitely think it’s 100 percent doable, but I would have to play the game differently. Even now we are having to play the game differently than we did six months ago. Social media is hard because it’s constantly evolving. The algorithms keep changing.

Filming in Iceland for an upcoming documentary.

What are you doing today that’s different from before?
One thing is just using trends and popular things that are going on. For example, a year and a half ago, Pokémon Go was very popular, so we did a video on Pokémon and it got 20 million views within a couple weeks. We have to be very smart about what content we put out — not just putting out content to put out content.

One thing that’s always stayed true since the beginning is consistent content. When we don’t put out a video weekly, it actually hurts our content being seen. The famous people on YouTube now are the ones putting out daily content. For what we’re doing, that’s impossible, so we’ve sort of shifted platforms from YouTube, which was our bread and butter. Facebook is where we push our main content now, because Facebook doesn’t favor daily content. It just favors good-quality content.

Teens will be the first to say that grown-ups struggle with knowing what’s cool. How do you chase after topics likely to blow up?
A big one is going on YouTube and seeing what videos are trending. Also, if you go to Google Trends, it shows you the top things that were searched that day, that week, that month. So, it’s being on top of that. Or, maybe, Taylor Swift is coming out with a new album; we know that’s going to be really popular. Just staying current with all that stuff. You can also use Facebook, Twitter and Instagram to get an idea of what people are really excited about.

Can you tell us about some of the equipment you use, and the demands that your workflow puts on your storage needs?
We shoot so much content. We own two Red 8K cameras that we film everything with, and we’re shooting daily for the most part. On an average week, we’re shooting about eight terabytes, and then backing that up — so 16 terabytes a week. Obviously, we need a lot of storage, and we need storage that we can access quickly. We’re not putting it on tape. We need to pull stuff up right there and start editing on it right away.

So, we need the biggest drives that are as fast as possible. That’s why we use G-Tech’s 96TB G-Speed Shuttle XL towers. We have around 10 of those, and we’ve been shooting with those for the last three to four years. We needed something super reliable. Some of these shoots involve forking out a lot of money. I can’t take a hard drive and just hope it doesn’t fail. I need something that never fails on me — like ever. It’s just not worth taking that risk. I need a drive I can completely trust and is also super-fast.

What’s the one piece of advice that you wish somebody had given you when you were starting out?
In my early days, I didn’t have much of a budget, so I would never back up any of my footage. I was working on two really important projects and had them all on one drive. My roommate knocked that drive off the table, and I lost all that footage. It wasn’t backed up. I only had little bits and pieces still saved on the card — enough to release it, but a lot of people wanted to buy the stock footage and I didn’t have most of the original content. I lost out on a huge opportunity.

Today, we back up every single thing we do, no matter how big or how small it is. So, if I could do my early days over again, even if I didn’t have all the money to fund it, I’d figure out a way to have backup drives. That was something I had to learn the hard way.

Indie film Hoax calls on ACES

By Debra Kaufman

Shot in the remote mountains of southwestern Colorado, Hoax follows a brilliant primate specialist and ruthless TV producer as they investigate the site of a camping trip gone terribly wrong. They soon find themselves fighting to survive — and coming to grips with the fact that Big Foot may not be a legend after all. The movie, which will complete post production in mid-June, is the brainchild of Matt Allen, who wrote and directed it. His friend, freelance editor and colorist Peder Morgenthaler wore many hats, starting with multiple readings of the script. “As Matt was pulling the production together, he asked if I’d like to edit and color grade,” says Morgenthaler. “I ended up post supervising too.”

Peder Morgenthaler

The Academy Color Encoding System (ACES) was already on Morgenthaler’s radar. He’d followed its early development and began experimenting with it as soon as he could. As a result, he immediately thought ACES would be ideal for the Hoax, which would be shot with Red Helium and Monstro cameras. “We wanted to work with the Red RAW data instead of baking in a look,” he says. “The challenge would be how to maintain the full dynamic range present in the camera originals all the way through every step of post production — from dailies to visual effects — so that when the footage came to the color grade none of the information would have been flattened out or lost.”

He also was aware that High Dynamic Range was being discussed as a new display format. “In my research about ACES it became clear that it was a really good way to be able to make an HDR master down the road,” says Morgenthaler. “That would be a big advantage for an indie film. We don’t have access to image scientists or an advanced image processing pipeline. ACES offers us opportunities we wouldn’t otherwise have because of our limited resources.”

Because the production was going to take place in a remote area, with no opportunity for a DIT station, Allen and Morgenthaler made the decision that ACES would only be implemented in post. “We weren’t doing all the on-set color grading and look previewing that you would do in a full ACES pipeline,” notes Morgenthaler. Cinematographer Scott Park shot at 6K and 8K resolution, framing for a 2.39:1 widescreen aspect ratio while shooting 1.78:1 to give enough room for reframing in post. “In addition, we shot segments on ENG and Flir infrared and night vision cameras, so we had several codecs and color spaces in the workflow,” Morgenthaler says. “We also had 120 VFX shots — mostly invisible ones — planned, so we needed a color pipeline that could handle that component as well.”

In post, Morgenthaler used Adobe Premiere Pro for editing, Adobe After Effects with the OpenColorIO plug-in for visual effects and Blackmagic Resolve for dailies, color grading and finishing. “All this software can work in an ACES environment,” he says. “Our goals were to enable a responsive and efficient editorial workflow; maintain image quality, resolution and dynamic range; simplify color management between VFX artists and the colorist; and generate deliverables for multiple display standards with minimal additional grading.”

For dailies, assistant editor Ricardo Cozzolino worked in Resolve, syncing dual-system sound to picture and performing a quick color balance using the Red RAW controls, then exporting the shots as Rec. 709 HD ProRes 422 proxies. “During the edit, I’d apply temp color correction, compositing and stabilization in the Premiere timeline for preview purposes, knowing I would likely have to rebuild those effects in Resolve during the finishing process,” Morgenthaler says. “There were no ACES operations required during editorial; we just worked with the color corrected dailies in Rec. 709 space.”

One unexpected challenge was getting the After Effects compositors up to speed working in ACES scene linear space. “It wasn’t familiar to them,” he says. “They’re used to working in Rec. 709, and since After Effects doesn’t support an ACES pipeline natively you have to use a third-party plug in like OpenColorIO, which has been developed for a number of platforms including AE to enable the ACES color transformations.” The plug-in allows the user to disable AE’s internal color management functions and replace them with the proper ACES color transforms.

Using Resolve, the editorial team exported each plate at 4K as an ACES 16-bit OpenEXR image sequence in linear space. The visual effect artists then executed their shots in After Effects, using the OpenColorIO (OCIO) plugin. In the end, the ACES process worked for VFX exactly like he hoped it would. “All the dynamic range is there,” Morgenthaler says. “I just conformed the completed VFX shots into my color timeline and it worked perfectly, seamlessly replacing the original footage. The shots look wonderful, and they grade exactly like the camera-original R3D files.”

In color grading, Morgenthaler conformed back to the original 6K and 8K Red files inside Resolve, targeting a DCI Scope 4K finish. “After the master grade is completed, we’ll create additional versions targeting various display technologies simply by switching the ACES Output Transform,” he explains. “We can easily create versions for digital cinema, HDR and streaming, which is one of the huge benefits of the ACES process.”

Having the right storage is important to the ACES workflow, since 16-bit OpenEXR at 4K is around 45MB per frame, or just over 1,000MBps at 24 fps to play back in realtime, says Morgenthaler. “Not all storage can do that.” Morgenthaler, who consults with Seagate on their storage systems for post, relied on a Seagate RealStor shared storage system with 144TB of fibre channel storage. “The file sharing is based off of Tiger Technology’s Tigerstore, which enables simultaneous access for all users on the network at full quality and resolution,” he says “That greatly increased the efficiency of our workflow. It meant instant collaboration between team members, with no syncing of separate drives required to maintain collaboration.” In total, the production generated 44 hours of footage and ended up with 19.5TB of total data, not including visual effects.

“We may be on the front edge of using ACES in indie films, but it’ll be more important for indie filmmakers going forward,” he predicts. “There are real benefits to doing so. It’s a powerful tool for maintaining dynamic range and quality, and the pre-built color management pipeline simplifies complex VFX processes. It also increases the film’s desirability to distributors by enabling generation of additional versions such as HDR.

“I don’t know that we could have achieved what we did on this film without ACES,” concludes Morgenthaler. “Large films have access to color scientists and secret sauce, and ACES gives you that in a turnkey package, which is really powerful for a small film.”


Debra Kaufman has covered media and entertainment for 30 years for publications including Variety, The Hollywood Reporter, American Cinematographer, International Cinematographer, Wired and others. She currently also writes for USC’s Entertainment Technology Center’s daily newsletter, ETCentric.

Creating the look for Netflix’s The End of the F***ing World

By Adrian Pennington

Content in 8K UHD won’t be transmitting or streaming its way to a screen anytime soon, but the ultra-high-resolution format is already making its mark in production and post. Remarkably, it is high-end TV drama, rather than feature films, that is leading the way. The End of The F***ing World is the latest series to pioneer a workflow that gives its filmmakers a creative edge.

Adapted from the award-winning graphic novels of Charles Forsman, the dark comedy is an eight-part co-production between Netflix and UK broadcaster Channel 4. The series invites viewers into the confused lives of teen outsiders James (Alex Lawther) and Alyssa (Jessica Barden), as they decide to escape from their families and embark on a road trip to find Alyssa’s estranged father.

Executive producer and director Jonathan Entwistle and cinematographer Justin Brown were looking for something special stylistically to bring the chilling yet humorous tale to life. With Netflix specifying a 4K deliverable, the first critical choice was to use 8K as the dominant format. Brown selected the Red Weapon 8K S35 with the Helium sensor.

In parallel, the filmmakers turned to colorist Toby Tomkins, co-founder of East London grading and finishing boutique studio Cheat, to devise a look and a workflow that would maximize the rich, detailed color, as well as the light information from the Red rushes.

“I’ve worked with Justin for about 10 years, since film school,” explains Tomkins. “Four years ago he shot the pilot for The End of The F***ing World with Jon, which is how I first became involved with the show. Because we’d worked together for so long, I kind of already knew what type of thing they were looking for. Justin shot tests on the Red Weapon, and our first job was to create a 3D LUT for the on-set team to refer to throughout shooting.”

Expert at grading commercials, and with feature-length narrative Sixteen (also shot by Justin Brown) under his belt, this was Tomkins’ first responsibility for an episodic TV drama, and he relished the challenge. “From the beginning, we knew we wanted to work completely RAW at 7K/8K the whole way through and final output at 4K,” he explains. “We conformed to the R3D rushes, which were stored on our SSD NAS. This delivered 10Gbps bandwidth to the suite.”

With just 10 days to grade all the episodes, Tomkins needed to develop a rich “Americana” look that would not only complement the dark narrative but would also work across a range of locations and timescales.

“We wanted the show to have richness and a denseness to it, with skin tones almost a leathery red, adding some warmth to the characters,” he says. “Despite being shot at British locations — with British weather — we wanted to emulate something filmic and American in style. To do this we wanted quite a dense film print look, using skin tones you would find on celluloid film and a shadow and highlight roll-off that you would find in films, as opposed to British TV.”

Cheat used its proprietary film emulation to create the look. With virtually the whole series shot in 8K, the Cheat team invested in a Quad GPU Linux Resolve workstation, with dual Xeon processors, to handle the additional processing requirements once in the DaVinci Resolve finishing suite.

“The creative benefits of working in 8K from the Red RAW images are huge,” says Tomkins. “The workstation gave us the ability to use post-shoot exposure and color temperature settings to photorealistically adjust and match shots and, consequently, more freedom to focus on the finer details of the grade.

“At 8K the noise was so fine in size that we could push the image further. It also let us get cleaner keys due to the over-sample, better tracking, and access to high-frequency detail that we could choose to change or adapt as necessary for texture.”

Cheat had to conform more than 50 days of rushes and 100TBs of 7K and 8K RAW material spread across 40 drives, a process that was completed by Cheat junior colorist Caroline Morin in Resolve.

“After the first episode, the series becomes a road movie, so almost each new scene is a new location and lighting setup,” Tomkins explains. “I tried to approach each episode as though it was its own short film and to establish a range of material and emotion for each scene and character, while also trying to maintain a consistent look that flowed throughout the series.”

Tomkins primarily adjusted the RAW settings of the material in Resolve and used lift, gamma and gain to adjust the look depending on the lighting ratios and mood of the scenes. “It’s very easy to talk about workflow, tools and approach, but the real magic comes from creative discussions and experimentation with the director and cinematographer. This process was especially wonderful on this show because we had all worked together several times before and had developed a short hand for our creative discussion.

“The boundaries are changing,” he adds. “The creative looks that you get to work and play with are so much stronger on television now than they ever used to be.”

Behind the Title: Park Road Post’s Anthony Pratt

NAME: Anthony Pratt

COMPANY: Park Road Post Production

CAN YOU DESCRIBE YOUR COMPANY?
Park Road is a bespoke post production facility, and is part of the Weta Group of Companies based on the Miramar Peninsular in Wellington, New Zealand.

We are internationally recognized for our award-winning sound and picture finishing for TV and film. We walk alongside all kinds of storytellers, supporting them from shoot through to final delivery.

WHAT’S YOUR JOB TITLE?
Workflow Architect — Picture

WHAT DOES THAT ENTAIL?
I get to think about how we can work with a production to wrap process and people around a project, all with a view of achieving the best result at the end of that process. It’s about taking a step back and challenging our current view while thinking about what’s next.

We spend a lot of time working with the camera department and dailies team, and integrating their work with editorial and VFX. I work alongside our brilliant director of engineering for the picture department, and our equally skilled systems technology team — they make me look good!

From a business development perspective, I try to integrate the platforms and technologies we advance into new opportunities for Park Road as a whole.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I quite like outreach around the company and the group, so presenting and sharing is fun — and it’s certainly not always directly related to the work in the picture department. Our relationships with film festivals, symposia, the local industry guilds and WIFT always excite me.

WHAT’S YOUR FAVORITE PART OF THE JOB?
My favorite time of all is when we get to see our clients work in a cinema with an audience for the first time — then the story is really real.

It’s great when our team is actively engaged as a creative partner, especially during the production phase. I enjoy learning from our technical team alongside our creative folk, and there’s always something to learn.

We have fantastic coffee and baristas; I get to help QC that throughout my day!

WHAT’S YOUR LEAST FAVORITE?
It’s always hard when a really fantastic story we’ve helped plan for isn’t greenlit. That’s the industry, of course, but there are some stories we really want to see told! Like everyone, there are a few Monday mornings that really need to start later.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
I get a huge kick on the days we get to sign off the final DCP for a theatrical release. It’s always inspiring seeing all that hard work come together in our cinema.

I am also particularly fond of summer days where we can get away from the facility for a half hour and eat lunch on a beach somewhere with the crew — in Miramar a beach is only 10 minutes away.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I’d be building my own business and making my own work — if it wasn’t strictly film related it would still be narrative — and I’m always playing with technology, so no doubt I’d be asking questions about what that meant from a lived perspective, regardless of the medium. I’d quite probably be distilling a bit more gin as well!

WHY DID YOU CHOOSE THIS PROFESSION? HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I think it kind of chose me in the end… I’ve always loved the movies and experimented with work in various media types from print and theatre through animation and interactivity — there was always a technology overtone — before landing where I needed to be: in cinema.

I came to high-end film post somewhat obliquely, having built an early tapeless TV pipeline; I was able to bring that comfort with digital acquisition to an industry transitioning from 35mm in the mid 2000s.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
I’m profoundly privileged to work for a company owned by Peter Jackson, and I have worked on every project of his since The Lovely Bones. We are working on Christian Rivers’ Mortal Engines at present. We recently supported the wonderful Jess Hall shooting on the Alexa 65 for Ghost in the Shell. He’s a really smart DOP.

I really enjoy our offshore clients. As well as the work we do with our friends in the USA. we’ve done some really great work recently with clients in China and the Middle East. Cultural fusion is exhilarating.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
We worked with director Geoff Murphy to restore and revisit his seminal New Zealand feature from 1983 UTU Redux, and that was the opening night feature for the 2013 NZ International Film Festival. It was incredibly good fun, was honorable and is a true taonga in our national narrative.

A Park Road Mistika grading suite.

The Hobbit films were a big chunk of the last decade for us, and our team was recognized with multiple awards. The partnerships we built with SGO, Quantum, Red and Factorial are strong to this day. I was very fortunate to collect some of those awards on our team’s behalf, and was delighted to have that honor.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
I rely on clean water and modern medicine to help keep myself, and our wider community, safe from harm. And I am really conscious that to keep that progress moving forward we’re going to have to shepherd our natural world one hell of a lot better.

Powerful computing and fast Internet transformed not only our work, but time and distance for me. I’ve learned more about film, music and art because of the platforms running without friction on the Internet than I would have dared dreamed in the ‘90s.

I hold in my hand a mobile access point that can not only access a mind-bogglingly large world of knowledge and media, but can also dynamically represent that information for my benefit and allow me to acknowledge the value of trust in that connection — there’s hope for us in the very large being accessible by way of the very small.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
I kind of abandoned Facebook a few years ago, but Film Twitter has some amazing writers and cinematographers represented. I tend to be a bit of a lurker most other places — sometimes the most instructive exercise is to observe! Our private company Slack channels supplement the rest of my social media time.

To be honest, most of our world is respectfully private, but I do follow @ParkRoadPost on Instagram and Twitter.

DO YOU LISTEN TO MUSIC WHILE YOU WORK? CARE TO SHARE YOUR FAVORITE MUSIC TO WORK TO?
Our team has a very broad range of musical tastes, and we tend to try and share that with each other… and there is always Radiohead. I have a not-so-secret love of romantic classical music and lush film scores. My boss and I agree very much on what rock (and a little alt-country) should sound like, so there’s a fair bit of that!

When my headphones are on there is sometimes old-school liquid or downbeat electronica, but mostly I am listening to the best deep house that Berlin and Hamburg have to offer while I work.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
A quick walk around the peninsular is a pretty good way of chilling out, especially if it’s dusk or dawn — then I can watch some penguins on the rocks while ships come in and out of the harbor!

My family (including the furry ones!) are incredible, and they help provide perspective in all things.

Wellington is the craft beer capital of New Zealand, so there’s always an opportunity for some food and an interesting drop from Garage Project (or Liberty brewing out of Auckland) with mates in town. I try and hang out with a bunch of my non-industry friends every month or so — those nights are definitely my favorite for music, and are good for my soul!

IBC: G-Tech adds four new products to Evolution Series

G-Technology has added four new products to its Evolution (ev) Series, an ecosystem of docking stations and interchangeable and expandable external hard drives and accessories. The new products include the G-Speed Studio XL with two ev Series bay adapters, the ev Series Reader Red Mini-Mag edition, the G-Dock ev Solo and the ev Series FireWire adapter.

G-SPEEDstudioXL-evSeries-BayAdapters-FrontOpen-with-evDrives-HiRes

The G-Speed Studio XL (pictured right) with two ev Series bay adapters is an eight-bay Thunderbolt 2 storage solution that comes with six enterprise-class hard drives and two integrated ev Series bay adapters for greater capacity and performance. The integrated ev Series bay adapters accommodate all ev Series drive modules, enabling cross-functionality with other products in the Evolution Series. Configurable in RAID-0, -1, -5, -6 and -10, it supports multistream compressed 4K workflows with extremely large volumes of data at transfer rates of up to 1,200 MB/sec and the ability to daisy chain via dual Thunderbolt 2 ports.

Designed to optimize a Red camera workflow, the ev Series Reader Red Mini-Mag edition uses high-performance connectivity for fast Red footage transfers and backup. Users can transfer content quickly from a Red Mini-Mag media card onto any G-dock ev docking station or G-Speed Studio XL with ev Series bay adapters. The ev Series all-terrain case, which is watertight, adds protection when shooting on the go.

For those who already have several G-Drive ev modules, the G-Dock ev Solo (pictured below) is a USB 3.0 docking solution for shared environments, including studios, labs and classrooms. Users can transfer, edit and back up an existing Evolution Series hard drive module by inserting it into the G-Dock ev Solo. When paired with the G-Drive ev, G-Drive ev Raw, G-Drive ev 220 or G-Drive ev SSD, the G-Dock ev Solo can store up to 2TB of data and transfer content at rates up to 400MB/sec.

G_Dock_evSOLO

Finally, the new ev Series FireWire adapter attaches to an ev Series drive, allowing connection to an existing FireWire 800 port. Users can connect a G-Drive ev Raw, G-Drive ev, G-Drive ev 220 or G-Drive ev SSD to a computer via one of two FireWire 800 ports or daisy chain them.

The G-Speed Studio XL with two ev Series bay adapters, the ev Series Reader Red Mini-Mag edition and the G-Dock ev Solo will be available in October. The ev Series FireWire adapter is available now.

A Closer Look: Interstate’s work on Master & Dynamic headphones short

By Randi Altman

To tell the story of how their high-end MH40 headphones are made, Master & Dynamic called on New York-based production company Interstate to create a film educating potential buyers. Interstate is the US branch of the Montreal-based integrated production company BLVD. It’s run by managing partner Danny Rosenbloom (formerly with Psyop, Brand New School) and creative director Yann Mabille (formerly with The Mill, Mill+).

This almost 1.5-minute piece talks about the materials that go into creating the headphones and describes the manufacturing process and why it’s all meant to maximize sound quality. There are images of molten metal, flowing liquids and magnetized metal shavings that transform into headphone components. To create the finished look, Interstate captured as much as they could in-camera, shooting with a birds eye view, and a mix of stop motion and visual effects.

For the liquid aluminum sequence, Interstate used a material called gallium for the melting aluminum effect — also used in the original Terminator movies — and cast and melted an aluminum ingot from it on camera.

According to Interstate EP Rosenbloom, “The material melts at roughly 80 degrees Fahrenheit. It’s the same stuff some magicians use to bend spoons with their minds — not all of them, of course, because the good ones really can bend spoons with their minds!”

Interstate’s Mabille, who co-directed the piece with Adam Levite, answered our questions and helped us dig a bit deeper.

yann_mabille

How early did Interstate get involved with the project? 
We started to get involved during the final stages of setting up Interstate, which makes this project our very first. We thought it was a great way to start.

 

Were you involved in the creative, or did the client come with that already spelled out?
Miles Skinner, who is a freelance creative director for Master & Dynamic, wanted to create a sequence that suggested a building process that had a specific elegance and artistic value while showcasing the beautiful qualities of the raw materials used to build the headphones.

At the same time, the goal was to stray away from the traditional pipeline representations, which are usually hands or machines interacting with objects etc. We were tasked with finding creative solutions to implement Miles’ ideas. We conceived a semi-abstract representation of each of the main steps of the building process, starting by glorifying the raw materials, processing these materials in an interesting manner, and eventually ending it in an elegant way to showcase the finished product.

How much is live-action versus VFX?
The product is very well designed and has a great finish, so we knew that it would look great on camera. Adam and I love macro-photography and were keen to feature the natural beauty of raw and noble materials on a small scale. This naturally led to trying to shoot as much as we could in-camera, therefore limiting the role of VFX in the sequence.

That said, CGI was used to animate certain elements that would have been challenging to puppeteer on such a small scale. In order to add light interactions across the lens, cleanup and retime shots, we used 2D. We wanted to retain a physical approach from the very beginning to keep all the wonderful qualities of the raw materials as genuine as possible.

Did you do any prepro?
Indeed. Most of the prepro was spent getting to know the materials we were going to work with and how to best represent the headphones, as well as all the components used for the construction process. For example, we ended up using gallium to simulate melting aluminum, and a specific metallic powder was brought to life to shape components, such as steel screws, which were also made out of wax that we then melted. Overall, it was obviously much easier to film deconstruction and reverse the footage to give the illusion of construction.

Can you walk us through the production and post workflow? What did Interstate shoot on?
Alexander Hankoff was the DP. I had worked with him when I was at The Mill, and I always wanted to work with him again as I knew he had a great eye for macro-photography. He can find beauty where you expect it the least. He did a great job over the two-day shoot.

We shot the whole spot on a Red Epic camera, most of it at about 120fps. Also, production designer Jaime Moore and her prop master, Gino Fortebuono, were indispensable to the process and did a great job bringing this to life. We shot the whole sequence in a fairly big studio to make sure we could use different set-ups at the same time.

new2

Interstate produced and provided some post, but you also worked with BLVD. What exactly did they provide in terms of post?
Most of the time we will do all the post internally, but in this case we could not do all of it as we were just starting the company. BLVD was the right choice to help with the 3D and some of the 2D components, but their audio experience was key, and they also did a great job with the sound design.

How did you work with them on approvals?
We had daily reviews, which were all remote, but hassle free. Everyone was really responsive and engaged thoughout the process.

What tools were used for the edit, VFX and color?
Apple Final Cut, Autodesk Maya and Blackmagic DaVinci Resolve for color.

How did you describe to your colorist, Tristan Kneschke, the look and feel you wanted for the piece?
A very favorite part of the process for me is to establish a color look, but I also think it’s crucial to sleep on it. It’s important to step back when you do coloring since your first pass will often be either too extreme or off tone. Keeping a fresh eye is the hardest thing to do while coloring. Luckily we were able to do that with Tristan. We established a look, which we then refined over the course of a week.

What was the most challenging part of this project?
Besides figuring out how to get the most out of the materials we had — the components that make the headsets or the materials used to shape specific objects — the conceptual phase was crucial and the most challenging. It was key to find the right balance between an overly abstract and removed representation of the actual building process, and an elegant and somewhat explicit representation of that same process. It was important not to get too far away from a clear and palpable depiction of what happens to the materials in order to constantly keep the audience hooked and able to relate to the product.

What haven’t we asked that’s important?
The client was amazing — they really gave us total freedom. Miles is a rock star, every idea he had was great and everything we proposed he quickly came on board with. As a company, we really wanted to make sure that our first piece out of the gate was memorable. I think we got there.

Offhollywood launches cinema camera accessory products

New York’s Offhollywood, the former post house turned camera equipment rental and production services boutique that focuses on emerging technologies, has entered the world of product development with three initial camera accessory products. HotLink, HotBox R/S and the HotTap will be on the Red Digital Cinema booth during the IBC show in Amsterdam.

HotLink_handMAIN

“Since we started Offhollywood in 2003, we have been alpha/beta testers and early adopters, providing feedback and ideas to leading technology companies in the content creation space,” reports CEO Mark L. Pederson. “We believe that technology will continue to evolve and radically empower content creators, and we are excited to stay on the edge of that change and develop and produce new tools and accessories for digit cinema, television and interactive.

The HotLink is a third-party hardware tool that facilitates wifi control of Red’s DSMC Digital Cinema camera systems, leveraging the RedLink Command Protocol, an open development platform for camera control and metadata integration, announced by Red at NAB earlier this year.

The Hotbox R/S was designed for rental houses to solve power distribution and run/stop camera triggering issues when working with Red DSMC camera systems, Arri Alexa and Amira cameras, and Sony F5 and F55s.

hotbox_rs_wifI-hi one

The HotLink is a reinvented P-Tap power distribution splitter for powering camera accessories with the common, industry standard P-Tap power cables. The HotLink adds an internal resettable fuse and directional current protection on its 2-pin LEMO power input to protect the camera and attached accessories.

“First the lab moved to the set with the advent of on-set and near set dailies — and now the lab is moving into the camera and into your pocket,” explains Pederson. “Having full wifi control of the camera and metadata on a RAW digital cinema cameras with an iOS device is a powerful proposition. New applications such as FoolControl iOS are now setting the bar. Once you experience iris and lens control, exposure monitoring, slating and access of any and all settings on the camera from 50+ feet away — no wires — just the touch of your fingers on an iOS device you carry in your pocket — it’s pretty hard to not start working and thinking differently.”