Tag Archives: The Molecule

VFX Roundtable: Trends and inspiration

By Randi Altman

The world of visual effects is ever-changing, and the speed at which artists are being asked to create new worlds, or to make things invisible is moving full-speed ahead. How do visual effects artists (and studios) prepare for these challenges, and what inspired them to get into this business? We reached out to a small group of visual effects pros working in television, commercials and feature films to find out how they work and what gets their creative juices flowing.

Let’s find out what they had to say…

KEVIN BAILLIE, CEO, ATOMIC FICTION
What do you wish clients would know before jumping into a VFX-heavy project?
The core thing for every filmmaking team to recognize is that VFX isn’t a “post process.” Careful advance planning and a tight relationship between the director, production designer, stunt team and cinematographer will yield a far superior result much more cost effectively.

In the best-looking and best-managed productions I’ve ever been a part of, the VFX team is the first department to be brought onto the show and the last one off. It truly acts as a partner in the filmmaking process. After all, once the VFX post phase starts, it’s effectively a continuation of production — with there being a digital corollary to every single department on set, from painters to construction to costume!

What trends in VFX have impressed you the most over the last year or two, and how are they affecting your work?
The move to cloud computing is one of the most exciting trends in VFX. The cloud is letting smaller teams to much bigger work, allowing bigger teams to do things that have never been seen before and will ultimately result in compute resources no longer being a constraint on the creative process.

Cloud computing allowed Atomic Fiction to play alongside the most prestigious companies in the world, even when we were just 20 people. That capability has allowed us to grow to over 200 people, and now we’re able to take the lead vendor position on A-list shows. It’s remarkable what dynamic and large-scale infrastructure in the cloud has enabled Atomic to accomplish.

How many years have you been working in VFX, and what project inspired you to get into this line of work?
I grew up in Seattle and started dabbling in 3D as a hobby when I was 14 years old, having been immensely inspired by Jurassic Park. Soon thereafter, I started working at Microsoft in the afternoons, developing visual content to demonstrate their upcoming technologies. I was fortunate enough to land a job with Lucasfilm right after graduating high school, which was 20 years ago at this point! I’ve been lucky enough to work with many of the directors that inspired me as a child, such as George Lucas and Robert Zemeckis, and modern pioneers like JJ Abrams.

Looking back on my career so far, I truly feel like I’ve been living the dream. I can’t wait for what’s next in this exciting, ever-changing business.

ROB LEGATO, OSCAR-WINNING VFX SUPERVISOR, SECOND UNIT DIRECTOR, SECOND UNIT DIRECTOR OF PHOTOGRAPHY
What do you wish clients would know before jumping into a VFX-heavy project?
It takes a good bit of time to come up with a plan that will ensure a sustainable attack when makinging the film. They need to ask someone in authority, “What does it take to do it,” and then make a reasonable plan. Everyone wants to do a great job all the time, and if they could maneuver the schedule — even with the same timeframe — it could be a much less frustrating job.

It happens time and time again, someone comes up with a budget and a schedule that doesn’t really fit with the task and forces you to live with it. That makes for a very difficult assignment that gets done because of the hard work of the people who are in the trenches.

What trends in VFX have impressed you the most over the last year or two, and how are they affecting your work?
For me, it’s how realistic you can make something. The rendering capabilities — like what we did on Jungle Book with the animals — are so sophisticated that it fools your eye into believing it’s real. Once you do that you’ve opened the magic door that allows you to do anything with a tremendous amount of fidelity. You can make good movies without it being a special-venue movie or a VFX movie. The computer power and rendering abilities — along with the incredible artistic talent pool that we have created over the years — is very impressive, especially for me, coming from a more traditional camera background. I tended to shy away from computer-generated things because they never had the authenticity you would have wanted.

Then there is the happy accident of shooting something, where an angle you wouldn’t have considered appears as you look through the camera; now you can do that in the computer, which I find infinitely fascinating. This is where all the virtual cinematography things I’ve done in the past come in to help create that happy accident.

How many years have you been working in VFX, and what project inspired you to get into this line of work?
I’ve been working in VFX since about 1984. Visual effects wasn’t my dream. I wanted to make movies: direct, shoot and be a cameraman and editor. I fell into it and then used it as an avenue to allow me to create sequences in films and commercials.

The reason you go to movies is to see something you have never seen before, and for me that was Close Encounters. The first time I saw the mothership in Close Encounters, it wasn’t just an effect, it became an art form. It was beautifully realized and it made the story. Blade Runner was another where it’s no longer a visual effect, it’s filmmaking as an art form.

There was also my deep appreciation for Doug Trumbull, whose quality of work was so high it transcended being a visual effect or a photographic effect.

LISA MAHER, VP OF PRODUCTION, SHADE VFX 
What do you wish clients would know before jumping into a VFX-heavy project?
That it’s less expensive in the end to have a VFX representative involved on the project from the get-go, just like all the other filmmaking craft-persons that are represented. It’s getting better all the time though, and we are definitely being brought on board earlier these days.

At Shade we specialize in invisible or supporting VFX. So-called invisible effects are often much harder to pull off. It’s all about integrating digital elements that support the story but don’t pull the audience out of a scene. Being able to assist in the planning stages of a difficult VFX sequence often results in the filmmakers achieving what they envisioned more readily. It also helps tremendously to keep the costs in line with what was originally budgeted. It also goes without saying that it makes for happier VFX artists as they receive photography captured with their best interests in mind.

What trends in VFX have impressed you the most over the last year or two, and how are they affecting your work?
I would say the most exciting development affecting visual effects is the explosion of opportunities offered by the OTT content providers such as Netflix, Amazon, HBO and Hulu. Shade primarily served the feature film market up to three years ago, but with the expanding needs of television, our offices in Los Angeles and New York are now evenly split between film and TV work.

We often find that the film work is still being done at the good old reliable 2K resolution while our TV shows are always 4K plus. The quality and diversity of projects being produced for TV now make visual effects a much more buoyant enterprise for a mid-sized company and also a real source of employment for VFX professionals who were previously so dependent on big studio generated features.

How many years have you been working in VFX, and what project inspired you to get into this line of work?
I’ve been working in visual effects close to 20 years now. I grew up in Ireland; as a child the world of film, and especially images of sunny California, were always a huge draw for me. They helped me survive the many grey and rainy days of the Irish climate.  I can’t point to one project that inspired me to get into film making — there have been so many — just a general love for storytelling, I guess. Films like Westworld (the 1973 version), Silent Running, Cinema Paradiso, Close Encounters of the Third Kind, Blade Runner and, of course, the original Star Wars were truly inspirational.

DAVID SHELDON-HICKS, CO-FOUNDER/EXECUTIVE CREATIVE DIRECTOR, TERRITORY STUDIO
What do you wish clients would know before jumping into a VFX-heavy project?
The craft and care and love that goes into VFX is often forgotten in the “business” of it all. As a design led studio that straddles art and VFX departments in our screen graphic and VFX work, we prefer to work with the director from the preproduction phase. This ensures that all aspects of our work are integrated into story and world building.

The talent and gut instinct, eye for composition and lighting, appreciation of form, choreography of movement and, most notably, the appreciation of the classics is so pertinent to the art of VFX and is undersold for conversations of shot counts, pipelines, bidding and numbers of artists. Bringing the filmmakers into the creative process has to be the way forward for an art form still finding its own voice.

What trends in VFX have impressed you the most over the last year or two, and how are they affecting your work?
The level of concept art and postviz coming through from VFX studios is quite staggering. It gets back to my point from above of bringing the VFX dialogue with filmmakers and VFX artists concentrated on world building and narrative expansion. It’s so exciting to see concept art and postviz getting to a new level of sophistication and influence in the filmmaking process.

How many years have you been working in VFX, and what project inspired you to get into this line of work?
I have been working professionally in VFX for over 15 years. My love of VFX and creativity in general came from the moment I picked up a pencil and imagined new possibilities. But once I cut my film teeth designing screens graphics on Casino Royale and followed by Dark Knight, I left my freelance days behind and co-founded Territory Studio. Our first film as a studio was Prometheus, and working with Ridley Scott was a formative experience that has influenced our own design-led approach to motion graphics and VFX, which has established us in the industry and seen the studio grow and expand.

MARK BREAKSPEAR, VFX SUPERVISOR, SONY PICTURES IMAGEWORKS
What do you wish clients would know before jumping into a VFX-heavy project?

Firstly, I think the clients I have worked with have always been extremely cognizant of the key areas affecting VFX heavy projects and consequently have built frameworks that help plan and execute these mammoth shows successfully.

Ironically, it’s the smaller shows that sometimes have the surprising “gotchas” in them. The big shows come with built-in checks and balances in the form of experienced people who are looking out for the best interests of the project and how to navigate the many pitfalls that can make the VFX costs increase.

Smaller shows sometimes don’t allow enough discussion and planning time for the VFX components in pre-production, which could result in the photography not being captured as well as it could have been. Everything goes wrong from there.

So, when I approach any show, I always look for the shots that are going to be underestimated and try to give them the attention they need to succeed. You can get taken out of a movie by a bad driving comp as much as you can a monster space goat biting a planet in half.

What trends in VFX have impressed you the most over the last year or two, and how are they affecting your work?
I think there are several red herrings out there right now… the big one being VR. To me, VR is like someone has invented teleportation, but it only works on feet.

So, right now, it’s essentially useless and won’t make creating VFX any easier or make the end result any more spectacular. I would like to see VR used to aid artists working on shots. If you could comp in VR I could see that being a good way to help create more complex and visually thrilling shots. The user interface world is really the key area VR can benefit.

Suicide Squad

I do think however, that AR is very interesting. The real world, with added layers of information is a hugely powerful prospect. Imagine looking at a building in any city of the world, and the apartments for sale in it are highlighted in realtime, with facts like cost, square footage etc. all right there in front of you.

How does AR benefit VFX? An artist could use AR to get valuable info about shots just by looking at them. How often do we look at a shot and ask “what lens was this? AR could have all that meta-data ready to display at any point on any shot.

How many years have you been working in VFX, and what project inspired you to get into this line of work?
I’ve been in VFX for 25 years. When I started, VFX was not really a common term. I came to this industry through the commercial world… as a compositor on TV shows and music videos. Lots of (as we would call it now) visual effects, but done in a world bereft of pipelines and huge cloud-based renderfarms.

I was never inspired by a specific project to get into the visual effects world. I was a creative kid who also liked the sciences. I liked to work out why things ticked, and also draw them, and sometimes try to draw them with improvements or updates as I could imagine. It’s a common set of passions that I find in my colleagues.

I watched Star Wars and came out wondering why there were black lines around some of the space ships. Maybe there’s your answer… I was inspired by the broken parts of movies, rather than being swept up in the worlds they portrayed. After all that effort, time and energy… why did it still look wrong? How can I fix it for next time?

CHRIS HEALER, CEO/CTO/VFX SUPERVISOR, THE MOLECULE
What do you wish clients would know before jumping into a VFX-heavy project?

Plan, plan plan… previs, storyboarding and initial design are crucial to VFX-heavy projects. The mindset should ideally be that most (or all) decisions have been made before the shoot starts, as opposed to a “we’ll figure it out in post” approach.

What trends in VFX have impressed you the most over the last year or two, and how are they affecting your work?
Photogrammetry, image modeling and data capture are so much more available than ever before. Instead of an expensive Lidar rig that only produces geometry without color, there are many many new ways to capture the color and geometry of the physical world, even using a simple smart phone or DSLR.

How many years have you been working in VFX, and what project inspired you to get into this line of work?
I’ve been doing VFX now for over 16 years. I would have to say that The Matrix (part 1) was really inspiring when I saw it the first time, and it made clear that VFX as an art form was coming and available to artists of all kinds all over the world. Previous to that, VFX was very difficult to approach for the average student with limited resources.

PAUL MARANGOS, SENIOR VFX FLAME ARTIST, HOOLIGAN
What do you wish clients would know before jumping into a VFX-heavy project?

The more involved I can be in the early stages, the more I can educate clients on all of the various effects they could use, as well as technical hurdles to watch out for. In general, I wish more clients involved the VFX guys earlier in the process — even at the concepting and storyboarding stages — because we can consult on a range of critical matters related to budgets, timelines, workflow and, of course, bringing the creative to life with the best possible quality.

Fortunately, more and more agencies realize the value of this. For instance, with a recent campaign Hooligan finished for Harvoni, we were able to plan shots for a big scene featuring hundreds of lanterns in the sky, which required lanterns of various sizes for every angle that Elma Garcia’s production team shot. Having everything well storyboarded and under Elma’s direction, who left no detail unnoticed, we managed to create a spectacular display of lantern composites for the commercial.

We were also involved early on for a campaign for MyHeritage DNA (above) via creative agency Berlin Cameron, featuring spoken word artist Prince Ea, and directed by Jonathan Augustavo of Skunk. Devised as if projecting on a wall, we mapped the motion graphics in the 3D environments.

What trends in VFX have impressed you the most over the last year or two, and how are they affecting your work?
Of course VR and 360 live TV shows are exciting, but augmented reality is what I find particularly interesting — mixing the real world with graphics and video all around you. The interactivity of both of these emerging platforms presents an endless area of growth, as our industry is on the cusp of a sea change that hasn’t quite yet begun to directly affect my day-to-day.

Meanwhile, at Hooligan, we’re always educating ourselves on the latest software, tools and technological trends in order to prepare for the future of media and entertainment — which is wise if you want to be relevant 10 years from now. For instance, I recently attended the TED conference, where Chris Milk spoke on the birth of virtual reality as an artform. I’m also seeing advances in Google cardboard, which is making the platform affordable, too. Seeing companies open up VR Departments is an exciting step for us all and it shows the vision for the future of advertising.

How many years have you been working in VFX, and what project inspired you to get into this line of work?
I have worked in VFX for 25 years. After initially studying fine art and graphic design, the craft aspect of visual effects really appealed to me. Seeing special effects genius Ray Harryhausen’s four-minute skeleton fight was a big inspiration. He rear-projected footage of the actual actors and then combined the shots to make a realistic skeleton-Argonaut battle. It took him over four and a half months to shoot the stop-motion animation.

Main Image: Deadpool/Atomic Fiction.

VR Post: Hybrid workflows are key

By Beth Marchant

Shooting immersive content is one thing, but posting it for an ever-changing set of players and headsets is whole other multidimensional can of beans.

With early help from software companies that have developed off-the-shelf ways to tackle VR post — and global improvements to their storage and networking infrastructures — some facilities are diving into immersive content by adapting their existing post suites with a hybrid set of new tools. As with everything else in this business, it’s an ongoing challenge to stay one step ahead.

Chris Healer

The Molecule
New York- and Los Angeles-based motion graphics and VFX post house The Molecule leapt into the VR space more than a year and a half ago when it fused The Foundry’s Nuke with the open-sourced panoramic photo stitching software Hugin. Then, CEO Chris Healer took the workflow one step further. He developed an algorithm that rendered stereoscopic motion graphics spherically in Nuke.

Today, those developments have evolved into a robust pipeline that fuels The Molecule’s work for Conan O’Brien’s eponymous TBS talk show, The New York Times’s VR division and commercial work. “It’s basically eight or ten individual nodes inside Nuke that complete one step or another of the process,” says Healer. “Some of them overlap with Cara VR,” The Foundry’s recently launched VR plug-in for Nuke, “but all of it works really well for our artists. I talk to The Foundry from time to time and show them the tools, so there’s definitely an open conversation there about what we all need to move VR post forward.”

Collaborating with VR production companies like SuperSphere, Jaunt and Pixvana in Seattle, The Molecule is heading first where mass VR adoption seems likeliest. “The New York Times, for example, wants to have a presence at film festivals and new technology venues, and is trying to get out of the news-only business and into the entertainment-provider business. And the job for Conan was pretty wild — we had to create a one-off gag for Comic-Con that people would watch once and go away laughing to the next thing. It’s kind of a cool format.”

Healer’s team spent six weeks on the three-minute spot. “We had to shoot plates, model characters, animate them, composite it, build a game engine around it, compile it, get approval and iterate through that until we finished. We delivered 20 or so precise clips that fit into a game engine design, and I think it looks great.”

Healer says the VR content The Molecule is posting now is, like the Conan job, a slight variation on more typical recent VR productions. “I think that’s also what makes VR so exciting and challenging right now,” he says. “Everyone’s got a different idea about how to take it to the next level. And a lot of that is in anticipation of AR (augmented reality) and next-generation players/apps and headsets.

‘Conan’

The Steam store,” the premiere place online to find virtual content, “has content that supports multiple headsets, but not all of them.” He believes that will soon gel into a more unified device driver structure, “so that it’s just VR, not Oculus VR or Vive VR. Once you get basic head tracking together, then there’s the whole next thing: Do you have a controller of some kind, are you tracking in positional space, do you need to do room set up? Do we want wands or joysticks or hand gestures, or will keyboards do fine? What is the thing that wins? Those hurdles should solidify in the next year or two. The key factor in any of that is killer content.”

The biggest challenge facing his facility, and anyone doing VR post right now, he says, is keeping pace with changing resolutions and standards. “It used to be that 4K or 4K stereo was a good deliverable and that would work,” says Healer. “Now everything is 8K or 10K, because there’s this idea that we also have to future-proof content and prepare for next-gen headsets. You end up with a lot of new variables, like frame rate and resolution. We’re working on stereo commercial right now, and just getting the footage of one shot converted from only six cameras takes almost 3TB of disk space, and that’s just the raw footage.”

When every client suddenly wants to dip their toes into VR, how does a post facility respond? Healer thinks the onus is on production and post services to provide as many options as possible while using their expertise to blaze new paths. “It’s great that everyone wants to experiment in the space, and that puts a certain creative question in our field,” he says. “You have to seriously ask of every project now, does it really just need to be plain-old video? Or is there a game component or interactive component that involves video? We have to explore that. But that means you have to allocate more time in Unity https://unity3d.com/ building out different concepts for how to present these stories.”

As the client projects get more creative, The Molecule is relying on traditional VFX processes like greenscreen, 3D tracking and shooting plates to solve VR-related problems. “These VFX techniques help us get around a lot of the production issues VR presents. If you’re shooting on a greenscreen, you don’t need a 360 lens, and that helps. You can shoot one person walking around on a stage and then just pan to follow them. That’s one piece of footage that you then composite into some other frame, as opposed to getting that person out there on the day, trying to get their performance right and then worrying about hiding all the other camera junk. Our expertise in VFX definitely gives us an advantage in VR post.”

From a post perspective, Healer still hopes most for new camera technology that would radically simplify the stitching process, allowing more time for concepting and innovative project development. “I just saw a prototype of a toric lens,” shaped like the donut-like torus that results from revolving a circle in three-dimensional space, “that films 360 minus a little patch, where the tripod is, in a single frame,” he says. “That would be huge for us. That would really change the workflow around, and while we’re doing a lot of CG stuff that has to be added to VR, stitching takes the most time. Obviously, I care most about post, but there are also lots of production issues around a new lens like that. You’d need a lot of light to make it work well.”

Local Hero Post
For longtime Scratch users Local Hero Post, in Santa Monica, the move to begin grading and compositing in Assimilate Scratch VR was a no-brainer. “We were one of the very first American companies to own a Scratch when it was $75,000 a license,” says founder and head of imaging Leandro Marini. “That was about 10 years ago and we’ve since done about 175 feature film DIs entirely in Scratch, and although we also now use a variety of tools, we still use it.”

Leandro Marini

Marini says he started seeing client demand for VR projects about two years ago and he turned to Scratch VR. He says it allows users do traditional post the way editors and colorist are used to — with all the same DI tools that let you do complicated paint outs, visual effects and 50-layer-deep color corrections, Power Windows, in realtime on a VR sphere.”

New Deal Studios’ 2015 Sundance film, Kaiju Fury was an early project, “when Scratch VR was first really user-friendly and working in realtime.” Now Marini says their VR workflow is “pretty robust. [It’s] currently the only system that I know of that can work in VR in realtime in multiple ways,” which includes a echo-rectangular projection, which gives you a YouTube 360-type of feel and an Oculus headset view.

“You can attach the headset, put the Oculus on and grade and do visual effects in the headset,” he says. “To me, that’s the crux: you really have to be able to work inside the headset if you are going to grade and do VR for real. The difference between seeing a 360 video on a computer screen and seeing it from within a headset and being able to move your head around is huge. Those headsets have wildly different colors than a computer screen.”

The facility’s — and likely the industry’s — highest profile and biggest budget project to date is Invisible, a new VR scripted miniseries directed by Doug Liman and created by 30 Ninjas, the VR company he founded with Julina Tatlock. Invisible premiered in October on Samsung VR and the Jaunt app and will roll out in coming months in VR theaters nationwide. Written by Dallas Buyers Club screenwriter Melisa Wallack and produced by Jaunt and Condé Nast Entertainment, it is billed as the first virtual reality action-adventure series of its kind.

‘Invisible’

“Working on that was a pretty magical experience,” says Marini. “Even the producers and Liman himself had never seen anything like being able to do the grade, do VFX and do composite and stereo fixes in 3D virtual reality all with the headset on. That was our initial dilemma for this project, until we figured it out: do you make it look good for the headset, for the computer screen or for iPhones or Samsung phones? Everyone who worked on this understood that every VR project we do now is in anticipation of the future wave of VR headsets. All we knew was that about a third would probably see it on a Samsung Gear VR, another third would see it on a platform like YouTube 360 and the final third would see it on some other headset like Oculus Rift, HTC or Google’s new Daydream.”

How do you develop a grading workflow that fits all of the above? “This was a real tricky one,” admits Marini. “It’s a very dark and moody film and he wanted to make a family drama thriller within that context. A lot of it is dark hallways and shadows and people in silhouette, and we had to sort of learn the language a bit.”

Marini and his team began exclusively grading in the headset, but that was way too dark on computer monitors. “At the end of the day, we learned to dial it back a bit and make pretty conservative grades that worked on every platform so that it looked good everywhere. The effect of the headset is it’s a light that’s shining right into your eyeball, so it just looks a lot brighter. It had to still look moody inside the headset in a dark room but not too moody that it vanishes on computer laptop in a bright room. It was a balancing act.”

Local Hero

Local Hero also had to figure out how to juggle the new VR work with its regular DI workload. “We had to break off the VR services into a separate bay and room that is completely dedicated to it,” he explains. “We had to slice it off from the main pipeline because it needs around-the-clock custom attention. Very quickly we realized we needed to quarantine this workflow. One of our colorists here has become a VR expert, and he’s now the only one allowed to grade those projects.” The facility upgraded to a Silverdraft Demon workstation with specialized storage to meet the exponential demand for processing power and disk space.

Marini says Invisible, like the other VR work Local Hero has done before is, in essence, a research project in these early days of immersive content. “There is no standard color space or headset or camera. And we’re still in the prototype phase of this. While we are in this phase, everything is an experiment. The experience of being in 3D space is interesting but the quality of what you’re watching is still very, very low resolution. The color fidelity relative to what we’re used to in the theater and on 4K HDR televisions is like VHS 1980’s quality. We’re still very far away from truly excellent VR.”

Scratch VR workflows in Invisible included a variety of complicated processes. “We did things like dimension-alizing 2D shots,” says Marini. “That’s complicated stuff. In 3D with the headset on we would take a shot that was in 2D, draw a rough roto mask around the person, create a 3D field, pull their nose forward, push their eyes back, push the sky back — all in a matter of seconds. That is next-level stuff for VR post.”

Local Hero also used Scratch Web for reviews. “Moments after we finished a shot or sequence it was online and someone could put on a headset and watch it. That was hugely helpful. Doug was in London, Condé Nast in New York. Lexus was a sponsor of this, so their agency in New York was also involved. Jaunt is down the street from us here in Santa Monica. And there were three clients in the bay with us at all times.”

‘Invisible’

As such, there is no way to standardize a VR DI workflow, he says. “For Invisible, it was definitely all hands on deck and every day was a new challenge. It was 4K 60p stereo, so the amount of data we had to push — 4K 60p to both eyes — which was unprecedented.” Strange stereo artifacts would appear for no apparent reason. “A bulge would suddenly show up on a wall and we’d have to go in there and figure out why and fix it. Do we warp it? Try something else? It was like that throughout the entire project: invent the workflow every day and fudge your way through. But that’s the nature of experimental technology.”

Will there be a watershed VR moment in the year ahead? “I think it all depends on the headsets, which are going to be like mobile phones,” he says. “Every six months there will be a new group of them that will be better and more powerful with higher resolution. I don’t think there will be a point in the future when everyone has a self-contained high-end headset. I think the more affordable headsets that you put your phone into, like Gear VR and Daydream, are the way most people will begin to experience VR. And we’re only 20 percent of the way there now. The whole idea of VR narrative content is completely unknown and it remains to be seen if audiences care and want it and will clamor for it. When they do, then we’ll develop a healthy VR content industry in Hollywood.”


Beth Marchant has been covering the production and post industry for 21 years. She was the founding editor-in-chief of Studio/monthly magazine and the co-editor of StudioDaily.com. She continues to write about the industry.

VFX Storage: The Molecule

Evolving to a virtual private local cloud?

By Beth Marchant

VFX artists, supervisors and technologists have long been on the cutting-edge of evolving post workflows. The networks built to move, manage, iterate, render and put every pixel into one breathtaking final place are the real super heroes here, and as New York’s The Molecule expands to meet the rising demand for prime-time visual effects, it pulls even more power from its evolving storage pipeline in and out of the cloud.

The Molecule CEO/CTO Chris Healer has a fondness for unusual workarounds. While studying film in college, he built a 16mm projector out of Legos and wrote a 3D graphics library for DOS. In his professional life, he swiftly transitioned from Web design to motion capture and 3D animation. He still wears many hats at his now bicoastal VFX and VR facility, The Molecule —which he founded in New York in 2005 — including CEO, CTO, VFX supervisor, designer, software developer and scientist. In those intersecting capacities, Healer has created the company’s renderfarm, developed and automated its workflow, linking and preview tools and designed and built out its cloud-based compositing pipeline.

When the original New York office went into growth mode, Healer (pictured at his new, under-construction facility) turned to GPL Technologies, a VFX and post-focused digital media pipeline and data infrastructure developer, to help him build an entirely new network foundation for the new location the company will move to later this summer. “Up to this point, we’ve had the same system and we’ve asked GPL to come in and help us create a new one from scratch,” he says. “But any time you hire anyone to help with this kind of thing, you’ve really got to do your own research and figure out what makes sense for your artists, your workflows and, ultimately, your bottom line.”

The new facility will start with 65 seats and expand to more than 100 within the next year to 18 months. Current clients include the major networks, Showtime, HBO, AMC, Netflix and director/producer Doug Limon.

UKS-beforesmall      UKS-aftersmall
Netflix’s Unbreakable Kimmy Schmidt is just one of the shows The Molecule works on.

Healer’s experience as an artist, developer, supervisor and business owner has given him a seasoned perspective on how to develop VFX pipeline work. “There’s a huge disparity between what the conventional user wants to do, i.e. share data, and the much longer dialog you need to have to build a network. Connecting and sharing data is really just the beginning of a very long story that involves so many other factors: how many things are you connecting to? What type of connection do you have? How far away are you from what you’re connecting to? How much data are you moving, and it is all at once or a continuous stream? Users are so different, too.”

Complicating these questions, he says, are a facility’s willingness to embrace new technology before it’s been vetted in the market. “I generally resist the newest technologies,” he says. “My instinct is that I would prefer an older system that’s been tested for years upon years. You go to NAB and see all kinds of cool stuff that appears to be working the way it should. But it hasn’t been tried in different kinds of circumstances or its being pitched to the broadcast industry and may not work well for VFX.”

Making a Choice
He was convinced by EMC’s Isilon system, based on customer feedback and the hardware has already been delivered to the new office. “We won’t install it until construction is complete, but all the documentation is pointing in the right direction,” he says. “Still, it’s a bit of a risk until we get it up and running.”

Last October, Dell announced it would acquire EMC in a deal that is set to close in mid-July. That should suit The Molecule just fine —most of its artists computers are either Dell or HP running Nvidia graphics.

A traditional mass configuration on a single GigE line can only do up to 100MB per second. “A 10GigE connection running in NFS can, theoretically, do 10 times that,” says Healer. “But 10GigE works slightly differently, like an LA freeway, where you don’t change the speed limit but you change the number of lanes and the on and off ramp lights to keep the traffic flowing. It’s not just a bigger gun for a bigger job, but more complexity in the whole system. Isilon seems to do that very well and it’s why we chose them.”

His company’s fast growth, Healer says, has “presented a lot of philosophical questions about disk and RAID redundancy, for example. If you lose a disk in RAID-5 you’re OK, but if two fail, you’re screwed. Clustered file systems like GlusterFS and OneFS, which Isilon uses, have a lot more redundancy built in so you could lose quite a lot of disks and still be fine. If your number is up and on that unlucky day you lost six disks, then you would have backup. But that still doesn’t answer what happens if you have a fire in your office or, more likely, there’s a fire elsewhere in the building and it causes the sprinklers to go off. Suddenly, the need for off-site storage is very important for us, so that’s where we are pushing into next.”

Healer honed in on several metrics to help him determine the right path. “The solutions we looked at had to have the following: DR, or disaster recovery, replication, scalability, off-site storage, undelete and versioning snapshots. And they don’t exactly overlap. I talked to a guy just the other day at Rsync.net, which does cloud storage of off-site backups (not to be confused with the Unix command, though they are related). That’s the direction we’re headed. But VFX is just such a hard fit for any of these new data centers because they don’t want to accept and sync 10TB of data per day.”

A rendering of The Molecule NYC's new location.His current goal is simply to sync material between the two offices. “The holy grail of that scenario is that neither office has the definitive master copy of the material and there is a floating cloud copy somewhere out there that both offices are drawing from,” he says. “There’s a process out there called ‘sharding,’ as in a shard of glass, that MongoDB and Scality and other systems use that says that the data is out there everywhere but is physically diverse. It’s local but local against synchronization of its partners. This makes sense, but not if you’re moving terabytes.”

The model Healer is hoping to implement is to “basically offshore the whole company,” he says. “We’ve been working for the past few months with a New York metro startup called Packet which has a really unique concept of a virtual private local cloud. It’s a mouthful but it’s where we need to be.” If The Molecule is doing work in New York City, Healer points out, Packet is close enough that network transmissions are fast enough and “it’s as if the machines were on our local network, which is amazing. It’s huge. It the Amazon cloud data center is 500 miles away from your office, that drastically changes how well you can treat those machines as if they are local. I really like this movement of virtual private local that says, ‘We’re close by, we’re very secure and we have more capacity than individual facilities could ever want.’ But they are off-site and the multiple other companies that use them are in their own discrete containers that never crosses. Plus, you pay per use — basically per hour and per resource. In my ideal future world, we would have some rendering capacity in our office, some other rendering capacity at Packet and off-site storage at Rsync.net. If that works out, we could potentially virtualize the whole workflow and join our New York and LA office and any other satellite office we want to set up in the future.”

The VFX market, especially in New York, has certainly come into its own in recent years. “It’s great to be in an era when nearly every single frame of every single shot of both television and film is touched in some way by visual effects, and budgets are climbing back and the tax credits have brought a lot more VFX artists, companies and projects to town,” Healer says. “But we’re also heading toward a time when the actual brick-and-mortar space of an office may not be as critical as it is now, and that would be a huge boon for the visual effects industry and the resources we provide.”

The Molecule: VFX for ‘The Affair’ and so much more

By Randi Altman

Luke DiTommaso, co-founder of New York City’s The Molecule, recalls “humble”
beginnings when he thinks about the visual effects, motion graphics and VR studio’s launch as a small compositing shop. When The Molecule opened in 2005, New York’s production landscape was quite a bit different than the tax-incentive-driven hotbed that exists today.

Rescue Me was our big break,” explains DiTommaso. “That show was the very beginning of this wave of production that started happening in New York. Then we got Damages and Royal Pains, but were still just starting to get our feet wet with real productions.”

The Molecule partners (L-R) Andrew Bly, Chris Healer and Luke DiTommaso.

Then, thanks to a healthy boost from New York’s production and post tax incentives, things exploded, and The Molecule was at the right place at the right time. They had an established infrastructure, talent and experience providing VFX for television series.

Since then DiTommaso and his partners Chris Healer and Andrew Bly have seen the company grow considerably, doing everything from shooting and editing to creating VFX and animation, all under one roof. With 35 full-time employees spread between their New York and LA offices — oh, yeah, they opened an office in LA! — they also average 30 freelance artists a day, but can seat 65 if needed.

While some of these artists work on commercials, many are called on to create visual effects for an impressive list of shows, including Netflix’s Unbreakable Kimmy Schmidt, House of Cards and Bloodline, Showtime’s The Affair, HBO’s Ballers (pictured below), FX’s The Americans, CBS’ Elementary and Limitless, VH1’s The Breaks, Hulu’s The Path (for NBC and starring Aaron Paul) and the final season of USA’s Royal Pains. Also completed are the miniseries Madoff and Behind the Magic, a special on Snow White, for ABC.

Ballers-before      Ballers-after

The Molecule’s reach goes beyond the small screen. In addition to having completed a few shots for Zoolander 2 and a big one involving a digital crowd for Barbershop 3, at the time of this interview the studio was gearing up for Jodie Foster’s Money Monster; they will be supplying titles, the trailer and a ton of visual effects.

There is so much for us to cover, but just not enough time, so for this article we are going to dig into The Molecule’s bread and butter: visual effects for TV series. In particular, the work they provided for Showtime’s The Affair, which had its season finale just a few weeks ago.

The Affair
Viewers of The Affair, a story of love, divorce and despair, might be surprised to know that each episode averages between 50 to 70 visual effects shots. The Molecule has provided shots that range from simple clean-ups to greenscreen driving and window shots — “We’ll shoot the plates and then composite a view of midtown Manhattan or Montauk Highway outside the car window scene,” says DiTommaso — to set extensions, location changes and digital fire and rain.

One big shot for this past season was burning down a cabin during a hurricane. “They had a burn stage so they could captFire-stageure an amount of practical fire on a stage, but we enhanced that, adding more fire to increase the feeling of peril. The scene then cuts to a wide shot showing the location, which is meant to be on the beach in Montauk during a raging hurricane. We went out to the beach and shot the house day for night — we had flicker lighting on the location so the dunes and surrounding grass got a sort of flickering light effect. Later on, we shot the stage from a similar angle and inserted the burning stage footage into the exterior wide location footage, and then added a hurricane on top of all of that. That was a fun challenge.”

During that same hurricane, the lead character Noah gets his car stuck in the mud but they weren’t able to get the tires to spin practically, so The Molecule got the call. “The tires are spinning in liquid so it’s supposed to kick up a bunch of mud and water and stuff while rain is coming down on top of it, so we had our CG department create that in the computer.”

Another scene that features a good amount of VFX was one that involved a scene that took place on the patio outside of the fictitious Lobster Roll restaurant. “It was shot in Montauk in October and it wasn’t supposed to be cold in the scene, but it was about 30 degrees at 2:00am and Alison is in a dress. They just couldn’t shoot it there because it was just too cold. We shot plates, basically, of the location, without actors. Later we recreated that patio area and lined up the lighting and the angle and basically took the stage footage and inserted it into the location footage. We were able to provide a solution so they could tell the story without having the actors’ breath and their noses all red and shivering.”

Lobster_Roll-before      Lobster_Roll-after

Being on Set
While on-set VFX supervision is incredibly important, DiTommaso would argue “by the time you’re on set you’re managing decisions that have already been set into motion earlier in the process. The most important decisions are made on the tech scouts and in the production/VFX meetings.”

He offers up an example: “I was on a tech scout yesterday. They have a scene where a woman is supposed to walk onto a frozen lake and the ice starts to crack. They were going to build an elaborate catwalk into the water. I was like, ‘Whoa, aren’t we basically replacing the whole ground with ice? Then why does she need to be over water? Why don’t we find a lake that has a flat grassy area leading up to it?’ Now they’re building a much simpler catwalk — imagine an eight-foot-wide little platform. She’ll walk out on that with some blue screens and then we’ll extend the ice and dress the rest of the location with snow.

According to DiTommaso being there at the start saved a huge amount of time, money and effort. “By the time you’re on set they would have already built it into the water and all that stuff.”

But, he says, being on set for the shoot is also very important because you never know what might happen. “A problem will arise and the whole crew kind of turns and looks at you like, ‘You can fix this, right?’ Then we have to say, ‘Yeah. We’re going to shoot this plate. We’re going to get a clean plate, get the actors out, then put them back in.’ Whatever it is; you have to improvise sometimes. Hopefully that’s a rare instance and that varies from crew to crew. Some crews are very meticulous and others are more freewheeling.”

Tools
The Molecule is shooting more and more of their own plates these days, so they recently invested in a Ricoh S camera for shooting 360-degree HDR. “It has some limitations, but it’s perfect for CG HDRs,” explains DiTommaso. “It gives you a full 360-degree dome, instantly, and it’s tiny like a cell phone or a remote. We also have a Blackmagic 4K Cinema camera that we’ll shoot plates with. There are pros and cons to it, but I like the latitude and the simplicity of it. We use it for a quick run and gun to grab an element. If we need a blood spurt, we’ll set that up in the conference room and we’ll shoot a plate.”

The Molecule added John Hamm’s head to this scene for Unbreakable Kimmy Schmidt.

They call on a Canon 74 for stills. “We have a little VFX kit with little LED tracking points and charts that we bring with us on set. Then back at the shop we’re using Nuke to composite. Our CG department has been doing more and more stuff. We just submitted an airplane — a lot of vehicles, trains, planes and automobiles are created in Maya.”

They use Side Effects Houdini for simulations, like fire and rain; for rendering they called on Arnold, and crowds are created in Massive.

What’s Next?
Not ones to be sitting on the sidelines, The Molecule recently provided post on a few VR projects, but their interest doesn’t end there. Chris Healer is currently developing a single lens VR camera rig that DiTommaso describes as essentially “VR in a box.”