Category Archives: VFX

postPerspective Impact Award winners from SIGGRAPH 2017

Last April, postPerspective announced the debut of our Impact Awards, celebrating innovative products and technologies for the post production and production industries that will influence the way people work. We are now happy to present our second set of Impact Awards, celebrating the outstanding offerings presented at SIGGRAPH 2017.

Now that the show is over, and our panel of VFX/VR/post pro judges has had time to decompress, dig out and think about what impressed them, we are happy to announce our honorees.

And the winners of the postPerspective Impact Award from SIGGRAPH 2017 are:

  • Faceware Technologies for Faceware Live 2.5
  • Maxon for Cinema 4D R19
  • Nvidia for OptiX 5.0  

“All three of these technologies are very worthy recipients of our first postPerspective Impact Awards from SIGGRAPH,” said Randi Altman, postPerspective’s founder and editor-in-chief. “These awards celebrate companies that define the leading-edge of technology while producing tools that actually make users’ working lives easier and projects better, and our winners certainly fall into that category.

“While SIGGRAPH’s focus is on VFX, animation, VR/AR and the like, the types of gear they have on display vary. Some are suited for graphics and animation, while others have uses that slide into post production. We’ve tapped real-world users in these areas to vote for our Impact Awards, and they have determined what tools might be most impactful to their day-to-day work. That’s what makes our awards so special.”

There were many new technologies and products at SIGGRAPH this year, and while only three won an Impact Award, our judges felt there were other updates that it was important to let people know about as well.

Blackmagic Design’s Fusion 9 was certainly turning heads and Nvidia’s VRWorks 360 Video was called out as well. Chaos Group also caught our judges attention with V-Ray for Unreal Engine 4.

Stay tuned for future Impact Award winners in the coming months — voted on by users for users — from IBC.

WAR FOR THE PLANET OF THE APES

Editor William Hoy — working on VFX-intensive War for the Planet of the Apes

By Mel Lambert

For William Hoy, ACE, story and character come first. He also likes to use visual effects “to help achieve that idea.” This veteran film editor points to director Zack Snyder’s VFX-heavy I, Robot, director Matt Reeves’ 2014 version of Dawn of the Planet of the Apes and his new installment, War for the Planet of the Apes, as “excellent examples of this tenet.”

War for the Planet of the Apes, the final part of the current reboot trilogy, follows a band of apes and their leader as they are forced into a deadly conflict with a rogue paramilitary faction known as Alpha-Omega. After the apes suffer unimaginable losses, their leader begins a quest to avenge his kind, and an epic battle that determines the fate of both their species and the future of our planet.

Marking the picture editor’s second collaboration with Reeves, Hoy recalls that he initially secured an interview with the director through industry associates. “Matt and I hit it off immediately. We liked each other,” Hoy recalls. “Dawn of the Planet of the Apes had a very short schedule for such a complicated film, and Matt had his own ideas about the script — particularly how the narrative ended. He was adamant that he ‘start over’ when he joined the film project.

“The previous Dawn script, for example, had [the lead ape character] Caesar and his followers gaining intelligence and driving motorized vehicles,” Hoy says. “Matt wanted the action to be incremental which, it turned out, was okay with the studio. But a re-written script meant that we had a very tight shoot and post schedule. Swapping its release date with X-Men: Days of Future Past gave us an additional four or five weeks, which was a huge advantage.”

William Hoy, ACE (left), Matt Reeves (right).

Such a close working relationship on Dawn of the Planet of the Apes meant that Hoy came to the third installment in the current trilogy with a good understanding of the way that Reeves likes to work. “He has his own way of editing from the dailies, so I can see what we will need on rough cut as the filmed drama is unfolding. We keep different versions in Avid Media Composer, with trusted performances and characters, and can see where they are going” with the narrative. Having worked with Reeves over the past two decades, Stan Salfas, ACE, served as co-editor on the project, joining prior to the Director’s Cut.

A member of The Academy of Motion Picture Arts And Sciences, Hoy also worked with director Randall Wallace on We Were Soldiers and The Man in the Iron Mask, with director Phillip Noyce on The Bone Collector and director Zack Snyder on Watchmen, a film “filled with emotional complexity and heavy with visual effects,” he says.

An Evolutionary Editing Process
“Working scene-by-scene with motion capture images and background artwork laid onto the Avid timeline, I can show Matt my point of view,” explains Hoy. “We fill in as we go — it’s an evolutionary process. I will add music and some sound effects for that first cut so we can view it objectively. We ask, ‘Is it working?’ We swap around ideas and refine the look. This is a film that we could definitely not have cut on film; there are simply too many layers as the characters move through these varied backgrounds. And with the various actors in motion capture suits giving us dramatic performances, with full face movements [CGI-developed facial animation], I can see how they are interacting.”

To oversee the dailies on location, Hoy set up a Media Composer editing system in Vancouver, close to the film locations used for principal photography. “War for the Planet of the Apes was shot on Arri Alexa 65 digital cameras that deliver 6K images,” the editor recalls. “These files were down-sampled to 4K and delivered to Weta Digital [in New Zealand] as source material, where they were further down-sampled to 2K for CGI work and then up-sampled back to 4K for the final release. I also converted our camera masters to 2K DNxHD 32/36 for editing color-timed dailies within my Avid workstation.”

In terms of overall philosophy, “we did not want to give away Caesar’s eventual demise. From the script, I determined that the key arc was the unfolding mystery of ‘What is going on?’ And ‘Where will it take us?’ We hid that Caesar [played by Andy Serkis] is shot with an arrow, and initially just showed the blood on the hand of the orangutan, Maurice [Karin Konoval]; we had to decide how to hide that until the key moment.”

Because of the large number of effect-heavy films that Hoy has worked on, he is considered an action/visual effects editor. “But I am drawn to performances of actors and their characters,” he stresses. “If I’m not invested in their fate, I cannot be involved in the action. I like to bring an emotional value to the characters, and visualize battle scenes. In that respect Matt and I are very much in tune. He doesn’t hide his emotion as we work out a lot of the moves in the editing room.”

For example, in Dawn of The Planet of The Apes, Koba, a human-hating Bonobo chimpanzee who led a failed coup against Caesar, is leading apes against the human population. “It was unsatisfying that the apes would be killing humans while the humans were killing apes. Instead, I concentrated on the POV of Caesar’s oldest son, Blue Eyes. We see the events through his eyes, which changed the overall idea of the battle. We shot some additional material but most of the scene — probably 75% — existed; we also spoke with the FX house about the new CGI material,” which involved re-imaged action of horses and backgrounds within the Virtual Sets that were fashioned by Weta Digital.

Hoy utilized VFX tools on various layers within his Media Composer sessions that carried the motion capture images, plus the 3D channels, in addition to different backgrounds. “Sometimes we could use one background version and other times we might need to look around for a new perspective,” Hoy says. “It was a trial-and-error process, but Matt was very receptive to that way of working; it was very collaborative.”

Twentieth Century Fox’s War for the Planet of the Apes.

Developing CGI Requests for Key Scenes
By working closely with Weta Digital, the editor could develop new CGI requests for key scenes and then have them rendered as necessary. “We worked with the post-viz team to define exactly what we needed from a scene — maybe to put a horse into a blizzard, for example. Ryan Stafford, the film’s co-producer and visual effects producer, was our liaison with the CGI team. On some scenes I might have as many as a dozen or more separate layers in the Avid, including Caesar, rendered backgrounds, apes in the background, plus other actors in middle and front layers” that could be moved within the frame. “We had many degrees of freedom so that Matt and I could develop alternate ideas while still preserving the actors’ performances. That way of working could be problematic if you have a director who couldn’t make up his mind; happily, Matt is not that way!”

Hoy cites one complex scene that needed to be revised dramatically. “There is a segment in which Bad Ape [an intelligent chimpanzee who lived in the Sierra Zoo before the Simian Flu pandemic] is seen in front of a hearth. That scene was shot twice because Matt did not consider it frenetic enough. The team returned to the motion capture stage and re-shot the scene [with actor Steve Zahn]. That allowed us to start over again with new, more frantic physical performances against resized backgrounds. We drove the downstream activities – asking Weta to add more snow in another scene, for example, or maybe bring Bad Ape forward in the frame so that we can see him more clearly. Weta was amazing during that collaborative process, with great input.”

The editor also received a number of sound files for use within his Avid workstation. “In the beginning, I used some library effects and some guide music — mostly some cues of composer Michael Giacchino’s Dawn score music from music editor Paul Apelgren. Later, when the picture was in one piece, I received some early sketches from the sound design team. For the Director’s Cut we had a rough cut with no CGI from Weta Digital. But when we received more sound design, I would create temp mixes on the Avid, with a 5.1-channel mix for the sound-editorial team using maybe 24 tracks of effects, dialog and music elements. It was a huge session, but Media Composer is versatile. After turning over that mix to Will Files, the film’s sound designer, supervising sound editor and co-mixer, I was present with Matt on the re-recording stage for maybe six weeks of the final mix as the last VFX elements came in. We were down to the wire!”

Hoy readily concedes that while he loves to work with new directors — “and share their point of view” — returning to a director with whom he has collaborated previously is a rewarding experience. “You develop a friendly liaison because it becomes easier once you understand the ways in which a director works. But I do like to be challenged with new ideas and new experiences.” He may get to work again with Reeves on the director’s next outing, The Batman, “but since Matt is still writing the screenplay, time will tell!”


Mel Lambert is principal of Content Creators, an LA-based copywriting and editorial service, and can be reached at mel.lambert@content-creators.com. Follow him on Twitter @MelLambertLAHe is also a long-time member of the UK’s National Union of Journalists.

 

Dell 6.15

Pixomondo streamlines compute management with Thinkbox’s Deadline

There’s never a dull moment at Pixomondo, where artists and production teams juggle feature film, TV, theme park and commercial VFX projects between offices in Toronto, Vancouver, Los Angeles, Frankfurt, Stuttgart, Shanghai and Beijing. The Academy- and Emmy-award-winning VFX studio securely manages its on-premises compute resources across its branches and keeps its rendering pipeline running 24/7 utilizing Thinkbox’s Deadline, which it standardized on in 2010.

In recent months, Pixomondo has increasingly been computing workstation tasks on its render farm via Deadline and has moved publishing to Deadline as well. Sebastian Kral, Pixomondo’s global head of pipeline, says, “By offloading more to Deadline, we’re able to accelerate production. Our artists don’t have to wait for publishes to finish before they move onto the next task, and that’s really something. Deadline’s security is top-notch, which is extremely important for us given the secretive nature of some of our projects.”

Kral is particularly fond of Deadline’s Python API, which allows his global team to develop custom scripts to minimize the minutia that artists must deal with, resulting in a productivity boon. “Deadline gives us incredible flexibility. The Python API is fast, reliable and more usable than a command line entry point, so we can script so many things on our own, which is convenient,” says Kral. “We can build submission scripts for texture conversions, and create proxy data when a render job is done, so our artists don’t have to think about whether or not they need a QT of a composite.”

Power Rangers. Images courtesy of Pixomondo.

The ability to set environment variables for renders, or render as a specific user, allows Pixomondo’s artists to send tasks to the farm with an added layer of security. With seven facilities worldwide, and the possibility of new locations based on production needs, Pixomondo has also found Deadline’s ability to enable multi-facility rendering valuable.

“Deadline is packed with a ton of great out-of-the-box features, in addition to the new features that Thinkbox implements in new iterations; we didn’t even need to build our own submission tool, because Deadline’s submission capabilities are so versatile,” Kral notes. “It also has a very user-friendly interface that makes setup quick and painless, which is great for getting new hires up to speed quickly and connecting machines across facilities.”

Pixomondo’s more than 400 digital artists are productive around the clock, taking advantage of alternating time zones at facilities around the world. Nearly every rendering decision at the studio is made with Deadline in mind, as it presents rendering metrics in an intuitive way that allows the team to more accurately estimate project turnaround. “When opening Deadline to monitor a render, it’s always an enjoyable experience because all the information I need is right there at my fingertips,” says Kral. “It provides a meaningful overview of our rendering resource spread. We just log in, test renders, and we have all the information needed to determine how long each task will take using the available machines.”


Benji Davidson joins Brickyard VFX

Brickyard VFX in Santa Monica has added VFX supervisor Benji Davidson to its staff. Davidson’s experience includes live-action directing, creative directing and VFX supervision. He joins Brickyard from MPC, where he served as VFX supervisor since 2008.

In his career, Davidson has worked as an on-set VFX supervisor, lead 2D artist and director, among other things. Born and raised in England, he got his start at acclaimed commercial production company HKM/The Directors Bureau before joining MPC LA.

His notable projects include Activision’s Call of Duty: Black Ops III Seize Glory, EA Sports’ Madden NFL 16 Madden: The Movie, Samsung’s Do What You Can’t and Super Bowl spots for Coca-Cola and Acura. He also contributed to the Los Angeles Olympics bid.

“I think in today’s climate, with rapid turnarounds, there is a benefit to being involved early on,” says Davidson. “I aim to help the director and agency, not hinder them. I’ve enjoyed being on set, sometimes that’s the only place everybody is together, which is invaluable when you get to post. You already know the expectations. There’s a secret short-hand from that shared experience.”

Brickyard is a digital production studio working in high-end visual effects, animation, design and creative development for advertising, feature films and emerging media.


Behind the Title: Union VFX supervisor James Roberts

NAME: James Roberts

COMPANY: London-based Union (@unionvfx)

CAN YOU DESCRIBE YOUR COMPANY?
Union is an independent VFX company founded on a culture of originality, innovation and collaboration.

WHAT’S YOUR JOB TITLE?
VFX Supervisor

WHAT DOES THAT ENTAIL?
Overseeing the VFX for feature films from concept to delivery. This includes concept development, on-set photography and supervision of artists.

WHAT WOULD SURPRISE PEOPLE ABOUT WHAT FALLS UNDER THAT TITLE?
I sometimes get to be an actor.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Working with creative artists both on set and in the studio to develop original artwork.

WHAT’S YOUR LEAST FAVORITE?
Answering emails.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
1am

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Professional dog walker

WHY DID YOU CHOOSE THIS PROFESSION?
My mother was an artist and my father was a computer programmer… I didn’t have many other options.

My Cousin Rachel

CAN YOU NAME SOME RECENT PROJECTS?
T2 Trainspotting and My Cousin Rachel.

WHAT PROJECT ARE YOU MOST PROUD OF?
’71 and The Theory of Everything.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
Headphones, Side Effects Houdini and light bulbs.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
Instagram — I’m @jjjjjjames

DO YOU LISTEN TO MUSIC WHILE YOU WORK?
Yes…… anything and everything.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I spend time away from work with nice people.


MTV International’s Flanker Channels get graphic rebrand

LA-based animation and design studio Laundry has rebranded MTV International’s Flanker Channels, seven music-themed channels that broadcast in international markets and complement the MTV flagship channel. They worked closely with MTV World Creative Studio, the network’s international creative unit. The new brand identity is now on-air and online.

The MTV Flanker Channels offer viewers a wide variety of choices across seven different subsets of programming: Live, Hits, Classic, Rock, Music, Dance and Base. While the new branding package has a unified look, each channel’s theme is tailored for that type of music. Within the package there is a series of genre-inspired “party animal” characters that dance, shake and move to the DNA of each channel.

“We were faced with the challenge of finding a conceptual and visual thread that connected everything,” says Maximiliano Borrego, creative director at MTV World Creative Studio. “Something unique and identifiable across the channels that would, above all, entertain our audience. It was a big visual creative puzzle.”

“Adhering to MTV’s ‘Kill Boring’ mantra was a welcome license for us to make bold, creative choices that the network can own,” says PJ Richardson, partner/executive creative director of Laundry. “All seven Flanker identities reveal something distinct and unexpected, yet holistically fit within the larger brand ecosystem of the MTV family of channels.”

Laundry developed a graphics system for the rebrand based on “Wireframe + Skin,” MTV’s visual framework to branding. This conceptual and modular design approach dictated how they composed and arranged graphic content to interact. Assets included IDs, bumpers, key art, on-screen graphics, end boards, background animations, invaders (loopable animated elements), 3D logos (on-air and online), container boxes and crawls for each Flanker Channel.

They called on Maxon Cinema 4D and Adobe’s Creative Suite.

“We pictured MTV as a virtual reality planet where each sub-channel is a genre-specific continent — inhabited by party animals,” says Anthony Liu, partner/executive creative director of Laundry. “They’re the perfect visual metaphor for the diverse music genres and fans of the world; different in their influence and location, but the same in their fandom and human spirit.”

The party animals are 3D characters rendered to look graphic. Each one distantly references a real animal representing the music styles of the specific channel: an eel reflects the smoothness of electronic music like a glow stick, and a crab with a speaker-like shell is a nod to Jamaican dance-party vans. The creatures were designed to provide a lot of latitude across different moments in animation. For MTV Rocks, a 24-hour alternative music channel, Laundry built a frenetic mosh pit-inspired character made of drumsticks and guitar picks. While the animation is not specific to any one band or type of rock music, it captures the overall wild energy of the genre.

In total, Laundry created more than 300 elements for the MTV International Flanker Channels. The team also developed insanely vibrant layouts that reinforce MTV’s “Kill Boring” mission statement by combining the invader graphics with off-the-wall logo treatments and color palettes. Once the entire rebrand was brought to life, Laundry created a style guide with templates, so MTV teams across the world could use the assets consistently, but with enough flexibility as to not be repetitive.

“The MTV World Creative group really understood viewers’ shortening attention span, but increased appreciation of creativity, which was a vision we shared,” concludes Richardson. “Challenging in all the right ways, what made the collaboration so spectacular was the process of evolving the look and feel of the rebrand to nail both of those things and make a final package we’re all super stoked about.”


Foundry’s Nuke and Hiero 11.0 now available

Foundry has made available Nuke and Hiero 11.0, the next major release for the Nuke line of products, including Nuke, NukeX, Nuke Studio, Hiero and HieroPlayer. The Nuke family is being updated to VFX Platform 2017, which includes several major updates to key libraries used within Nuke, including Python, Pyside and Qt.

The update also introduces a new type of group node, which offers a powerful new collaborative workflow for sharing work among artists. Live Groups referenced in other scripts automatically update when a script is loaded, without the need to render intermediate stages.

Nuke Studio’s intelligent background rendering is now available in Nuke and NukeX. The Frame Server takes advantage of available resource on your local machine, enabling you to continue working while rendering is happening in the background. The LensDistortion node has been completely revamped, with added support for fisheye and wide-angle lenses and the ability to use multiple frames to produce better results. Nuke Studio now has new GPU-accelerated disk caching that allows users to cache part or all of a sequence to disk for smoother playback of more complex sequences.

 

 

FMPX8.14

Quick Chat: SIGGRAPH’S production sessions chair Emily Hsu

With SIGGRAPH 2017 happening in LA next week, we decided to reach out to Emily Hsu, this year’s production sessions chair to find out more about the sessions and the process in picking what to focus on. You can check out this year’s sessions here. By the way, Hsu’s day job is production coordinator at Portland, Oregon’s Laika Studios. So she comes at this from an attendee’s perspective.

How did you decide what panels to offer?
When deciding the production sessions line-up, my team and I consider many factors. One of the first is a presentation’s appeal to a wide range of SIGGRAPH attendees, which means that it strikes a nice harmony between the technical and the artistic. In addition, we consider the line-up as whole. While we retain strong VFX and animated feature favorites, we also want to round out the show with new additions in VR, gaming, television and more.

Ultimately, we are looking for work that stands out — will it inspire and excite attendees? Does it use technology that is groundbreaking or apply existing technologies in a groundbreaking way? Has it received worthy praise and accolades? Does it take risks? Does it tell a story in a unique way? Is it something that we’ve never seen within the production sessions program before? And, of course, does it epitomize the conference theme: “At the Heart of Computer Graphics & Interactive Techniques?”

These must be presentations that truly get to the heart of a project — not just the obvious successes, but also the obstacles, struggles and hard work that made it possible for it all to come together.

How do you make sure there is a balance between creative workflow and technology?
With the understanding that Production Sessions’ subject matter is targeted toward a broad SIGGRAPH audience, the studios and panelists are really able determine that balance.

Production Session proposals are often accompanied by varied line-ups of speakers from either different areas of the companies or different companies altogether. What’s especially incredible is when studio executives or directors are present on a panel and can speak to over-arching visions and goals and how everything interacts in the bigger picture.

These presentations often showcase the cross-pollination and collaboration that is needed across different teams. The projects are major undertakings by mid-to-large size crews that have to work together in problem solving, developing new systems and tools, and innovating new ways to get to the finish line — so the workflow, technology and art all go hand-in-hand. It’s almost impossible to talk about one without talking about the other.

Can you talk more about the new Production Gallery?
The Production Gallery has been a very special project for the Production Sessions team this year. Over the years since Production Sessions began, we’ve had special appearances by Marvel costumes, props, Laika puppets, and an eight-foot tall Noisy Boy robot from Real Steel, but they have only been available for viewing in the presentation time slots.

In creating a new space that runs Sunday through Wednesday of the conference, we’re hoping to give attendees a true up-close-and-personal experience and also honor more studio work that may often go unnoticed or unseen.

When you go behind-the-scenes of a film set or on a studio tour, there are tens of thousands of elements involved – storyboards, concept artwork, maquettes, costumes, props, and more. This space focuses on those physical elements that are lovingly created for each project, beyond the final rendered piece you see in the movie theater. In peeling back the curtain, we’re trying to bring a bit of the studios straight to the attendees.

The Production Gallery is one of the accomplishments from this year that I’m most proud of, and I hope it grows in future SIGGRAPH conferences.

If someone has never been to SIGGRAPH before, what can you tell them to convince them it’s not a show to miss?
SIGGRAPH is a conference to be experienced, not to hear about later. It opens up worlds, inspires creativity, creates connections and surrounds you in genius. I always come out of it reinvigorated and excited for what’s to come.

At SIGGRAPH, you get a glimpse into the future right now — what non-attendees may only be able to see or experience in many years or even decades. If it’s a show you don’t attend, you’re not just missing — you’re missing out.

If they have been in the past, how is this year different and why should they come?
My first SIGGRAPH was 2011 in Vancouver, and I haven’t skipped a single conference since then. Technology changes and evolves in the blink of an eye and I’ve blinked a lot since last year. There’s always something new to be learned or something exciting to see.

The SIGGRAPH 2017 Committee has put an exceptional amount of effort into the attendee experience this year. There are hands-on must-see-it-to-believe-it kinds of experiences in VR Village, the Studio, E-Tech and the all-new VR Theater, as well as improvements to the overall SIGGRAPH experience to make the conference smoother, more fun, collaborative and interactive.

I won’t reveal any surprises here, but I can say that there will be quite a few that you’ll have to see for yourself! And on top of all that, a giraffe named Tiny at SIGGRAPH? That’s got to be one for the SIGGRAPH history books, so come join us in making history.


Lucasfilm and ILM release open source MaterialX library

Lucasfilm and ILM have launched the first open source release of the MaterialX library for computer graphics. MaterialX is an open standard developed by Lucasfilm’s Advanced Development Group and ILM engineers to facilitate the transfer of rich materials and look-development content between applications and renderers.

Originated at Lucasfilm in 2012, MaterialX has been used by ILM on features including Star Wars: The Force Awakens and Rogue One: A Star Wars Story, as well as realtime immersive experiences such as Trials On Tatooine.

Workflows at computer graphics production studios require multiple software tools for different parts of the production pipeline, and shared and outsourced work requires companies to hand off fully look-developed models to other divisions or studios which may use different software packages and rendering systems.

MaterialX addresses the current lack of a common, open standard for representing the data values and relationships required to transfer the complete look of a computer graphics model from one application or rendering platform to another, including shading networks, patterns and texturing, complex nested materials and geometric assignments. MaterialX provides a schema for describing material networks, shader parameters, texture and material assignments and color-space associations in a precise, application-independent and customizable way.

MaterialX is an open source project released under a modified Apache license.

Quick Chat: Filmmaker/DP/VFX artist Mihran Stepanyan

Veteran Armenian artist Mihran Stepanyan has an interesting background. In addition to being a filmmaker and cinematographer, he is also a colorist and visual effects artist. In fact, he won the 2017 Flame Award, which was presented to him during NAB in April.

Let’s find out how his path led to this interesting mix of expertise.

Tell us about your background in VFX.
I studied feature film directing in Armenia from 1997 through 2002. During the process, I also became very interested in being a director of photography. As a self-taught DP, I was shooting all my work, as well as films produced by my classmates and colleagues. This was great experience. Nearly 10 years ago, I started to study VFX because I had some projects that I wanted to do myself. I’ve fallen in love with that world. Some years ago, I started to work in Moscow as a DP and VFX artist for a Comedy Club Production special project. Today, I not only work as a VFX artist but also as a director and cinematographer.

How do your experiences as a VFX artist inform your decisions as a director and cinematographer?
They are closely connected. As a director, you imagine something that you want to see in the end, and you can realize that because you know what you can achieve in production and post. And, as a cinematographer, you know that if problems arise during the shoot, you can correct them in VFX and post. Experience in cinematography also complements VFX artistry, because your understanding of the physics of light and optics helps you create more realistic visuals.

What do you love most about your job?
The infinity of mind, fantasy and feelings. Also, I love how creative teams work. When a project starts, it’s fun to see how the different team members interact with one another and approach various challenges, ultimately coming together to complete the job. The result of that collective team work is interesting as well.

Tell us about some recent projects you’ve worked on.
I’ve worked on Half Moon Bay, If Only Everyone, Carpenter Expecting a Son and Doktor. I also recently worked on a tutorial for FXPHD that’s different from anything I’ve ever done before. It is not only the work of an Autodesk Flame artist or a lecturer, but also gave me a chance to practice English, as my first language is Armenian.

Mihran’s Flame tutorial on FXPHD.

Where do you get your inspiration?
First, nature. There nothing more perfect to me. And, I’m picturalist, so for various projects I can find inspiration in any kind of art, from cave paintings to pictorial art and music. I’m also inspired by other artists’ work, which helps me stay tuned with the latest VFX developments.

If you had to choose the project that you’re most proud of in your career, what would it be, and why?
I think every artist’s favorite project is his/her last project, or the one he/she is working on right now. Their emotions, feelings and ideas are very fresh and close at the moment. There are always some projects that will stand out more than others. For me, it’s the film Half Moon Bay. I was the DP, post production supervisor and senior VFX artist for the project.

What is your typical end-to-end workflow for a project?
It differs on each project. In some projects, I do everything from story writing to directing and digital immediate (DI) finishing. For some projects, I only do editing or color grading.

How did you come to learn Flame?
During my work in Moscow, nearly five years ago, I had the chance to get a closer look at Flame and work on it. I’m a self-taught Flame artist, and since I started using the product it’s become my favorite. Now, I’m back in Armenia working on some feature films and upcoming commercials. I am also a member of Flame and Autodesk Maya Beta testing groups.

How did you teach yourself Flame? What resources did you use?
When I started to learn Flame, there weren’t as many resources and tutorials as we have now. It was really difficult to find training documentation online. In some cases, I got information from YouTube, NAB or IBC presentations. I learned mostly by experimentation, and a lot of trial and error. I continue to learn and experiment with Flame every time I work.

Any tips for using the product?
As for tips, “knowing” the software is not about understanding the tools or shortcuts, but what you can do with your imagination. You should always experiment to find the shortest and easiest way to get the end result. Also, imagine how you can construct your schematic without using unnecessary nods and tools ahead of time. Exploring Flame is like mixing the colors on the palette in painting to get the perfect tone. In the same way, you must imagine what tools you can “mix” together to get the result you want.

Any advice for other artists?
I would advise that you not be afraid of any task or goals, nor fear change. That will make you a more flexible artist who can adapt to every project you work on.

What’s next for you?
I don’t really know what’s next, but I am sure that it is a new beginning for me, and I am very interested where this all takes me tomorrow.