Tag Archives: visual effects

Game of Thrones: VFX associate producer Adam Chazen

With excitement starting to build for the seventh season of HBO’s Game of Thrones, what better time to take a quick look back at last season’s VFX workflow. HBO associate VFX producer Adam Chazen was kind enough to spend some time answering questions after just wrapping Season 7.

Tell us about your background as a VFX associate producer and what led you to Game of Thrones.
I got my first job as a PA at VFX studio Pixomondo. I was there for a few years, working under my current boss Steve Kullback (visual effects producer on Game of Thrones). He took me with him when he moved to work on Yogi Bear, and then on Game of Thrones.

I’ve been with the show since 2011, so this is my sixth year on board. It’s become a real family at this point; lots of people have been on since the pilot.

From shooting to post, what is your role working on Game of Thrones?
As the VFX associate producer, in pre-production mode I assist with organizing our previs and concept work. I help run and manage our VFX database and I schedule reviews with producers, directors and heads of departments.

During production I make sure everyone has what they need on set in order to shoot for the various VFX requirements. Also during production, we start to post the show — I’m in charge of running review sessions with our VFX supervisor Joe Bauer. I make sure that all of his notes get across to the vendors and that the vendors have everything they need to put the shots together.

Season 7 has actually been the longest we’ve stayed on set before going back to LA for post. When in Belfast, it’s all about managing the pre-production and production process, making sure everything gets done correctly to make the later VFX adjustments as streamlined as possible. We’ll have vendors all over the world working on that next step — from Australia to Spain, Vancouver, Montreal, LA, Dublin and beyond. We like to say that the sun never sets on Game of Thrones.

What’s the process for bringing new vendors onto the show?
They could be vendors that we’ve worked with in the past. Other times, we employ vendors that come recommended by other people. We check out industry reels and have studios do testing for us. For example, when we have dragon work we ask around for vendors willing to run dragon animation tests for us. A lot of it is word of mouth. In VFX, you work with the people that you know will do great work.

What’s your biggest challenge in creating Game of Thrones?
We’re doing such complex work that we need to use multiple vendors. This can be a big hurdle. In general, whether it be film or TV, when you have multiple vendors working on the same shot, it becomes a potential issue.

Linking in with cineSync helps. We can have a vendor in Australia and a vendor in Los Angeles both working on the same shot, at exactly the same time. I first started using cineSync while at Pixomondo and found it makes the revision process a lot quicker. We send notes out to vendors, but most of the time it’s easier to get on cineSync, see the same image and draw on it.

Even the simple move of hovering a cursor over the frame can answer a million questions. We have several vendors who don’t use English as their first language, such as those in Spain. In these cases, communication is a lot easier via cineSync. By pointing to a single portion of a single frame, we completely bypass the language barrier. It definitely helps to see an image on screen versus just explaining it.

What is your favorite part of the cineSync toolkit?
We’ve seen a lot of cool updates to cineSync. Specifically, I like the notes section, where you can export a PDF to include whichever frame that note is attributed to.

Honestly, just seeing a cursor move on-screen from someone else’s computer is huge. It makes things so much easier to just point and click. If we’re talking to someone on the phone, trying to tell them about an issue in the upper left hand corner, it’s going to be hard to get our meaning across. cineSync takes away all of the guesswork.

Besides post, we also heavily use cineSync for shoot needs. We shoot the show in Northern Ireland, Iceland, Croatia, Spain and Calgary. With cineSync, we are able to review storyboards, previs, techvis and concepts with the producers, directors, HODs and others, wherever they are in the world. It’s crucial that everyone is on the same page. Being able to look at the same material together helps everyone get what they want from a day on set.

Is there a specific shot, effect or episode you’re particularly proud of?
The Battle of the Bastards — it was a huge episode. Particularly, the first half of the episode when Daenerys came in with her dragons at the battle of Meereen, showing those slavers who is boss. Meereen City itself was a large CG creation, which was unusual for Game of Thrones. We usually try to stay away from fully CG environments and like to get as much in-camera as possible.

For example, when the dragon breathes fire we used an actual flamethrower we shot. Back in Season 5, we started to pre-animate the dragon, translate it to a motion control rig, and attach a flamethrower to it. It moves exactly how the dragon would move, giving us a practical element to use in the shot. CG fire can be done but it’s really tricky. Real is real, so you can’t question it.

With multiple vendors working on the sequence, we had Rodeo FX do the environment while Rhythm & Hues did the dragons. We used cineSync a lot, reviewing shots between both vendors in order to point out areas of concern. Then in the second half of the episode, which was the actual Battle of the Bastards, the work was brilliantly done by Australian VFX studio Iloura.

Exceptional Minds: Autistic students learn VFX, work on major feature films

After graduation, these artists have been working on projects for Marvel, Disney, Fox and HBO.

By Randi Altman

With an estimated 1 in 68 children in the US being born with some sort of autism spectrum disorder, according to the Centers for Disease Control’s Autism and Developmental Disabilities Monitoring, I think it’s fair to say that most people have been touched in some way by a child on the spectrum.

As a parent of a teenager with autism, I can attest to the fact that one of our biggest worries, the thing that keeps us up at night, is the question of independence. Will he be able to make a living? Will there be an employer who can see beyond his deficits to his gifts and exploit those gifts in the best possible way?

Enter Exceptional Minds, a school in Los Angeles that teaches young adults with autism how to create visual effects and animation while working as part of a team. This program recognizes how bright these young people are and how focused they can be, surrounds them with the right teachers and behavioral therapists, puts the right tools in their hands and lets them fly.

The school, which also has a VFX and animation studio that employs its graduates, was started in 2011 by a group of parents who have children on the spectrum. “They were looking for work opportunities for their kids, and quickly discovered they couldn’t find any. So they decided to start Exceptional Minds and prepare them for careers in animation and visual effects,” explains Susan Zwerman, the studio executive producer at Exceptional Minds and a long-time VFX producer whose credits include Broken Arrow, Alien Resurrection, Men of Honor, Around the World in 80 Days and The Guardian.

Since the program began, these young people have had the opportunity to work on some very high-profile films and TV programs. Recent credits include Game of Thrones, The Fate of the Furious and Doctor Strange, which was nominated for an Oscar for visual effects this year.

We reached out to Zwerman to find out more about this school, its studio and how they help young people with autism find a path to independence.

The school came first and then the studio?
Yes. We started training them for visual effects and animation and then the conversation turned to, “What do they do when they graduate?” That led to the idea to start a visual effects studio. I came on board two years ago to organize and set it up. It’s located downstairs from the school.

How do you pick who is suitable for the program?
We can only take 10 students each year, and unfortunately, there is a waiting list because we are the only program of its kind anywhere. We have a review process that our educators and teachers have in terms of assessing the student’s ability to be able to work in this area. You know, not everybody can function working on a computer for six or eight hours. There are different levels of the spectrum. So the higher functioning and the medium functioning are more suited for this work, which takes a lot of focus.

Students are vetted by our teachers and behavioral specialists, who take into account the student’s ability, as well as their enthusiasm for visual effects and animation — it’s very intense, and they have to be motivated.

Susie Zwerman (in back row, red hair) with artists in the Exceptional Minds studio.

I know that kids on the spectrum aren’t necessarily social butterflies, how do you teach them to work as a team?
Oh, that’s a really good question. We have what’s called our Work Readiness program. They practice interviewing, they practice working as a team, they learn about appearance, attitude, organization and how to problem solve in a work place.

A lot of it is all about working in a team, and developing their social skills. That’s something we really stress in terms of behavioral curriculum.

Can you describe how the school works?
It’s a three-year program. In the first year, they learn about the principles of design and using programs like Adobe’s Flash and Photoshop. In Flash, they study 2D animation and in Photoshop they learn how to do backgrounds for their animation work.

During year two, they learn how to work in a production pipeline. They are given a project that the class works on together, and then they learn how to edit using Adobe Premiere Pro and compositing on Adobe After Effects.

In the third year, they are developing their skills in 3D via Autodesk Maya and compositing with The Foundry’s Nuke. So they learn the way we work in the studio and our pipeline, as well as preparing their portfolios for the workplace. At the end of three years, each student completes their training with a demo reel and resume of their work.

Who helps with the reels and resumes?
Their teachers supervise that process and help them with editing and picking the best pieces for their reel. Having a reel is important for many reasons. While many students will work in our studio for a year after graduation, I was able to place some directly into the work environment because their talent was so good… and their reel was so good.

What is the transition like from school to studio?
They graduate in June and we transition many of them to the studio, where they learn about deadlines and get paid for their work. Here, many experience independence for the first time. We do a lot of 2D-type visual effects clean-up work. We give them shots to work on and test them for the first month to see how they are doing. That’s when we decide if they need more training.

The visual effects side of the studio deals with paint work, wire and rod removal and tracker or marker removals — simple composites — plus a lot of rotoscoping and some greenscreen keying. We also do end title credits for the major movies.

We just opened the animation side of the studio in 2016, so it’s still in the beginning stages, but we’re doing 2D animation. We are not a 3D studio… yet! The 2D work we’ve done includes music videos, Websites, Power Points and some stuff for the LA Zoo. We are gearing up for major projects.

How many work in the studio?
Right now, we have about 15 artists at workstations in our current studio. Some of these will be placed on the outside, but that’s part of using strategic planning in the future to figure out how much expansion we want to do over the next five years.

Thanks to your VFX background, you have many existing relationships with the major studios. Can you talk about how that has benefitted Exceptional Minds?
We have had so much support from the studios; they really want to help us get work for the artists. We started out with Fox, then Disney and then HBO for television. Marvel Studios is one of our biggest fans. Marvel’s Victoria Alonso is a big supporter, so much so that we gave her our Ed Asner Award last June.

Once we started to do tracker marker and end title credits for Marvel, it opened doors. People say, “Well, if you work for Marvel, you could work for us.” So she has been so instrumental in our success.

What were the Fox and Marvel projects?
Our very first client was Fox and we did tracker removals for Dawn of the Planet of the Apes — that was about three years ago. Marvel happened about two years ago and our first job for them was on Avengers: Age of Ultron.

What are some of the other projects Exceptional Minds has worked on?
We worked on Doctor Strange, providing tracker marker removals and end credits. We worked on Ant-Man, Captain America: Civil War, Pete’s Dragon, Alvin & the Chipmunks: The Road Chip and X-Men: Apocalypse.

Thanks to HBO’s Holly Schiffer we did a lot of Game of Thrones work. She has also been a huge supporter of ours.

It’s remarkable how far you guys have come in a short amount of time. Can you talk about how you ended up at Exceptional Minds?
I used to be DGA production manager/location manager and then segued into visual effects as a freelance VFX producer for all the major studios. About three years ago, my best friend Yudi Bennett, who is one of the founders of Exceptional Minds, convinced me to leave my career and  come here to help set up the studio. I was also tasked with producing, scheduling and budgeting work to come into the studio. For me, personally, this has been a spiritual journey. I have had such a good career in the industry, and this is my way of giving back.

So some of these kids move on to other places?
After they have worked in the studio for about a year, or sometimes longer, I look to have them placed at an outside studio. Some of them will stay here at our studio because they may not have the social skills to work on the outside.

Five graduates have been placed so far and they are working full time at various productions studios and visual effects facilities in Los Angeles. We have also had graduates in internships at Cartoon Network and Nickelodeon.

One student is at Marvel, and others are at Stargate Studios, Mr. Wolf and New Edit. To be able to place our artists on the outside is our ultimate goal. We love to place them because it’s sort of life changing. For example, one of the first students we placed, Kevin, is at Stargate. He moved out of his parents’ apartment, he is traveling by himself to and from the studio, he is getting raises and he is moving up as a rotoscope artist.

What is the tuition like?
Students pay about 50 percent and we fundraise the other 50 percent. We also have scholarships for those that can’t afford it. We have to raise a lot of money to support the efforts of the school and studio.

Do companies donate gear?
When we first started, Adobe donated software. That’s how we were able to fund the school before the studio was up and running. Now we’re on an educational plan with them where we pay the minimum. Autodesk and The Foundry also give us discounts or try to donate licenses to us. In terms of hardware, we have been working with Melrose Mac, who is giving us discounts on computers for the school and studio.


Check out Exceptional Minds Website for more info.

The A-list — Kong: Skull Island director Jordan Vogt-Roberts

By Iain Blair

Plucky explorers! Exotic locations! A giant ape! It can only mean one thing: King Kong is back… again. This time, the new Warner Bros. and Legendary Pictures’ Kong: Skull Island re-imagines the origin of the mythic Kong in an original adventure from director Jordan Vogt-Roberts (The Kings of Summer).

Jordan Vogt-Roberts

With an all-star cast that includes Tom Hiddleston, Samuel L. Jackson, Oscar-winner Brie Larson, John Goodman and John C. Reilly, it follows a diverse team of explorers as they venture deep into an uncharted island in the Pacific — as beautiful as it is treacherous — unaware that they’re crossing into the domain of the mythic Kong.

The legendary Kong was brought to life on a whole new scale by Industrial Light & Magic, with two-time Oscar-winner Stephen Rosenbaum (Avatar, Forrest Gump) serving as visual effects supervisor.

To fully immerse audiences in the mysterious Skull Island, Vogt-Roberts, his cast and filmmaking team shot across three continents over six months, capturing its primordial landscapes on Oahu, Hawaii — where shooting commenced on October 2015 — on Australia’s Gold Coast and, finally, in Vietnam, where production took place across multiple locations, some of which have never before been seen on film. Kong: Skull Island was released worldwide in 2D, 3D and IMAX beginning March 10.

I spoke with Vogt-Roberts about making the film and his love of post.

What’s the eternal appeal of doing a King Kong movie?
He’s King Kong! But the appeal is also this burden, as you’re playing with film history and this cinematic icon of pop culture. Obviously, the 1933 film is this impeccable genre story, and I’m a huge fan of creature features and people like Ray Harryhausen. I liked the idea of taking my love for all that and then giving it my own point of view, my sense of style and my voice.

With just one feature film credit, you certainly jumped in the deep end with this — pun intended — monster production, full of complex moving parts and cutting-edge VFX. How scary was it?
Every movie is scary because I throw myself totally into it. I vanish from the world. If you asked my friends, they would tell you I completely disappear. Whether it’s big or small, any film’s daunting in that sense. When I began doing shorts and my own stuff, I did shooting, the lighting, the editing and so on, and I thrived off all that new knowledge, so even all the complex VFX stuff wasn’t that scary to me. The truly daunting part is that a film like this is two and a half years of your life! It’s a big sacrifice, but I love a big challenge like this was.

What were the biggest challenges, and how did you prepare?
How do you make it special —and relevant in 2017? I’m a bit of a masochist when it comes to a challenge, and when I made the jump to The Kings of Summer it really helped train me. But there are certain things that are the same as they always are, such as there’s never enough time or money or daylight. Then there are new things on a movie of this size, such as the sheer endurance you need and things you simply can’t prepare yourself for, like the politics involved, all the logistics and so on. The biggest thing for me was, how do I protect my voice and point of view and make sure my soul is present in the movie when there are so many competing demands? I’m proud of it, because I feel I was able to do that.

How early on did you start integrating post and all the VFX?
Very early on — even before we had the script ready. We had concept artists and began doing previs and discussing all the VFX.

Did you do a lot of previs?
I’m not a huge fan of it. Third Floor did it and it’s a great tool for communicating what’s happening and how you’re going to execute it, but there’s also that danger of feeling like you’re already making the movie before you start shooting it. Think of all the great films like Blade Runner and the early Star Wars films, all shot before they even had previs, whereas now it’s very easy to become too reliant on it; you can see a movie sequence where it just feels like you’re watching previs come to life. It’s lost that sense of life and spontaneity. We only did three previs sequences — some only partially — and I really stressed with the crew that it was only a guide.

Where did you do the post?
It was all done at Pivotal in Burbank, and we began cutting as we shot. The sound mix was done at Skywalker and we did our score in London.

Do you like the post process?
I love post. I love all aspects of production, but post is where you write the film again and where it ceases being what was on the page and what you wanted it to be. Instead you have to embrace what it wants to be and what it needs to be. I love repurposing things and changing things around and having those 3am breakthroughs! If we moved this and use that shot instead, then we can cut all that.

You had three editors — Richard Pearson, Bob Murawski and Josh Schaeffer. How did that work?
Rick and Bob ran point, and Rick was the lead. Josh was the editor who had done The Kings of Summer with me, and my shorts. He really understands my montages and comedy. It was so great that Rick and Bob were willing to bring him on, and they’re all very different editors with different skills — and all masters of their craft. They weren’t on set, except for Hawaii. Once we were really globe-trotting, they were in LA cutting.

VFX play a big role. Can you talk about working on them with VFX supervisor Jeff White and ILM, who did the majority of the effects work?
He ran the team there, and they’re all amazing. It was a dream come true for me. They’re so good at taking kernels of ideas and turning them into reality. I was able to do revisions as I got new ideas. Creating Kong was the big one, and it was very tricky because the way he moves isn’t totally realistic. It’s very stylized, and Jeff really tapped into my animé and videogame sensibility for all that. We also used Hybride and Rodeo for some shots.

What was the hardest VFX sequence to do?
The helicopter sequence was really very difficult, juggling the geography of that, with this 100-foot creature and people spread all over the island, and also the final battle sequence. The VFX team and I constantly asked ourselves, “Have we seen this before? Is it derivative? Is it redundant?” The goal was to always keep it fresh and exciting.

Where did you do the DI?
At Fotokem with colorist Dave Cole who worked on The Lord of the Rings and so many others. I love color, and we did a lot of very unusual stuff for a movie like this, with a lot of saturation.

Did the film turn out the way you hoped?
A movie never quite turns out the way you hope or think it will, but I love the end result and I feel it represents my voice. I’m very proud of what we did.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Recreating history for Netflix’s The Crown

By Randi Altman

If you, like me, binge-watched Netflix’s The Crown, you are now considerably better educated on the English monarchy, have a very different view of Queen Elizabeth, and were impressed with the show’s access to Buckingham Palace.

Well, it turns out they didn’t actually have access to the Palace. This is where London-based visual effects house One of Us came in. While the number of shots provided for the 10-part series varied, the average was 43 per episode.

In addition to Buckingham Palace, One of Us worked on photoreal digital set extensions, crowd replications and environments, including Downing Street and London Airport. The series follows a young Elizabeth who inherits the crown after her father, King George VI, dies. We see her transition from a vulnerable young married lady to a more mature woman who takes her role as head monarch very seriously.

We reached out to One of Us VFX supervisor Ben Turner to find out more.

How early did you join the production?
One of Us was heavily involved during an eight-month pre-production process, until shooting commenced in July 2015.

Ben Turner

Did they have clear vision of what they needed VFX vs. practical?
As we were involved from the pre-production stage, we were able to engage in discussions about how best to approach shooting the scenes with the VFX work in mind. It was important to us and the production that actors interacted with real set pieces and the VFX work would be “thrown away” in the background, not drawing attention to itself.

Were you on set?
I visited all relevant locations, assisted on set by Jon Pugh who gathered all VFX data required. I would attend all recces at these locations, and then supervise on the shoot days.

Did you do previs? If so, what software did you use?
We didn’t do much previs in the traditional sense. We did some tech-vis to help us figure out how best to film some things, such as the arrivals at the gates of Buckingham Palace and the Coronation sequence. We also did some concept images to help inform the shoot and design of some scenes. This work was all done in Autodesk Maya, The Foundry’s Nuke and Adobe Photoshop.

Were there any challenges in working in 4K? Did your workflow change at all, and how much of your work currently is in 4K?
Working in 4K didn’t really change our workflow too much. At One of Us, we are used to working on film projects that come in all different shapes and sizes (we recently completed work on Terrance Mallick’s Voyage of Time in IMAX 5K), but for The Crown we invested in the infrastructure that enabled us to take it in our stride — larger and faster disks to hold the huge amounts of data, as well as a new 4K monitor to review all the work.

     

What were some of your favorite, or most challenging, VFX for the show?
The most challenging work was the kind of shots that many people are already very familiar with. So the Queen’s Coronation, for example, was watched by 20 million people in 1953, and with Buckingham Palace and Downing Street being two of the most famous and recognizable addresses in the world, there wasn’t really anywhere for us to hide!

Some of my favorite shots are the ones where we were recreating real events for which there are amazing archive references, such as the tilt down on the scaffolding at Westminster Abbey on the eve of the Coronation, or the unveiling of the statue of King George VI.

     

Can you talk about the tools you used, and did you create any propriety tools during the workflow?
We used Enwaii and Maya for photogrammetry, Photoshop for digital matte painting and Nuke for compositing. For crowd replication we created our own in-house 2.5D tool in Nuke, which was a card generator that gave the artist a choice of crowd elements, letting them choose the costume, angle, resolution and actions required.

What are you working on now?
We are currently hard at work on Season 2 of The Crown, which is going to be even bigger and more ambitious, so watch this space! Recent work also includes King Arthur: Legend Of The Sword (Warner Bros.) and Assassin’s Creed (New Regency).

Swedish post/VFX company Chimney opens in LA

Swedish post company Chimney has opened a Los Angeles facility, its first in the US, but one of their 12 offices in eight countries. Founded in Stockholm in 1995, Chimney produces over 6,000 pieces for more than 60 countries each year, averaging 1,000 projects and 10,000 VFX shots. The company, which is privately held by 50 of its artists, is able to offer 24-hour service thanks to its many locations around the world.

When asked why Chimney decided to open an office in LA, founder Henric Larsson said, “It was not the palm trees and beaches that made us open up in LA. We’re film nerds and we want to work with the best talent in the world, and where do we find the top directors, DPs, ADs, CDs and producers if not in the US?”

The Chimney LA crew.

The Chimney LA team was busy from the start, working with Team One to produce two Lexus campaigns, including one that debuted during the Super Bowl. For the Lexus Man & Machine Super Bowl Spot, they took advantage of the talent at sister facilities in Poland and Sweden.

Chimney also reports that it has signed with Shortlist Mgmt, joining other companies like RSA, Caviar, Tool and No6 Editorial. Charlie McBrearty, founding partner of Shortlist Mgmt, says that Chimney has “been on our radar for quite some time, and we are very excited to be part of their US expansion. Shortlist is no stranger to managing director Jesper Palsson, and we are thrilled to be reunited with him after our past collaboration through Stopp USA.”

Tools used for VFX include Autodesk’s Flame and Maya, The Foundry’s Nukea and Adobe After Effects. Audio is via Avid Pro Tools. Color is done in Digital Vision’s Nucoda. For editing they call on Avid Media Composer, Apple Final Cut and Adobe Premiere

Quick Chat: Brent Bonacorso on his Narrow World

Filmmaker Brent Bonacorso has written, directed and created visual effects for The Narrow World, which examines the sudden appearance of a giant alien creature in Los Angeles and the conflicting theories on why it’s there, what its motivations are, and why it seems to ignore all attempts at human interaction. It’s told through the eyes of three people with differing ideas of its true significance. Bonacorso shot on a Red camera with Panavision Primo lenses, along with a bit of Blackmagic Pocket Cinema Camera for random B-roll.

Let’s find out more…

Where did the idea for The Narrow World come from?
I was intrigued by the idea of subverting the traditional alien invasion story and using that as a way to explore how we interpret the world around us, and how our subconscious mind invisibly directs our behavior. The creature in this film becomes a blank canvas onto which the human characters project their innate desires and beliefs — its mysterious nature revealing more about the characters than the actual creature itself.

As with most ideas, it came to me in a flash, a single image that defined the concept. I was riding my bike along the beach in Venice, and suddenly in my head saw a giant Kaiju as big as a skyscraper sitting on the sand, gazing out at the sea. Not directly threatening, not exactly friendly either, with a mutual understanding with all the tiny humans around it — we don’t really understand each other at all, and probably never will. Suddenly, I knew why he was here, and what it all meant. I quickly sketched the image and the story followed.

What was the process like bringing the film to life as an independent project?
After I wrote the script, I shot principal photography with producer Thom Fennessey in two stages – first with the actor who plays Raymond Davis (Karim Saleh) and then with the actress playing Emily Field (Julia Cavanaugh).

I called in a lot of favors from my friends and connections here in LA and abroad — the highlight was getting some amazing Primo lenses and equipment from Panavision to use because they love Magdalena Górka’s (the cinematographer) work. Altogether it was about four days of principal photography, a good bit of it guerrilla style, and then shooting lots of B-roll all over the city.

Kacper Sawicki, head of Papaya Films which represents me for commercial work in Europe, got on board during post production to help bring The Narrow World to completion. Friends of mine in Paris and Luxembourg designed and textured the creature, and I did the lighting and animation in Maxon Cinema 4D and compositing in Adobe After Effects.

Our editor was the genius Jack Pyland (who cut on Adobe Premiere), based in Dallas. Sound design and color grading (via Digital Vision’s Nucoda) were completed by Polish companies Głośno and Lunapark, respectively. Our composer was Cedie Janson from Australia. So even though this was an indie project, it became an amazing global collaborative effort.

Of course, with any no-budget project like this, patience is key — lack of funds is offset by lots of time, which is free, if sometimes frustrating. Stick with it — directing is a generally a war of attrition, and it’s won by the tenacious.

As a director, how did you pull off so much of the VFX work yourself, and what lessons do you have for other directors?
I realized early on in my career as a director that the more you understand about post, and the more you can do yourself, the more you can control the scope of the project from start to finish. If you truly understand the technology and what is possible with what kind of budget and what kind of manpower, it removes a lot of barriers.

I taught myself After Effects and Cinema 4D in graphic design school, and later I figured out how to make those tools work for me in visual effects and to stretch the boundaries of the short films I was making. It has proved invaluable in my career — in the early stages I did most of the visual effects in my work myself. Later on, when I began having VFX companies do the work, my knowledge and understanding of the process enabled me to communicate very efficiently with the artists on my projects.

What other projects do you have on the horizon?
In addition to my usual commercial work, I’m very excited about my first feature project coming up this year through Awesomeness Films and DreamWorks — You Get Me, starring Bella Thorne and Halston Sage.

VFX house Jamm adds Flame artist Mark Holden

Santa Monica-based visual effects boutique Jamm has added veteran Flame artist Mark Holden to its roster. Holden comes to Jamm with over 20 years of experience in post production, including stints in London and Los Angeles.

It didn’t take long for Holden to dive right in at Jamm; he worked on Space 150’s Buffalo Wild Wings Super Bowl campaign directed by the Snorri Bros. and starring Brett Favre. The Super Bowl teaser kicked off the pre-game.

Holden is known not only for his visual effects talent, but also for turning projects around under tight deadlines and offering his clients as many possible solutions within the post process. This has earned him work with leading agencies such as Fallon, Mother, Saatchi & Saatchi, Leo Burnett, 180, TBWA/Chiat/Day, Goodby Silverstein & Partners, Deutsch, David & Goliath, and Team One. He has worked with brands including Lexus, Activision, Adidas, Chevy, Geico, Grammys, Kia, Lyft, Pepsi, Southwest Airlines, StubHub, McDonald’s, Kellogg’s, Stella Artois, Silk, Heineken and Olay.

 

Ingenuity Studios helps VFX-heavy spot get NASCAR-ready

Hollywood-based VFX house Ingenuity Studios recently worked on a 60-second Super Bowl spot for agency Pereira & O’Dell promoting Fox Sports’ coverage of the Daytona 500, which takes place on February 26. The ad, directed by Joseph Kahn, features people from all over the country gearing up to watch the Daytona 500, including footage from NASCAR races, drivers and, for some reason, actor James Van Der Beek.

The Ingenuity team had only two weeks to turn around this VFX-heavy spot, called Daytona Day. Some CG elements include a giant robot, race cars and crowds. While they were working on the effects, Fox was shooting footage in Charlotte, North Carolina and Los Angeles.

“When we were initially approached about this project we knew the turnaround would be a challenge,” explains creative director/VFX supervisor Grant Miller. “Editorial wasn’t fully locked until Thursday before the big game! With such a tight deadline preparing as much as we could in advance was key.”

Portions of the shoot took place at the Daytona Speedway, and since it was an off day the stadium and infield were empty. “In preparation, our CG team built the entire Daytona stadium while we were still shooting, complete with cheering CG crowds, RVs filling the interior, pit crews, etc.,” says Miller. “This meant that once shots were locked we simply needed to track the camera, adjust the lighting and render all the stadium passes for each shot.”

Additional shooting took place at the Charlotte Motor Speedway, Downtown Los Angeles and Pasadena, California.

In addition to prepping CG for set extensions, Ingenuity also got a head start on the giant robot that shows up halfway through the commercial.  “Once the storyboards were approved and we were clear on the level of detail required, we took our ‘concept bot’ out of ZBrush, retopologized and unwrapped it, then proceeded to do surfacing and materials in Substance Painter. While we had some additional detailing to do, we were able to get the textures 80 percent completed by applying a variety of procedural materials to the mesh, saving a ton of manual painting.”

Other effects work included over 40 CG NASCAR vehicles to fill the track, additional cars for the traffic jam and lots of greenscreen and roto work to get the scenes shot in Charlotte into Daytona. There was also a fair bit of invisible work that included cleaning up sets, removing rain, painting out logos, etc.

Other tools used include Autodesk’s Maya, The Foundry’s Nuke and BorisFX’s Mocha.

Last Chance to Enter to Win an Amazon Echo… Take our Storage Survey Now!

If you’re working in post production, animation, VFX and/or VR/AR/360, please take our short survey and tell us what works (and what doesn’t work) for your day-to-day needs.

What do you need from a storage solution? Your opinion is important to us, so please complete the survey by Wednesday, March 8th.

We want to hear your thoughts… so click here to get started now!

 

 

Review: Nvidia’s new Pascal-based Quadro cards

By Mike McCarthy

Nvidia has announced a number of new professional graphic cards, filling out their entire Quadro line-up with models based on their newest Pascal architecture. At the absolute top end, there is the new Quadro GP100, which is a PCIe card implementation of their supercomputer chip. It has similar 32-bit (graphics) processing power to the existing Quadro P6000, but adds 16-bit (AI) and 64-bit (simulation). It is intended to combine compute and visualization capabilities into a single solution. It has 16GB of new HBM2 (High Bandwidth Memory) and two cards can be paired together with NVLink at 80GB/sec to share a total of 32GB between them.

This powerhouse is followed by the existing P6000 and P5000 announced last July. The next addition to the line-up is the single-slot VR-ready Quadro P4000. With 1,792 CUDA cores running at 1200MHz, it should outperform a previous-generation M5000 for less than half the price. It is similar to its predecessor the M4000 in having 8GB RAM, four DisplayPort connectors, and running on a single six-pin power connector. The new P2000 follows next with 1024 cores at 1076MHz and 5GB of RAM, giving it similar performance to the K5000, which is nothing to scoff at. The P1000, P600 and P400 are all low-profile cards with Mini-DisplayPort connectors.

All of these cards run on PCIe Gen3 x16, and use DisplayPort 1.4, which adds support for HDR and DSC. They all support 4Kp60 output, with the higher end cards allowing 5K and 4Kp120 displays. In regards to high-resolution displays, Nvidia continues to push forward with that, allowing up to 32 synchronized displays to be connected to a single system, provided you have enough slots for eight Quadro P4000 cards and two Quadro Sync II boards.

Nvidia also announced a number of Pascal-based mobile Quadro GPUs last month, with the mobile P4000 having roughly comparable specifications to the desktop version. But you can read the paper specs for the new cards elsewhere on the Internet. More importantly, I have had the opportunity to test out some of these new cards over the last few weeks, to get a feel for how they operate in the real world.

DisplayPorts

Testing
I was able to run tests and benchmarks with the P6000, P4000 and P2000 against my current M6000 for comparison. All of these test were done on a top-end Dell 7910 workstation, with a variety of display outputs, primarily using Adobe Premiere Pro, since I am a video editor after all.

I ran a full battery of benchmark tests on each of the cards using Premiere Pro 2017. I measured both playback performance and encoding speed, monitoring CPU and GPU use, as well as power usage throughout the tests. I had HD, 4K, and 6K source assets to pull from, and tested monitoring with an HD projector, a 4K LCD and a 6K array of TVs. I had assets that were RAW R3D files, compressed MOVs and DPX sequences. I wanted to see how each of the cards would perform at various levels of production quality and measure the differences between them to help editors and visual artists determine which option would best meet the needs of their individual workflow.

I started with the intuitive expectation that the P2000 would be sufficient for most HD work, but that a P4000 would be required to effectively handle 4K. I also assumed that a top-end card would be required to playback 6K files and split the image between my three Barco Escape formatted displays. And I was totally wrong.

Besides when using the higher-end options within Premiere’s Lumetri-based color corrector, all of the cards were fully capable of every editing task I threw at them. To be fair, the P6000 usually renders out files about 30 percent faster than the P2000, but that is a minimal difference compared to the costs. Even the P2000 was able to playback my uncompressed 6K assets onto my array of Barco Escape displays without issue. It was only when I started making heavy color changes in Lumetri that I began to observe any performance differences at all.

Lumetri

Color correction is an inherently parallel, graphics-related computing task, so this is where GPU processing really shines. Premiere’s Lumetri color tools are based on SpeedGrade’s original CUDA processing engine, and it can really harness the power of the higher-end cards. The P2000 can make basic corrections to 6K footage, but it is possible to max out the P6000 with HD footage if I adjust enough different parameters. Fortunately, most people aren’t looking for more stylized footage than the 300 had, so in this case, my original assumptions seem to be accurate. The P2000 can handle reasonable corrections to HD footage, the P4000 is probably a good choice for VR and 4K footage, while the P6000 is the right tool for the job if you plan to do a lot of heavy color tweaking or are working on massive frame sizes.

The other way I expected to be able to measure a difference between the cards would be in playback while rendering in Adobe Media Encoder. By default, Media Encoder pauses exports during timeline playback, but this behavior can be disabled by reopening Premiere after queuing your encode. Even with careful planning to avoid reading from the same disks as the encoder was accessing from, I was unable to get significantly better playback performance from the P6000 compared to the P2000. This says more about the software than it says about the cards.

P6000

The largest difference I was able to consistently measure across the board was power usage, with each card averaging about 30 watts more as I stepped up from the P2000 to the P4000 to the P6000. But they all are far more efficient than the previous M6000, which frequently sucked up an extra 100 watts in the same tests. While “watts” may not be a benchmark most editors worry too much about, among other things it does equate to money for electricity. Lower wattage also means less cooling is needed, which results in quieter systems that can be kept closer to the editor without being distracting from the creative process or interfering with audio editing. It also allows these new cards to be installed in smaller systems with smaller power supplies, using up fewer power connectors. My HP Z420 workstation only has one 6-pin PCIe power plug, so the P4000 is the ideal GPU solution for that system.

Summing Up
It appears that we have once again reached a point where hardware processing capabilities have surpassed the software capacity to use them, at least within Premiere Pro. This leads to the cards performing relatively similar to one another in most of my tests, but true 3D applications might reveal much greater differences in their performance. Further optimization of CUDA implementation in Premiere Pro might also lead to better use of these higher-end GPUs in the future.


Mike McCarthy is an online editor and workflow consultant with 10 years of experience on feature films and commercials. He has been on the forefront of pioneering new solutions for tapeless workflows, DSLR filmmaking and now multiscreen and surround video experiences. If you want to see more specific details about performance numbers and benchmark tests for these Nvidia cards, check out techwithmikefirst.com.