Tag Archives: Tim Miller

Blur Studio uses new AMD Threadripper for Terminator: Dark Fate VFX

By Dayna McCallum

AMD has announced new additions to its high-end desktop processor family. Built for demanding desktop and content creation workloads, the 24-core AMD Ryzen Threadripper 3960X and the 32-core AMD Ryzen Threadripper 3970X processors will be available worldwide November 25.

Tim Miller on the set of Dark Fate.

AMD states that the powerful new processors provide up to 90 percent more performance and up to 2.5 times more available storage bandwidth than competitive offerings, per testing and specifications by AMD performance labs. The 3rd Gen AMD Ryzen Threadripper lineup features two new processors built on 7nm “Zen 2” core architecture, claiming up to 88 PCIe 4.0 lanes and 144MB cache with 66 percent better power efficiency.

Prior to the official product launch, AMD made the 3rd Gen Threadrippers available to LA’s Blur Studio for work on the recent Terminator: Dark Fate and continued a collaboration with the film’s director — and Blur Studio founder — Tim Miller.

Before the movie’s release, AMD hosted a private Q&A with Miller, moderated by AMD’s James Knight. Please note that we’ve edited the lively conversation for space and taken a liberty with some of Miller’s more “colorful” language. (Also watch this space to see if a wager is won that will result in Miller sporting a new AMD tattoo.) Here is the Knight/Miller conversation…

So when we dropped off the 3rd Gen Threadripper to you guys, how did your IT guys react?
Like little children left in a candy shop with no adult supervision. The nice thing about our atmosphere here at Blur is we have an open layout. So when (bleep) like these new AMD processors drops in, you know it runs through the studio like wildfire, and I sit out there like everybody else does. You hear the guys talking about it, you hear people giggling and laughing hysterically at times on the second floor where all the compositors are. That’s where these machines really kick ass — busting through these comps that would have had to go to the farm, but they can now do it on a desktop.

James Knight

As an artist, the speed is crucial. You know, if you have a machine that takes 15 minutes to render, you want to stop and do something else while you wait for a render. It breaks your whole chain of thought. You get out of that fugue state that you produce the best art in. It breaks the chain between art and your brain. But if you have a machine that does it in 30 seconds, that’s not going to stop it.

But really, more speed means more iterations. It means you deal with heavier scenes, which means you can throw more detail at your models and your scenes. I don’t think we do the work faster, necessarily, but the work is much higher quality. And much more detailed. It’s like you create this vacuum, and then everybody rushes into it and you have this silly idea that it is really going to increase productivity, but what it really increases most is quality.

When your VFX supervisor showed you the difference between the way it was done with your existing ecosystem and then with the third-gen Threadripper, what were you thinking about?
There was the immediate thing — when we heard from the producers about the deadline, shots that weren’t going to get done for the trailer, suddenly were, which was great. More importantly, you heard from the artists. What you started to see was that it allows for all different ways of working, instead of just the elaborate pipeline that we’ve built up — to work on your local box and then submit it to the farm and wait for that render to hit the queue of farm machines that can handle it, then send that render back to you.

It has a rhythm that is at times tiresome for the artists, and I know that because I hear it all the time. Now I say, “How’s that comp coming and when are we going to get it, tick tock?” And they say, “Well, it’s rendering in the background right now, as I’m watching them work on another comp or another piece of that comp.” That’s pretty amazing. And they’re doing it all locally, which saves so much time and frustration compared to sending it down the pipeline and then waiting for it to come back up.

I know you guys are here to talk about technology, but the difference for the artists is the instead of working here until 1:00am, they’re going home to put their children to bed. That’s really what this means at the end of the day. Technology is so wonderful when it enables that, not just the creativity of what we do, but the humanity… allowing artists to feel like they’re really on the cutting edge, but also have a life of some sort outside.

Endoskeleton — Terminator: Dark Fate

As you noted, certain shots and sequences wouldn’t have made it in time for the trailer. How important was it for you to get that Terminator splitting in the trailer?
 Marketing was pretty adamant that that shot had to be in there. There’s always this push and pull between marketing and VFX as you get closer. They want certain shots for the trailer, but they’re almost always those shots that are the hardest to do because they have the most spectacle in them. And that’s one of the shots. The sequence was one of the last to come together because we changed the plan quite a bit, and I kept changing shots on Dan (Akers, VFX supervisor). But you tell marketing people that they can’t have something, and they don’t really give a (bleep) about you and your schedule or the path of that artist and shot. (Laughing)

Anyway, we said no. They begged, they pleaded, and we said, “We’ll try.” Dan stepped up and said, “Yeah, I think I can make it.” And we just made it, but that sounds like we were in danger because we couldn’t get it done fast enough. All of this was happening in like a two-day window. If you didn’t notice (in the trailer), that’s a Rev 7. Gabriel Luna is a Rev 9, which is the next gen. But the Rev 7s that you see in his future flashback are just pure killers. They’re still the same technology, which is looking like metal on the outside and a carbon endoskeleton that splits. So you have to run the simulation where the skeleton separates through the liquid that hangs off of an inch string; it’s a really hard simulation to do. That’s why we thought maybe it wasn’t going to get done, but running the simulation on the AMD boxes was lightning fast.

 

 

 

Julian Clarke on editing Terminator: Dark Fate

By Oliver Peters

Linda Hamilton’s Sarah Connor and Arnold Schwarzenegger T-800 are back to save humanity from a dystopian future in this latest installment of the Terminator franchise. James Cameron is also back and brings with him writing and producing credits, which is fitting — Terminator: Dark Fate is in essence Cameron’s sequel to Terminator 2: Judgment Day.

Julian Clarke

Tim Miller (Deadpool) is at the helm to direct the tale. It’s roughly two decades after the time of T2, and a new Rev-9 machine has been sent from an alternate future to kill Dani Ramos (Natalia Reyes), an unsuspecting auto plant worker in Mexico. But the new future’s resistance has sent back Grace (Mackenzie Davis), an enhanced super-soldier, to combat the Rev-9 and save her. They cross paths with Connor, and the story sets off on a mad dash to the finale at Hoover Dam.

Miller brought back much of his Deadpool team, including his VFX shop Blur, DP Ken Seng and editor Julian Clarke. This is also the second pairing of Miller and Clarke with Adobe. Both Deadpool and Terminator: Dark Fate were edited using Premiere Pro. In fact, Adobe was also happy to tie in with the film’s promotion through its own #CreateYourFate trailer remix challenge. Participants could re-edit their own trailer using supplied content from the film.

I recently spoke with Clarke about the challenges and fun of cutting this latest iteration of such an iconic film franchise.

Terminator: Dark Fate picks up two decades after Terminator 2, leaving out the timelines of the subsequent sequels. Was that always the plan, or did it evolve out of the process of making the film?
That had to do with the screenplay. You were written into a corner by the various sequels. We really wanted to bring Linda Hamilton’s character back. With Jim involved, we wanted to get back to first principles and have it based on Cameron’s mythology alone. To get back to the Linda/Arnold character arcs, and then add some new stuff to that.

Many fans were attracted to the franchise by Cameron’s two original Terminator films. Was there a conscious effort at integrating that nostalgia?
I come from a place of deep fandom for Terminator 2. As a teenager I had VHS copies of Aliens and Terminator 2 and watched them on repeat after school! Those films are deeply embedded in my psyche, and both of them have aged well — they still hold up. I watched the sequels, and they just didn’t feel like a Terminator film to me. So the goal was definitely to make it of the DNA of those first two movies. There’s going to be a chase. It’s going to be more grounded. It’s going to get back into the Sarah Connor character and have more heart.

This film tends to have elements of humor unlike most other action films. That must have posed a challenge to set the right tone without getting campy.
The humor thing is interesting. Terminator 2 has a lot of humor throughout. We have a little bit of humor in the first half and then more once Arnold shows up, but that’s really the way it had to be. The Dani Ramos character — who’s your entry point into the movie — is devastated when her whole family is killed. To have a lot of jokes happening would be terrible. It’s not the same in Terminator 2 because John Connor’s stepparents get very little screen time, and they don’t seem that nice. You feel bad for them, but it’s OK that you get into this funny stuff right off the bat. On this one we had to ease into the humor so you could [experience] the gravity of the situation at the start of the movie.

Did you have to do much to alter that balance during the edit?
There were one or two jokes that we nipped out, but it wasn’t like that whole first act was chock full of jokes. The tone of the first act is more like Terminator, which is more of a thriller or horror movie. Then it becomes more like T2 as the action gets bigger and the jokes come in. So the first half is like a bigger Terminator and the second half more like T2.

Deadpool, which Tim Miller also directed, used a very nonlinear story structure, balancing action, comedic moments and drama. Terminator was always designed with a linear, straightforward storyline. Right?
A movie hands you certain editing tools. Deadpool was designed to be nonlinear, with characters in different places, so there are a whole bunch of options for you. Terminator: Dark Fate is more like a road movie. The detonation of certain paths along the road are predetermined. You can’t be in Texas before Mexico. So the structural options you had were where to check in with the Rev-9, as well as the inter-scene structure. Once you are in the detention center, who are you cutting to? Sarah? Dani? However, where that is placed in the movie is pretty much set. All you can do is pace it up, pace it down, adjust how to get there. There aren’t a lot of mobile pieces that can be swapped around.

When we had talked after Deadpool, you discussed how you liked the assistants to build string-outs — what some call a Kem roll. Similar action is assembled back to back into a sequence in order from every take. Did you use that same organizational method on Terminator: Dark Fate?
Sometimes we were so swamped with material that there wasn’t time to create string-outs. I still like to have those. It’s a nice way to quickly see all the pieces that cover a moment. If you are trying to find the one take or action that’s 5% better than another, then it’s good to see them all in a row, rather than trying to keep it all in your head for a five-minute take. There was a lot of footage that we shot in the action scenes, but we didn’t do 11 or 12 takes for a dialogue scene. I didn’t feel like I needed some tool to quickly navigate through the dialogue takes. We would string out the ones that were more complicated.

Depending on the directing style, a series of takes may have increasingly calibrated performances with successive takes. With other directors, each take might be a lot different than the one before and after it. What is your approach to evaluating which is the best take to use?
It’s interesting when you use the earlier takes versus the later takes and what you get from them. The later takes are usually the ones that are most directed. The actors are warmed up and most closely nail what the director has in mind. So they are strong in that regard, but sometimes they can become more self-conscious. So sometimes the first take is more thrown away and may have less power but feels more real — more off the cuff. Sometimes a delivered dialogue line feels less written, and you’ll buy it more. Other times you’ll want that more dramatic quality of the later takes. My instinct is to first use the later takes, but as you start to revise a scene, you often go back to pieces of the earlier takes to ground it a little more.

How long did the production and post take?
It took a little over 100 days of shooting with a lot of units. I work on a lot of mid-budget films, so this seemed like a really long shoot. It was a little relentless for everyone — even squeezing it into those 100 days. Shooting action with a lot of VFX is slow due to the reset time needed between takes. The ending of the movie is 30 minutes of action in a row. That’s a big job shooting all of that stuff. When they have a couple of units cranking through the dialogue scenes plus shooting action sequences — that’s when I have to work hard to keep up. Once you hit the roadblocks of shooting just those little action pieces, you get a little time to catch up.

We had the usual director’s cut period and finished by the end of this September. The original plan was to finish by the beginning of September, but we needed the time for VFX. So everything piled up with the DI and the mix in order to still hit the release date. September got a little crazy. It seems like a long time — a total of 13 or 14 months — but it still was an absolute sprint to get the movie in shape and get the VFX into the film in time. This might be normal for some of these films, but compared to the other VFX movies I’ve done, it was definitely turning things up a notch!

I imagine that there was a fair amount of previz required to lay out the action for the large VFX and CG scenes. Did you have that to work with as placeholder shots? How did you handle adjusting the cut as the interim and final shots were delivered?
Tim is big into previz with his background in VFX and animation and owning his own VFX company. We had very detailed animatics going into production. Depending on a lot of factors, you still abandon a lot of things. For example, the freeway chases are quite a bit different because when you go there and do it with real cars, they do different things. Or only part of the cars look like they are going fast enough. Those scenes became quite different than the previz.

Others are almost 100% CG, so you can drop in the previz as placeholders. Although, even in those cases, sometimes the finished shot doesn’t feel real enough. In the “cartoon” world of previz, you can do wild camera moves and say, “Wow, that seems cool!” But when you start doing it at photoreal quality, then you go, “This seems really fake.” So we tried to get ahead of that stuff and find what to do with the camera to ground it. Kind of mess it up so it’s not too dynamic and perfect.

How involved were you with shaping the music? Did you use previous Terminator films’ scores as a temp track to cut with?
I was very involved with the music production. I definitely used a lot of temp music. Some of it was ripped from old Terminator movies, but there’s only so much Terminator 2 music you can put in. Those scores used a lot of synthesizers that date the sound. I did use “Desert Suite” from Terminator 2, when Sarah is in the hotel room. I loved having a very direct homage to a Sarah Connor moment while she’s talking about John. Then I begged our composer, Tom Holkenborg (from Junkie XL), to consider doing a version of it for our movie. So it is essentially the same chord progression.

That was an interesting musical and general question about how much do you lean into the homage thing. It’s powerful when you do it, but if you do it too much, it starts to feel artificial or pandering. So I tried to hit the sweet spot so you knew you were watching a Terminator movie, but not so much that it felt like Terminator karaoke. How many times can you go da-dum-dum-da-da-dum? You have to pick your moments for those Terminator motifs. It’s diminishing returns if you do it too much.

Another inspirational moment for me was another part in Terminator 2. There’s a disturbing industrial sound for the T-1000. It sounds more like a foghorn or something in a factory rather than music, and it created this unnerving quality to the T-1000 scenes, when he’s just scoping things out. So we came up with a modern-day electronic equivalent for the Rev-9 character, and that was very potent.

Was James Cameron involved much in the post production?
He’s quite busy with his Avatar movies. Some of the time he was in New Zealand, some of the time he was in Los Angeles. Depending on where he was and where we were in the process, we would hit milestones, like screenings or the first cut. We would send him versions and download a bunch of his thoughts.

Editing is very much a part of his wheelhouse. Unlike many other directors, he really thinks about this shot, then that shot, then the next shot. His mind really works that way. Sometimes he would give us pretty specific, dialed-in notes on things. Sometimes it would just be bigger suggestions, like, “Maybe the action cutting pattern could be more like this …” So we’d get his thoughts — and, of course, he’s Jim Cameron, and he knows the business and the Terminator franchise — so I listened pretty carefully to that input.

This is the second film that you’ve cut with Premiere Pro. Deadpool was first, and there were challenges using it on such a complex project. What was the experience like this time around?
Whenever you set out to use a new workflow — not to say Premiere is new because it’s been around a long time and has millions of users, but it’s unusual to use it on large VFX movies for specific reasons.

L-R: Matthew Carson and Julian Clarke

On Deadpool, that led to certain challenges, and that’s just what happens when you try to do something new. The fact that we had to split the movie into separate projects for each reel, instead of one large project. Even so, the size of our project files made it tough. They were so full of media that they would take five minutes to open. Nevertheless, we made it work, and there are lots of benefits to using Adobe over other applications.

In comparison, the interface to Avid Media Composer looks like it was designed 20 years ago, but they have multi-user collaboration nailed, and I love the trim tool. Yet, some things are old and creaky. Adobe’s not that at all. It’s nice and elegant in terms of the actual editing process. We got through it and sat down with Adobe to point out things that needed work, and they worked on them. When we started up Terminator, they had a whole new build for us. Project files now opened in 15 seconds. They are about halfway there in terms of multi-user editing. Now everyone can go into a big, shared project, and you can move bins back and forth. Although, only one user at a time has write access to the master project.

This is not simple software they are writing. Adobe is putting a lot of work into making it a more fitting tool for this type of movie. Even though this film was exponentially larger than Deadpool, from the Adobe side it was a smoother process. Props to them for doing that! The cool part about pioneering this stuff is the amount of work that Adobe is on board to do. They’ll have people work on stuff that is helpful to us, so we get to participate a little in how Adobe’s software gets made.

With two large Premiere Pro projects under your belt, what sort of new features would you like to see Adobe add to the application to make it even better for feature film editors?
They’ve built out the software from being a single-user application to being multi-user software, but the inherent software at the base level is still single-user. Sometimes your render files get unlinked when you go back and forth between multiple users. There’s probably stuff where they have to dig deep into the code to make those minor annoyances go away. Other items I’d like to see — let’s not use third-party software to send change lists to the mix stage.

I know Premiere Pro integrates beautifully with After Effects, but for me, After Effects is this precise tool for executing shots. I don’t want a fine tool for compositing — I want to work in broad strokes and then have someone come back and clean it up. I would love to have a tracking tool to composite two shots together for a seamless, split screen of two combined takes — features like that.

The After Effects integration and the color correction are awesome features for a single user to execute the film, but I don’t have the time to be the guy to execute the film at that high level. I just have to keep going. I want to be able to do a fast and dirty version so I know it’s not a terrible idea, and then turn to someone else and say, “OK, make that good.” After Effects is cool, but it’s more for VFX editors or single users who are trying to make a film on their own.

After all of these action films, are you ready to do a different type of film, like a period drama?
Funny you should say that. After Deadpool I worked on The Handmaid’s Tale pilot, and it was exactly that. I was working on this beautifully acted, elegant project with tons of women characters and almost everything was done in-camera. It was a lot of parlor room drama and power dynamics. And that was wonderful to work on after all of this VFX/action stuff. Periodically it’s nice to flex a different creative muscle.

It’s not that I only work on science-fiction/VFX projects — which I love — but, in part, people start associating you with a certain genre, and then that becomes an easy thing to pursue and get work for.

Much like acting, if you want to be known for doing a lot of different things, you have to actively pursue it. It’s easy to go where momentum will take you. If you want to be the editor who can cut any genre, you have to make it a mission to pursue those projects that will keep your resume looking diverse. For a brief moment after Deadpool, I might have been able to pivot to a comedy career (laughs). That was a real hybrid, so it was challenging to thread the needle of the different tones of the film and make it feel like one piece.

Any final thoughts on the challenges of editing Terminator: Dark Fate?
The biggest challenge of the film was that, in a way, the film was an ensemble with the Dani character, the Grace character, the Sarah character and Arnold’s character — the T-800. All of these characters are protagonists that all have their individual arcs. Feeling that you were adequately servicing those arcs without grinding the movie to a halt or not touching bases with a character often enough — finding out how to dial that in was the major challenge of the movie, plus the scale of the VFX and finessing all the action scenes. I learned a lot.


Oliver Peters is an experienced film and commercial editor/colorist. In addition, he regularly interviews editors for trade publications. He may be contacted through his website at oliverpeters.com

Deadpool’s Premiere Pro editing workflow

By Nicholas Restuccio

Director Tim Miller’s Deadpool is action-packed, vulgar (in a good way) and a ton of fun. It’s also one of the few Hollywood blockbusters to be edited entirely on Adobe’s Premiere Pro.

On the Saturday following the film’s release, Adobe hosted a panel on the Fox Studios lot that included Deadpool’s post supervisor Joan Bierman, first assistant editor Matt Carson and Adobe consultants Vashi Nedomansky and Mike Kanfer. Here are some takeaways…

Why Premiere Pro?
According to Bierman, much of the credit for choosing Premiere Pro for the edit goes to Tim Miller. “Even before we had a crew, Tim knew he wanted to do this,” she said. Miller, a first-time feature director is no stranger to technology — he is co-founder of Culver City’s Blur Studio, which specializes in visual effects and animation.

Miller’s friend, director David Fincher, is a big advocate of Adobe Premiere. It’s likely his using it to edit Gone Girl — the first feature cut with the product — inspired Miller. The rest of the credit goes to Ted Gagliano, president of post production at Fox, for giving the go ahead for the road less taken.

DEADPOOL

Training and Storage
The first step in this undertaking was getting all the editors and assistants — who were used to editing on Media Composer and Final Cut — trained on Premiere. So they brought in editor Vashi Nedomansky — a Premiere Pro workflow consultant — who spent an initial three weeks training all five editors and established the workflow. He then returned for at least 12 days during the next nine months to further refine the workflow and answer questions both technical and editorial.

Additionally, he showed them features that are unique to Premiere, such as Dynamic Linking to After Effects projects and tapping the tilde (~) key to “full screen” the workspace section. “In our shared editing environment, because the editors were all coming from an Avid workflow, we treated Premiere Pro sequences as Avid bins,” explained Nedomansky. “Because Premiere Pro only allows one open project at a time… we shared sequences like you would share bins in Avid to allow all the editors access to the latest cuts.”

The next step was to get the multi-user editorial environment set up. They wanted to have several users, assistant editors and editors, get in and start working on the film simultaneously, without crashing into each other and corrupting files.

Jeff Brue’s Open Drives provided storage for the film via its product Velocity, which delivered 180TB of solid-state storage. With 5GB/s of “normal” throughput, the team had projects that would open in less than two minutes.

DEADPOOL

The solution to the multi-user access problem was much simpler and lower tech. When someone was working on a project file, they would move it to their named directory so nobody opened it mid-edit. Then, once they were done, they moved it back. So a little discipline went a long way in making sure that sharing media in a multi-user environment was stress-free.

When they needed a sequence in a project, they were able to link to it from another Premiere project without harming the source project. All of this allowed them to keep everything, as Nedomansky put it, “contained, safe and sharable.”

Re-Framing and Multi-Format Shooting
With all this in place, the team was ready to start cutting the wide array of footage the crew was producing. The film was shot primarily on the Arri Alexa at 3.2K RAW, but footage was also captured on 5K and 6K Red cameras and at least one Phantom. All of the footage was downsized to the common container format of 2048×1152 for the offline in Premiere and encoded in ProRes LT. This allowed them to do a center extraction, which gave the director and editor the ability to re-frame when they wanted to.

For the online, they went back to the Arri RAW, or other RAW formats, depending on their needs. The center extraction gave them a lot of creative freedom, so much so that they reframed the entire movie in the online. “If I had it to do over again I would have done it [the reframing] in a cheaper room” said Bierman.

Throughout the edit, the post team was burning its way through Mac Pros — the Macs were having an issue with the ATI D700 cards in OS X. In all the team burned through 10 of the cards, which would occasionally melt down on renders.

“There were some incredibly complex reels on Deadpool,” says Kanfer.At one point midway through the production real five was taking over 10 minutes to load. Our engineers quickly regrouped and within a week were able to optimize the situation and the same reel took only 2 1/2 minutes to load once the fix was made. Other less complex reels in the film loaded in a minute or less.”

Vashi Nedomansky, Matt Carson and Joan Bierman.

The sound team had to create a slight workaround for audio turnovers. In a traditional Avid workflow, the hand off to Avid Pro Tools is relatively seamless — as you would expect since they are made by the same company — but going from Premiere required a little more effort. The package was the same as a normal sound turnover, including QuickTimes, guide tracks and EDLs, along with the AIFs. The trouble occurred when the conform wasn’t always in sync with what had been turned over.

Adobe looks at all of this as an opportunity to make their product even stronger. They said their engineers on the Adobe team “love to tackle problems and there is no better place to tackle those problem than live on an edit.”

Final Take Away
Even with training the editorial team to use a new program, working through audio conform hiccups and a pile of dead Mac towers, the team produced a polished film that had the best opening weekend for an R-rated film in history.

With improved sound turnover options hinted at for future versions of Premiere, we will very likely see more “Edited with Adobe Premiere Pro” logos in future film end credits.

Blur hires EP Greg Talmage, promotes others

Culver City’s Blur Studio, known for its VFX and design work for games, spots and films, has hired Greg Talmage as executive producer. Talmage brings 15 years of experience in creative services to Blur and was most recently from Iron Claw, an Emmy Award-winning production company that he co-founded in 2008.

In his new role Talmage will help guide Blur’s continued growth in games, commercials, VFX and film projects, and drive opportunities for Blur’s emerging pool of directors.

“I’ve always been a fan of Blur,” he says. “Their work has emotional impact and authentic, human resonance. Blur pushes the creative concept until it’s absolutely memorable and hard-hitting. My goal is to broaden our horizons, elevate brand awareness and expose more people to the passion and talent here — we work with top-level movie directors, develop our own properties, write trailers, direct commercials and create stunning animation. Now, it’s time for Blur to also shine in different markets and new forms of media.”

Talmage began his entertainment career at DreamWorks in marketing in the late 90’s and then gravitated towards design and short-form projects, rising through the ranks as a producer at companies that included Imaginary Forces, Transistor Studios, Troika and Logan, where he managed projects for major brands such as Microsoft, Apple and Electronic Arts.

In addition to bringing Talmage on board, Blur has promoted three long time artists to leadership roles within the company. Director Dave Wilson assumes creative director responsibilities, providing a guiding vision together on all company projects with Blur co-founder Tim Miller, while former VFX/CG supervisor Kevin Margo will be directing commercials, shorts and content for video game marketing campaigns. Previously the most senior CG supervisor at Blur, Jerome “Jed” Denjean now oversees all department leads as head CG supervisor, in addition to managing projects.

Blur replaces proprietary management tool, installs Shotgun

Los Angeles — Blur Studio, makers of feature films, commercials and game cinematics and trailers, is now using Shotgun’s software for production management and tracking. Shotgun replaces Blur’s in-house production management software, which was tying up valuable internal development resources.

“Pipeline and infrastructure software may not be the sexiest application you’ll find in a VFX and animation studio, but it’s certainly one of the most crucial to producing great work,” says Tim Miller, co-founder of Blur Studio. “The less time our artists and producers spend managing, the more time we can spend creating. As projects get more complicated and schedules get more challenging it’s critical to have tools to efficiently manage the huge amounts of detailed data needed to get the job done. Shotgun is both producer- and artist-friendly, which makes it in our opinion the best tool out there for helping us deliver great work.”

Prior to integrating Shotgun into their pipeline, Blur had two full-time developers working on a proprietary toolset called “Trax” for production management. The developers were allocating a majority of their time to updating the toolset. As project volume increased and the studio expanded with a new facility move, Blur’s production management needs were no longer being met by their in-house system.

“While Trax has certain features and advantages that will continue to be utilized for studio wide planning and reporting, as a production tool it was falling further behind. Shotgun was gaining prominence for offering a highly equipped, out of the box production management platform for animation, games and VFX studios,” says Jeff Beeland, Blur pipeline supervisor. “Shotgun has over a dozen dedicated developers, and a full-time support team, and once we started testing it we quickly realized we could never match what they deliver with something created in-house.”

Once the decision was made to standardize on Shotgun, Blur seamlessly connected the Shotgun database with their pipeline and in-house database. Both feature a lightweight Python API, allowing Shotgun to link effortlessly with their internal system along with core artist apps including Autodesk 3ds Max. Now fully integrated into their daily operations, Shotgun powers production management for an average user base of 85 artists across all of Blur’s projects which most recently have included Thor: The Dark World and promotional trailers for The Elder Scrolls. Using Shotgun’s Web-based interface, producers, coordinators and artists at all levels can add and link data seamlessly to the database for easy access.

In the future, Blur plans on further integrating Shotgun into their daily operations to track budgets and artists’ time sheets. “We’ve come a long way in a short time with Shotgun,” concludes Beeland. “It’s been great for our artists, producers and supervisors alike, and aside from building a great production management platform — Shotgun’s support and customer service is significantly ahead of what is standard in the software business.”

Photo Caption: Blur’s pipeline supervisor, Jeff Beeland, using Shotgun.

 

Tim Miller and Blur create prologue for ‘Marvel’s Thor: The Dark World’

Blur_Thor2_prologue_06-1

Culver City — Tim Miller, co-founder of Blur Studio, was tapped by Marvel Studios to head up the three-minute prologue sequence that sets the stage for its upcoming sequel Thor: The Dark World. Miller created the sequence, which is almost entirely CG.

“Blur and Tim Miller have a distinct understanding of the Marvel Universe,” said producer Kevin Feige. “That alone, not to mention their storytelling and CG expertise, made the opening and end titles standout sequences in the film.”

Narrated by Anthony Hopkins (Odin), the prologue establishes context for the story of Marvel’s Thor: The Dark World. The sequence is set in Svartalfheim, during an alignment of the Nine Realms 5000 years ago, when Malekith (Christopher Eccleston) battled Odin’s father, King Bor.

Continue reading