Tag Archives: compositing

ILM’s Richard Bluff talks VFX for Marvel’s Doctor Strange

By Daniel Restuccio

Comic book fans have been waiting for over 30 years for Marvel’s Doctor Strange to come to the big screen, and dare I say it was worth the wait. This is in large part because of the technology now available to create the film’s stunning visual effects.

Fans have the option to see the film in traditional 2D, Dolby Cinema (worthy of an interstate or plane fare pilgrimage, in my opinion) and IMAX 3D. Doctor Strange, Marvel Studios’ 15th film offering, is also receiving good critical reviews and VFX Oscar buzz — it’s currently on the list of 20 films still in the running in the Visual Effects category for the 89th Academy Awards.

Marvel Doctor StrangeThe unapologetically dazzling VFX shots, in many cases directly inspired by the original comic visuals by Steve Dittko, were created by multiple visual effects houses, including Industrial Light & Magic, Luma Pictures, Lola VFX, Method Studios, Rise FX, Crafty Apes, Framestore, Perception and previs house The Third Floor. Check out our interview with the film’s VFX supervisor Stephane Ceretti.

Director Scott Derrickson said in in a recent Reddit chat that Doctor Strange is “a fantastical superhero movie.

“Watching the final cut of the film was deeply satisfying,” commented Derrickson. “A filmmaker cannot depend upon critical reviews or box office for satisfaction — even if they are good. The only true reward for any artist is to pick a worthy target and hit it. When you know you’ve hit your target that is everything. On this one, I hit my target.”

Since we got an overview of how the visual effects workflow went from Ceretti, we decided to talk to one of the studios that provided VFX for the film, specifically ILM and their VFX supervisor Richard Bluff.

Richard Bluff

According to Bluff, early in pre-production Marvel presented concept art, reference images and previsualization on “what were the boundaries of what the visuals could be.” After that, he says, they had the freedom to search within those bounds.

During VFX presentations with Marvel, they frequently showed three versions of the work. “They went with the craziest version to the point where the next time we would show three more versions and we continued to up the ante on the crazy,” recalls Bluff.

As master coordinator of this effort for ILM, Bluff encouraged his artists, “to own the visuals and try to work out how the company could raise the quality of the work or the designs on the show to another level. How could we introduce something new that remains within the fabric of the movie?”

As a result, says Bluff, they had some amazing ideas flow from individuals on the film. Jason Parks came up with the idea of traveling through the center of a subway train as it fractured. Matt Cowey invented the notion of continually rotating the camera to heighten the sense of vertigo. Andrew Graham designed the kaleidoscope-fighting arena “largely because his personal hobby is building and designing real kaleidoscopes.”

Unique to Doctor Strange is that the big VFX sequences are all very “self-contained.” For example, ILM did the New York and Hong Kong sequence, Luma did the Dark Dimension and Method did the multi-universe. ILM also designed and developed the original concept for the Eldridge Magic and provided all the shared “digital doubles” — CGI rigged and animatable versions of the actors — that tied sequences together. The digital doubles were customized to the needs of each VFX house.

Previs
In some movies previs material is generated and thrown away. Not so with Doctor Strange. What ILM did this time was develop a previs workflow where they could actually hang assets and continue to develop, so it became part of the shot from the earliest iteration.

There was extensive previs done for Marvel by The Third Floor as a creative and technical guide across the movie, and further iterations internal to ILM done by ILM’s lead visualization artist, Landis Fields.

Warning! Spoiler! Once Doctor Strange moves the New York fight scene into the mirror universe, the city starts coming apart in an M.C. Escher-meets-Chris Nolan-Inception kind of way. To make that sequence, ILM created a massive tool kit of New York set pieces and geometry, including subway cars, buildings, vehicles and fire escapes.

In the previs, Fields started breaking apart, duplicating and animating those objects, like the fire escapes, to tell the story of what a kaleidoscoping city would look like. The artists then fleshed out a sequence of shots, a.k.a. “mini beats.” They absorbed the previs into the pipeline by later switching out the gross geometry elements in Fields’ previs with the actual New York hero assets.

Strange Cam
Landis and the ILM team also designed and built what ILM dubbed the “strange cam,” a custom 3D printed 360 GoPro rig that had to withstand the rigors of being slung off the edge of skyscrapers. What ILM wanted to do was to be able to capture 360 degrees of rolling footage from that vantage point to be used as a moving background “plates” that could be reflected within the New York City glass buildings.

VFX, Sound Design and the Hong Kong
One of the big challenges with the Hong Kong sequence was that time was reversing and moving forward at the same time. “What we had to do was ensure the viewer understands that time is reversing throughout that entire sequence.” During the tight hand-to-hand action moments that are moving forward in time, there’s not really much screen space to show you time reversing in the background. So they designed the reversing destruction sequence to work in concert with the sound design. “We realized we had to move away from a continuous shower of debris toward rhythmic beats of debris being sucked out of frame.”

before-streetafter-street

Bluff says the VFX the shot count on the film — 1,450 VFX — was actually a lot less than Captain America: Civil War. From a VFX point of view, The Avengers movies lean on the assets generated in Iron Man and Captain America. The Thor movies help provide the context for what an Avengers movie would look and feel like. In Doctor Strange “almost everything in the movie had to be designed (from scratch) because they haven’t already existed in a previous Marvel film. It’s a brand-new character to the Marvel world.”

Bluff started development on the movie in October of 2014 and really started doing hands on work in February of 2016, frequently traveling between Vancouver, San Francisco and London. A typical day, working out of the ILM London office, would see him get in early and immediately deal with review requests from San Francisco. Then he would jump into “dailies” in London and work with them until the afternoon. After “nightlies” with London there was a “dailies” session with San Francisco and Vancouver, work with them until evening, hit the hotel, grab some dinner, come back around 11:30pm or midnight and do nightlies with San Francisco. “It just kept the team together, and we never missed a beat.”

2D vs. IMAX 3D vs. Dolby Cinema
Bluff saw the entire movie for the first time in IMAX 3D, and is looking forward to seeing it in 2D. Considering sequences in the movie are surreal in nature and Escher-like, there’s an argument that suggests that IMAX 3D is a better way to see it because it enhances the already bizarre version of that world. However, he believes the 2D and 3D versions are really “two different experiences.”

Dolby Cinema is the merging of Dolby Atmos — 128-channel surround sound — with the high dynamic range of Dolby Vision, plus really comfortable seats. It is, arguably, the best way to see a movie. Bluff says as far as VFX goes, high dynamic range information has been there for years. “I’m just thankful that exhibition technology is finally catching up with what’s always been there for us on the visual effects side.”

During that Reddit interview, Derrickson commented, “The EDR (Extended Dynamic Range) print is unbelievable — if you’re lucky enough to live where an EDR print is playing. As for 3D and/or IMAX, see it that way if you like that format. If you don’t, see it 2D.”

Doctor Strange is probably currently playing in a theater near you, but go see it in Dolby Cinema if you can.


In addition to being a West Coast correspondent for postPerspective, Daniel Restuccio is the multimedia department chair at California Lutheran University and former Walt Disney Imagineer.

Grading & Compositing Storage: Northern Lights

Speed is key for artist Chris Hengeveld.

By Beth Marchant

For Flame artist Chris Hengeveld of Northern Lights in New York City, high-performance file-level storage and a Fibre Channel connection mean it’s never been easier for him to download original source footage and share reference files with editorial on another floor. But Hengeveld still does 80 percent of his work the old-fashioned way: off hand-delivered drives that come in with raw footage from production.

Chris Hengeveld

The bicoastal editorial and finishing facility Northern Lights — parent company to motion graphics house Mr. Wonderful, the audio facility SuperExploder and production boutique Bodega — has an enviably symbiotic relationship with its various divisions. “We’re a small company but can go where we need to go,” says colorist/compositor Hengeveld. “We also help each other out. I do a lot of compositing, and Mr. Wonderful might be able to help me out or an assistant editor here might help me with After Effects work. There’s a lot of spillover between the companies, and I think that’s why we stay busy.”

Hengeveld, who has been with Northern Lights for nine years, uses Flame Premium, Autodesk’s visual effects finishing bundle of Flame and Flare with grading software Lustre. “It lets me do everything from final color work, VFX and compositing to plain-old finishing to get it out of the box and onto the air,” he says. With Northern Lights’ TV-centric work now including a growing cache of Web content, Hengeveld must often grade and finish in parallel. “No matter how you send it out, chances are what you’ve done is going to make it to the Web in some way. We make sure that what we make look good on TV also looks good on the Web. It’s often just two different outputs. What looks good on broadcast you often have to goose a bit to get it to look good on the Web. Also, the audio specs are slightly different.”

Hengeveld provided compositing and color on this spot for Speedo.

Editorial workflows typically begin on the floor above Hengeveld in Avid, “and an increasing number, as time goes by, in Adobe Premiere,” he says. Editors are connected to media through a TerraBlock shared storage system from Facilis. “Each room works off a partition from the TerraBlock, though typically with files transcoded from the original footage,” he says. “There’s very little that gets translated from them to me, in terms of clip-based material. But we do have an Aurora RAID from Rorke (now Scale Logic) off which we run a HyperFS SAN — a very high-performance, file-level storage area network — that connects to all the rooms and lets us share material very easily.”

The Avids in editorial at Northern Lights are connected by Gigabit Ethernet, but Hengeveld’s room is connected by Fibre. “I get very fast downloading of whatever I need. That system includes Mr. Wonderful, too, so we can share what we need to, when we need to. But I don’t really share much of the Avid work except for reference files.” For that, he goes back to raw camera footage. “I’d say bout 80 percent of the time, I’m pulling that raw shoot material off of G-Technology drives. It’s still sneaker-net on getting those source drives, and I don’t think that’s ever going to change,” he says. “I sometimes get 6TB of footage in for certain jobs and you’re not going to copy that all to a centrally located storage, especially when you’ll end up using about a hundredth of that material.”

The source drives are typically dupes from the production company, which more often than not is sister company Bodega. “These drives are not made for permanent storage,” he says. “These are transitional drives. But if you’re storing stuff that you want to access in five to six years, it’s really got to go to LTO or some other system.” It’s another reason he’s so committed to Flame and Lustre, he says. Both archive every project locally with its complete media, which can be then be easily dropped onto an LTO for safe long-term storage.

Time or money constraints can shift this basic workflow for Hengeveld, who sometimes receives a piece of a project from an editor that has been stripped of its color correction. “In that case, instead of loading in the raw material, I would load in the 15- or 30-second clip that they’ve created and work off of that. The downside with that is if the clip was shot with an adjustable format camera like a Red or Arri RAW, I lose that control. But at least, if they shoot it in Log-C, I still have the ability to have material that has a lot of latitude to work with. It’s not desirable, but for better stuff I almost always go back to the original source material and do a conform. But you sometimes are forced to make concessions, depending on how much time or budget the client has.”

A recent spot for IZod, with color by Hengeveld.

Those same constraints, paired with advances in technology, also mean far fewer in-person client meetings. “So much of this stuff is being evaluated on their computer after I’ve done a grade or composite on it,” he says. “I guess they feel more trust with the companies they’re working with. And let’s be honest: when you get into these very detailed composites, it can be like watching paint dry. Yet, many times when I’m grading,  I love having a client here because I think the sum of two is always greater than one. I enjoy the interaction. I learn something and I get to know my client better, too. I find out more about their subjectivity and what they like. There’s a lot to be said for it.”

Hengeveld also knows that his clients can often be more efficient at their own offices, especially when handling multiple projects at once, influencing their preferences for virtual meetings. “That’s the reality. There’s good and bad about that trade off. But sometimes, nothing beats an in-person session.”

Our main image is from NBC’s Rokerthon.

Jon Neill joins Axis as head of lighting, rendering, compositing

Axis Animation in Glasgow, Scotland, has added Jon Neill as their new head of lighting, rendering and compositing (LRC). He has previously held senior positions at MPC and Cinesite, working on such projects as Jungle Book, Skyfall and Harry Potter and the Order of the Phoenix.

His role at Axis will be overseeing the LRC team at both the department and project level, providing technical and artistic leadership across multiple projects and managing the day-to-day production needs.

“Jon’s supervisory skills coupled with knowledge in a diverse range of execution techniques is another step forward in raising the bar in both our short- and long-form projects.” says Graham McKenna, co-founder and head of 3D at Axis.

SGO Mistika now compatible with AJA’s Kona, Corvid

SGO, makers of the color grading/finishing tool Mistika, has partnered with video hardware developer AJA. Mistika is now fully compatible with AJA’s line of Kona and Corvid video capture and playback cards, offering optimized video output. The combination of the latest version of Mistika with the AJA hardware boosts support for extreme video formats, including 4K stereo 3D dual link, even at HFR frame rates up to 60p.

AJA’s Kona capture, display and mastering products for SD, HD, 3G, Dual Link HD, 2K and 4K are a good match with Mistika, which provides a complete post feature set for projects of any practical resolution and frame rate, even beyond 8K. Stereo 3D output in 4K using the Corvid 88 I/O card is already available, along with viable future 8K capabilities for Mistika Ultima 8K systems.

Blackmagic makes Fusion 8 Studio public beta available, releases Resolve 12.2

Fusion 8 Studio, the full version of Blackmagic’s visual effects and motion graphics software, is available for download for both Mac OS X and Windows. A public beta of the free version of Fusion 8 was released earlier this year at SIGGRAPH. The new Fusion 8 Studio public beta builds upon all of the tools in the free version and adds advanced optical flow tools for retiming, image repair, color smoothing and morphing between different images, along with the ability to render at resolutions larger than Ultra HD.

The Fusion 8 Studio public beta also adds advanced stereoscopic tools for converting 2D shows to 3D, support for third-party plug-ins, remote scripting and Avid Connect, a plug-in that allows customers to use Fusion directly from Media Composer timelines.

Projects created with the free version of Fusion can be opened and finished in Fusion 8 Studio, regardless of which platform they were created on. Fusion 8 Studio also includes Generation — multi-user studio software for managing assets, tracking versions and doing shot-based review and approval.

In addition, Fusion 8 Studio public beta also includes render node software that lets customers install an unlimited number of Fusion render nodes on additional computers for free, saving them thousands of dollars in licensing fees. That means customers working on high-end film and television projects in large multi user studios can now accelerate their workflow by distributing render jobs across an unlimited number of systems on their network.

Fusion 8 is available in two versions. Fusion 8 Studio, which is now in public beta, will be available for Mac and Windows for $995, with Linux to be released in Q1 2016. Fusion 8 Studio has all of the same features as the free version and adds advanced optical flow image analysis tools for stereoscopic 3D work, retiming and stabilization. Fusion Studio also includes support for third party OpenFX plug-ins, unlimited distributed network rendering and Generation for studio-wide, multi-user collaboration to track, manage, review and approve shots when working with large creative teams on complex projects.

In other news, there is a free DaVinci Resolve 12.2 update that adds support for the latest color science technologies, along with decoding of HEVC/H.265 QuickTime files on OS X, additional high dynamic range features and more. The DaVinci Resolve 12.2 update is available now for both DaVinci Resolve 12 and DaVinci Resolve 12 Studio customers, and can be downloaded from the Blackmagic Design website.

Resolve

Since November’s release of version 12.1, Blackmagic has been adding features pro editors and colorists need, as well as support for the latest formats with expanded color spaces and wide dynamic range. With this DaVinci Resolve 12.2 update, Blackmagic Design continues to improve the software and extend its lead in color, dynamic range and image processing, putting DaVinci Resolve far ahead of other color correction software.

The DaVinci Resolve 12.2 update adds support for the latest Blackmagic and third-party cameras while also delivering significant improvements to DaVinci Resolve color management. Customers get new support for HDR Hybrid Log Gamma, conversion LUTs for Hybrid Log Gamma, ACES IDTs for Canon C300 Mk II clips, and updated ST 2084 HDR color science. That means colorists have even better tools for finishing high dynamic range projects that are going to be distributed to the latest theaters with the latest projection systems like IMAX Laser and Dolby Vision. This also lets customers prepare content that is ready for next generation HDR 4K televisions.

In addition, the DaVinci Resolve 12.2 update adds support for NewBlue Titler Pro titles using Media Composer AAF sequences, improves ProRes 4444 alpha channel support by defaulting to straight blend mode, retains Power Window opacity and invert settings when converting to Power Curve windows and more.

Quick Chat: ‘Mermaids on Mars’ director Jon V. Peters

Athena Studios, a Bay Area production and animation company, has completed work on a short called Mermaids on Mars, which is based on a children’s book and original music by the film’s producer Nancy Guettier. It was directed by Jon V. Peters and features the work of artists whose credits include the stop-motion offerings Coraline, James and the Giant Peach and The Nightmare Before Christmas, as well as many other feature length films.

The film is about a young boy who is magically transported to Mars, where he tries to stop an evil Martian from destroying the last of the planet’s mermaids. The entire story was told with stop-motion animation, which was shot on Athena Studios‘ (@AthenaStudios) soundstage.

The 24-minute film was comprised of 300 shots. Many involved complex compositing, putting heavy demands on Athena’s small team of visual effects artists who were working within a post schedule of just over three months.

Mermaids on Mars

Kat Alioshin (Coraline, The Nightmare Before Christmas, Monkeybone, Corpse Bride) was co-producer of the film, running stages and animation production. Vince De Quattro (Hellboy, Pirates of the Caribbean, Star Wars, Mighty Joe Young) is the film’s digital post production supervisor.

Let’s find out more from Peters who in addition to directing and producing Mermaids on Mars, is also the founder of Athena Studios.

Why did you decide to create Mermaids on Mars as an animated short?
The decision was budget-driven, primarily. We were originally approached by Nancy Guettier, who is the author of the book the film is based on, and one of the film’s producers. She had originally presented us with a feature length script with 12 songs. Given budgetary restrictions, however, we worked with Nancy and her screenwriter, Jarrett Galante, to cut the film down to a 24-minute short that retained five of her original songs.

What are some of the challenges you faced turning a book into an animated short?
The original book is a charming short story that centers more on mermaids conserving water. The first feature-length script had added many other elements, which brought in Martian armies and a much more detailed and storyline. The biggest problem we had was trying to simplify the story as much as possible without losing the heart of the material. Because of our budget, we were also limited in the number of puppets and the design of our sets.

julian_mars

Are there wrong ways to go about this?
There are hundreds, perhaps thousands, of ways to approach production on a film like this. The only “wrong” way would have been to ignore the budget. As many other films have shown, limitations (financial or otherwise) can breed creativity. If you walk the budget backward it can help you define your approach. The film’s co-producer, Kat Alioshin, had worked on numerous stop-motion features previously, so she had a good handle on what the cost for each element would be.

Describe your thought process for setting the stage for Mermaids on Mars.
Originally, we looked at doing the entire production as more of a 2D stop-motion down shooter design, but the producer really wanted 3D characters. We did not have the budget for full sets however. As we looked at combining a 2D set design with 3D practical stop-motion puppets it took us all the way back to Georges Méliès, the father of visual effects. He was a stage magician and his films made use of flats in combination with his actors. We drew inspiration from his work in the design of our production.

l

While we wanted to shoot as much in-camera as possible we knew that because of the budget we would need to rely almost as much on post production as the production itself. We shot many of the elements individually and then combined them in post. That part of the production was headed up by veteran visual effects artist Vince De Quattro.

What cameras did you use? 
Animation was shot on Canon DSLR cameras, usually 60D, using DragonFrame. The puppeted live-action wave rank shots were done on a Blackmagic Studio Camera in RAW and then graded in DaVinci Resolve to fit with the Canon shots. Live action shots (for the bookends of the film) were shot on Red Epic cameras.

What was used for compositing and animation?

All compositing was done in Adobe After Effects. There was no 3D animation in the film since it was all practical, stop-motion, but the 3D models for the puppet bodies (used for 3D printing and casting) was done in Autodesk Maya.

Was the 2D all hand drawn?
Yes, all 2D was hand drawn and hand painted. We wanted to keep a handmade feel to as many aspects of the film as possible.

How much time did you devote to the set-up and which pieces took the longest to perfect?
It was a fairly quick production for a stop-motion piece. Given the number of stages, shop needs, size of the project and other shoots we had scheduled, we knew we could not shoot it in our main building, so we needed to find another space. We spent a lot of our time looking for the right building, one that met the criteria for the production. Once we found it we had stages set up and running within a week of signing the lease agreement.

Our production designer Tom Proost (Galaxy Quest, Star Wars — The Phantom Menace, Lemony Snicket’s, Coraline) focused on set and prop building of the hero elements, always taking a very “stage-like” approach to each. We had a limited crew so his team worked on those pieces that were used in the most shots first. The biggest pieces were the waves of the ocean, used on both Earth and Mars, a dock set, the young boy’s bedroom, the mermaid palace, the Martian fortress and a machine called the “siphonator.”

GilbertOnDock

Initial builds and animation took approximately six months, and post production took an equal amount of time.

What was your favorite set to work with, and why?
There were many great sets, but I think the wave set that Tom Proost and his team built was my favorite. It was very much a practical set that had been designed as a raked stage with slots for each of the wave ranks. It was manually puppeted by the crew as they pulled the waves back and forth to create the proper movement. That was filmed and then the post production team composited in the mermaid characters, since they could not be animated within the many wave ranks.

You did the post at Athena?
Twenty-four minutes of film with an average of five composited iterations per shot equates to approximately 300,000 frames processed to final, all completed by Athena’s small team under tight festival deadlines.

IBC: Autodesk to release Extension 1 for Flame 2016 line

Autodesk will soon release Extension 1 for its Flame 2016 family of 3D VFX software, which includes Autodesk Flame, Autodesk Flare, Autodesk Lustre and Autodesk Flame Assist. Inspired by user feedback, Autodesk added workflow improvements, new creative tools and a performance boost. Flame 2016 Extension 1 will be available to subscription customers on September 23.

Highlights of the Flame 2016 Extension 1 release are:
– Connected Conform: A new, unified media management approach to sharing, sorting and syncing media across different sequences for faster finishing in Flame Premium, Flame and Flare. New capabilities include shared sources, source sequence, shots sequence, shared segment syncing and smart replace.
– Advanced Performance: Realtime, GPU-accelerated debayering of Red and ArriRaw source media using high-performance Nvidia K6000 or M6000 graphics cards. The performance boost allows artist to begin work instantly in Flame Premium, Flame, Flare and Lustre.
– GMask Tracer: New to Flame Premium, Flame and Flare, this feature simplifies VFX creation with spline-based shape functionality and a chroma-keying algorithm.
– User-Requested Features: Proxy workflow enhancements, new batch context views, refined cache status, full-screen views, redesigned tools page and more.

Behind the Title: Encore VFX’s Robert Minshall

NAME: Robert Minshall

COMPANY: Encore VFX (@encorepost) in Hollywood.

CAN YOU DESCRIBE WHAT ENCORE DOES?
Encore is a post facility that specializes in the picture finishing of episodic television. This includes dailies, online editing, final color and VFX. Encore is a division of Deluxe Entertainment Services.

WHAT’S YOUR JOB TITLE?
Senior Compositor

WHAT DOES THAT ENTAIL?
I create VFX by combining a variety of 2D and 3D elements in a (mostly) 2D environment.

Neverland Stop the Bleeding

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
In my case, it would probably be the things that aren’t included in the title. I have extensive editorial experience dating back to the mid-‘80s, and I tap into those skills on a regular basis, especially when working on NCIS (pictured above). In addition to extensive VFX work, I handle all VFX drop-ins into the masters, drop-in inserts, re-conforms and, occasionally, I even handle minor recuts to better accommodate the VFX.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Finishing difficult shots seamlessly. Of course, if I do it right, the average person would have no idea that anything was done to a particular shot, which is the ultimate objective. Client contact is also a part of the work that I like, which can be quite extensive.

WHAT’S YOUR LEAST FAVORITE?
Working on difficult production fixes that take a lot of time or iteration, with very little payoff.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
That’s a tough question. This is actually my third career. I was an engineer when I got out of college and then spent a number of years pursuing a career in music before ending up in post. At this point, I’d probably be involved in some other aspect of television.

WHY DID YOU CHOOSE THIS PROFESSION?
It’s more like this profession chose me. I graduated with an engineering degree from MIT and moved to LA from Boston to pursue a music career. When that didn’t take off immediately, I found myself working as a materials engineer on the space shuttle program (at which point, I could actually call myself a rocket scientist from MIT).

After a few years, I chose to shift my focus back into music —with moderate success — mostly in the R&B field, working with artists such as Barry White, Deniece Williams, Johnny Nash and the Motown production team of Holland-Dozier-Holland, both recording and touring.

Since the work was sporadic, I also took side jobs to make ends meet, one of which landed me in a video facility at the time when the “video explosion” was the subject of magazine covers. Eventually, I went from wiring a facility to tape-op to editor to senior editor, with extensive visual effects work, to finally the Inferno workstation, where I still am.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
NCIS, NCIS: New Orleans, Under the Dome, Extant, Newsroom, House M.D., Weeds.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
Working on the original The X-Files really helped me build a reputation in the industry. I delivered the pilot personally to Chris Carter and was the finish editor on the show for the first four years. After the fourth season, I jumped to the Inferno, which is where I still am. My involvement with such a wildly popular show provided me with an unusually high profile. I also made significant contributions to Ally McBeal, Deadwood and now NCIS, which are obvious points of pride.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
My smart phone is indispensable; I use it to stay in constant contact with clients. My PC is also important because I am always emailing QuickTime files to producers for notes and approval, as well as doing various searches relevant to my work. Obviously, my workstation is key — without it, I wouldn’t be doing any of this.

DO YOU LISTEN TO MUSIC WHILE YOU WORK? CARE TO SHARE YOUR FAVORITE MUSIC TO WORK TO?
Sometimes I listen to music but other times it gets in the way. I like everything from classical music to ‘60s rock and ‘70s R&B. I have a penchant for female vocalists, and it’s a great time for them – Rihanna, Katy Perry, Beyonce, Kelly Clarkson, etc., along with a number of more obscure ones. Taylor Swift is also hard to ignore. I enjoy Prince as well and have a soft spot for Sly and the Family Stone. As far as I’m concerned, he pretty much invented funk, which, for a time, was a large part of my life. Folk, blues, guitar-driven rock, even some hip-hop if it’s good. There isn’t much I don’t like.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I’ve delivered hundreds of TV shows either the day before or the day of air, so tight deadlines are just part of what I do, and it doesn’t stress me out the way you might think. It’s just the end product of what I do.

Outside of work, I spend time with my wife. We’ve been together for 37 years and I love her as much as I ever did. I enjoy going to the beach and riding waves. I play some golf and try to play at least an hour of music every day, either piano (mostly classical) or guitar. I love to travel when I can. I am never at a loss for things to do.

Free public beta of Fusion 8 now available for Mac and PC

The public beta of the free version of Blackmagic’s Fusion 8, the company’s visual effects and motion graphics software, is now available for download from the Blackmagic Design website. This beta is for the free version of Fusion 8 and is available for both Mac OS X and Windows.

A beta for the paid version, Fusion 8 Studio, which adds stereoscopic 3D tools and is designed for multi-user workgroups and larger studios, will be available shortly. However, current Fusion Studio customers can download the public beta for the free version of Fusion 8 and start using it today.

stereoscopic@2x

This public beta is also the first-ever Mac compatible release of Fusion, which was previously a Windows-only product. In addition, projects can be easily moved between Mac and Windows versions of Fusion so customers can work on the platform of their choice.

In the six months since Fusion 8 was launched at NAB there have been many improvements to the user interface — it features a more modern look. There will be many more improvements to the user interface as the Fusion engineering teams continue to work with the visual effects community.

Featuring a node-based interface, Fusion makes it easy to build high-end visual effects compositions very quickly. Nodes are small icons that represent effects, filters and other image processing operations that can be connected together in any order to create unlimited visual effects. Nodes are laid out logically like a flow chart, so customers won’t waste time hunting through nested stacks of confusing layers with filters and effects. With a node-based interface, it’s easy to see and adjust any part of a project in Fusion by clicking on a node.

interface-01@2xWith a massive toolset consisting of hundreds of built in tools, customers can pull keys, track objects, rotoscope, retouch images, animate titles, create amazing particle effects and much more, all in a true 3D workspace. Fusion can also import 3D models, point cloud data, cameras or even entire 3D scenes from Maya, 3ds Max or LightWave and render them seamlessly with other elements. Deep pixel tools can be used to add volumetric fog, lighting and reflection mapping of rendered objects using world position passes so customers can create amazing atmospheric effects that render in seconds, instead of hours.

Fusion has been used on thousands of feature film and television projects, including Thor, Edge of Tomorrow, The Hunger Games trilogy, White House Down, Battlestar Galactica and others.

Killer wasps and an ’80s look for horror flick ‘Stung’

Rat Pack Films and XYZ Films movie Stung is an homage to the golden age of VHS, featuring a campy look of the 1980’s horror film genre. It’s proof positive that monster movies still exist in the world of low-budget horror/comedy.

They used an Arri Alexa 2K with Hawk anamorphic lenses to shoot the film. The anamorphic lenses produced a distinctive intensity that the filmmakers felt helped with the strong color definition needed to achieve a 1980’s look and feel.

Stung focuses on a fancy garden party gone terribly wrong when a colony of killer wasps mutates into seven-foot tall predators.

German-based freelance colorist Peter Hacker custom built — from top to bottom — his own PC-based compositing/grading workstation, equipped with an Nvidia 760GTX, an internal hardware RAID for storage and some SSDs for realtime playback. The calibrated NEC LCD monitors are supported by a Sony OLED screen in order to accurately judge the final color grading.

Peter Hacker

Hacker (pictured above)  has a strong background in visual effects and compositing, which is why he was hired to be the VFX producer and compositor, as well as colorist — more on that in a minute — for Stung (see the trailer here). Hacker has several years experience in color grading, working on numerous commercials for Mercedes, Audi and Fanta; a few indie features; and many shorts.

In collaboration with director Benni Diez and VFX supervisor Sebastian Nozon, he took over the post-production management of the movie. He was also in charge of preparing all the footage for the VFX shots and handed it over to the remotely working animation, rigging, modeling and compositing artists.

He also developed the movie’s look and was involved in the compositing of more than a hundred shots. However, schedule conflicts with the original color grading team required a new plan, and Hacker took on the color grading and finishing of the film as well.

Hacker’s weapon of choice for his post work on Stung was Assimilate Scratch. “As a student I had worked in Scratch at Filmakademie Baden-Württemberg and really dug into fully learning the system. I found it to be the most straightforward tool suite, and I still feel that way. As a freelancer working on a variety of imagery projects, it has all the realtime functions I need — conform, color grading, versioning and finishing – as well as some bells and whistles like VR capability. And it’s now at a price I can personally afford, which means that I, as well as all indie productions, can set up an at-home studio and have a second license on the laptop for hitting the road.”

“For Stung I created different looks during the first two days of grading because I didn’t have LUTs as a reference. It’s easy to create multiple looks for review in Scratch, and those LUTs are now in my archive for possible future use. Then I created a separate Scratch Construct (timeline) with all the movie’s master shots to ensure the look would work and to allow me to track the changes within the story, which were bound to occur due to changes of the seasons and different weather/lighting conditions within a sequence.”

Working Remotely, and on a Budget
The horror film genre is synonymous with low-budget production, so there was not a lot of wiggle room, which meant they had to get creative with workflows, especially since they were working and reviewing shots remotely.

The finishing team.

The finishing team at work.

The movie was shot in Berlin.  Dominik Kattwinkel and Benni Diez edited on Avid Media Composer in Cologne. Also working in Cologne were animators Waldemar and Harry Fast. Sebastian Nozon and Sascha Geddert did the compositing, lighting and rendering in Berlin. “I was doing parts of the compositing and finally graded the entire movie in Ludwigsburg,” explains Hacker. “To make all the data transfer possible among those numerous locations we used BTSync, which kept us all in sync without a hassle.” The Foundry’s Nuke and Adobe After Effects were used for compositing and Autodesk 3ds Max and Maya for 3D animation and rendering.

During editing, the number of visual effects shots increased from 150 to 600. “I had 8TB of storage for the Alexa material and some Red footage from the pick-up shoot. There were 1,600 edits in the film that runs for 84 minutes, so that gives you an idea of the project’s heavy workload — and all while being on a tight budget,” explains Hacker. “To ensure the data’s safety we had back-up RAIDs set up at several locations spread over the country. Furthermore, we separated the data being worked on from the back-ups, and scheduled the day’s work to back up during the night.”

“With a couple weeks left until delivery, the rendered shots (JPEGs, in the end replaced by DPXs) were transferred from Berlin and Cologne to me in Ludwigsburg where I dropped them into the Scratch timeline. With peer-to-peer uploaded previews of the film, or just smaller sequences, we all were continually on the same page.”

They used Skype for review conversations. “Two weeks before delivery we all came together in a small office space in Ludwigsburg to finish the compositing. At that time I switched from compositing to color grading for 12 straight days in a darkened tent in the corner of the room. It was a cheerful time with all of us finally sharing the same space and adding some final touches and even bringing some sequences to life for the first time. For viewing pleasure, I brought in my 55-inch Sony TV for a few relaxed reviews, which also sped up the process and helped to keep the budget in line.”

before after

These sessions included director Benni Diez watching back to back with Hacker. “It was very helpful that he could view and judge the color grading in realtime on a separate monitor without the need to watch over my shoulder all the time,” he says. “It was also crucial for all the VFX shots — Diez and Nozon immediately could discuss how they looked with the grading applied. It’s always a big challenge when it comes to CGI content being integrated into live action back plates. With the different nature of the content, they either fit together even better after the grading is applied, or not. Once in a while we had shots working completely fine in comps, but they got torn apart in the grading. Altogether, it was a magical experience to see all the elements come together right before your eyes, and literally any changes could be made on the fly.”

Hacker says one particularly helpful  Scratch feature, which he used a lot on Stung, was the ability to continue working while Scratch was rendering in the background. “That’s a huge timesaver and I wouldn’t like to work without it.”