Category Archives: A Closer Look

‘Suicide Squad’: Imageworks VFX supervisor Mark Breakspear 

By Randi Altman

In Warner Bros.’ Suicide Squad, a band of captured super-villains are released from prison by the government and tasked with working together to fight a common enemy, the evil Joker. This film, which held top box office honors for weeks, has a bit of everything: comic book antiheroes, super powers, epic battles and redemption. It also features a ton of visual effects work that was supervised by Sony Imageworks’ Mark Breakspear, who worked closely with production supervisor Jerome Chen and director David Ayer (see our interview with Ayer).

Mark Breakspear

Mark Breakspear

Breakspear is an industry veteran with more than 20 years of experience as a visual effects supervisor and artist, working on feature films, television and commercials. His credits include American Sniper, The Giver, Ender’s Game, Thor: The Dark World, The Great Gatsby… and that’s just to name a few.

Suicide Squad features approximately 1,200 shots, with Imageworks doing about 300, including the key fight at the end of the film between Enchantress, the Squad, Incubus and Mega Diablo. Imageworks also provided shots for several other sequences throughout the movie.

MPC worked on the majority of the other visual effects, with Third Floor creating postviz after the shoot to help with the cutting of the film.

I recently threw some questions at Breakspear about his process and work on Suicide Squad.

How early did you get involved in the project?
Jerome Chen, the production supervisor, involved us from the very beginning in the spring of 2015. We read the script and started designing one of the most challenging characters — Incubus. We spent a couple of months working with designer Tim Borgmann to finesse the details of his overall look, shape and, specifically, his skin and sub-surface qualities.


How did Imageworks prepare for taking on the film?

We spent time gathering as much information as we could about the work we were looking to do. That involved lengthy calls with Jerome to pick over every aspect of the designs that David Ayer wanted. As it was still pretty early, there was a lot more “something like” rather than “exactly like” when it came to the ideas. But this is what the prepro was for, and we were able to really focus on narrowing down the many ideas in to key selections and give the crew something to work with during the shoot in Toronto.

Can you talk about being on set?
The main shoot was at Pinewood in Toronto. We had several soundstages that were used for the creation of the various sets. Shoot days are usually long and arduous, and this was no exception. For VFX crews, the days are typically hectic, quiet, hectic, very hectic, quiet and then suddenly very hectic again. After wrap, you still have to download all the data, organize it and prep everything for the next day.

I had fantastic help on set from Chris Hebert who was our on-set photographer. His job was to make sure we had accurate records (photographic and data sets) of anything that could be used in our work later on. That meant actors, props, witness cameras, texture photography and any specific one-off moments that occur 300 times a day. Every movie set needs a Chris Hebert or it’s going to be a huge struggle later on in post!

gb0140_comp_breakdown_plate.1052.tif


Ok, let’s dig into the workflow. Can you walk us through it?

Workflow is a huge subject, so I’ll keep the answer somewhat concise! The general day would begin with a team meet between all the various VFX departments here at Imageworks. The work was split across teams in both Culver and Vancouver, so we did regular video Hangouts to discuss the daily plan, the weekly targets and generally where we were at, plus specific needs that anyone had. We would usually follow this by department meetings prior to AM dailies where I would review the latest work from the department leads, give notes, select things to show Jerome and David, and give feedback that I may have received from production.

We tried our best to keep our afternoons meeting-free so actual work could get done! Toward the end of the day we would have more dailies, and the final days selection of notes and pulls to the client would take place. Most days ended fairly late, as we had to round off the hundreds of emails with meaningful replies, prep for the next day and catch any late submission arrivals from the artists that might benefit from notes before the morning.

What tool, or tools, did you use for remote collaboration?
We used Google Hangouts for video conferencing, and Itview for shot discussion and notes with Jerome and David. Itview is our own software that replaces the need to use [off-the-shelf tools], and allows a much faster, more secure and accurate way to discuss and share shots. Jerome had a system in post and we would place data on it remotely for him to view and comment on in realtime with us during client calls. The notes and drawings he made would go straight in to our note tracker and then on to artists as required.

gb1156_comp_breakdown.1242.tif
What was the most challenging shot or shots, and why?

Our most challenging work was in understanding and implementing fractals into the design of the characters and their weapons. We had to get up to speed on three-dimensional mandlebulbs and how we can render them into our body of work. We also had to create vortical flow simulations that came off the fractal weapons, which created their own set of challenges due to the nature of how particles uniquely behave when near high velocity emissions.

So there wasn’t a specific shot that was more challenging than another, but the work that went in to most of them required a very challenging pre-design and concept solve involving fractal physics to make them work.

Can you talk about tools — off-the-shelf or proprietary — you used for the VFX? Any rendering in the cloud?
We used Side Effects Houdini and Autodesk Maya for the majority of shots and The Foundry’s Nuke to comp everything. When it came to rendering we used Arnold, and in regards to cloud rendering, we did render remotely to our own cloud, which is about 1,000 miles away — does that count (smiles)?

VFX Supervisor Volker Engel: ‘Independence Day,’ technology and more

Uncharted Territory’s Volker Engel is one of Hollywood’s leading VFX supervisors, working on movies as diverse as White House Down, Hugo and Roland Emmerich’s Shakespeare movie Anonymous. Most recently he was in charge of the huge number of effects for Emmerich’s Independence Day: Resurgence.

Engel was kind enough to make time in his schedule to discuss his 28-year history with Emmerich, his favorite scenes from Independence Day, his experience with augmented reality on set and more.

When did you get involved with Independence Day?
I was probably the earliest person involved after Roland Emmerich himself! He kept me posted over the years while we were working on other projects because we were always going to do this movie.

I think it was 2009 when the first negotiations with 20th Century Fox started, but the important part was early 2014. Roland had to convince the studio regarding the visuals of the project. Everyone was happy with the screenplay, but they said it would be great to get some key images. I hired a company called Trixter — they are based in Germany, but also have an office in LA. They have a very strong art department. In about six weeks we finished 16 images that are what you can call “concept art,” but they are extremely detailed. Most of these concepts can be seen as finished shots in the movie. This artwork was presented to 20th Century Fox and the movie was greenlit.

Concept art via Trixter.

You have worked with Emmerich many times. You must have developed a sort of shorthand?
This is now a 28-year working relationship. Obviously, we haven’t done every movie as a team but I think this is our eighth movie together. There is a shorthand and that helps a lot. I don’t think we really know what the actual shorthand is other than things that we don’t need to talk about because we know what needs to happen.

Technology continues to advance. Does that make life easier, or because you have more options does it make it even more complex?
It’s less the fact that there’s more options, it’s that the audience is so much more sophisticated. We now have better tools available to make better pictures. We can do things now that we were not able to do before. So, for example, now we can imagine a mothership that’s 3,000 miles in diameter and actually lands on Earth. There is a reason we had a smaller mothership in the first movie and that it didn’t touch down anywhere on the planet.

The mothership touching down in DC.

So it changes the way you tell stories in a really fundamental way?
Absolutely. If you look at a movie like Ex Machina, for example, you can show a half-human/half-robot and make it incredibly visually convincing. So all of a sudden you can tell a story that you wouldn’t have been able to tell before.

If you look at the original Independence Day movie, you really only see glimpses of the aliens because we had to do it with practical effects and men in suits. For Independence Day: Resurgence we had the chance to go much further. What I like actually is that Roland decided not to make it too gratuitous, but at least we were able to fully show the aliens.

Reports vary, but they suggest about 1,700 effects shots in Independence Day: Resurgence. Is that correct?
It was 1,748. Close to two-thirds of the movie!

What was your previs process like?
We had two different teams: one joined us from Method Studios and the other was our own Uncharted Territory team, and we split the task in half. The Method artists were working in our facility, so we were all under one roof.

Method focused on the whole lunar sequence, for example, while our in-house team started with the queen/bus chase toward the end of the movie. Roland loves to work with two specific storyboard artists and has several sessions during the week with them, and we used this as a foundation for the previs.

Trixter concept art.

So Roland was involved at the previs stage looking at how it was all going to fit together?
He had an office where the previs team was working, so we could get him over and go literally from artist to artist. We usually did these sessions twice a day.

What tools were you using?
Our in-house artists are Autodesk 3D Studio Max specialists, and the good folks from Method worked with Autodesk Maya.

The live shoot used camera-tracking technology from Ncam to marry the previs graphics and the live action in realtime to give a precise impression of how the final married shot would work.

How were you using the Ncam exactly?
The advantage is that we took the assets we had already built for previs and then re-used them inside the Ncam set-up, doing this with Autodesk Motion Builder. But some of the animation had to be done right there on set.

After: Area 51

I’ll give you an example. When we’re inside the hangar at Area 51, Roland wanted to pan from an actor’s face looking at 20 jet fighters lifting off and flying into the distance, and he wanted to pan off the actors face to show the jets. The Ncam team and Marion [Spates, the on-set digital effects supervisor] had to right there, on the spot, do the animation for the fighters. In about five minutes, they had to come up with something there and then and do the animation, and what’s more, it worked. That’s why Roland also loves to work with Ncam, because it gives him the flexibility to make some decisions right there in the moment.

So you’re actually updating or even creating shots on set?
Yes, exactly. We have the toolbox there — the assets like the interior of the hangar — but then we do it right there to the picture. Sometimes for both the A-camera and the B-camera.

We did a lot of extensions and augmentations on this movie and what really helped was our experience of working with Ncam on White House Down. For Roland, as the director, it helps him compose his images instead of just looking at a gigantic bluescreen. That’s really what it is, and he’s really good at that.

The Ncam at use on set.

I explain it this way: imagine you already have your first composite right there, which goes straight to editorial. They immediately have something to work with. We just deliver two video files: the clean one with the bluescreen and another from Ncam that has the composite.

Did using Ncam add to the shooting time?
Working with AR on set always adds some shooting time, and it’s really important that the director is briefed and wants to use this tool. The Ncam prep often runs parallel to the rehearsals with the actors, but sometimes it adds two or three additional minutes. When you have someone who’s not prepared for it, two or three minutes can feel like a lifetime. It does, however, save a lot of time in post.

On White House Down, when we used Ncam for the first time, it actually took a little over a week until everything grooved and everyone was aware of it — especially the camera department. After a little while they just knew this is exactly what needed to be done. It all became instant teamwork. It is something that supports the picture and it’s not a hindrance. It’s something that the director really wants.

Do you have a favorite scene from Resurgence?
There is a sequence inside the mothership where our actors are climbing up one of these gigantic columns. We had a small set piece being built for the actors to climb, and it was really important for Roland to compose the whole image. He could ask for a landing platform to be removed and more columns to be added to create a sense of depth, then move the view around another 50 or 60 degrees.

He was creating his images right there, and that’s why the guys have to be really quick on their feet and build these things in and make it work. At the same time, the assistant director is there saying the cameras are ready, the actors are ready and we’re ready to shoot, and of course no one wants them to wait around, so they better have their stuff ready!

Destruction of Singapore

The destruction of Singapore.

Some of my other favorite sequences from the film are the destruction of Singapore while the mothership enters the atmosphere and the alien queen chasing the school bus!

What is next for you?
In 1999, when I started Unchartered Territory with my business partner Marc Weigert, we set it up as a production company and started developing our own projects. We joke that Roland interrupts us from developing our projects because he comes with projects of his own that we just cannot say no to! But we have just come back from a trip to Ireland where we scouted two studios and met with several potential production partners for a new project of our own. Stay tuned!

Cinna 4.13

Talking with new Shade VFX NY executive producer John Parenteau

By Randi Altman

John Parenteau, who has a long history working in visual effects, has been named executive producer of Shade VFX’s New York studio. Shade VFX, which opened in Los Angeles in 2009, provides feature and television visual effects, as well as design, stereoscopic, VR and previs services. In 2014, they opened their New York office to take advantage of the state’s fairly aggressive tax incentives and all that it brings to the city.

Shade-1“As a native New Yorker, with over a decade of working as an artist there, the decision to open an office back home was an easy one,” explains owner Bryan Godwin. “With John coming on board as our New York executive producer, I feel our team is complete and poised to grow — continuing to provide feature-film-level visuals. John’s deep experience running large facilities, working with top tier tent-pole clients and access to even more potential talent convinced me that he is the right choice to helm the production efforts out east.”

Shade’s New York office is already flush with work, including Rock that Body for Sony, The OA and The Get Down for Netflix, Mosaic for HBO and Civil for TNT. Not long ago, the shop finished work on Daredevil and Jessica Jones, two of Marvel’s Netflix collaborations. As John helps grow the client list in NYC, he will be supporting NY visual effects supervisor Karl Coyner, and working directly with Shade’s LA-based EP/VP of production Lisa Maher.

John has a long history in visual effects, starting at Amblin Entertainment in the early ‘90s all the way through to his recent work with supercomputer company Silverdraft, which provides solutions for VFX, VR and more. I’ve known him for many years. In fact, I first started spelling John Parenteau’s name wrong when he was co-owner and VFX supervisor at Digital Muse back in the mid to late ‘90s — kidding, I totally know how to spell it… now.

We kept in touch over the years. His passion and love for filmmaking and visual effects has always been at the forefront of our conversations, along with his interest in writing. John even wrote some NAB blogs for me when he was managing director of Pixomondo (they won the VFX Oscar for Hugo during that time) and I was editor-in-chief of Post Magazine. We worked together again when he was managing director of Silverdraft.

“I’ve always been the kind of guy who likes a challenge, and who likes to push into new areas entertainment,” says John. “But leaving visual effects was less an issue of needing a change and more of a chance to stretch my experience into new fields. After Pixomondo, Silverdraft was a great opportunity to delve into the technology behind VFX and to help develop some unique computer systems for visual effects artists.”

Making the decision to leave the industry a couple years ago to take care of his mother was difficult, but John knew it was the right thing to do. “While moving to Oregon led me away from Hollywood, I never really left the industry; it gets under your skin, and I think it’s impossible to truly get out, even if you wanted to.”

Parenteau realized quickly that the Portland scene wasn’t a hot-bed of film and television VFX, so he took the opportunity to apply his experience in entertainment to a new market, founding marketing boutique Bigfoot Robot. “I discovered a strong need for marketing for small- to mid-sized companies, including shooting and editing content for commercials and marketing videos. But I did keep my hand in media and entertainment thanks to one of my first clients, the industry website postPerspective. Randi and I had known each other for so many years, and our new relationship helped her out technically while allowing me to stay in touch with the industry.”

John’s mom passed over a year ago, and while he was enjoying his work at Bigfoot Robot, he realized how much he missed working in visual effects. “Shade VFX had always been a company I was aware of, and one that I knShade-2ew did great work,” he says. “In returning to the industry, I was trying to avoid landing in too safe of a spot and doing something I’d already done before. That’s when Bryan Godwin and Dave Van Dyke (owner and president of Shade, respectively) contacted me about their New York office. I saw a great opportunity to help build an already successful company into something even more powerful. Bryan, Lisa and Dave have become known for producing solid work in both feature and television, and they were looking for a missing component in New York to help them grow. I felt like I could fill that role and work with a company that was fun and exciting. There’s also something romantic about living in Manhattan, I have to admit.”

And it’s not just about building Shade for John. “I’m the kind of guy who likes to become part of a community. I hope I can contribute in various ways to the success of visual effects for not only Shade but for the New York visual effects community as a whole.”

While I’ll personally miss working with John on a day-to-day basis, I’m happy for him and for Shade. They are getting a very talented artist, who also happens to be a really nice guy.


Blending Ursa Mini and Red footage for Aston Martin spec spot

By Daniel Restuccio

When producer/director Jacob Steagall set out to make a spec commercial for Aston Martin, he chose to lens it on the Blackmagic Ursa Mini 4.6k and the Scarlet Red. He says the camera combo worked so seamlessly he dares anyone to tell which shots are Blackmagic and which are Red.

L-R Blackmagic’s Moritz Fortmann and Shawn Carlson with Jacob Steagall and Scott Stevens.

“I had the idea of filming a spec commercial to generate new business,” says Steagall. He convinced the high-end car maker to lend him an Aston Martin 2016 V12 Vanquish for a weekend. “The intent was to make a nice product that could be on their website and also be a good-looking piece on the demo reel for my production company.”

Steagall immediately pulled together his production team, which consisted of co-director Jonathan Swecker and cinematographers Scott Stevens and Adam Pacheco. “The team and I collaborated together about the vision for the spot which was to be quick, clean and to the point, but we would also accentuate the luxury and sexiness of the car.”

“We had access to the new Blackmagic Ursa Mini 4.6k and an older Red Scarlet with the MX chip,” says Stevens. “I was really interested in seeing how both cameras performed.”

He set up the Ursa Mini to shoot ProRes HQ at Ultra HD (3840×2160) and the Scarlet at 8:1 compression at 4K (4096×2160). He used both Canon still camera primes and a 24-105mm zoom, switching them from camera to camera depending on the shot. “For some wide shots we set them up side by side,” explains Stevens. “We also would have one camera shooting the back of the car and the other camera shooting a close-up on the side.”

In addition to his shooting duties, Stevens also edited the spot, using Adobe Premiere, and exported the XML into Blackmagic Resolve Studio 12. Stevens notes that, in addition to loving cinematography, he’s also “really into” color correction. “Jacob (Steagall) and I liked the way the Red footage looked straight out of the camera in the RedGamma4 color space. I matched the Blackmagic footage to the Red footage to get a basic look.”

Blackmagic colorist Moritz Fortmann took Stevens’ basis color correction and finessed the grade even more. “The first step was to talk to Jacob and Scott and find out what they were envisioning, what feel and look they were going for. They had already established a look so we saved a few stills as reference images to work off. The spot was shot on two different types of cameras, and in different formats. Step two was to analyze the characteristics of each camera and establish a color correction to match the two.  Step three was to tweak and refine the look. We did what I would describe as a simple color grade, only relying on primaries, without using any Power Windows or keys.”

If you’re planning to shoot mixed footage, Fortmann suggests you use cameras with similar characteristics, matching resolution, dynamic range and format. “Shooting RAW and/or Log provides for the highest dynamic range,” he says. “The more ‘room’ a colorist has to make adjustments, the easier it will be to match mixed footage. When color correcting, the key is to make mixed footage look consistent. One camera may perform well in low light while another one does not. You’ll need to find that sweet spot that works for all of your footage, not just one camera.”

Daniel Restuccio is a writer and chair of the multimedia department at California Lutheran University.


Sony at NAB with new 4K OLED monitor, 4K, 8X Ultra HFR camera

At last year’s NAB, Sony introduced its first 4K OLED reference monitor for critical viewing — the BVM-X300. This year, Sony added a new monitor, the the PVM-X550, a 55-inch, OLED panel with 12-bit signal processing, perfect for client viewing. The Trimaster EL PVM-X550 supports HDR through various Electro-Optical Transfer Functions (EOTF), such as S-Log3, SMPTE ST.2084 and Hybrid Log-Gamma, covering applications for both cinematography and broadcast. The PVM-X550 is a quad-view OLED monitor, which allows customized individual display settings across four distinct views in HD. It is equipped with the same signal-processing engine as the BVM-X300, providing a 12-bit output signal for picture accuracy and consistency. It also supports industry standard color spaces including the wider ITU-R BT.2020 for Ultra High Definition.

HFR Camera
At NAB 2016, Sony displayed their newest camera system: the HDC-4800 combines 4K resolution with enhanced high frame rate capabilities, capturing up to 8X at 4K, and 16X in full HD. “This camera system can do a lot of everything — very high frame rate, very high resolution,” said Rob Willox, marketing manager for content creation, Sony Electronics.

I broke the second paragraph into two, and they are now: The HDC-4800 uses a new Super 35mm 4K CMOS sensor, supporting a wide color space (both BT.2020 and BT.709), and provides an industry standard PL lens mount, giving the system the capability of using the highest quality cinematic lenses for clear and crisp high resolution images.The new sensor brings the system into the cinematic family of RED and Alexa, making it well suited as a competitor to today’s modern, high end cinematic digital solutions.

An added feature of the HDC-4800 is how it’s specifically designed to integrate with Sony’s companion system, the Sony HDC-4300, a 2/3 inch image sensor 4k/HD camera. Using matching colorimetry and deep toolset camera adjustments, and with the ability to take advantage of existing build-up kits, remote control panels and master setup units, the two cameras can blend seamlessly.

Archive
Sony also showed the second generation of its Optical Disc Archive System, which adopts new, high-capacity optical media, rated with a 100 year shelf life with double the transfer rate and double the capacity of a single cartridge at 3.3 TB. The Generation 2 Optical Disc Archive System also adds an 8-channel optical drive unit, doubling the read/write speeds of the previous generation, helping to meet the data needs of real-time 4K production.


Making our dialogue-free indie feature ‘Driftwood’

By Paul Taylor and Alex Megaro

Driftwood is a dialogue-free feature film that focuses on a woman and her captor in an isolated cabin. We chose to shoot entirely MOS… because we are insane. Or perhaps we were insane to shoot a dialogue-free feature in the first place, but our choice to remove sound recording from the set was both freeing and nerve wracking due to the potential post production nightmare that lay ahead.

Our decision was based on how, without speech to carry along the narrative, every sound would need to be enhanced to fill in the isolated world of our characters. We wanted draconian control over the soundscape, from every footstep to every door creak, but we also knew the sheer volume of work involved would put off all but the bravest post studios.

The film was shot in a week with a cast of three and a crew of three in a small cabin in Upstate New York. Our camera of choice was a Canon 5D Mark II with an array of Canon L-series lenses. We chose the 5D because we already owned it — so more bang for our buck — and also because it gave us a high-quality image, even with such a small body. Its ease of use allowed us to set up extremely quickly, which was important considering our extremely truncated shooting schedule. Having no sound team on set allowed us to move around freely without the concerns of planes passing overhead or cars rumbling in the distance delaying a shot.

The Audio Post
The editing was a wonderfully liberating experience in which we cut purely to image, never once needing to worry about speech continuity or a host of other factors that often come into play with dialogue-driven films. Driftwood was edited on Apple’s Final Cut Pro X, a program that can sometimes be a bit difficult for audio editing, but for this film it was a non-issue. The Magnetic Timeline was actually quite perfect for the way we constructed this film and made the entire process smooth and simple.

Once picture locked, we brought the project to New York City’s Silver Sound Studios, who jumped at the chance to design the atmosphere for an entire feature from the ground up. We sat with the engineers at Silver Sound and went through Driftwood shot-by-shot, creating a master list of all the sounds we thought necessary to include. Some were obvious, such as footsteps, breathing, clocks ticking and others less so, such as the humming of an old refrigerator or creaking of a wooden chair.

Once the initial list was set, we discussed whether or not to use stock audio or rerecord everything at the original location. Again, because we wanted complete control to create something wholly unique, we concluded it was important to return to the cabin and capture its particular character. Over the course of a few days, the Silver Sound gang rerecorded nearly every sound in the film, leaving only some basic Foley work to complete in their studio.

Once their library was complete, one of the last steps before mixing was to ADR all of the breathing. We had the actors come into the studio over a one-week period during which they breathed, moaned and sighed inside Silver Sound’s recording booth. These subtle sounds are taken for granted in most films, but for Driftwood they were of utter importance. The way the actors would sigh or breath could change the meaning behind that sound and change the subtext of the scene. If the characters cannot talk, then their expressions must be conveyed in other ways, and in this case we chose a more physiological track.

By the time we completed the film we had spent over a year recording and mixing the audio. The finished product is a world unto itself, a testament to the laborious yet incredibly exciting work performed by Silver Sound.

Driftwood was written, directed and photographed by Paul Taylor. It was produced and edited by Alex Megaro.


Raytracing today and in the future

By Jon Peddie

More papers, patents and PhDs have been written and awarded on ray tracing than any other computer graphic technique.

Ray tracing is a subset of the rendering market. The rendering market is a subset of software for larger markets, including media and entertainment (M&E), architecture, engineering and construction (AEC), computer-aided design (CAD), scientific, entertainment content creation and simulation-visualization. Not all users who have rendering capabilities in their products use it. At the same time there are products that have been developed solely as rendering tools and there are products that include 3D modeling, animation and rendering capabilities, and they may be used primarily for rendering, primarily for modeling or primarily for animation.

Because ray tracing is so important, and at the same time computationally burdensome, individuals and organizations have spent years and millions of dollars trying to speed things up. A typical ray traced scene on an old-fashioned HD screen can tax a CPU so heavily the image can only be upgraded maybe every second or two — certainly not the 33ms needed for realtime rendering.

GPUs can’t help much because one of the characteristics of ray tracing is it has no memory and every frame is a new frame, so the computational load is immutable. Also, the branching that occurs in raytracing defeats the power of a GPU’s SIMD architecture.

Material Libraries Critical
Prior to 2015, all ray tracer engines came with their own materials libraries. Cataloging the characteristics of all the types of materials in the world is beyond the resources of any company’s ability to develop and support. And the lack of standards has held back any cooperative development in the industry. However, a few companies have agreed to work together and share their libraries.

I believe we will see an opening up of libraries and the ability of various ray tracing engines to be able to avail themselves of a much larger library of materials. Nvidia is developing a standard-like capability they are calling the Material Definition Language — (MDL) and using it to allow various libraries to work with a wide range of ray tracing engines.

Rendering Becomes a Function of Price
In the near future, I expect to see 3D rendering become a capability offered as an online service. While it’s not altogether clear how this will affect the market, I think it will boost the use of ray tracing and lower the cost to an as-needed basis. It also offers the promise of being able to apply huge quantities of processing power limited only by the amount of money the user is willing to pay. Ray tracing will resolve to time (to render a scene) divided by cost.

That will continue to bring down the time to generate a ray traced frame for an animation for example, but it probably won’t get us to realtime ray tracing at 4K or beyond.

Shortcuts and Semiconductors
Work continues on finding clever ways to short circuit the computational load by using intelligent algorithms to look at the scene and deterministically allocate what objects will be seen, and which surfaces need to be considered.

Hybrid techniques are being improved and evolved where only certain portions of a scene are ray traced. Objects in the distance for example don’t need to be ray traced and flat, dull colored objects don’t need it.

Chaos Group says the use of variance-based adaptive sampling on this model of Christmas cookies from Autodesk 3ds Max provided a better final image in record time. (Source: Chaos Group)

Semiconductors are being developed to specifically accelerate ray tracing. Imagination Technologies, the company that designs Apple’s iPhone and iPad GPU, has a specific ray tracing engine that, when combined with the advance techniques just described can render an HD scene with partial ray traced elements several times a second. Siliconarts is a startup in Korea that has developed a ray tracing accelerator and I have seen demonstrations of it generating images at 30fps. And Nvidia is working ways to make a standard GPU more ray-tracing friendly.

All these ideas and developments will come together in the very near future and we will begin to realize realtime ray tracing.

Market Size
It is impossible to know how many users there are of ray tracing programs because the major 3D modeling and CAD programs, both commercial and free (e.g., Autodesk, Blender, etc.) have built-in ray tracing engines, as well as the ability to use pluggable add-on software programs for ray tracing.

The potentially available market vs. the totally available market (TAM).

Also, not all users make use of ray tracing on a regular basis— some use it every day, others maybe occasionally or once a project. Furthermore, some users will use multiple ray tracing programs in a project, depending upon their materials library, user interface, specific functional requirements or pipeline functionality.

Free vs. Commercial
A great deal of the raytracing software available on the market is the result of university projects. Some of the developers of such programs have formed companies, others have chosen to stay in academia or work as independent programmers.

The number of new suppliers has not slowed down indicating a continued demand for ray tracing

The non-commercial developers continue to offer their ray tracing rendering software as an open source and for free — and continue to support it, either individually or as part of a group.

Raytracing Engine Suppliers
The market for ray tracing is entering into a new phase. This is partially due to improved and readily available low-cost processors (thank you, Moore’s law), but more importantly it is because of the demand and need for accurate virtual prototyping and improved workflows.

Rendering in the cloud using GPUs (Source OneRender).

As with any market, there is a 20/80 rule, where 20 percent of the suppliers represent 80 percent of the market. The ray tracing market may be even more unbalanced. There would appear to be too many suppliers in the market despite failures and merger and acquisition activities. At the same time many competing suppliers have been able to successfully coexist by offering features customized for their most important customers.

Conclusion
Ray tracing is to manufacturing what a storyboard is to film — the ability to visualize the product before it’s built. Movies couldn’t be made today with the quality they have without ray tracing. Think of how good the characters in Cars looked — that imagery made it possible for you to suspend disbelief and get into the story. It used to be: “Ray tracing — Who needs it?” Today it’s: “Ray tracing? Who doesn’t use it?”

Our Main Image: An example of different materials being applied to the same object (Source Nvidia)

Dr. Jon Peddie is president of Jon Peddie Research, which just completed an in-depth market study on the ray tracing market. He is the former president of Siggraph Pioneers and  serves on advisory boards of several companies. In 2015, he was given the Life Time Achievement award from the CAAD society. His most recent book is “The History of Visual Magic in Computers.”


Quick Chat: Ian Stynes on mixing two Sundance films

By Kristine Pregot

A few years back, I had the pleasure of working with talented sound mixer Ian Stynes on a TV sketch comedy. It’s always nice working with someone you have collaborated with before. There is a comfort level and unspoken language that is hard to achieve any other way. This year we collaborated once again for So Yong Kim’s 2016 film Lovesong, which made its premiere at this year’s Sundance and had its grade at New York’s Nice Shoes via colorist Sal Malfitano.

Ian has been busy. In fact, another film he mixed recently had its premiere at Sundance as well — Other People, from director Chris Kelly.

Ian Stynes

Ian Stynes

Since we were both at the festival, I thought what better time to ask him how he approached mixing these two very different films.

Congrats on your two films at Sundance, Lovesong (which is our main image) and Other People. How did the screenings go?
Both screenings were great; it’s a different experience to see the movie in front of an excited audience. After working on a film for a few months it’s easy to slip into only watching it from a technical standpoint — wondering, if a certain section is loud enough, or if a particular sound effect works — but seeing it with an engaged crowd (especially as a world premiere at a place like Sundance) is like seeing it with fresh eyes again. You can’t help but get caught up.

What was the process like to work with each director for the film?
I’ve been lucky enough to work with some wonderful directors, and these movies were no exception. Chris Kelly, the director for Other People, who is a writer on a bunch of TV shows including SNL and Broad City is so down to earth and funny. The movie was based on the true story of his mother, who died from cancer. So he was emotionally attached to the film in a unique way. He was very focused about what he wanted but also knew when to sit back and let me do my thing. This was Chris’s first movie, but you wouldn’t know it.

For Lovesong, I worked with director So Yong Kim once again. She makes all her films with her husband Bradley Rust Gray. They switch off with directorial duties but are both extremely involved in each other’s movies. This is my third time working on a film with the two of them — the other two were For Ellen with Paul Dano and Jon Heder, and Exploding Girl with Zoe Kazan. So is an amazing director to work with; it feels like a real collaboration mixing with her. She is creative and extremely focused with her vision, but always inclusive and kind to everyone involved in the crew.

With both films a lot of work was done ahead of time. I try and get it to a very presentable place before the directors come in. This way we can focus on the creative tasks together. One of the fun parts of my job is that I get to sit in a room for a good while and work closely with creative and fun people on something that is very meaningful to them. It’s usually a bit of a bonding experience by the end of it.

How long did each film take you to mix?
I am also extremely lucky to work with some great people at Great City Post. I was the mixer, supervising sound editor and sound designer on both films, but I have an amazing team of people working with me.

Matt Schoenfeld did a huge amount of sound designing on both movies, as well as some of the mixing on Lovesong. Jay Culliton was the dialogue editor on Other People. Renne Bautista recorded Foley and dealt with various sound editing tasks. Shaun Brennan was the Foley artist, and additional editing was done by Daniel Heffernan and Houston Snyder. We are a small team but very efficient. We spent about eight to 10 weeks on each film.

Lovesong

How is it different to mix comedy than it is to mix a drama?
When you add sound to a film it’s important to think about how it is helping the story — how it augments or moves the story along. The first level of post sound work involves cleaning and removing anything that might take the viewer out of the world of the story (hearing mics, audio distortion, change in tone etc.).

Beyond that, different films need different things. Narrative features usually call for the sound to give energy to a film but not get in the way. Of course, there are always specific moments where the sound needs to stand out and take center stage. Most people usually aren’t aware of it or know what post sound specifically entails, but they certainly notice when it is missing or a bad sound job was done. Dramas usually have more intensity to the story and comedy’s can be a bit lighter. This often informs the sound design, edit and mix. That said, every movie is still different.

What is your favorite sound design on a film of all time?
I love Ben Burtt, who did all the Star Wars movies. He also did Wall-E, which is such a great sound design movie. The first 40 or so minutes have no direct dialogue — all the audio is sound design. You might not realize it, but it is very effective. On the DVD extra Ben Burtt did a doc about the sound for that movie. The documentary ends up being about the history of sound design itself. It’s so inspiring, even for non-sound people. Here is the link.

I urge anyone reading this to watch it. I guarantee it will get you thinking about sound for film in a way you never have before.

Kristine Pregot is a senior producer at New York City-based Nice Shoes.



Encore colorist Laura Jans Fazio goes dark with ‘Mr. Robot’

By Randi Altman

After watching Mr. Robot when it premiered on USA Network last year, I changed all of my computer passwords and added a degree of difficulty that I’m proud of. I’m also not 100 percent convinced that my laptop’s camera isn’t on even when there’s no green light. That’s right, I completely and gleefully bought into the paranoia, and I wasn’t alone. Mr. Robot won Best Television Series Drama at this year’s Golden Globes, and one of the show’s supporting actors, Christian Slater, took home a statue.

The show, about a genius New York-based computer hacker (Rami Maleck) who believes corporations control, well, everything, has been getting its color grade by Laura Jans Fazio, lead colorist at Deluxe’s Encore, since its second episode.

Laura Jans Fazio

If you watch any TV at all, you’ve very likely seen some of Jans Fazio’s work. Her resume lists House of Cards, Hawaii 5-0, Proof, Empire and The Lottery, and she’s currently gearing up to work on the updated Gilmore Girls and Lady Dynamite.

Jans Fazio was kind enough to take some time out from grading this upcoming season of House of Cards to chat about her work on Mr. Robot.

Were you on Mr. Robot from the very start?
Sam Esmail, the show’s creator, asked me to help out with one of the first scenes in the pilot — the one that took place in Ron’s Coffee Shop. We made some changes, Sam loved it and wanted me to hit the whole show, so I did!

What kind of direction were you given about the look of that scene?
For Ron’s Coffee Shop, the direction was, “just do your thing.” So I was fortunate enough to do my own thing on it, and make it what I felt it should be.

What about when you started the season?
That’s part of what coloring has been — at least in my career — trying to interpret what the client, or the creator, is saying to me, because everybody has a different way of describing things, whether they’re technically savvy or not. I have to take that description and interpret it, and apply that to the image through my tool set on the computer.

That’s the process for this show, like many others I’ve worked on… I’ve been lucky enough to be entrusted to just do what I think feels right, and then I wait for notes. And more often than not, my notes are pretty minimal.

So minimal notes on Mr. Robot?
It was either “go darker” or ” let’s change this room in its entirety — I want it to be colder, and I’m not feeling the emotion of the scene.” In other instances, I’ll take a scene that’s lit completely warm and I’ll go cool with it because I think it looks better. Then I’ll send it out and be happily pleased that it’s liked.

(Photo: David Giesbrecht/USA Network)

Can you describe a scene and give me an example?
The All Safe office, where Elliot worked, actually stayed similar to the pilot. The only difference was I took a lot of magenta out of it. So it had the feeling of a cold, sterile, distant corporate environment with a “working for the man” kind of feel. It’s not dark. It’s airy and lofty, but not airy in a good way. It basically allows the talent to come through — to see the emotion of what the characters are going through, and what they’re talking about. The rest just seems to melt behind them.

How do you build on what the DP Tod Campbell captures on set?
This is the way I approach all images — I take what I’ve got to work with, play with different styles of contrast, densities and color tones and let the image take me where it wants to be. How it feels in the story and what’s it’s cut against, and where are it’s going.

Usually I’ll tap into it straight away, but it’s always that way on the first episode or two of a new show, because you don’t really know where it needs to be. It’s kind of like the first color of paint that you put on a canvas that has been prepped — that’s not always the color that’s going to come through. It’s going to start out one way, and evolve as you go.

Sometimes colorists talk about being given stills or told to emulate the look of a certain film. It’s pretty amazing that they’re just saying, “Go.”
But that’s not always the case. There are many times where people come in with a photography coffee table book, and say, “I want this, this or that.” Or they will reference a movie from 1972 or say, “Let’s make it look like this Japanese film shot in 1942,” and I reference those clips.

That’s a common practice. In this situation I was approached based on my work on House of Cards and entrusted with Mr. Robot.

Mr. Robot - Season 1     

How do you prefer to work? Or do you enjoy both?
I enjoy both. It’s always good to get feedback, and I need an idea of what it is. When I saw the pilot for MrRobot, I of knew automatically what I would do with it.

Is there anything that stuck out from the season that you are most proud of?
The fact that the show is super dark. Dark is good. People are hesitant to do dark because they need to see what’s going on, but I look at it this way: if you’re in a dark forest and see an opening of light, that’s when you want to see more. And going dark was well received, both by the audience and my peers. That was cool.

Your tool of choice is FilmLight Baselight. Why do you like this particular system?
It’s just makes sense, from the way it allows you to layer colors and grade inside/outside, therefore eliminating keystrokes. It allows me to be really fast, and it deals with different color spaces and gammas. Also, the development always seems to be on the cutting edge of the latest technology coming from the camera manufacturers. They are also great about keeping up with where our business is going, including paying attention to different color spaces and HDR and VR.

Mr. Robot - Pilot Where do you find your inspiration?
It’s everywhere. I notice everything. I notice what somebody is wearing, what the colors are, where the contrasts lie and how the light is hitting them. I notice the paint sheens in a room and where the light that is falling onto objects and creating depth. I get lost online viewing design and color palettes and architecture and photography and gardens. The list goes on.

Growing up in New York, I was walking all the time and was just immersed in visual stimulation — from people, buildings, objects, architecture, art and design. I look to all of the man-made things, but I also look to nature, landscapes and skies… the color contrasts of it all.

What’s next for you, and how many shows do you work on at the same time?
Sometimes I’m on multiple shows within a week, and that overlaps. Right now, I’m doing Hawaii 5-0, House of Cards and Lady Dynamite. House of Cards will end soon, but Hawaii 5-0 will still be going on. Gilmore Girls will start up. Lady Dynamite will still be going, and then Robot will start. Then who knows what else is going to come in between those times.

That’s a lot.
The more the merrier!

The Molecule: VFX for ‘The Affair’ and so much more

By Randi Altman

Luke DiTommaso, co-founder of New York City’s The Molecule, recalls “humble”
beginnings when he thinks about the visual effects, motion graphics and VR studio’s launch as a small compositing shop. When The Molecule opened in 2005, New York’s production landscape was quite a bit different than the tax-incentive-driven hotbed that exists today.

Rescue Me was our big break,” explains DiTommaso. “That show was the very beginning of this wave of production that started happening in New York. Then we got Damages and Royal Pains, but were still just starting to get our feet wet with real productions.”

The Molecule partners (L-R) Andrew Bly, Chris Healer and Luke DiTommaso.

Then, thanks to a healthy boost from New York’s production and post tax incentives, things exploded, and The Molecule was at the right place at the right time. They had an established infrastructure, talent and experience providing VFX for television series.

Since then DiTommaso and his partners Chris Healer and Andrew Bly have seen the company grow considerably, doing everything from shooting and editing to creating VFX and animation, all under one roof. With 35 full-time employees spread between their New York and LA offices — oh, yeah, they opened an office in LA! — they also average 30 freelance artists a day, but can seat 65 if needed.

While some of these artists work on commercials, many are called on to create visual effects for an impressive list of shows, including Netflix’s Unbreakable Kimmy Schmidt, House of Cards and Bloodline, Showtime’s The Affair, HBO’s Ballers (pictured below), FX’s The Americans, CBS’ Elementary and Limitless, VH1’s The Breaks, Hulu’s The Path (for NBC and starring Aaron Paul) and the final season of USA’s Royal Pains. Also completed are the miniseries Madoff and Behind the Magic, a special on Snow White, for ABC.

Ballers-before      Ballers-after

The Molecule’s reach goes beyond the small screen. In addition to having completed a few shots for Zoolander 2 and a big one involving a digital crowd for Barbershop 3, at the time of this interview the studio was gearing up for Jodie Foster’s Money Monster; they will be supplying titles, the trailer and a ton of visual effects.

There is so much for us to cover, but just not enough time, so for this article we are going to dig into The Molecule’s bread and butter: visual effects for TV series. In particular, the work they provided for Showtime’s The Affair, which had its season finale just a few weeks ago.

The Affair
Viewers of The Affair, a story of love, divorce and despair, might be surprised to know that each episode averages between 50 to 70 visual effects shots. The Molecule has provided shots that range from simple clean-ups to greenscreen driving and window shots — “We’ll shoot the plates and then composite a view of midtown Manhattan or Montauk Highway outside the car window scene,” says DiTommaso — to set extensions, location changes and digital fire and rain.

One big shot for this past season was burning down a cabin during a hurricane. “They had a burn stage so they could captFire-stageure an amount of practical fire on a stage, but we enhanced that, adding more fire to increase the feeling of peril. The scene then cuts to a wide shot showing the location, which is meant to be on the beach in Montauk during a raging hurricane. We went out to the beach and shot the house day for night — we had flicker lighting on the location so the dunes and surrounding grass got a sort of flickering light effect. Later on, we shot the stage from a similar angle and inserted the burning stage footage into the exterior wide location footage, and then added a hurricane on top of all of that. That was a fun challenge.”

During that same hurricane, the lead character Noah gets his car stuck in the mud but they weren’t able to get the tires to spin practically, so The Molecule got the call. “The tires are spinning in liquid so it’s supposed to kick up a bunch of mud and water and stuff while rain is coming down on top of it, so we had our CG department create that in the computer.”

Another scene that features a good amount of VFX was one that involved a scene that took place on the patio outside of the fictitious Lobster Roll restaurant. “It was shot in Montauk in October and it wasn’t supposed to be cold in the scene, but it was about 30 degrees at 2:00am and Alison is in a dress. They just couldn’t shoot it there because it was just too cold. We shot plates, basically, of the location, without actors. Later we recreated that patio area and lined up the lighting and the angle and basically took the stage footage and inserted it into the location footage. We were able to provide a solution so they could tell the story without having the actors’ breath and their noses all red and shivering.”

Lobster_Roll-before      Lobster_Roll-after

Being on Set
While on-set VFX supervision is incredibly important, DiTommaso would argue “by the time you’re on set you’re managing decisions that have already been set into motion earlier in the process. The most important decisions are made on the tech scouts and in the production/VFX meetings.”

He offers up an example: “I was on a tech scout yesterday. They have a scene where a woman is supposed to walk onto a frozen lake and the ice starts to crack. They were going to build an elaborate catwalk into the water. I was like, ‘Whoa, aren’t we basically replacing the whole ground with ice? Then why does she need to be over water? Why don’t we find a lake that has a flat grassy area leading up to it?’ Now they’re building a much simpler catwalk — imagine an eight-foot-wide little platform. She’ll walk out on that with some blue screens and then we’ll extend the ice and dress the rest of the location with snow.

According to DiTommaso being there at the start saved a huge amount of time, money and effort. “By the time you’re on set they would have already built it into the water and all that stuff.”

But, he says, being on set for the shoot is also very important because you never know what might happen. “A problem will arise and the whole crew kind of turns and looks at you like, ‘You can fix this, right?’ Then we have to say, ‘Yeah. We’re going to shoot this plate. We’re going to get a clean plate, get the actors out, then put them back in.’ Whatever it is; you have to improvise sometimes. Hopefully that’s a rare instance and that varies from crew to crew. Some crews are very meticulous and others are more freewheeling.”

Tools
The Molecule is shooting more and more of their own plates these days, so they recently invested in a Ricoh S camera for shooting 360-degree HDR. “It has some limitations, but it’s perfect for CG HDRs,” explains DiTommaso. “It gives you a full 360-degree dome, instantly, and it’s tiny like a cell phone or a remote. We also have a Blackmagic 4K Cinema camera that we’ll shoot plates with. There are pros and cons to it, but I like the latitude and the simplicity of it. We use it for a quick run and gun to grab an element. If we need a blood spurt, we’ll set that up in the conference room and we’ll shoot a plate.”

The Molecule added John Hamm’s head to this scene for Unbreakable Kimmy Schmidt.

They call on a Canon 74 for stills. “We have a little VFX kit with little LED tracking points and charts that we bring with us on set. Then back at the shop we’re using Nuke to composite. Our CG department has been doing more and more stuff. We just submitted an airplane — a lot of vehicles, trains, planes and automobiles are created in Maya.”

They use Side Effects Houdini for simulations, like fire and rain; for rendering they called on Arnold, and crowds are created in Massive.

What’s Next?
Not ones to be sitting on the sidelines, The Molecule recently provided post on a few VR projects, but their interest doesn’t end there. Chris Healer is currently developing a single lens VR camera rig that DiTommaso describes as essentially “VR in a box.”