Category Archives: rendering

Chaos Group and Adobe partner for photorealistic rendering in CC

Chaos Group’s V-Ray rendering technology is featured in Adobe’s Creative Cloud, allowing graphic designers to easily create photorealistic 3D rendered composites with Project Felix.

Available now, Project Felix is a public beta desktop app that helps users composite 3D assets like models, materials and lights with background images, resulting in an editable render they can continue to design in Photoshop CC. For example, users can turn a basic 3D model of a generic bottle into a realistic product shot that is fully lit and placed in a scene to create an ad, concept mock-up or even abstract art.

V-Ray acts as a virtual camera, letting users test angles, perspectives and placement of their model in the scene before generating a final high-res render. Using the preview window, Felix users get immediate visual feedback on how each edit affects the final rendered image.

By integrating V-Ray, Adobe has brought the same raytracing technology used by companies Industrial Light & Magic to a much wider audience.

“We’re thrilled that Adobe has chosen V-Ray to be the core rendering engine for Project Felix, and to be a part of a new era for 3D in graphic design,” says Peter Mitev, CEO of Chaos Group. “Together we’re bringing the benefits of photoreal rendering, and a new design workflow, to millions of creatives worldwide.”

“Working with the amazing team at Chaos Group meant we could bring the power of the industry’s top rendering engine to our users,” adds Stefano Corazza, senior director of engineering at Adobe. “Our collaboration lets graphic designers design in a more natural flow. Each edit comes to life right before their eyes.”

GPU-accelerated renderer Redshift now in v.2.0, integrates with 3ds Max

Redshift Rendering has updated its GPU-accelerated rendering software to Redshift 2.0. This new version includes new features and pipeline enhancements to the existing Maya and Softimage plug-ins. Redshift 2.0 also introduces integration with Autodesk 3ds Max. Integrations with Side Effects Houdini and Maxon Cinema 4D are currently in development and are expected later in 2016.

New features across all platforms include realistic volumetrics, enhanced subsurface scattering and a new PBR-based Redshift material, all of which deliver improved final render results. Starting July 5, Redshift is offering 20 percent off new Redshift licenses through July 19.

Age of Vultures

Age of Vultures

A closer look at Redshift 2.0’s new features:

● Volumetrics (OpenVDB) – Render clouds, smoke, fire and other volumetric effects with production-quality results (initial support for OpenVDB volume containers).

● Nested dielectrics – The ability to accurately simulate the intersection of transparent materials with realistic results and no visual artifacts.

● New BRDFS and linear glossiness response – Users can model a wider variety of metallic and reflective surfaces via the latest and greatest in surface shading technologies (GGX and Beckmann/CookTorrance BRDFs).

● New SSS models and single scattering – More realistic results with support for improved subsurface scattering models and single-scattering.

● Redshift material – The ability to use more intuitive, PBR-based main material, featuring effects such as dispersion/chromatic aberration.

● Multiple dome lights – Users can combine multiple dome lights to create more compelling lighting.

● alSurface support – There is now full support for the Arnold shader without having to port settings.

● Baking – Users can save a lot of rendering time with baking for lighting and AOVs.

Users include Blizzard, Jim Henson’s Creature Shop, Glassworks and Blue Zoo.

Main Image: Rendering example from A Large Evil Corporation.

MTI 3.31

Sony Imageworks helps take ‘Alice Through the Looking Glass’

By Christine Holmes

Sony Imageworks VFX supervisor Jay Redd’s journey with Alice Through the Looking Glass began at the end of 2013, a full two and a half years before the US domestic release. A seasoned veteran in the visual effects world, Redd partnered closely with director James Bobin, Imageworks VFX supervisor Ken Ralston, production designer Dan Hennah and crew to bring this vibrant adventure to life.

Jay Redd

Time itself plays multiple roles in this new chapter to Alice in Wonderland. We see Alice return to Wonderland — or “Underland” as it’s referred to in the film — to help her friend the Mad Hatter find out what happened to his family many years ago. She sets out on a solo quest to find the Chronosphere, a small object located in a castle at the center of a giant clock. The sphere that controls time is being guarded by a new character, played by Sacha Baron Cohen, called Time, a true personification of time. When taken from the clock and activated for time travel, the Chronosphere brings Alice to a new location where all the moments in Underland’s history are displayed within the waves the Oceans of Time. Disrupting the Chronosphere ultimately causes consequences as Time and time to begin to break down.

Redd was kind enough to talk to postPerspective about the challenges of representing Time, the character, and the concept of in this film.

Did it help your initial process to have had a visual language already established from Alice in Wonderland?
I would say it served as a foundation, but part of what was exciting about working with James Bobin on this one was that he wanted to make it feel really different. The time travel element allowed us to go back into Underland before things became sad — before the Jabberwocky attacked the village, before the Red Queen was in power, before all of these things. It allowed us to expand on the palette to be much more saturated and vibrant, and I think you can see that as compared to Alice in Wonderland.

How did you begin the challenge of representing time in both human form and in an entirely new time travel world?
The character Time is not in any of Lewis Carroll’s books, and in one of the early drafts of the script, time itself is just a concept. Then James Bobin brought the idea that time is a character. Personify time and create an actual character. The idea of the back of his head being clockwork came from James as well. He wanted Time to be part of the clock. There’s a moment in the film where Time says, “I am he and he is me.” The clock and Time are the same thing, so when we see Time open his chest, there’s a miniature version of the clock in there as well.

The idea of the Oceans of Time was just one line in the script: “Alice traveled through the Oceans of Time.” Then Ken Ralston, James, myself and our team had to figure out what that looked like. James was very adamant about wanting to include images. Pardon the pun, but over time, we experimented with a number of different looks and ended up with this ocean setting that surrounded you — the characters and the audience. You would go in and out of the ocean to enter moments and different times in history.

clock before

clock after

With the moments depicted within the ocean waves, it felt like there was a very painterly style employed there. What was the thought process behind that decision?
Those images actually came from original shots from the first movie. We started with footage — either completed shots from the first movie or footage from other scenes in this sequel. We couldn’t complete some of the Oceans of Time shots until we had finished the others. So that became a weird schedule for us. We processed the footage for moments to make them feel like they were in the water. We didn’t want the moments to feel like you were going to a drive-in theater like they were just projected on the wall or surface. We wanted them to feel volumetric — to have volume and feel thick and deep —100 feet in the air. It’s a really interesting process that all had to start with 2D processing from our compositing department, which required slowing down the footage to make it feel bigger, and more present.

To add scale?
Yes, exactly, scale. Sometimes a shot used for these moments would only be one second, but we needed three seconds. So we would slow it down, process it and use our own optical flow processes from our 2D workflow that would then feed into our water simulations. Then, in 3D simulations, those moving pieces of footage, using the vector data from the optical flow, would actually move the water and the wave spray around. When you see the Jabberwocky swing its head, it’s actually affecting the surface of the water. That’s why it has that painterly, or liquid, feel to it. I’m really impressed with what our team did in 3D. That’s the stuff you really can’t see as clearly in a 2D flat projection theatre when you’re there. In a 3D theatre, it really comes alive. I’m happy you picked up on it because it was a lot of work.

Those are some of my favorite moments, especially in 3D.
Awesome! Mine too. The Oceans of Time is so much bigger in 3D projection. That little Chronosphere you see, that’s the thing that tells you how big everything is. We can play with the size of the Chronosphere. You can make the Chronosphere bigger and everything feels smaller. Or you can make the Chronosphere smaller and animate it in a slightly different way and suddenly the Oceans of Time is huge. There’s a really interesting relationship between the size of the Chronosphere, how fast it’s moving, how fast we’re moving, and the scale of the world. That was something that took weeks to figure out in animation. If you move too quickly through a large environment, it doesn’t feel that huge. That’s something we wanted to play with and, of course, keep the pacing of the chase sequence to keep it exciting. Those are the kind of things that take months to figure out.

What about the creative evolution of the effect used to represent time breaking down in a tangible way in Underland?
We knew Time’s castle had to get completely rusted over, or frozen over. After a trip to the Los Angeles Natural History Museum, Ralston and I came upon obsidian rock with a kind of mineral growth on it. It was a bright orange and red mineral deposit that had started growing across these crystals. It probably took a million years to happen. We both looked at each other recognizing just how cool that was.

Fast forward a few months after meeting with the effects lead, and the rest of our team about covering the entire world with rust. Ken and I were shooting in London and had been doing dailies with our team for a few weeks, and even while we were shooting we were creating this reference imagery. We all gave this stuff to our team, led by Imageworks senior effects/simulation supervisor Joseph Pepper. A few weeks later he had put together a very rough test.

After the first viewing I said, “At some point we’ll want to get spikes and dust in there, but don’t do that right now, just keep it simple.” Well, Pepper and team went further with the next test. It’s Alice running down a hallway in the castle — granted this is all digital — and there’s this rust aggressively chasing her. Her hair is flopping all over and she looks back, and there is rust shooting up the walls, down the stairs, and through all the arches. A piece of stone breaks off and all this dust is falling and then the rust catches her by the ankle and freezes her in a second and a half, putting her in this physically impossible shape where she’s balancing on her toe reaching for a window. A lot of chaos and excitement! Ken and I looked at that with our jaws dropped. It really showed us what was possible with this idea.

The last week of shooting at Shepperton Studios was that moment where we knew rust was going to be something really cool. We made a couple small changes to the test and showed it to James Bobin. I was looking at his face when he watched it. It kind of sunk and I thought, “Oh crap.” Then he said, “Wow. That’s scary.” We knew we hit it! There was a kind of nervous chuckle and then he said something like, “Yeah, that’ll scare the kids.”

What was the most challenging shot in this film?
The toughest shot was the real movie version of the one I just described to you, when the rust comes over the clock cog and big gear, grabs Alice by the ankle, then climbs up her body and freezes her face all the way to her fingertips as she’s trying to drop the Chronosphere back into the holder. That was one of the most difficult shots of the film.

In fact, we reshot the live-action element of Alice a year later because we wanted to do a slightly better version of it to expose more of her ankle and her body. What we were doing was blending from a full live-action version of Alice to a full photoreal digital double. That’s the kind of shot that every single department touched, from paint and stabilization in the 2D world to wire rig removals, from full on modeling to all the textures of the costumes.

Then there was the lighting, the castle — all the effects — very detailed timing and art direction and controlling of the animation of the rust coming around her face, crossing her eyes, nose and arms at a certain time. There was also the subtle affecting of the cloth so it added weight when the rust went over her arm. It’s incredibly dense and detailed digital work. Our team developed a lot of specialized and cool technology for that. A lot of new animation tools, rendering tools and FX tools to make that happen — definitely pushing boundaries for us at Sony Imageworks.

Christine Holmes is a freelance artist and manager of animated content. She has worked in the film industry for the last six years.


AMD offering FireRender plug-in for 3ds Max

AMD, makers of the line of FirePro graphics cards and engines, has released a free software–based rendering plug-in, the FireRender for Autodesk 3ds Max, which is designed for content creators with 4K workflows and who are looking for photorealistic rendering. FireRender for Max offers physically accurate raytracing and comes with an extensive material library.

AMD FireRender is built on OpenCL 1.2, which means it can run on any hardware. It also provides a CPU backend, which means that FireRender can run on GPU, CPU, CPU+GPU, or a variety of combinations of multiple CPUs and GPUs. Within the FireRender, integrated materials are editable in the 3ds Max Material Slate Editor as nodes. There is also Active Shade Viewport Integration, which means you can work with FireRender in realtime and see your changes as you make them. Physically Correct materials and lighting help with true design decisions via global illumination — including caustics. Emissive and Photometric Lighting, as well as lights from HDRI environments, enable artists to blend a scene in with its surroundings.

AMD says to keep an eye out for other upcoming free software plug-ins for other animation software, including Autodesk Maya and Rhino.

 

In other AMD news, at the NAB show last month, the company introduced the AMD FirePro W9100 32GB workstation graphics card designed for large asset workflows with creative applications. It will be available in Q2 of this year. The FirePro W9100 16GB is currently available.


Thinkbox addresses usage-based licensing

At the beginning of May, Thinkbox Software launched Deadline 8, which introduced on-demand, per-minute licensing as an option for Thinkbox’s Deadline and Krakatoa, The Foundry’s Nuke and Katana, and Chaos Group’s V-Ray. The company also revealed it is offering free on-demand licensing for Deadline, Krakatoa, Nuke, Katana and V-Ray for the month of May.

Chris BondThinkbox founder/CEO Chris Bond explained, “As workflows increasingly incorporate cloud resources, on-demand licensing expands options for studios, making it easy to scale up production, whether temporarily or for a long-term basis. While standard permanent licenses are still the preferred choice for some VFX facilities, the on-demand model is an exciting option for companies that regularly expand and contract based on their project needs.”

Since the announcement, users have been reaching out to Thinkbox with questions about usage-based licensing. We reached out to Bond to help those with questions get a better understanding of what this model means for the creative community.

What is usage-based licensing?
Usage-based licensing is an additional option to permanent and temporary licenses and gives our clients the ability to easily scale up or scale down, without increasing their overhead, on a project-need basis. Instead of one license per render node, you can purchase minutes from the Thinkbox store (as pre-paid bundles of hours) that can be distributed among as many render nodes as you like. And, once you have an account with the Store, purchasing extra time only takes a few minutes and does not require interaction with our sales team.

Can users still purchase perpetual licenses of Deadline?
Yes! We offer both usage-based licensing and perpetual licenses, which can be used separately or together in the cloud or on-premise.

How is Deadline usage tracked?
Usage is tracked per minute. For example, if you have 10,000 hours of usage-based licensing, that can be used on a single node for 10,000 hours, 10,000 nodes for one hour or anything in between. Minutes are only consumed while the Deadline Slave application is rendering, so if it’s sitting idle, minutes won’t be used.

What types of renderfarms are compatible with usage-based licensing?
Usage-based licensing works with both local- and cloud-based renderfarms. It can be used exclusively or alongside existing permanent and temporary licenses. You configure the Deadline Client on each machine for usage-based or standard licensing. Alternatively, Deadline’s Auto-Configuration feature allows you to automatically assign the licensing mode to groups of Slaves in the case of machines that might be dynamically spawned via our Balancer application. It’s easy to do, but if anyone is confused they can send us an email and we’ll schedule a session to step you through the process.

Can people try it out?
Of course! For the month of May, we’re providing free licensing hours of Deadline, Krakatoa, Nuke, Katana and V-Ray. Free hours can be used for on-premise or cloud-based rendering, and users are responsible for compute resources. Hours are offered on a first-come, first-served basis and any unused time will expire at 12am PDT on June 1.