Tag Archives: Siggraph

Jim Hagarty Photography

Blue Sky Studios’ Mikki Rose named SIGGRAPH 2019 conference chair

Mikki Rose has been named conference chair of SIGGRAPH 2019. Fur technical director at Greenwich, Connecticut-based Blue Sky Studios, Rose chaired the Production Sessions during SIGGRAPH 2016 this past July in Anaheim and has been a longtime volunteer and active member of SIGGRAPH for the last 15 years.

Rose has worked on such film as The Peanuts Movie and Hotel Transylvania. She refers to herself a “CG hairstylist” due to her specialization in fur at Blue Sky Studios — everything from hair to cloth to feathers and even vegetation. She studied general CG production at college and holds BS degrees in Computer Science and Digital Animation from Middle Tennessee State University as well as an MFA in Digital Production Arts from Clemson University. Prior to Blue Sky, she lived in California and held positions with Rhythm & Hues Studios and Sony Pictures Imageworks.

“I have grown to rely on each SIGGRAPH as an opportunity for renewal of inspiration in both my professional and personal creative work. In taking on the role of chair, my goal is to provide an environment for those exact activities to others,” said Rose. “Our industries are changing and developing at an astounding rate. It is my task to incorporate new techniques while continuing to enrich our long-standing traditions.”

SIGGRAPH 2019 will take place in Los Angeles from July 29 to August 2, 2019.


Main Image: SIGGRAPH 2016 — Jim Hagarty Photography

A look at the new AMD Radeon Pro SSG card

By Dariush Derakhshani

My first video card review was on the ATI FireGL 8800 more than 14 years ago. It was one of the first video cards that could support two monitors with only one card, which to me was a revolution. Up until then I had to jam two 3DLabs Oxygen VX1 cards in my system (one AGP and the other PCI) and wrestle them to handle OpenGL with Maya 4.0 running on two screens. It was either that or sit in envy as my friends taunted me with their two screen setups, like waving a cupcake in front of a fat kid (me).

Needless to say, two cards were not ideal, and the 128MB ATI FireGL8800 was a huge shift in how I built my own systems from then on. Fourteen years later, I’m fatter, balder and have two 27-inch HP screens sitting on my desk (one at 4K) that are always hungry for new video cards. I run mulitple applications at once, and I demand to push around a lot of geometry as fast as possible. And now, I’m even rendering a fair amount on the GPU, so my video card is ever more the centerpiece of my home-built rigs.

So when I stopped by AMD’s booth at SIGGRAPH 2016 in Anaheim recently. I was quite interested in what AMD’s John Swinimer had to say about the announcements the company was making at the show. AMD acquired ATI in 2006.

First, I’m just going to jump right into what got me the most wide-eyed, and that is the announcement of the AMD Radeon Pro SSG. This professional card mates a 1TB SSD to the frame buffer of the video card, giving you a huge boost in how much the GPU system can load into memory. Keep in mind that professional card frame buffers range from about 4GB in entry level cards up to 24-32GB in super high-end cards, so 1TB is a huge number to be sure.

One of the things that slows down GPU rendering the most is having to flush and reload textures from its frame buffer, so the idea of having a 1TB frame buffer is intriguing, to say the least (i.e. a lot of drooling). In their press release, AMD mentions that “8K raw video timeline scrubbing was accelerated from 17 frames per second to a stunning 90+ frames per second” in the first demonstration of the Radeon Pro SSG.

Details are still forthcoming, but two PCIe 3.0 m.2 slots on the SSG card can get us up to 1TB of frame buffer. But the question is, how fast will it be? In traditional SSD drives, m.2 enjoys a large bandwidth advantage over regular SATA drives as long as it can access the PICe bus directly. Things are different if the SSG card is an island in and of itself, with the storage bandwidth contained on the card itself, so it’s unclear how the m.2 bus on the SSG card will do in communicating with the GPU directly. I tend to doubt we’ll see the same performance in bandwidth between GDDR5 memory and an on-board m.2 card, but only real-world testing will be able to suss that out.

But, I believe we’ll immediately see great speed improvements in GPU rendering of huge datasets since the SSG will circumvent the offloading and reloading times between the GPU and CPU memories, as well as potentially boosting multi-frame GPU rendering of CG scenes. But in cases where the graphics sub-system doesn’t need to load more than a dozen or so GBs of data, on board GDDR5 memory will certainly still have an edge in communication speed with the GPU.

So, needless to say, but I’m going to say it anyway, I am very much looking forward to slapping one of these into my rig to see GPU render times, as well as operability using large datasets in Maya and 3ds Max. And as long as the Radeon Pro SSG can avoid hitting up the CPU and main system memory, GPU render gains should be quite large on the whole.

Wait, There’s More
On to other AMD announcements at the show: The affordable Radeon Pro WX line-up (due in the fourth quarter of 2016), refreshing the FirePro branded line. The Radeon Pro WX cards are based on AMD’s RX consumer cards (like the RX 480), but with a higher-level professional driver support and certification with professional apps. The end-goal of professional work is stability as well as performance, and AMD promises a great dedicated support system around their Radeon Pro line to give us professionals the warm and fuzzies we always need over consumer level cards.

The top-of-the line Radeon Pro WX7100 features 256-bit 8GB memory and workstation class performance, but at less than $1,000, which I believe replaces the FirePro W8100. This puts the four-simultaneous-display-capable WX7100 in line to compete with the Nvidia Quadro M4000 card in pricing at least, if not in specs as well. But it’s hard to say where the WX7100 will sit with performance. I do hope it’s somewhere in-between the Quadro M4000 and the $1,800 M5000 card. It’s difficult to answer that based on paper specs, as the number of (OpenCL) Compute Units vs. the number of CUDA cores are hard to compare.

The 8GB Radeon Pro WX5100 and 4GB WX4100 round out the new announcements from SIGGRAPH 2016, putting them in line to compete somewhere between the 8GB Quadro M4000 and 4GB M2000 and K1200 cards in performance. Seems though that AMD’s top-of-the-line will still be the $3,400+ FirePro W9100 with 16GB of memory, though a 32GB version is also available.

I have always thought AMD brought a really good price-to-performance ratio, and it seems like the Radeon Pro WX line will continue that tradition, and I look forward to benchmarking these cards in real world CG use.

Dariush Derakhshani is a professor and VFX supervisor in the Los Angeles area and author of Maya and 3ds Max books and videos. He is bald and has flat feet.

Pixar open sources Universal Scene Description for CG workflows

Pixar Animation Studios has released Universal Scene Description (USD) as an open source technology in order to help drive innovation in the industry. Used for the interchange of 3D graphics data through various digital content creation tools, USD provides a scalable solution for the complex workflows of CG film and game studios. 

With this initial release, Pixar is opening up its development process and providing code used internally at the studio.

“USD synthesizes years of engineering aimed at integrating collaborative production workflows that demand a constantly growing number of software packages,” says Guido Quaroni, VP of software research and development at Pixar.

 USD provides a toolset for reading, writing, editing and rapidly previewing 3D scene data. With many of its features geared toward performance and large-scale collaboration among many artists, USD is ideal for the complexities of the modern pipeline. One such feature is Hydra, a high-performance preview renderer capable of interactively displaying large data sets.

“With USD, Hydra, and OpenSubdiv, we’re sharing core technologies that can be used in filmmaking tools across the industry,” says George ElKoura, supervising lead software engineer at Pixar. “Our focus in developing these libraries is to provide high-quality, high-performance software that can be used reliably in demanding production scenarios.”

Along with USD and Hydra, the distribution ships with USD plug-ins for some common DCCs, such as Autodesk’s Maya and The Foundry’s Katana.

 To prepare for open-sourcing its code, Pixar gathered feedback from various studios and vendors who conducted early testing. Studios such as MPC, Double Negative, ILM and Animal Logic were among those who provided valuable feedback in preparation for this release.

Today: AMD/Radeon event at SIGGRAPH introducing Capsaicin graphics tech

At the SIGGRAPH 2016 show, AMD will webcast a live showcase of new creative graphics solutions during their “Capsaicin” event for content creators. Taking place today at 6:30pm PDT, it’s hosted by Radeon Technologies Group’s SVP and chief architect Raja Koduri.

The Capsaicin event at SIGGRAPH will showcase advancements in rendering and interactive experiences. The event will feature:
▪ Guest speakers sharing updates on new technologies, tools and workflows.
▪ The latest in virtual reality with demonstrations and technology announcements.
▪ Next-gengraphics products and technologies for both content creation and consumption, powered by the Polaris architecture.

A realtime video webcast of the event will be accessible from the AMD channel on YouTube, where a replay of the webcast can be accessed a few hours after the conclusion of the live event. It will be available for one year after the event.

For more info on the Caspaicin event and live feed, click here.

Vicon at SIGGRAPH with two new motion tracking cameras

Vicon, which makes precision motion tracking systems and match-moving software, will be at SIGGRAPH this year showing its two new camera families, Vero and Vue. The new offerings join Vicon’s flagship camera, Vantage.

Vero is a range of high-def, synchronized optical video cameras for providing realtime video footage and 3D overlay in motion capture. Designed as an economical system for many types of applications, the Vero range includes a custom 6-12 mm variable focus lens that delivers an optimized field of view, as well as 2.2 megapixel resolution at 330Hz.

With these features, users can capture fast sport movements and multiple actors, drones or robots with low latency. The range also includes a 1.3 megapixel camera. Vero is compatible with existing Vicon T-series, Bonita and Vantage cameras as well as Vicon’s Control app, which allows users to calibrate the system and make adjustments on the fly.

With HD resolution and variable focal lengths, the Vicon Vue camera incorporates a sharp video image into the motion capture volume. It also enables seamless calibration between optical and video volumes, ensuring the optical and video views are aligned to capture fine details.

GPU-accelerated renderer Redshift now in v.2.0, integrates with 3ds Max

Redshift Rendering has updated its GPU-accelerated rendering software to Redshift 2.0. This new version includes new features and pipeline enhancements to the existing Maya and Softimage plug-ins. Redshift 2.0 also introduces integration with Autodesk 3ds Max. Integrations with Side Effects Houdini and Maxon Cinema 4D are currently in development and are expected later in 2016.

New features across all platforms include realistic volumetrics, enhanced subsurface scattering and a new PBR-based Redshift material, all of which deliver improved final render results. Starting July 5, Redshift is offering 20 percent off new Redshift licenses through July 19.

Age of Vultures

Age of Vultures

A closer look at Redshift 2.0’s new features:

● Volumetrics (OpenVDB) – Render clouds, smoke, fire and other volumetric effects with production-quality results (initial support for OpenVDB volume containers).

● Nested dielectrics – The ability to accurately simulate the intersection of transparent materials with realistic results and no visual artifacts.

● New BRDFS and linear glossiness response – Users can model a wider variety of metallic and reflective surfaces via the latest and greatest in surface shading technologies (GGX and Beckmann/CookTorrance BRDFs).

● New SSS models and single scattering – More realistic results with support for improved subsurface scattering models and single-scattering.

● Redshift material – The ability to use more intuitive, PBR-based main material, featuring effects such as dispersion/chromatic aberration.

● Multiple dome lights – Users can combine multiple dome lights to create more compelling lighting.

● alSurface support – There is now full support for the Arnold shader without having to port settings.

● Baking – Users can save a lot of rendering time with baking for lighting and AOVs.

Users include Blizzard, Jim Henson’s Creature Shop, Glassworks and Blue Zoo.

Main Image: Rendering example from A Large Evil Corporation.

SIGGRAPH hosting character design contest

If you are a character designer and thinking about attending the SIGGRAPH conference on July 24-28 in Anaheim, this is your lucky week. The Spirit of SIGGRAPH 
winner will receive complimentary full conference registration for SIGGRAPH 2016. Travel and incidental expenses are not included. The winning design will also be featured in promotion of the conference through social media, and the designer will be credited.

Designs must be submitted by midnight on April 15. The winner will be announced on May 2. The contest is seeking character designs that “embody the spirit of SIGGRAPH and should be original creations.” Submissions will be judged on any of the following criteria:
– Creativity
– Design
– Relevance to SIGGRAPH
– Suitability to 2D graphic design, 3D animatable design
and 3D printable solid design
– Ability to be turned into a wearable costume
– Suitability for use in a variety of on-site promotions

The winner will be chosen by this year’s SIGGRAPH event chair, Mona Kasra, next year’s event chair, Jerome Solomon, and a board of judges comprised of 2016 program chairs and experts across the animation and design industries.

Performance-capture companies Animatrik, DI4D collaborate

Motion- and facial-capture companies Animatrik Film Design and Dimensional Imaging (DI4D) have launched a new collaboration based on their respective mocap expertise. The alliance will deliver facial performance-capture services to the VFX and video game communities across North America.

Animatrik technology has been used on such high-profile projects as Image Engine’s Chappie, Microsoft’s Gears of War series and Duncan Jones’ upcoming Warcraft. DI4D’s technology has appeared in such shows as the BBC’s Merlin and video games like Left 4 Dead 2 and Quantum Break. The new collaboration will allow both companies to bring even more true-to-life animation to similar projects in the coming months.

Animatrik has licensed DI4D’s facial performance-capture software and purchased DI4D systems, which it will operate from its Vancouver, British Columbia, and Toronto motion-capture studios. Animatrik will also offer an “on-location” DI4D facial performance-capture service, which has been used before on projects such as Microsoft’s Halo 4.

Check out our video with Animatrik’s president, Brett Ineson, at SIGGRAPH.

IKinema at SIGGRAPH with tech preview of natural language interface

IKinema, a provider of realtime animation software for motion capture, games and virtual reality using inverse kinematics, has launched a new natural language interface designed to enable users to produce animation using descriptive commands based on everyday language. The technology, code-named Intimate, is currently in prototype as part of a two-year project with backing by the UK government’s Innovate UK program.

The new interface supplements virtual reality technology such as Magic Leap and Microsoft HoloLens, offering new methods for creating animation that are suitable for professionals but also simple enough for a mass audience. The user can bring in a character and then animate the character from an extensive library of cloud animation, simply by describing what the character is supposed to do.

Intimate is targeted to many applications including pre-production, games, virtual production, virtual and augmented reality and more. The technology is expected to become commercially available in 2016 and the aim is to make an SDK available to any animation package. Currently, the company has a working prototype and has engaged with top studios for the purpose of technology validation and development.

Nvidia takes on VR with DesignWorks VR at SIGGRAPH

At SIGGRAPH in LA, Nvidia introduced DesignWorks VR, a set of APIs, libraries and features that enable both VR headset and application developers to deliver immersive VR experiences. DesignWorks VR includes components that enable VR environments like head-mounted displays (HMDs), immersive VR spaces such as CAVEs and other immersive displays, and cluster solutions. DesignWorks VR builds on Nvidia’s existing GameWorks VR SDK for game developers, with improved support for OpenGL and features for professional VR applications.

Ford VR

At its SIGGRAPH 2015 booth, Nvidia featured a VR demonstration by the Ford Motor Company in which automotive designers and engineers were able to simulate the interiors and exteriors of vehicles in development within an ultra high-definition virtual reality space. By using new tools within DesignWorks VR, Ford and Autodesk realized substantial performance improvements to make the demo smooth and interactive.

In addition, Nvidia highlighted an immersive Lord of the Rings VR experience created by Weta Digital and Epic Games and powered the Nvidia Quadro M6000. At Nvidia’s “Best of GTC Theater,” companies such as Audi and Videostich  spoke on their work with VR in design.