Quantum F1000

Nvidia GPU Technology Conference 2015: Part I

By Fred Ruckel

Recently, I had the pleasure of attending the Nvidia GPU Technology Conference 2015 in San Jose, California, a.k.a. Silicon Valley. This was not a conference for the faint of heart; it was an in-depth look at where the development of GPU technology is heading and what strides it had made over the last year. In short, it was the biggest geek fest I have ever known, and I mean that as a compliment. The cast of The Big Bang Theory would have fit right in.

While some look at “geek” as having a negative connotation, in the world of technology geeks rule the universe and change the way you live, so hug a geek if you know one. That said, for those who aren’t fully aware of what GPU means, it stands for Graphics Processing Unit, and you likely use it every single day if you work in this industry.

Nvidia had long been known primarily as a gaming acceleration company, but make no mistake about it, they have evolved into a super power in the world of graphics. A few years ago, the power of the GPU was unlocked and since then it has been a constant race for speed. While I was officially out there to cover the media and entertainment sector, I wanted to dive into a bit of everything. That way I’d be fully versed on what this growth market means from a post-production perspective — no pun intended. Over the course of the week, I attended 13 breakout sessions, three keynotes and a mini “Shark Tank” called the Emerging Companies Summit, where the winner was given $100,000 to help grow their idea.

Fred Ruckel

Fred Ruckel

My biggest take away from this conference was this: Nvidia wants to empower its developers with a solid core of education in order to help raise the bar in technology. On more than one occasion, this has resulted in a developer using their newly found skills to create a next-generation product. Let’s face it, we all work our butts off everyday, cranking out project after project for our clients, and we seldom have the time to better our skill set. Nvidia took it a step further with the conferences at GTC 2015; they created specialized sessions where one could go in and learn more efficient ways of doing things from the masters. Add to this a few thousand of the smartest people on the planet, and it turns into a team-building event. I would not doubt that the world will benefit from this conference alone — new inventions coming to life, companies being formed and maybe the next billion-dollar idea has been hatched! Yeah, it was that kind of cool.

If you are a developer at a technology company (or even a budding entrepreneur), I would seriously consider attending this conference in the future. If you are wondering if it is too technical for you, than it most likely is! It’s super technical, and even I, a proud geek, felt lost at some points during the week.

Now that you have some background, I am going to jump in and give a chronological breakdown of what I learned from my week at GTC.

It all began with an outstanding keynote that got the audience all fired up. The entire front wall of the room was a giant screen, over 150 feet long, with a neural network animation showcasing revved-up, electrified synapses, (much like the crowd). Nvidia’s CEO Jen-Hsun Huang took the stage to a roaring, full-capacity crowd of over 3,800 people. Calm, cool and collected in his black leather jacket, Huang captivated the crowd by revealing just how exponential Nvidia’s growth has been since the introduction of CUDA technology seven years ago. With 10x growth and over three million downloads, GPU technology has taken firm root as “the choice method” for all graphics processing needs. From a personal standpoint, every computer at my company, RuckSackNY, is GPU accelerated with CUDA, and it does make a big difference.

The crowd at the keynote

The crowd at the keynote

Perhaps you’re wondering if you have this technology on hand but don’t know it? Most likely you do. Both the MacBook Pro and iMac have an Nvidia GPU card built in. The trick is, in some cases, you have to unlock its potential to fully take advantage of it. It’s not a full-blown GPU, but it is one of the reasons you love your little Mac computer.

Deep Learning
Nvidia targeted four key points during this conference. The biggest was “Deep Learning.” This single catch phrase dominated the entire conference. Right now, you are probably asking, “What the heck do they mean by Deep Learning?” I know I wasn’t sure and started asking around. The answer I got was that it means “machine learning.” Yeah, that didn’t help me either, so I dug further. In this scenario, a computer can learn a process via input data, analyze it, understand it and repeat it. The more input, the more it learns. The more it learns, the more accurate it becomes. That’s about as basic as I can put it. It works in a similar way as your brain: feed your brain and your brain shall grow. It’s really just another way to say “artificial intelligence,” but that term tends to evoke fear of a computer takeover. (A side note: The first known example of Deep Learning occurred in 1998, In which edge detection of an image became a reality.)

To help developers better understand and program Deep Learning, Nvidia announced a self-contained unit called the Digits DevBox. It’s a preconfigured bespoke system for $15K. This will definitely help companies jump in and learn, and at a great price point.

Toward the end of the keynote, Elon Musk, CEO of Tesla, took the stage. He sat down to discuss the future of autonomous cars with Jen-Hsan. While this was a cool topic to learn about, it had nothing to do with why I was there, so I’ll wait until the end of these series of pieces about the GTC 2015 to cover that. I will point out that Elon Musk clearly stated it will be a solid 20 years before self-driving cars take over the current fleet of cars! Ironically, the marketing message released later in the week stated that Tesla would be releasing software for all its cars to be autonomous this coming summer, so who knows what’s really happening. He also mentioned that currently there are no autonomous driving systems that are safer than humans on the road.


Nvidia’s CEO Jen-Hsun Huang with the Titan X

Titan X
During GTC 2015, the Titan X graphics card was announced with great fan fair! At only $999, it is very reasonably priced, and packs a wallop of a punch. Boasting 3072 CUDA cores and 12GB of memory, this card is poised to take the graphics world by storm. The Titan is geared toward the gaming community, where as the M6000 is the enterprise level card that would be used in a mission critical environment.

As mentioned earlier, there were many breakout sessions during the conference. For this report, I attended every session related to the post-production side of the business. First up, is a tool for smoothing out 3D models with a new tool made by artists, for artists.

The ‘Mush’ for 3D Animation
A great product called Delta Mush was presented by VFX house Rhythm and Hues. Before you start wondering, yes they still exist! They have slimmed down, restructured and are making strides toward a full recovery from the financial debacle a few years ago.

Delta Mush, as it is called, is a method with which you can smooth out any geometry, while maintaining its weight and movement. This has been in development for some time and it carries much promise. It will be incorporated into Autodesk Maya 2016 and some of the larger studios have already implemented it into their workflow, although they preferred to not be named directly. As a 3D artist, I can tell you that you will absolutely love this product and will wonder how the hell you worked without it.

Delta mush

Delta Mush cuts the time it takes to rig a character for animation from days or weeks, to hours or minutes, depending on its complexity. It’s fast and efficient. It allows an artist to focus on the design of the character while helping remove the painstakingly long process of rigging. With such an efficiency level, fewer people can be assigned to the task, therefore maximizing your budget by reducing staff size and adding to the creative time.

If you are a rigger, I highly recommend that you check this out — you will be very happy you did. During the session, the presenter cited that it only took ONE hour to fully rig a character for Game of Thrones. Without it, it would have been at least a week. Time saved equals more time to spend with your family!

This brings us to the end of part one of our three-part series on the Nvidia GPU Technology Conference. Check this space shortly for more!

Fred Ruckel is a technology consultant and entrepreneur. He is also CEO of RuckSack NY, a Manhattan-based soups-to-nuts creative agency. He took all the photos that ran in this three-part series.


Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.