Entrepreneurs, self-driving cars and more
By Fred Ruckel
Welcome to the final installment of my Nvidia GPU Technology Conference experience. If you have read Part I and Part II, I’m confident you will enjoy this wrap-up — from a one-on-one meeting with one of Nvidia’s top dogs to a “shark tank” full of entrepreneurs to my take on the status of self-driving cars. Thanks for following along and feel free to email if you have any questions about my story.
Going One on One
I had the pleasure sitting down with Nvidia marketing manager Greg Estes, along with Gail Laguna, their PR expert in media and entertainment. They allowed me to pick their brains about new Nvidia tech releases.
Estes explained, that over time, GTC has evolved from a high-performance developer-only show to a show with something for everyone, including those attendees that have never written a line of code. Many come to the conference because their business benefits from Nvidia’s technology.
Estes pointed out that some applications — Nuke, Flame or Media Composer — that require advanced color accuracy, multiple 4K displays and multi-GPU performance still required a high-performance workstation. Other apps for 3D modeling and animation — such as Maya and 3DS Max — are well suited for virtualized systems running Nvidia GRID because they don’t require the multiple GPUs that Flame does, for example.
Estes and I also spoke about advanced rendering, so this next portion will be a bit of a geek out session. If rendering doesn’t do it to you, skip to the next section (joking).
Last year at GTC, Nvidia and Honda were able to create a fully realized car with all the features and materials. They showed it at 3K resolution, and it cost Honda over a million dollars to create.
Nvidia’s vision is to make physically-based rendering interactive, scalable and broadly accessible. “There are several things we are going to do in order to make that happen,” explained Estes. In Iray 2015, we have created a materials definition language and the files it creates are called .MDL files. These files will allow for models to be imported with all materials intact across different high-end software programs. Similar to an .FBX file, but much more interchangeable.
“The new version of Iray can tell what kind of processors you have in your system —even multiple processors — it then scales the renders across as many processors, or even other systems, as you choose via a web interface (That isn’t to say it works via the web, but you can use a web browser to administer the resources locally.). Essentially you have an infinitely scalable pipeline that with a click of a button allows you to dedicate as much processing power as you want, wherever you want.”
This is pretty cool… it works like a true renderfarm without having to send and receive any original files. During our conversation, I mentioned that to me “it sounded infinite.” Estes corrected me: “I would not yet use the term infinite, but I can tell you we have not yet found an upper limit to what it can do. You’ll run out of money faster than you’ll run out of capability.”
I guess that’s to say, you can scale your renders to your heart’s content and your only limitation is your budget. Big words from a smart man!
Estes had been at SGI, the superpower of supercomputing in the late 1990s and early 2000s, and spent some time at Avid as well. Clearly he is a man who knows what works and what doesn’t within the world of computers and video.
We discussed the new Titan X GPU vs. the new M6000. On paper they are both relatively identical. The M6000 is the professional version of the Titan X. Before you get confused on that one, Estes explained that “most of the differences are in the software.”
The Titan X will make gamers very happy, as they typically over-clock cards and eventually burn them out! However, the Quadro M6000 is the type you use inside an MRI machine. It’s the kind of card that needs to continue to work exactly as it is designed, for years on end, without fail… true enterprise grade machinery.
If you are a Mac user, and many of us are, you’re probably wondering if this can work in your world? Sadly, it cannot… at least not directly. There is a work around, and I pressed Estes on how we could share this method with our readers. For a company with a Virtual Desktop Infrastructure running Nvidia GRID, users can remotely access their applications from a Mac, with full interactivity. The GPU acceleration will allow you to remotely drive another computer, using realtime interaction. If you’re a Mac person, that is your best bet for now.
Fun fact: Nvidia is the working force behind Shazam, the song recognition app.
The emerging companies summit
This was by far my favorite part of the conference. This Shark Tank-like challenge allowed small start-ups, using GPU technology, to present their business, offering them a chance to win money to help them grow. The room was filled with genius-level thinkers, people pushing the boundaries of technology — those who dare to dream and those who strive to achieve.
Some of the biggest “new” ideas in technology have been presented during this summit. Past presenters have continued to grow their business or have even been acquired by the likes of Facebook, Google and Microsoft, to name but a few! The prize in this competition was $100,000 of no strings-attached funding — no equity in their company needed to be given away as part of winning!
There was strong competition, and very few entrants’ solutions were related to the business of post production. The most relevant to M&E was a company called Redshift. Sounds like a new tool for the Red camera, however that was not the case. It is a new method for rendering files, that is faster than current methods.
As for what it takes to win, I asked Jeff Herbst, VP of biz dev at Nvidia, for his take on what he felt a presentation should include. “You’ve got to be crisp,” he explained. “What problem do you intend to solve? What does the world need? What is your solution? Why is this a great market for you to be in?” Herbst was the leader of the Early Stage Challenge and one of the “sharks” on the panel.
Fun fact: Oculus presented at a previous Early Stage Challenge, and was bought by Facebook for $2 billion. Not too shabby!!
This year’s winner of the notable $100K prize was a company called Artomatix. They have found a means of combining a painting with a picture — using paint strokes to apply it to the picture. The company demonstrated how it could take a single image to create new images, using the original as a seed. This resulted in similar, but unique, pieces of art with every rendering.
They also said this could be used for crowd replication, but on a larger scale. From my experience, Massive Software dominates this area of the market. I believe that Artomatix’s product is a long way from Massive’s first iteration 10 years ago. While it may fill a niche somewhere, I don’t see it having any place in the production/post market unless the technology is acquired and incorporated into something else, which is always a possibility!
Artomatix’s CTO referred to this as “artificial imagination” during his presentation. He talked about how long it took to make Avatar, how much it cost and how insane that was. Artomatix is a computer system that can mimic human-like artistic creativity. “We solved the problem that art costs too much and takes too long,” he said.
Personally, I feel that we are an industry of diverse artists and as such figured it was worth mentioning a company that presented a concept that looking to replace us all (again, joking). Don’t worry, they’re far away from achieving that reality.
Yeah, well, not so much, at least not yet. I had hoped I might get inside a few of these cars at the conference, but they just weren’t ready for use yet. On day one of the conference, when Elon Musk, CEO of Tesla, took the stage for the keynote, he discussed what Tesla’s roadmap looked like. The short of it is, these cars aren’t even close to being road worthy nationwide. There are many test areas around the country, but we aren’t even close to having full adoption.
There are many factors that will hold this technology back, and all of them are for the right reasons. A few key points to consider: In order to drive a self-driving car, you must pass a test and become certified for an autonomous vehicle. Ironic isn’t it? Self-driving cars do not work well at speeds between 15 and 25 mph (the most common speeds), self-driving cars are not accurate in urban areas…yet. Got snow? A self-driving car cannot determine where the lanes are when the road is covered. Even better, if you live in a rural area with no lines on the road, good luck driving one of these — you might as well be driving blind. If sensors can’t pick up the lines on the road, they don’t know where they are to calibrate distance. Sound ready to you?
Summing Up My GPU Tech Conference Experience
What a truly amazing display of innovative thinking! This is not like going to NAB — I’ve been to plenty of those. GTC was a more technical conference, if you can imagine that.
I suppose the difference between GTC and NAB is that NAB is geared to people working in general production and post. This conference was specific to people with a fair amount on knowledge relating to GPU technology and development.
GTC is certainly worth looking into, and if you are a developer this is the place for you. I see this as somewhat similar to the evolution of Siggraph — how that became the go-to-place for artists looking for work. Perhaps GTC will become an arena for developers seeking a new home?
I learned a great deal from NVidia’s incredibly packed conference agenda, and I’m looking forward to applying all my new found “Deep Learning” to my own personal app development —shhhh, it’s called YouSnapz, and it’s coming soon.
Fred Ruckel is a technology consultant and entrepreneur. He is also CEO of RuckSack NY, a Manhattan-based soups-to-nuts creative agency. He took all the photos that ran in this three-part series.