Transcript
A (0:00)
Foreign welcome to Generative Now. I am Michael McNano. I am a partner at Lightspeed. Nvidia is undeniably a Cornerstone of the AI Revolution. Their groundbreaking GPUs are the workhorses of modern AI research and development. Nvidia also made some major announcements at CES this year. And that's why I'm revisiting a conversation I had with Bill Daly. Bill is the Chief Scientist and Senior Vice President for Research at Nvidia. He is one of the most forward thinking minds when it comes to computer hardware and architecture. His decades long career started in academia at Caltech, then mit, before later becoming the chair of the Computer science department at Stanford before transitioning to Nvidia.
B (0:50)
We talk about his early experiences, the.
A (0:52)
1980S playing around with neural networks at Caltech, the pace of the AI evolution and why he believes that AI is the technology that will revolutionize all human endeavors.
B (1:04)
So check out this conversation I had.
A (1:05)
With Chief Scientist and Senior Vice President for Research at Nvidia, Bill Daly.
B (1:12)
Hey Bill, how's it going?
C (1:14)
It's going well, Michael.
B (1:15)
Thank you so much for doing this. Really, really appreciate the time. Very excited to talk to you. I've been looking forward to this one for a while. You have obviously incredibly impressive background and role at Nvidia and there's so much we could get into about Nvidia and the state of AI and GPUs and all of the research that you and your team of, I believe hundreds of researchers are working on. But maybe before we get there, like I said, you have such an impressive career. I think the audience would love to hear a little bit about what you've done over the past several decades across academia. Entrepreneurship. This role as Chief scientist of Nvidia. Give us a little bit of the background of your story.
C (2:00)
Okay. What's relevant to AI and the like probably started when I was a graduate student at Caltech. This is 40 years ago in the 1980s. I took a course on neural networks and I thought that was just a really cool technology. We built little multilayer perceptrons and, and convnets and these things called Hopfield nets that were little associative memories. But it also impressed me that it was a toy, that it was a great technology. But the COMPUTE wasn't there at the time. But that was a formative thing. And then later on I was a professor at MIT and I was building parallel computers and it kind of struck me that parallelism was the technology. It was a way to scale performance in a way that you couldn't do with serial processors. But at the same time, existing software was a huge inertia that with Moore's Law in effect then, and the Moore's Law about serial processors, not about transistors, people could just wait and every 18 months or so their performance of their computers would double. And so why rewrite all your software? If you went with parallel computing, your performance would go up by a factor of four. If you just wait, it goes up by a factor of two. And that's just too easy a path to compete with. So it wasn't really until that ended that the parallel computing took on.
