Transcript
Alex (0:00)
What's going on in the heart of Google's AI research operation? We'll find out with Google DeepMind's Chief Technology Officer right after this.
Leah Smart (0:08)
From LinkedIn News, I'm Leah Smart, host of Everyday Better, an award winning podcast dedicated to personal development. Join me every week for captivating stories and research to find more fulfillment in your work and personal life. Listen to Everyday better on the LinkedIn podcast network, Apple Podcasts or wherever you get your podcasts from LinkedIn News. I'm Jessi Hempel, host of the hello Monday Podcast. Start your week with the hello Monday Podcast. We'll navigate career pivots. We'll learn where happiness fits in. Listen to hello Monday with me, Jessi Hempel on the LinkedIn podcast network or wherever you get your podcasts.
Alex (0:45)
Booking a big technology podcast, A show for cool headed and nuanced conversation of the tech world and beyond. We have a great show for you today, a bonus show just as Google's IO news hits the wire. We have so much to talk about, including what's going on with the company, what it's announced today, but also what is happening in the research effort underlying it all. And we have a great guest for you. Joining us today is Korai Kavacholu. He is the Chief technology officer of DeepMind. We're going to speak with Korai today and then tomorrow you'll hear from DeepMind CEO Demis Hassabis. Korai, great to see you. Welcome to the show.
Korai Kavacholu (1:23)
Thank you very much folks.
Alex (1:24)
By the way, if you're watching on video, Korai and I are in two separate conference rooms in Google's. I don't know, it's a pretty cool new building that they have. It's called what gradient wave or something.
Korai Kavacholu (1:34)
We call it the gradient canopy.
Alex (1:36)
Gradient canopy. Anyway, we're here and I wanted to ask you a question that we've been asking on the show a lot, which is the scale question. Now Google has a tremendous amount of compute at your disposal and so you basically have the option is it scale that you want to throw at these models or is it new techniques? So let me just ask it to you as plainly as I can. Is scale the star right now or is it a supporting actor in terms of trying to get models to the next step?
Korai Kavacholu (2:07)
It's a good question. I think also the way you framed it because it is definitely an important factor. The way I'd like to think about this is it's rare that in any research problem you would have a dimension that pretty confident would give you improvements of course, with maybe diminishing returns, but most of the time with research, it's always like that. When we think about our research right now, in the case of generative AI models, scale is definitely one of those, but it's one of those things that are equally important with other things. When we are thinking about our architectures, like the architectural elements, the algorithms that we put in there that make up the model, they are as important as the scale. We of course analyze and understand, as with scale, how do these different architectures, different algorithms, become more and more effective? That's an important part because you know that you are putting more computational capacity and you want to make sure that you research the kinds of architectures and algorithms that pay off the best under that kind of scaling property. Right. But as I said, that's not the only one. Data is really important. I think it is as critical as any other thing. The algorithms, architectures, modules that we put into the system is important. Understanding their properties with data, with more compute, that is as important. And then of course, inference time techniques is as important as well, because now that you have a particular architecture, a particular model, you can multiply its reasoning capabilities by making sure that you can use that model over and over again through different techniques at inference time.
