Transcript
A (0:02)
Latitude Media, covering the new frontiers of the energy transition.
B (0:08)
From Latitude Media, this is Open Circuit. I am Stephen Lacy and it's just me today. Jigger and Catherine are going to be back with me next week. And as we start to wind down the year, we're, you know, we're in that seasonal reflection mode. And so in that spirit, we've got a bit of a different episode this week. We, we just passed the three year mark of when ChatGPT dropped on the world, which launched this commercial arms race for AI dominance. And it's become a magnetic force, some would call it a black hole for capital, the workforce and media attention.
B (0:43)
And the narrative of the last few years, driven by figures like Sam Altman, was intoxicatingly simple. Large language models are the on ramp to general artificial intelligence. And the only bottleneck to getting there is scale. Just feed the machine more, more chips and more electricity and it will get smarter.
C (1:00)
Look, I think compute is going to be the currency of the future. I think it will be maybe the.
B (1:05)
Most precious commodity in the world, and I think we should be investing heavily.
C (1:10)
To make a lot more compute.
B (1:11)
In the race to build these LLMs, the top tech companies have collectively teed up more than a trillion dollars in capital expenditures to expand their computing power. And obviously seeing an opportunity there, Utilities are planning another trillion dollars to upgrade the grid over the next three years, much of it to serve the appetite of AI data centers. Investors are cheering both parties along, and the entire logic of this multi trillion dollar leg of the race has been based on this belief that scale is the determining force for building human level intelligence. But three years in this narrative, at least for LLMs, is showing cracks. Ilya Sutskever, the co founder and former Chief Scientist at OpenAI, was on the Dwarkesh podcast about a week ago. This is one of the most influential AI experts in the world. And he questioned the idea of a scaling law for large language models. From 2012 to 2020, it was the age of research. Now from 2020 to 2025, it was the age of scaling. Because people say, this is amazing. You gotta scale more. Keep scaling the one word, scaling. But now the scale is so big. Like, is the belief really that, oh, it's so big, but if you had 100x more, everything would be so different. Like, it would be different for sure. But like is the belief that if you just honeydex the scale, everything would be transformed. I don't think that's true.
