Transcript
A (0:00)
What can 160 years of experience teach you about the future? When it comes to protecting what matters, Pacific Life provides life insurance, retirement income and employee benefits for people and businesses building a more confident tomorrow. Strategies rooted in strength and backed by experience. Ask a financial professional how Pacific Life can help you today. Pacific Life Insurance Co. Omaha, Nebraska. And in New York, Pacific Life and Annuity, Phoenix, Arizona.
B (0:29)
A lot of attention is given to Nvidia's chip dominance in the current AI landscape. They are making insane amounts of money and it seems like they're one of these unstoppable forces for training AI models and, you know, becoming the most valuable company in the world just further pushed that narrative. But it feels like cracks are appearing in their monopoly on the chip market. Andy Jassy, who is the CEO of Amazon, said that their competitive chip to Nvidia is already a multi billion billion business. And it seems like this is something that is going to continue to grow. So today on the show, we're going to be diving into what AWS has unveiled, what the numbers are on it, which I thought was actually quite impressive. Before we get into all of that, if you want to try all of the models that I talk about on the show, make sure that you go check out AI Box AI. It's my own startup. And you essentially get access to over 40 of the top AI models. Google OpenAI, Anthropic Deep Seek, Claude Grok 11 Labs for audio, a ton of really cool image generation models, including Open Eyes Image Generation Model Flex, which is what powers Grok. All of that in one platform for $20 a month and the ability to make apps by describing them. So a lot of really cool stuff. Go check it out. AI Box AI. All right, so the big question is, can anybody stop Nvidia's AI chip dominance? Most would say no because they're so ingrained. They have been, you know, pushed to the front. But I think AWS and Amazon definitely have a big, a big competitive advantage here, which is beyond the chips. Because of aws, Amazon Web servers, a lot of people are using their cloud for, you know, powering their AI model training. So Amazon will go and buy Nvidia chips, right? The H1 hundreds or two hundreds or whatever, put them onto, they'll put them into AWS and people can go and use them and rent them from AWS now to train their AI models. So we once Amazon goes and creates their own chips, theoretically it should be able to, you know, have a massive distribution advantage where they're going to, you know, probably Promote their own, maybe give a discount and have people use their chips. I think this is my, my bullish case for Amazon and AWS competing at a high level with Nvidia. Now will they be able to make their chips better than Nvidia and technologically and hardware wise I think that's a completely different discussion. But I do think that they have the distribution and what it takes to compete at a high level if they can get that hardware. So with all of that said, they are in the space and they are creating chips. There are hundreds of billions of dollars in revenue for any company that can peel off even a little bit of this massive industry that Nvidia is tackling. So Amazon CEO Andy JC this week was saying that their company, by the way, he said all this other AWS reinvent conference, but he said that the next generation of their Nvidia competitor AI chips, it's called the Trainium 3, it's about four times faster and uses less power than the current Trainium 2. And apparently these have seen some quote unquote substantial traction. He said it is a multi billion dollar revenue run rate business. They have one mil over a million chips in production and they have a hundred thousand companies using them as a majority of bedrock usage today day. This is absolutely crazy because obviously Nvidia still absolutely dominates the market. So even by just taking a small percentage of this, they have a multi billion dollar business. Bedrock is Amazon's AI development tool. It essentially lets companies pick and choose what AI models they want to use inside of their stack. JC said that Amazon's AI chip right now is winning among a bunch of different companies. There's a whole bunch on their cloud customer base. He said the main reason why people are picking it is because it quote has price performance advantages over the GPU options that are compelling. So basically he thinks that it costs less to train AI models. And so I think because of this a lot of people are using them. I think that is of course Amazon's classic kind of directive they give their own homegrown tech at lower prices. I mean this is basically the same strategy Amazon uses with Amazon Basics where you can go and buy, you know, an HDMI cord or a USB C cord or a broom or a mop or like some basic item. And Amazon has created the cheapest version of it. They put it on Amazon, it's a few dollars cheaper and they just try to make it up in the margins where people buy a ton of it. They're kind of applying the same strategy where they're like, look, tons of people are using Nvidia's chips to train models on aws. What if we just made, you know, some sort of compute that was a little bit cheaper? People could still get their models trained. They'll just use our chips because they're a little bit cheaper. Maybe they're not the fastest or, you know, the shiniest object in the room, the most powerful, but they are less money. And a lot of people, I think, will go for that option. I think in addition to that, the. The CEO of aws, that's Matt Garman, was talking in an interview with CRN about this, about, you know, one customer response for a big chunk of those billions of dollars in revenue. And he said that that was. Now, this is not a big shocker. Anthropic is a company that has been heavily invested in by AWS. So Amazon has put, I think, like about over $4 billion into anthropic in different investments. A lot of these are essentially get them over to aws and it's not a really big shocker to say, look like, hey, we, we gave anthropic $4 billion and you know, shocker, shocker. They spent that money on us and they used it on our chips. Because it's kind of the way these deals work. If you're giving someone that much money, you're going to try to dictate how they, you know, spend the money back to you. They obviously had rules in there like, we'll give you this money, but you got to use it on aws. You got to use the chip stack that we use. And so I think this is kind of interesting. Yes, they're making billions of dollars back, but it's. It's almost like a forced contract, like people didn't have a choice other than to use them. And maybe it was a better option because it was cheaper, but it is an interesting place to be, for sure. Here's something that Garmin, the CEO of AWS, said. He said, We've seen more enormous traction from Trainium 2, particular from our partners at Anthropic, who've announced Project Rainier, where there's over 50 or 500,000 Trainium 2 chips helping them build the next generation of models for Claude. Project Rainier, I think, is Amazon's most ambitious AI cluster of servers. They have this spread out across a whole bunch of different data centers in the US and it's basically built to. For Anthropic, right? Anthropic has this massive skyrocketing need. They're training the latest cloud models, I think it came online in October. So this is very recent. And Amazon of course is a huge investor in Anthropic and so that's why they're using this in exchange for all of their investments. Anthropic has made AWS their primary model training partner, even though Anthropic is also offered on Microsoft's cloud via the Nvidia chips. So OpenAI is now also using AWS in addition to Microsoft's cloud. But the OpenAI partnership I think couldn't have contributed a lot to the Trainium revenue because AWS is running on Nvidia's chips and systems. Um, and so I don't think that that's where a lot of that came from. I think mostly this is coming from Anthropic. Only a few US companies, right, Google, Microsoft, Amazon, Meta have all of the engineering pieces, the silicon chip design expertise, kind of the home grown high inner or high speed interconnect and the networking technology to even try to be a true competitor to Nvidia. Right, Because Nvidia basically cornered the market on you know, one major high performance networking tech in 2019 when their CEO outbid Intel and Microsoft to buy InfiniBand, which was a hardware maker of Mellanox. Something that I think is important to mention on top of all of this is that any of the AI models that are kind of built to be served up on Nvidia's chips, they're also going to rely on Nvidia's proprietary Compute Unified Devices architecture or cuda. And that's kind of a software. So CUDA essentially lets different apps use the GPUs for parallel processing compute, there's a bunch of other tasks, but it is basically kind of like intel versus you know, the sparks chip war that happened, you know, in the past. It's not a small thing to rewrite an AI app for non Cuda chips. It's, there's a lot that goes into it. So typically these AI models are just going to keep using Nvidia, but there are a couple players that are trying to do this. Homegrown Amazon, Microsoft, Google, even Meta are trying to build their own chips to get away from exclusively relying on Nvidia. But even with all of that, I think Amazon has a plan as I have talked about in the past. The next generation of its AI chips which is going to be the Trainium 4 is going to be built to with essentially be able to work with both Nvidia's GPUs and have that in the same system as of chips, the Trainium 4 chips. So I think whether that helps peel more business away from Nvidia or it's going to simply kind of reinforce their dominance but keep them on AWS's cloud, I'm not really sure which is going to happen. Either way, it's good for Amazon and aws. I think it might not matter to Amazon a lot because it's already on track to make multiple, you know, multi billion dollars from Trainium 2. They're already on Trainium 3 and now we're working on Trainium 4 chips. I think the next generation is going to be a lot better and that alone just might be enough to make AWS the winner. All right, thank you so much for tuning into the podcast today. If you learned anything new about the wild world of AI and chips and the competition for Nvidia, it would mean the world to me if you could leave a rating and review and also make sure to go check out AIBox AI. I'll leave a link in the description. Hope you have a fantastic rest of your day.
