
Loading summary
Ryan Henderson
Foreign.
Brett Shafer
Welcome to Chitchat Stocks. On this show, hosts Ryan Henderson and Brett Shafer analyze businesses and riff on the world of investing. As a quick reminder, Chitchat Stocks is a CCM Media Group podcast. Anything discussed on Chitchat Stocks by Ryan, Brett or any other podcast guest is not formal advice or recommendation. Now please enjoy this episode.
Ryan Henderson
Welcome into the Chit Chat Stocks podcast, a podcast to help you find your next great investment. Today we bring back on a recurring guest, Rahard Jark. Rahard is the founder of an AI startup that he eventually sold to a software company and is now a technology investor with fantastic coverage on the big tech space, especially in the age of this AI boom. He writes the substack Uncover Alpha, which if you are someone that wants in depth knowledge on the big tech companies, OpenAI, Nvidia, all of these businesses and how the AI industry and the boom is going and especially with how fast moving everything is, you know, gpu, semiconductors, all that good stuff. I would check out that substack. The link will be in the show notes. We'll include a link to that for the listeners before we get started. As a note, we were recording this on October 17th. It's kind of in the start of earnings season. So if anything happens from now until then, we're not going to be referencing this on the show since we're not time travelers. But Rahard, after the intro, we're going to kick things off right with Amazon. We're going to go through all of the big tech companies today. Well, we're going to talk about at least all them except Tesla since it's not a set essentially in this AI race in the same way as some of these companies. We're going to try to hit as much as we can. We're going to go through Amazon, Alphabet, Meta, Microsoft, Nvidia, OpenAI, even Apple, and then some general stuff around the AI boom. GPU's potential AI bubble. Let's start out with Amazon though. We got to start somewhere. What is Amazon's relationship with Anthropic and what is AWS's strategy in AI? Because there is a big narrative out there that they're falling behind.
Rahard Jark
Yeah. First of all, thank you for inviting me again. It's always a pleasure to join you guys. Yeah. With Amazon, you know, there is this narrative that they're falling behind. It's quite public by now. I think we had also a few reports out there which kind of confirmed the thesis that they're a bit behind. So they do have, I think most people know by now a significant stake in Anthropic. It's supposed to be between 10 and 20% or something like that. They also are, I mean Anthropic and AVS are linked, but it's not just Amazon depending on Anthropic, it's also the other way around because there are info that like 80% of the traffic that comes to Anthropic is from Bedrock. So Amazon's kind of routing software which helps with the AI routing and Amazon is diversifying. So this. So from Bedrock, I think a few months ago or a year ago it was like 190% of the traffic was towards Anthropic and now it's a bit different because Amazon is also hosting open source models. So I would say like Amazon is playing it more safely than many of the other cloud providers and I think you're also seeing other cloud providers like Microsoft starting to play it safe. While you have on the other hand Neo Clouds and Oracle being super aggressive, making huge discounts and trying to win over market share. While even if you listen to Jeff Bezos, he did a recent interview where he kind of said that he believes that we're in a bubble, at least an industrial bubble, so basically an AI bubble. And I think that kind of also shows in the way that Amazon or AVS is kind of going towards this. Although I think they will still benefit. So as we have this glut of compute or scarcity of computer, I think Amazon will also benefit because they do have power and they do have data centers which they can use. So you know, as Microsoft is full, as Google is full, as Oracle, neoclouds are full, you also see the benefits in AVs. So I wouldn't be surprised if we see. So we're recording this before earnings if we see acceleration of AVS in this quarter. But they are playing the more safer game, at least for now when it comes to this AI build outs and taking risks on companies and stuff like that.
Ryan Henderson
How important is the anthropic relationship? Because and I think there was a report yesterday that they are projecting and this is I think well in the hundreds of percent revenue growth, 9 billion in revenue this year or at least reaching annual recurring revenue of 9 billion by the end of this year and then hoping to get over 20 to 25 billion in 2026. How important can that relationship be? Or is it maybe overrated by the investment community?
Rahard Jark
No, I think it is super important because Amazon doesn't have their own malls, their own, you know, the horse which they bet on. Is open source and Anthropic. So for them, either of those two have to come on top. But so far, like the Anthropic relationship does feel natural because, you know, Amazon or AVS is really strong with developers and Anthropic is mostly used for coding. Right. So it's like it's a natural fit because it looks like, at least from what we are seeing right now, is that Anthropic has kind of gained a foothold in this coding environment. So they are, if you will, the enterprise version of this AI models where OpenAI is for now at least the consumer version of it. So if you look at Amazon, they're targeting the enterprises always have been, right. With avs. So it is a natural kind of fit. And like also Entropic is an important client of them for their chips. So for their homegrown chips, Trainium, which Amazon is a bit forcing Anthropic to use them. But you know, you need a big client using your chips so that you can develop and enhance the chip to be more effective. So I think the relationship is important. But yeah, Anthropic has its own maybe limitations or having some trouble now we will see if they will get in the crosshair of the government because they're kind of trying to slow things down in terms of progress. So this might not be what Amazon is really found about. But yeah, for now it looks like quite a natural relationship. And I think Anthropic's focus on one vertical, or at least visually it seems that way on coding does seem like a smart strategy.
Guest Host / Interviewer
Is there any. When I look at Amazon's relationship with Anthropic and Microsoft's relationship with OpenAI, I basically look at it well, maybe not as much with OpenAI but as pseudo ownership. Like I just kind of feels like they. When I think what's, what's Amazon's big like push into AI, I think it's anthropic. Is there any advantage to having them set like separate, independent. But it's just a huge stake in the business as opposed to actually having them under your corporate umbrella. Or is this just basically to appease regulators?
Rahard Jark
Yeah, I think at first it was maybe to appease the regulators, but I think right now I don't think first of all that Amazon or even Microsoft would want to have an Anthropic or OpenAI under their balance sheets because they're burning, they're going to burn a ton of cash going forward and you know, so it helps them. That venture is, venture capital is funding those companies because they're then spending the checks on their cloud businesses. Right. And even if they would want to kind of take over those companies, I think both of those companies are now too big to fall under that. Well, at least if we continue on this path and don't come into a bubble territory where we get distressed assets. But I also think the ownership stakes of Amazon and even Microsoft, I think people tend to think about that they own huge chunks of the business. But after restructuring, for example OpenAI, I think Microsoft is going to get like a third maybe or even 30% of the business of OpenAI only so of the new for profit entity which is like a lot smaller than the 50% that they had. And even like if you look at Amazon, even if it's like 10 to 20%, you're going to get deluded a lot because I don't think that if you intend to raise this trillions of CapEx, which we're probably going to talk about later in the show as well, I don't think Amazon or Microsoft will be participating in those rounds because I think they're already exposed a lot and their free cash flow is tied up in this kind of build out of the new Data Centers and GPUs. So I think at the end these percentages will be like sub 20s or even lower in terms of like just the ownership stake if everything goes according to plans.
Guest Host / Interviewer
Okay, let's shift gears to Google quickly. Or Alphabet I should say. I'm going to read one of your tweets here that I saw yesterday, I guess, or posts on X. You said Google is the only Frontier LLM provider that has the full stack already in place and working. That includes distribution, AI model data, your own cloud and TPUs. You said if Google wins the consumer LLM race is it is not only bad for OpenAI but also for Nvidia. Google is the only one that is not totally beholden to Nvidia, I guess let's maybe go through the stack there. So what are the different components that Google owns in sort of that AI supply chain, if you will and then how valuable are those? Tpusc.
Ryan Henderson
Sure.
Rahard Jark
So if we go through this tech. So first with model development, so we have DeepMind, which is their AI lab, which is producing products like Gemini, like Veo, so all of these different kind of products or models if you will. So very similar to OpenAI, we can say, right. So it's like this is the first stack and then you have gcp which is their public cloud offering and GCP is also helping of course DeepMind because they have easier access to infrastructure. So OpenAI is just now trying to build out these data centers where they are kind of like the owner or have a direct relationship with the suppliers. So they're not just beholden to the cloud providers. And Google already has this with gcp. And also in addition, like, you know, all of the volume on GCP helps Google's DeepMind in terms of like their infrastructure because they can optimize it better, because they have better scale so they can get better deals for TPUs and stuff like that. So then we go to TPUs which is their hardware. So this is like their ASIC competitor to Nvidia in a way. So but the thing is that everybody's doing their ASIC right now, right? So you have Meta, you have Microsoft, you have Amazon, but Google's the only one which has a mature enough offering that is actually effective, right? So they're on, I think it's like seven generation right now. And it's by far the most mature and kind of performance wise, even alternative to Nvidia. Because with developing of chips you need years, you need the cycles of products to learn from your mistakes and kind of, you know, enhance these products and make it useful for the workloads that you're powering. So they have the TPUs, they have TPUs at scale. So just I read the last interview I read was from a Google, former Google employee who basically said that yeah, we're buying Nvidia, but we're buying Nvidia only because clients are kind of requesting Nvidia on gcp and for all the internal stuff, including Gemini Veo, we're actually using TPUs. So even for training they're using TPUs which tells you a lot because you know, Gemini and the model and their models are on par, if not even better than OpenAI's models. So they're frontier, right? And they were not trained on Nvidia or inference. And this is helpful because you know, if the supply chain shanks up and you can't get enough GPUs or they become too expensive, whatever, Google has their own kind of vertical where they can get the compute and kind of continue to use the models and inference them also provide it to the users. So it's also good like when they, you know, negotiate with Nvidia you can always say I have an alternative, right? So it's better to have you have better negotiating power than if you're just Oracle or Microsoft who has very early stages of asics where Nvidia can say it was the alternative. Right. So I'm not gonna give you a discount because you don't have an alternative. Well, with Google it's different. Even on a recent podcast with Brad Gerstner, Jensen said that actually we can put Google in a different bracket with their TPUs and that it's only them and Google, which are kind of like mature enough to offer this kind of stuff. And with TPU's, they're really important because performance per watt, they're supposed to be really effective, especially for AI use cases. And as I said, the proof is in the numbers. So if Google is using all of their internal processes and products and powering them by TPUs, then that's the proof that it actually works. Right?
Ryan Henderson
So yeah, if Google is good at anything, which they are good at a lot of things, it is efficiency in data centers and compute. I wouldn't follow up here. Maybe on the consumer side of things, I've seen conflicting data out there. Some stuff, you know, it's moving quickly. It's not public for the most part. Is Gemini from a usage perspective from like consumers, such of our, such as ourselves, are they catching up to OpenAI and ChatGPT or are they still the tiny amount of market share?
Sponsor / Advertisement Voice
Before we move on, we want to talk about our friends at Interactive Brokers. Interactive Brokers is our favorite brokerage platform. They make it easy to buy stocks, ETFs, options, futures, currencies, bonds and more, all from a single unified platform. And Interactive Brokers allows you to maximize your returns by minimizing your cost. They offer zero commissions on U.S. stocks and low commissions on international securities. They aren't cutting corners either. Interactive Brokers, one of a kind. Smart routing technology gets you the lowest price possible in 36 countries and 28 different currencies. Interactive Brokers is our home for stock trading. And we wouldn't go anywhere else. If you're getting serious about investing, it's time to upgrade to a broker that you can trust. Head on over to ibkr.com restrictions apply. Interactive Brokers is a member of SIPC.
Rahard Jark
Yeah, so in the last, let's say three months they have gained a lot of ground and it's because of Nano Banana and stuff like that, which, which they released so products that kind of help them propel. And even for like I think it was two weeks, Google was on top of the iOS download charts, which tells you a lot because it's like, you know when people say Google is on top, they say oh, but it's like default on My phone, whatever. No, on iOS, Gemini is not default on your phone. So these are actually actual downloads. Right. And it was above chatgpt for like two weeks or even more. And then now it's like on third place after OpenAI and Sora. So kind of OpenAI took the helm again with the release of Sora. So Gemini is not there yet in terms of just the usage, but it's definitely gained a lot of market share. And for Google, it's the right time to launch Gemini Free. And if Gemini Free is better, which some rumors are suggesting that it's really good than GPT5, because GPT5 in terms of performance, we can argue. Yes. What was the leap from GPT4? In my view, the real kind of leap was the routing system so that OpenAI can make it more effective to run them all. So Google has an opening here to kind of reclaim the frontier model with Gemini Free. So we'll see. I think they're supposed to launch it quite soon, I think in a month or so. So that will be interesting how the market will take that. But we must not forget like AI overviews and AI mode, like these are products that are high in usage. So Google is already showing you that they're serving. I think they put out a stat of Quadrillion or something like that. So it's a number. So it's higher than trillions of tokens per month or per year, whatever. So the trajectory of those tokens is really high because remember they are serving with AI overviews and AI mode also these AI workloads and showing you that they can do it effective without raising trillions of new capex to serve these models. Right. So when you talk about Gemini, you have to acknowledge that Google has. In terms of all the usage, I think they're higher than OpenAI if you look at their surfaces, like overviews and AI mode.
Ryan Henderson
And it comes back to that infrastructure with TPUs and the stuff they built up for the last 20 years, just powering Google and YouTube and what have you. Let's jump to Meta. We've got a lot of other companies to get to here. The question I have for them is really, besides advertising optimization, which they've been doing for using AI tools for a long time now, what is their strategy for AI monetization? Because right now I guess I don't see much.
Rahard Jark
Yeah, yeah. So with Meta we have like, as you mentioned, so monetizing AI is. Yes, you have ad targeting, which they have been doing for quite some time with Advantage plus, which is not really, let's say a gen AI workload. So many would try to project it, but I don't think it's a gen AI workload. It's just an AI workload. Right. So it's different. The thing that they can still do and are in the early stages is so generating creative for ads. So this is different than targeting. This just means that smaller advertisers can for very little cost, make very compelling ads. And the effectiveness of ads, because the ad creative is better, can be much more effective and with it the return on so the roas for advertisers. And at the same time this means that over time CPM is going higher because the ads are more effective. Right. And this is unlocked for small advertisers and it also has bigger advertisers because now they can skip or at least use less of ad agencies. And you know, they have like 30, 40% of the budget is for ad agencies. And if they can save that 30%, there were surveys done that most of those savings would go towards increasing ad budgets, again affecting CPMs. So you have this effect in terms of AI monetization and then you have the second one which I think is also really big and important. It's monetizing WhatsApp and other messaging surfaces because now you already see it from OpenAI and trying to do this. So with E commerce kind of filled into these chat surfaces, right. Where you can monetize it better. So Meta can go from just selling ads to maybe actually selling and taking a take rate on everything that's sold to the user from using their product. Right. So if they ask meta AI, okay, what's the best sneakers right now? And somebody then ask, oh, but I want them for running and I want them to be, I don't know, color, et cetera, et cetera. And they pull in the actual purchase order, right. That they can do it. Meta can take a percentage of those revenues and it's not an ad, it's actually take revenue. Right. And then you also have WhatsApp customer service. So I think this is a big vertical. As I research customer service just the outside of so not internal customer service. It's a, it's a half a trillion market per year. Right. So all of the customer service companies and with, with AI you can, you can replace all of that industry or at least most of it. And I think that's a big lift for Meta and brings new revenue which is like more sales revenue again. So, and then you also have, we must not forget with Meta AI, they get higher intent ads so they can start to surface ads similar to Google because they have higher intent, they have other information than just social. And with higher intent they then own the full vertical. So they can go to an advertiser and say, yes, we can send you brand advertising, we can sell new social, we can also send you high intent ads. Right, so you can service the whole ad kind of vertical in one platform, which is really valuable. And the last part I will mention also is like Meta, Ray, Bounce and ar. I think from what we have seen, AI will be a really important navigation system for these glasses. And it's so more in the voice realm where I think is the real usage not just in text, but kind of having smart glasses and then saying, hey Mehta, can you, I don't know, can you calculate this that I'm seeing here or can you do stuff and stuff like that? So I think having an important and capable AI model is also where you will benefit with smart glasses. Right.
Guest Host / Interviewer
Aside from the smart glasses initiative, which would, I imagine, I guess it's slightly more speculative in terms of adoption. When I think about all of all the companies on this list, the it feels to me like Meta has the shortest path between AI training and revenue recognition because of the efforts going towards advertising, maybe Google as well. But if I'm understanding you right, that first pillar that you talked about, not the ad targeting but the ad generation, is that basically, let's say I'm a small business that sells, I don't know, protein bars and wants to target men in their 20s or something like that. Is the idea then that you can go to Meta and you can say, here's who I am, here's the customers I'm looking at. Create me an ad and obviously use your targeting efficiencies. Is it like basically just offloading all the work to them?
Rahard Jark
Yeah, exactly. So it's like you can even, I think it's even easier so they even have like, I'm not sure if it's in beta or if it's already in production, but a tool that you can just send a URL link of your website or the product that you sell and then they can suggest to you actually who your audience is, who might additional audience be and what the ad should be like. But then also as you kind of input it, right, so that you leave the AI to do its work and the AI can hyper personalize the ad so that if somebody likes, I don't know if athletes are your target, somebody likes, I don't know, basketball Somebody likes football, whatever. So it's different kind of ad creatives that can be created on the fly and make the ads more effective. So yeah, you are kind of leaving it to the AI system to kind of force fully do targeting and creative at once. But in terms of like the company that benefits the fastest, I think the cloud providers will still be the ones because they recognize revenue faster. And then it's yeah, the ad companies which should benefit from this and already are partially. But it's also like if you right now you're seeing, because Google is also transitioning to AI and the usage of search is going down, you are seeing a lot of businesses having trouble with organic traffic from Google. So they are starting to pay up for search ads and they will also start to pay up whatever they can get exposure. So they're going to start bidding up the CPMs also on social media because they're going to try to replace the traffic that has been lost. So I think if search goes away or partially goes away, then you will see all other surfaces benefit from higher CPMs because the advertisers will just have a harder time to reach people because so far OpenAI and Gemini surfaces are not yet displaying ads or at least not that much. So until that happens you can see a bump in CPMs from all the other industry players.
Guest Host / Interviewer
Yeah, that's an interesting sort of byproduct is the rise in potential cost per click for those both Meta and Google as well. Let's shift to Microsoft real quick. Azure has been seen I think as basically the market share taker over the last. I think that's playing out in the numbers as well. Has been seen as sort of the market share taker among the big three hyperscalers at least over the last, I'd say six months. And I think a lot of people are positioning them as sort of the, well them and Google Cloud as sort of the AI cloud, the cloud that's benefiting the most from the AI workloads. What is I guess a. Do you see that as true? Is that the case that they're kind of the leader as in an AI workload world and then what is Azure's plan as OpenAI has kind of somewhat publicly now gone non exclusive. I believe they're working with Google Cloud as well. What's the relationship like there?
Rahard Jark
Yeah, so correct. Azure has been gaining ground, but I would say that Azure has been gaining ground even before we had the AI boom. So before OpenAI already Asia was, was taking market share because of their relationship with enterprises they're kind of bundling with other products like Office, like European systems and stuff like that. And yeah, I mean, there are a lot of alternative data sources that, that show that Azure is continuing to kind of build and take market share. But I think it's also like, it would be interesting to see Azure numbers X OpenAI, because I think OpenAI, as much as it's a positive for Microsoft, it can also be a problem because if you're tied too much to one client, then because it's so big and because their compute demands are so high, you might end up not serving your other clients. And I think the kind of difference that you saw where OpenAI is now using other cloud providers is because of that fact. Because I think Microsoft said like, okay, we already have one third of the company, we are serving a lot of their workloads, but now we must also take care of other clients and kind of hedge our bets to not be exposed 100% to the success of OpenAI. And I think Microsoft is also being smart because they're seeing that the cloud industry. If you said three, four years ago that somebody can attack the moats of the free hyperscalers, I would say you're crazy. But today we're talking about Neo clouds and even private companies like OpenEHR raising hundreds or 300 billion, half a trillion for data centers. And I think investors need to be careful in analyzing the landscape. So if you suddenly don't have free players anymore that can offer you AI compute or compute in general, but you have the Neo clouds, you have OpenAI, who's doing their own data centers, you have XAI, again with their own data centers, but rumors that they're gonna start selling compute as well. You have Oracle, so suddenly it's not a monopoly or a free player oligopoly, but it's actually like a distributed market, then the margins are going to be under pressure. And you're already seeing this with reports out there, Oracle and their margin, they're kind of now saying that, yeah, we will be able to achieve 35% gross margin, but let's be honest, it's 35% gross margin. That's not really good for me as an investor in what's supposed to be a monopoly business. Right.
Ryan Henderson
AWS has 35% operating margin, so it's a big difference.
Guest Host / Interviewer
Can I pause here hard and ask, when you say neo cloud, what exactly does that mean? Is that just referring to basically oracles and core weaves of the world?
Rahard Jark
Yes. So Nibius, core weaves all of the Smaller kind of Lambda Labs. So you have a ton of these smaller clouds, if you will, that Nvidia has funded or helped get GPUs because they also want to reduce the risk of customer concentration. And now these NEO clouds do have, if you think about it just from a AI workloads perspective, they're not that far behind from many of the hyperscalers in terms of just the capacity. Especially if you now consider like this AI capex and letters of intent from OpenAI and stuff like that. So if this actually gets played out in the next three to five, five years, you might end up with NeoClouds or Oracle and everybody else having similar kind of data footprint as some of the hyperscalers. And that's a risk. Right? So I think, Mike, but I think going back to Microsoft, right, just to end that argument, I think what they're seeing is that at this point it doesn't make sense to build crazy at the crazy pace that we're building right now. And they say, okay, I want to keep the client relationship because the client is saying to me, I need compute, I need compute. And Microsoft saying, okay, I will get you compute. And then they sign deals with NEO clouds who provide the infrastructure. But the client doesn't need to know that it's run on core weave, let's say, right? So Microsoft still keeps the relationship with the client, but they get to de risk because if the data center build out, so if, I don't know, amortization cost of GPUs are bigger than they are projected to be or anything like that, then the risk of that goes to the balance sheet of the NEO cloud, not to Microsoft, right? So I think Microsoft is hedging and saying, and then if we have a bubble and if it pops, then we get distressed asset and then, you know, if Microsoft's not too exposed, they can then buy up those new clouds. Or let's say the value of those new clouds is mostly in power. So the power commitments and the lens of data centers that they have, they can buy up those assets and then again be in a monopoly market. But if we continue with this space, then it's risk for all of the free. So both Microsoft Azure, AVS and gcp.
Guest Host / Interviewer
So this episode is presented by our brand new partner, Portcido. Do you know what your actual investment returns are? Well, if you're like me, then the answer is probably no. And that means you need to try Portcito. Portcito is the ultimate portfolio performance tracker. With Portcido you can easily aggregate your various brokerage accounts and instantly see your actual investment returns. That includes all your investments too, stocks, ETFs, even crypto, if that's your thing. You no longer have to guess what your returns are. With Portcito, you can find out in minutes whether or not you are actually beating the market. And that includes a dividend tracking dashboard and even tracking of short positions. I have been looking for a tool like this for a long time because I was sick and tired of tracking it manually and Port Cyto has finally built it. If you want to find out what your real returns are, check out Port Cyto. The link will be in our show notes.
Ryan Henderson
All right, let's move on to Nvidia. It's obviously the giant in the space, different player because they're selling to all of these companies that we talked about. What's developed over the last few months, maybe a few quarters or longer. Are these, I guess the proper definition maybe isn't this, but circular accounting deals where for example, they sign a hundred billion dollar commitment with OpenAI and then OpenAI is going to potentially turn around and take this money that's flowing from Nvidia's balance sheet to OpenAI and they're going to buy Nvidia chips. What is their from your perspective? Why are they doing this? What's the strategy there and why do they. Why is it necessary?
Rahard Jark
Yeah, so first of all, Nvidia definitely has the cash flow. So right now the only company that has big cash flow, free cash flow, is Nvidia because everybody is buying their accelerators, GPUs. Why are they doing it? I think the real reason is if you look at, we went from OpenAI and all of these AI startups raising, let's say from 0 to 10 billions we had venture capital, so we had the Sequoias and we have all of those companies, venture capital flooding these companies from 10 to 30. You normally get somebody like SoftBank, right? And now you're going into a realm where these companies are trying to raise hundreds of billions or even trillions of dollars. And who is going to invest or lend that money? Well, probably the only company that has enough cash is Nvidia and maybe Apple, right? And for Nvidia it's important. Like there was a stat from Dylan Patel who runs the semi analysis team that one third of Nvidia's GPU cells go. So the end customers of one third of the GPUs that are being ordered are OpenAI and anthropic. So for Nvidia, it's really, really important that these two clients can continue to get new capital. Otherwise you might end up with a problem in terms of your growth slowing down. So I think Nvidia's motive here is to help their most important customer continue to get new capital because if they put their name on it, they can get, let's say, easier, although I'm not sure how easy it is to raise that kind of amount of money. Right. But they can get it easier if they have Nvidia as also part of the deal. I think that just shows us that we're at a quite late stage of the cycle where, if you will, the last lender or last investor is Nvidia. And we're also seeing these, you know, we're starting to see a lot of these debt deals where now GPUs are being sold to an SPV who then rents the GPUs to somebody like OpenAI or XAI or whatever. Right. And the collateral of that depth is the GPU or the data center. And again, here is the problem because GPUs are fast depreciating assets, especially as Nvidia went to a one year product cycle. Right. So we're starting to see these creative deals, which I am worried about because I think it shows that, you know, we are at the late stage of the cycle, at least when it comes to capex and these crazy commitments. So Nvidia has a lot of cash, they probably don't know what to do with it. But at the same time they want to prop up the ecosystem for it to continue to kind of function. Right. Even, even if Nvidia, it's not the first deal. So Nvidia is active in supporting Core Weave. They also have a deal with Coreview where if I think it is for 7 billion, if Corev in X years doesn't have enough demand for some capacity of Nvidia chips, they will be the backers of that. So they will take on that burden of the unsold compute, which is again like, you know, showing the creativeness or the late stage cycle that we're in.
Ryan Henderson
So, yeah, let's talk before we move on to another company. Let's talk. The question from Twitter that I thought was quite helpful and essentially just said they wanted to know your thoughts on the impacts of any changing depreciation schedules for GPUs. You mentioned the rapidly depreciating assets there. Just what are your thoughts there and how it plays out over the long term with this industry?
Rahard Jark
Yeah, so I recently wrote an article on Ankara Alpha, covering some of these problematic areas. And one of them was the amortization rates of GPUs. And the problem is that, you know, people and the cloud companies are. So the usefulness of life for these GPUs are mostly being extended towards five to six years. So I just ran the numbers just for comparison. So Microsoft, they have server and networking equipment over usefulness of life is four to six years. Oracle just bumped their usefulness of life for GPUs in 2025 from five to six years. Amazon this year reduced the usefulness of life from 6 to 5 because they're saying technological progress is faster. Then we have core Weave who's six years, Meta who's five and a half years, and Google who's six years. And this is networking equipment and GPUs together. So it's not just GPUs. But what's important here to understand is that up until 2024, Nvidia was on a two year product cycle, right? And since Blackwell, so they're now on a one year product cycle. And this changes things because each generation of accelerator is much more efficient in terms of tokens per watt. So from Hopper to Blackwell, Jensen said it himself, the ratio is that Blackwell can do 10 to 20x more tokens per watt than the Hopper generation CAD. And why is this the problem? The problem is because we're also running out of energy. So we have, you know, everybody's scraping for gigawatts of energy and if you have limited energy and just think about it, for now we just had incremental selling of GPUs. But now we are entering an age where the data center are built out and you have demand, but you don't have any more energy. So you can't open new data centers, or at least it takes years for them to be opened. So what do you do then? You have a GPU and you can say, okay, if I replace the old GPU with this gpu, I'm talking gpu, but it's really accelerator. If I replace it, I get 10x more tokens, even if I get 5x 2x more tokens, right? Even if we are more conservative, you can serve 100% more compute demand, which you want to do.
Ryan Henderson
Right?
Rahard Jark
So the problem becomes that you're saying that the GPUs are five or six years useful. I don't think that's true. I think the real number is more two to maybe three years of usefulness. But if that's the correct number, so you have Also industry experts saying this. So Grok, the CEO of Grok, he's saying one to two years, right? But if this is true, if this turns out to be true, then the amortization expense should be double of what it is today. And what this means is that every company that is in this space is not accounting their cost correctly and the amortization expenses are double. And why is this so important?
Ryan Henderson
The PE ratio should be higher, right?
Rahard Jark
Yeah. So the PE ratios are higher, Right. And why is this so important? So before data centers were not that big, you know, they were big in terms of like an expense, a capex expense, or on the balance sheet. But now they're really big. Right. So it becomes such an important part of the business that it will affect the bottom line very, very much so. Right. So it's like the topic of what the correct usefulness of life for, for GPU says or accelerators is, should be on top of minds of investors. Because especially like the business models of many of these Neo clouds, core weaves and other, if you change it from six to three, they are already making losses, but the losses are even bigger. And especially if you consider the debt deals where GPUs are correct collateral, right? So that's even worse, right? So it's like this could be a problematic area for the whole industry, which is systematic, not just surface level.
Guest Host / Interviewer
So I want to spend a little bit of time here because I think this is a very hotly debated topic and it's obviously very important. So when I picture what's going on here, the whole value chain, like all the capex that's going on at any time you see a big boost in capex at Amazon or AWS specifically, what I'm picturing is they're standing up a giant warehouse, they're filling it with a bunch of basically computer shelves or server shelves. And the biggest asset, the biggest cost There is GPUs. And I think we had a guest on here a while back that said these are the fastest depreciating assets in human history. So in my mind I'm thinking like this can't possibly. On the one hand, there's so much innovation going on, that's great, but it's leading to faster depreciation schedules, I would think. Like if, if, if you just improved your product cycle from two years, you as in Nvidia, and now it's every six months or every eight months or whatever that means the depreciation schedules in my head should be shrinking, right? Because you, you've Got new ones. But are they able to repurpose the GPUs? Are they able to make like, let's say I bought the newest iteration of Nvidia's GPUs today, two years down the road, can I just offload those to a less compute intensive workload? Like is that, is that what they're basically doing?
Rahard Jark
Yeah. So you're correct in terms of data centers. So 60% of the cost is the GPUs at current rates. So you still have like 40% or 30%. So 60 to 70% is the GPUs. So the 30% is still the data center which is like, is an asset that you can appreciate for longer cycles. But yeah, there are people saying the argument, oh, but they can use it for internal workloads. Yes, but this is only for a handful of companies. So you have Google which has their own search, you have Meta, which has their own, you know, social media, you have Amazon and you have Microsoft. But a core weave can't repurpose it for internal workloads. They have to sell it on the market. Right. And same with Oracle. And also even with companies like Google, Meta. And they're repurposing it because they couldn't get enough of it and they had space for it. Right. But right now what's happening is that, you know, when they fill it out, the opportunity cost just become too big for them to ignore. Right. If they can serve 3x5x more than with the old GPU, then they should be replacing it. Right. And at the same time you have electric prices surging because everybody's building data centers. So with the GPU, you don't have just CapEx. You also have OpEx. Right? You also have OPEX cost with electricity. And now let's not forget, it's hard to repurpose. Every new generation of Nvidia has liquid cooling. So you have data centers that need liquid cooling. So you're not going to be. So if we talk about this from three years, so in the future, let's say three years from now, people will still be using blackwall. Yes, but they will have to have liquid cool data centers. And the people that have liquid cooled data centers, there are not a lot of data centers that are liquid cool. Most enterprises don't have it, only the hyperscalers have it and some Neo clouds and bitcoin miners and stuff like that. So it's like this only works if you have a company also inside of your company that's using those internal workloads and if you're basically selling incremental new GPUs and if you're not already stacked in terms of like space and energy and everything like that. Right. So yeah, that, that's. And even when people say like, yeah, but they're still using H1 hundreds or even A1 hundreds, A100 is two generations from Blackhawk. It's, it's not six years or three or five years, it's two generations. Right, so we're talking about three years in terms of product years. Right. Because Nvidia has bumped up their product cycle. Right. So it doesn't make sense for GPUs to be six years old assets. So usefulness of life being six years old, that would mean that in terms of product. So a GPU that was launched in, I don't know, 2018, 2017 is being used right now. You even have a report today from the Information, and let's take this report with a grain of salt, but still having an internal document showing that Oracle had problems leasing out H1 hundreds until OpenAI came and took the compute. So you also have to imagine like if there was an OpenAI and anthropic, would the market still be so would the prices of H1 hundreds, which is a basically a one year old product cycle product, still be this high? Or are these two companies, you know, inflating the whole market because they have so much demand and they just want to meet demand because they're also like losing money. So for each, I think it was a stat. For every dollar that's being spent on ChatGPT and OpenAI, they're losing $3. So if this ends right, you can end up with a lot of compute which should be repriced differently than it is today because you have two companies basically driving the whole economy relatively.
Ryan Henderson
I think again, Alphabet seems to come out ahead even though they're not going to quote win if you know, the demand collapses as, as could potentially happen. But that's a great segue to the growing elephant in the room. Even though the revenue is significantly smaller than everyone here. But they dominate the news cycle, they dominate the spending projections. It is open AI. If you look at their business plan that they've released or leaked out there, they're going to be losing, I think cumulatively burning $100 billion, if not more through 2030. But they seem to have the financing to back them. They have a trillion dollars worth of commitments to all these different companies. It seems like they went from just using Azure to okay, we're going to Try to use every company possible to build out not only our own compute, but having cloud partners. I say all this stuff because I, I want to ask what you think OpenAI's business looks like in 2030. What could happen here? What's the potential outcomes?
Sponsor / Advertisement Voice
All right folks, before we move on.
Ryan Henderson
We need to tell you where we.
Sponsor / Advertisement Voice
Get our financial data. Fiscal AI Fiscal AI is the complete stock research platform for fundamental investors. I use the platform pretty much every single day. You'll see the charts on our podcast and you'll see it in our newsletter. This is our one stop shop for stock research. They've got up to 20 years of financial data on all companies globally, including company specific segment and KPI data. That means Amazon AWS revenue, SoFi's total members, Google's paid clicks, growth and literally millions of more data points. They've also got earnings call transcripts, ownership data, company specific research reports and much more. If you want complete financial data at your fingertips, then you need to check out Fiscal AI. And if you use our link Fiscal AI chitchat, you will get 15% off any paid plan. Again, that is fiscal AI chitchat. The link will be in the show notes.
Rahard Jark
Yeah, I mean first of all, OpenAI definitely, you know, is the Google verb for LLMs. So it has established a brand which helps them a lot. But I think at this point where we are at, you can see it from just the last few months, they're trying to juice up everything to boost engagement and to, you know, keep the user growth there. And why are they doing it? Of course, naturally, because they will have to raise hundreds of billions, if not trillions of dollars to meet the purchase intent letters that they send out to all of these companies. So even Altman, if you listen to him, he says yes, we're in a bubble, but at the same time we got something real here, right? This is from Ben Thompson's blog and you can sense it that he's like, if capital markets, either that's venture or public are going to continue to give us money, we're going to take it. Because of course the more money you have, the better your outcome is going to be. Whatever if it's a bubble or if it's not a bubble, right? So, and even like you have to think about, Microsoft was the kind of first backer of OpenAI. We talked about this before, right? And they have in their agreement. So Microsoft can always deny OpenAI's request from other. So OpenAI says Oracle has offered me this compute for this price. Are you willing to match it. And Microsoft can say yes and they have to go with Azure, but Microsoft is saying no and they're having. So if anybody, Microsoft has even access to OpenAI's IP, right. And Satya has over the years proven he's very smart. He's not like he's not super safe. He's trying to get market share, he's trying to make Google dance and everything, right? So if you think about it like why is Microsoft denying all of this compute that they can be serving? Could be serving, right? Because it's not like compute that's out there right now. It's like computer is going to be built in three, five years. So Microsoft could do it as well. Right. And I just can't get past the feeling that we are, that Microsoft trying to hedge, Right. Because we are entering the numbers that don't make sense anymore. Right. And OpenAI has said, yeah, please be patient, we're gonna come out with new deals. My bet is that they're gonna come out with some also electricity deals. So even if it's a smr, so nuclear or if it's gas plant or whatever because they need the power, right. But still as long so the capital markets has the button on all of this. If we get around from OpenAI where they can't raise what they wanted to raise, the market will start panicking in my view because then, you know, you suddenly have those two companies that can't fill in all of the purchase orders that they send out, which means that all of the expectations for Oracle, which stock price jumped 40% when the letter was sent for AMD, for Nvidia, for everyone. So everybody's connected to the fate of these two companies. And if you ask me what's. So if I go back now to what's going to be OpenAI in 2030, I have no idea because it can end really bad or it can end up like we're in some AI God mode or whatever. Right. But it's definitely something we haven't seen ever. Right. And even though I'm technologist by heart, the numbers just don't make sense at this point anymore for me. And that's why I'm more cautious.
Guest Host / Interviewer
Yeah, I think that's a really detailed view. Sounds like there's certainly some at a minimum concentration risk and all this spending. I wanted to talk Apple, but A, I don't know if there's that much to talk about but also B, we're bumping up on time here so we want to ask our final question to you if you had to pick one today, I know this is kind of a tough question because I'm pretty sure you own a couple. But if you had to pick one today, which company do you think will be the biggest winner of this AI race by 2030? Of the ones we talked about today.
Rahard Jark
By 2030, okay, are we talking about which company is going to have the biggest market share or are we looking at an investment perspective as well?
Ryan Henderson
So let's look at your favorite investments.
Rahard Jark
Okay.
Ryan Henderson
Today, like, what do you. You don't have to rank them, but what do you like and what do you dislike?
Rahard Jark
You know, valuation does matter at this point. My number one is Google and the reason is we talked about it. It has the full stack. And I think the TPUs are something which will be Google's most important assets to date. And they have a lot of assets, so that's saying a lot because they are already showing us that they can serve a lot of inference on scale, which nobody else can, and they don't have to raise trillions. And I think that will be really important because I think we will come to a point which OpenAI already hinted, where OpenAI will have to raise prices. And at that point, Google can go in this game and say, okay, we're gonna cut that and we're gonna cut it or we're gonna keep it free or whatever, right? And if the models are similar in terms of performance and everything, you want to talk about it and you have to pay for something and you don't have to pay for the other thing and it's the same, then it's a big benefit. Also, till 2030, we will have to. All of these companies will have to transition at least partially to the ad business model. And there's no other company besides Meta that's really good with ads and that's Google, right? So it's like they have the relationship with advertisers. They know the Surface and then you are in their ballpark, so you're in their game. Right? And also gcp, right? I think you're gonna see GCP win a lot of deals because TPUs will be able. Will be priced cheaper and people will not be so reluctant to just go to Nvidia because right now Nvidia has so from multiple former interviews and stuff like that. The, the notion is kind of like the clients are saying, I want Nvidia because nobody is getting fired for choosing Nvidia. Right? But if that's too pricey and you, you have an alternative. And as the market is shifting from trading to inference being a bigger portion of the pie. The costs are really important and then the TPU can become a really important asset and people will say, okay, no, for inference, let's also use the TPUs, which are a lot cheaper. So I think the moment for Google is quite there. And they're not yet priced because of the risk of AI search disruption. But even with such disruption, we have already figured out that the AI market, the LLM market is tam, is a lot bigger than just search because it's not just information retrieval, it's like agents, agentic use cases, all of the other stuff. And if OpenAI is able to raise trillions, they're probably going to be valid at what, 1,2 trillion? Then you can't say that DeepMind is not valued at least half a trillion or even more. Right? And then you have a full replacement of search. But again, even search search still is the backbone of many of these LLMs. So it might not be the front end surface for users, but it'll still be very important to be the back end.
Ryan Henderson
And not to mention all the other AI stuff that Google DeepMind's working on, which includes Waymo, but there's plenty of other things, medical, biotech, what have you. And it comes back to, again, I think this is the biggest theme from this episode is the infrastructure advantage.
Rahard Jark
Yes. Yeah, I agree, I agree.
Ryan Henderson
Rahar, thank you for joining this episode. Once again for the listeners that have listened to this episode, enjoyed your thoughts. They should definitely go over to Uncover Alpha, what's a 30 second elevator pitch and what you do over at that newsletter.
Rahard Jark
So I basically do deep dives into these companies, into the specific sub segments like TPUs, like stuff like that. And I focus it all also on alternative data sources. So I'm trying to source as much so former interviews, data from job employment, stuff like that. So trying to get as much alternative data and all basically be transparent and source it based on as much data as I can. So it's not like it's not just my opinion, it's kind of, you know, its opinion, but it's also a lot of insights from, from these, all of these sources.
Ryan Henderson
So yeah, all right, beautiful, thank you. Once again, let's hit the disclosure and get out of here. We are not financial advisors. Anything we say on the show is not formal advice or recommendation. Ryan I or any podcast guests may hold securities discussed in this podcast, may have held them in the past and may buy, sell or hold them in the future. Thank you to the listeners for tuning into this episode, we'll have more fun stuff coming out in the future, and we'll see you next time.
Date: October 29, 2025
Hosts: Ryan Henderson, Brett Schafer
Guest: Rihard Jarc, Technology Investor & Founder of Uncover Alpha
In this episode, Ryan and Brett host returning guest Rihard Jarc, a technology investor and AI startup founder, to dissect the state of competition among "big tech" companies in the modern AI arms race. The discussion ranges from Amazon’s calculated moves to Google’s stacked advantage, Meta’s monetization strategies, and the financial dynamics fueling Nvidia and OpenAI. Jarc brings a critical, data-driven perspective—often referencing insider data and structural market risks—that leaves listeners with a nuanced view of which companies are best positioned for sustainable success in AI.
[02:21–11:04]
[11:04–13:34]
[10:14–16:42]
Google’s Unique Structure:
Consumer Momentum: Recently, Gemini’s iOS app topped download charts (even over ChatGPT), and Google’s AI products serve “quadrillions” of tokens—a testament to scale.
[19:26–27:54]
Advertising Still King: AI’s biggest role remains ad targeting—improving creative content generation for advertisers, especially smaller businesses, and boosting CPMs (cost per thousand impressions).
Messaging Apps/e-commerce: “Monetizing WhatsApp” via e-commerce and direct sales integration (Meta taking a cut from transactions, not just ad revenue).
Customer Service Disruption: Generative AI can automate customer service, a $0.5 trillion market, providing Meta with new sales streams.
Smart Glasses & Voice AI: Meta's AR devices stand to benefit from advanced AI navigation and assistance in real-world contexts.
Quote: “With AI you can replace all of that industry or at least most of it. And I think that's a big lift for Meta and brings new revenue which is like more sales revenue again.” (22:16)
[27:54–35:07]
[36:01–47:21]
Circular Commitments: Nvidia funds AI startups (e.g., OpenAI) who then use that money to buy more Nvidia chips.
Late-Stage Signs? Such creative financing signals a possibly overheated, late-stage market.
Rapid Depreciation: Fast product cycles mean GPUs are usable (at top value) for only two to three years vs. the six years cited in company filings—raising questions about capex, amortization schedules, and true profitability.
Quote: “If this is true, if this turns out to be true, then the amortization expense should be double of what it is today. And what this means is that every company that is in this space is not accounting their cost correctly...” (44:00)
[51:51–57:56]
Burning Billions: OpenAI plans to burn $100B+ through 2030, seeking trillions in future commitments.
Brand Power: OpenAI is the “Google verb for LLMs.”
Financing House of Cards? If capital dries up, it could trigger a domino effect impacting Nvidia, AMD, Oracle, and the whole supply chain.
Microsoft’s Risk Management: Microsoft has the right to match external offers to OpenAI but is hesitating at these capital levels—perhaps sensing unsustainable overextension.
Quote: “...the numbers just don't make sense at this point anymore for me. And that's why I'm more cautious.” (57:00)
On Valuation and Bubbles:
On Google’s AI Stack:
On the Market’s Risk Structure:
On GPU Depreciation and Financial Risk:
| Segment | Timestamps | |-----------------------------------------------|-------------------| | Amazon & Anthropic's AI Partnership | 02:21–11:04 | | Strategic AI Ownership Models | 07:35–11:04 | | Google’s Full-Stack AI & TPUs | 11:04–16:42 | | Meta’s Monetization Path | 19:26–27:54 | | Microsoft, Azure & Neo Cloud Competition | 27:54–35:07 | | Nvidia, Circular Deals & GPU Depreciation | 36:01–47:21 | | OpenAI’s Capital Burn & Systemic Risk | 51:51–57:56 | | Rihard's #1 Pick for the Next Decade (Google) | 58:35–62:30 |
[58:35–62:30]
Favorite: Google (Alphabet)
Key Reasoning:
Jarc’s summary:
Rihard Jarc writes "Uncover Alpha," a Substack focused on deep dives into AI tech companies, sub-segments like TPUs, and leverages alternative data such as employment records, interviews, and non-standard datasets.
This summary skips all ads and non-content segments. For anyone seeking a data-driven, skeptical, and insightful take on who’s winning the AI race, this episode is a must-listen.