ProductLed Podcast: The GPU Gold Rush — How Vast.ai Scaled With AI Demand
Host: Wes Bush
Guests: Travis Cannell (COO & First Employee, Vast.ai), Esben Friis-Jensen (Entrepreneur in Residence at ProductLed; Co-founder, UserFlow and Cobalt)
Date: March 27, 2026
Episode Overview
In this episode, Wes Bush dives into the meteoric rise of Vast.ai in the context of the current AI "gold rush," exploring how the company is scaling to meet the unprecedented demand for GPU compute. Travis Cannell shares insights on the company’s two-sided marketplace, rapid growth, customer-centric focus, and strategic decisions behind product and organizational scaling. The conversation also explores industry trends, network effects, and the impact of AI on hiring and future business models.
Key Discussion Points
The Vast.ai Growth Explosion
-
Surge in Signups
- Vast.ai recorded 19,000 signups in February alone, representing a 27x YOY growth (01:41).
- Demand is accelerating month-over-month as generative AI and LLMs become mainstream.
-
Market Drivers
- The primary growth is driven by inference workloads — more users want to run and interact with AI models, not just train them (02:34, 06:23).
- Growing interest from international markets, with APAC and Europe being leading sources of users (07:20).
-
Shift in User Base
- Not only highly technical builders but also a growing class of "editors" can now leverage AI via easy interfaces, vastly expanding the total addressable market (08:49).
What is "Inference"? (Explanation, [04:00])
-
Definition
- Inference is running a trained AI model to generate outputs (responses, images, code).
- Distinct from the expensive process of training, inference is “the model’s life” — whenever a user interacts with a model, inference is happening on a GPU.
- Highlight: “When you're talking to this model, every time you prompt a model, it is running on a GPU somewhere. … Inference is life.” — Travis Cannell (05:39)
-
Technical Notes
- Models must be loaded into GPU RAM to process input.
- Optimization and "quantization" are active research areas to run models on smaller GPUs (05:13).
Why AI Teams Choose Vast.ai Over AWS or Lambda? ([10:05])
-
Cost Efficiency
- “We’re trying to bill ourselves as the most efficient solution for inference." — Travis Cannell (10:05)
- Users access a competitive marketplace of independent GPU suppliers who set their own prices, keeping the cost down.
-
Marketplace Model
- Vast.ai doesn’t own equipment; instead, it connects buyers and sellers of GPU compute, much like “Airbnb for GPUs” (10:50).
- Suppliers retain control over listing terms, duration, and pricing, facilitating liquidity and flexibility in the GPU market (11:20).
Building and Balancing a Two-Sided Marketplace
-
Supply vs. Demand
- Initial challenge: Keeping up with explosive demand, especially as users expect “AWS-level” support at low costs (13:04).
- Support team must handle everything from basic tutorials to high-scale enterprise workloads (15:10).
-
Supplier Management
- Suppliers (many small entrepreneurial operators) list GPUs via the platform, avoiding operational headaches like customer support (11:41).
- Marketplace signals (price/availability) guide suppliers to add or reallocate compute supply (13:21).
- Retention is high—hosts reinvest with growing earnings.
Defending Against Competition
-
Beyond Price
- “Bring it on, let's compete. ... Price is important, but everything else ... is very important as well. The customer service, the platform experience, the way the templates work.” — Travis Cannell (17:20)
- Emphasis on full stack product experience, responsive support, and robust software.
-
Transparency and Market Data
- Upcoming: Publishing 30, 90, and 180-day pricing histories to provide transparency and industry insight for different GPU types (17:50).
The Economics of GPU Rental ([18:36])
-
Key Cost Drivers
- Power costs are significant; location matters (cheap energy = higher margins).
- CapEx (equipment purchase), depreciation/tax advantages, and market timing all factor into profitability.
-
Market Surprises
- The transition from crypto mining to AI inference caused unexpected shifts in GPU pricing and rental rates (20:55).
Market Trends & Future Demand
- Unquenchable Demand
- As new capabilities emerge, AI use cases multiply—reflected in Travis’s analogy of “digitally recreating living things” and pushing towards digital experiences (23:51).
- Cites speculative future where the “world could become a computer” or simulated reality à la Philip K. Dick (24:00).
Network Effects & Marketplace Moats ([25:02])
-
Flywheel Dynamics
- Strong consumer pull attracts more supply; robust supply attracts more consumers, strengthening the marketplace moat.
- Vast.ai becomes the “go-to” for both sides: “...if you're a supplier, you even want to put your spare capacity on Vast rather than try to recreate that whole ecosystem...” (27:20).
-
Data as a Moat
- The platform’s pricing feeds are widely referenced, and Vast.ai plans to further leverage this transparency to strengthen industry authority.
-
Planned Features
- Bidding system, more granular contract types, and advanced marketplace mechanics are in development for even more liquidity and price discovery (30:58).
Organizational Growth & Culture
-
Leadership & Structure
- Hands-on, in-office leadership with the CEO and COO still contributing individually (31:48).
- Adopted a five-day in-person office policy to foster collaboration during rapid growth; opened offices in both SF and LA (32:28).
-
Remote-First with a Twist
- Previously maintained an ongoing virtual "open office" (Google Meet) for transparency, motivation, and ease of interaction (33:33).
-
Hiring in the Age of AI
- Paused hiring to reassess needs.
- Shift in skill requirements: From traditional coding to managing agents, infrastructure, and technical design management.
- “Being able to write great code is no longer a big differentiator. ... [The future is] more management and systems-level thinking.” — Travis Cannell (36:11)
Advice for Founders in the AI Era ([38:14])
-
Build for Agents
- Look at “the agent economy”—products and services that facilitate or plug into autonomous agents (38:45).
- Execution trumps mere ideation.
-
Inference Will Explode
- The proliferation of inference workloads is just beginning; future use cases will expand as costs fall and accessibility improves: “Inference is life.” — Travis Cannell (39:32)
Notable Quotes & Moments
- “We're having like a chatgpt3o moment, but it's bigger.” — Travis Cannell, opening thoughts on the scale of the current AI wave (00:00)
- “Inference is life.” — Travis Cannell on the centrality of inference workloads (05:39)
- “You’re basically the Airbnb for renting GPUs.” — Wes Bush (10:50)
- “Bring it on, let's compete. ... Price is important, but everything else ... is very important as well.” — Travis Cannell (17:20)
- “Being able to write great code is no longer a big differentiator.” — Travis Cannell on shifting hiring priorities in an AI-driven world (36:13)
- “Nothing new... but some things that we touched on during this kind of hour would be the agent economy and agents and building things for agents... Inference, it's going to explode. I think it's life.” — Travis Cannell’s concluding advice (38:45, 39:32)
Segment Timestamps
- [00:00] — Opening thoughts on the scale of AI demand
- [01:33] — Vast.ai’s explosive growth numbers
- [04:00] — What is inference? High-level explanation
- [06:23] — When did hypergrowth start for Vast.ai?
- [08:49] — How user base is shifting from builders to editors/prompters
- [10:05] — Why AI teams select Vast.ai over big cloud providers
- [13:04] — Balancing supply & demand in a two-sided marketplace
- [17:20] — Defending product moats beyond pricing
- [18:36] — The economics of GPU rental: power, CapEx, and unpredictability
- [23:51] — The ever-increasing demand for compute
- [25:02] — Network effects and strengths of Vast.ai’s marketplace
- [31:48] — Organizational scaling, office culture, and hiring philosophy
- [36:11] — The impact of AI on roles and recruitment
- [38:14] — Advice for founders: agent economy & inference explosion
Where to Follow Vast.ai & Travis Cannell
- Vast.ai Website: vast.ai
- X (Twitter): @vast_ai, @TravisCannell
- LinkedIn: Travis Cannell
Final Note:
This episode is a must-listen for anyone interested in the infrastructural backbone of AI, the intricacies of building a two-sided growth marketplace, and the strategic thinking required to ride (and shape) the next wave of digital transformation.
