Top Traders Unplugged – Ideas Lab Ep. IL44: From AI Hype to Transformative AGI ft. Aubrie Pagano
Host: Kevin Coldiron (standing in for Niels Kaastrup-Larsen)
Guest: Aubrie Pagano, General Partner at Alpaca VC
Date: December 31, 2025
Episode Overview
This episode explores the looming transition from the current state of artificial intelligence (AI) hype towards the more profound societal and economic shifts that could result from artificial general intelligence (AGI). Aubrie Pagano, a venture capitalist with a background in entrepreneurship and cross-disciplinary research, shares key findings from her recent white paper. The discussion covers blockers that must be overcome before AGI can be fully integrated into the economy, likely investment opportunities, and longer-term speculative visions of what an “Aquarius economy” could look like.
Guest Introduction & Background (03:54)
- Aubrie Pagano’s Background
- Founder and former CEO of Bow & Drape, a customizable apparel brand.
- Experience in real-world operations: supply chain, manufacturing, retail.
- Early-stage VC focused on foundational industries (supply chain, manufacturing, energy).
- Not an AI “insider” but needs to understand AI’s impact as an investor and entrepreneur.
“I’m obviously now in venture, seeing the front lines of all of this and really started to think about, okay, how do we, how do we think about this for culture...we’re investing in the real world and how it runs.” (05:27 – Aubrie)
Main Discussion Points
1. The Last Economic Cycle Built on Labor Scarcity (07:21)
- Key Insight:
AGI’s core promise is to make labor essentially “infinitely accessible,” thus eliminating labor scarcity. - Previous cycles transformed the nature of work but didn’t make labor costless; technology was a tool, not a replacement.
- The reality on the ground is a messy mix of overblown hype and slow, real-world adoption.
“We know there’s a lot of change right now, and we know that the facts on the ground are astonishing and contradictory and kind of incoherent and really existential.” (07:33 – Aubrie)
- Investment Angle:
The blockers to abundant, cheap labor via AI are where major near-term opportunities lie.
2. Blockers on the Path to AGI
A. Energy Abundance (12:58)
- AI compute and data centers require massive, stable, new sources of power.
- Current infrastructure is inadequate—energy is the primary scaling bottleneck.
- Opportunities: nuclear and geothermal energy, grid resiliency, demand response, and compute optimization.
“By 2030 we need over 150 gigawatts of new power, which is like double California’s entire grid.” (07:58 – Aubrie)
“Goldman Sachs has said that data centers will require 50% more global power within the next three years...” (13:41 – Aubrie)
Notable Moment:
- Discussion of outlandish solutions like “power stations on the moon” (16:39-16:52).
“Part of me thinks that’s like billionaires realizing that we’re in a potentially post-growth environment...people are going to throw spaghetti at a wall to figure out the fastest, quickest way to get there.” (16:52 – Aubrie)
B. Foundational Industry Resilience (17:44)
- Real-world automation is much harder than demos suggest (Moravec’s Paradox).
- Major sectors—manufacturing, agriculture, construction—lack reliable data infrastructure.
- Skilled/trade labor is in short supply and not easily automated (plumbers, electricians, etc.).
“Manufacturing, 65% of manufacturers don’t have usable data. In nurseries...they still take inventory on paper.” (21:23 – Aubrie)
- Investment Angle:
- Skilled trade enablement, education, and upskilling.
- Solutions for data normalization and supply chain visibility.
C. Agent and Human Coordination (25:32)
- Next layer of complexity: getting various AI agents and humans to interoperate smoothly.
- Major issues: lack of trust, emotional resonance, poor interoperability (“walled gardens”).
- Lack of a “USB-C for AI”—systems do not communicate or remember across platforms.
“If anybody listening has played around with this stuff...there’s a lot of trust issues, there’s a lot of emotional resonance issues.” (25:40 – Aubrie)
“As you’re trying to build with these tools more sophisticated workflows and actually trying to automate away real tasks, you run into this clunkiness...” (26:31 – Aubrie)
- Some early-stage companies are experimenting with “agent marketplaces” (e.g., Dialectica, Yearp) to pit AI agents against each other and vet their outputs.
“It becomes that much more incumbent to basically have like the USB-C for AI and that just doesn’t really exist yet.” (30:18 – Aubrie)
3. Speculative Section: The Aquarius Economy (33:29)
- Why Speculate?
The exact timeline to real AGI impact is unknowable; using narrative and sci-fi tools can help us imagine and prepare for vastly different futures.
“Let’s write about it in a kind of narrative format. Let’s write about it kind of like a sci-fi book...if you think about it that way, one, it’s a little less terrifying because it seems sci-fi.” (36:45 – Aubrie)
The Techno Core and Hegemony (39:34)
- Techno Core:
Centralized, digital AGI “superstructure”—effectively the technical overlords running AGI in society. - Hegemony:
Traditional centers of institutional human power—elite families, corporations, and political actors.
“The Techno Core is kind of like the digital superstructure that’s running AGI...The Hegemony is basically like institutionalized humans.” (39:34 & 40:08 – Aubrie)
Three Outlier Groups (42:10)
- No Mans:
- Off-the-grid, “modern-day hippies”—reject digital/AGI mainstream. Value human connection, serendipity, independence.
- Gurus:
- Artists, healers, athletes—humans with exceptional relational or creative force; their work is valued for its authenticity and non-AI origins.
- Incels:
- Here, used to describe people spiritually and socially disconnected, “victims” of techno-core dominance and the isolating effects of deeply digital (“AI-native”) society.
“We think that’ll cause a lot of peril and some implications that will hopefully create opportunities too to help.” (47:43 – Aubrie)
4. Investment Opportunities in a Changing Social Order (49:21)
- Using the Aquarius framework to spot opportunities—for example:
- Therapy models mixing human + AI
- Physical/social “third places” for in-person reconnection
- Sensory gyms for tactile development
- Whisper networks: monetizing and safeguarding the value of private, serendipitous human relationships as AI commoditizes more of life (see 51:46)
“As we increasingly have these digital lives...AI can’t capture [certain whispers]—they’re networks of people, relationships, connections that happen through serendipity, that happen through human contact.” (51:50 – Aubrie)
5. AI “Slop,” Platform Insidification, and User Experience (57:27)
- Challenges:
- Increasing “AI slop”—low-value, auto-generated content saturating social networks.
- Concerns that as platforms seek to monetize, user experience may degrade (as seen with Google and Facebook).
- Debate over whether current ChatGPT-level tech is the “worst it’ll ever be,” or as good as it gets for users.
“Assume this is the worst AI you will ever use…But is it really going to be the worst end user experience? Because we’ve seen this over and over again with technology that’s great at first...but just becomes extractive.” (55:27 & 57:08 – Kevin)
“The insidification of these platforms is almost like happening by the users itself…the slop wading I think is creating a real moment.” (59:36 – Aubrie)
Notable Quotes & Memorable Moments
-
On AGI’s Real Timeline:
“We actually don’t think…it’s around the corner. I think we might be a capital cycle at least away from this.” (09:10 – Aubrie) -
On foundational industry data problem:
“In manufacturing, 65% don’t have usable data. In ag, nurseries still take inventory on paper.” (21:23 – Aubrie) -
On agent interoperability:
“As you’re trying to build with these tools more sophisticated workflows…you run into this clunkiness that we call coordination…” (26:31 – Aubrie) -
On speculative forecasting:
“Let’s write about it kind of like a sci-fi book—imagine sometime in the future, AGI is fully achieved…It allows for better extrapolation because you’re distancing yourself from any current biases...” (36:45–37:30 – Aubrie) -
On whisper networks:
“We call these whisper networks…networks of people, relationships, connections that happen through serendipity, recommendation, that happen through human to human contact that’s very hard for AI to infiltrate.” (51:46 – Aubrie)
Timestamps by Segment
- 03:54 – Aubrie’s background, context for perspective
- 07:21 – Why AGI is the “last cycle” built on labor scarcity
- 12:58 – Blocker 1: Energy Abundance
- 17:44 – Blocker 2: Foundational Industry Resilience
- 25:32 – Blocker 3: Agent and Human Coordination
- 33:29 – Shifting to speculative thinking: the Aquarius Economy
- 39:34 – Defining The Techno Core and Hegemony
- 42:10 – Outlier social groups in the AGI future
- 49:21 – Turning far-future visions into present/future investment theses
- 51:46 – Whisper networks and the future of human connection
- 55:27 – Platform insidification, “AI slop,” and user experience
Final Takeaways
- We’re not on the doorstep of fully transformative AGI—the real economic disruption may be a capital cycle or two away.
- Near-term investment opportunities lie in solving obstacles (“blockers”) like energy supply, foundational industry data, and cross-agent coordination.
- Major social effects—including the human need for connectedness, authenticity, and new forms of agency—will drive both risk and investment opportunity in the longer run.
- A key practical insight: The real edge may lie in understanding how technology and culture shape each other, rather than betting solely on technical breakthroughs.
“If you believe this is our future end state, these opportunities are reflective of that future...within this framework you can start to build a language around that and think about...interesting ways that relationships in society and work will really change.” (51:18 – Aubrie)
