Practical AI Podcast – “Beyond Note-Taking with Fireflies”
Date: November 19, 2025
Hosts: Daniel Whitenack (Prediction Guard CEO), Chris Benson (Lockheed Martin Principal AI Research Engineer)
Guest: Krish Ramineni (Co-founder & CEO, Fireflies AI)
Episode Overview
This episode delves into the evolution, technical challenges, and future of Fireflies AI—a leading AI-powered meeting assistant. Krish Ramineni shares Fireflies’ origin story, how the team navigated a rapidly advancing AI landscape, and the paradigm shift from simple transcription to real-time, in-meeting “Live Assist” functionalities. The conversation is both a founder’s journey and a primer on what it really takes to turn bleeding-edge research into accessible, practical, and scalable AI products.
Key Discussion Points & Insights
1. The Genesis of Fireflies: From AI Wishful Thinking to “AI Note-Taker”
[02:56]
- In 2016–2017, before mainstream LLMs, Krish and his co-founder sought to build a “general AI Secretary,” employing classic NLP tooling like BERT and hand-crafted approaches.
- Initial attempts involved human-in-the-loop workflows—literally being the product for ten friends, validating the need for automated note-taking.
- Realization: Human-in-the-loop does not scale (“max support: 5,000–10,000 people and operations-heavy”), shifting focus to full automation.
- Market validation came before code: “Before we would write code for six months, we would ship something, and no one would want to use it...This is the first time where before we built anything or wrote a line of code, we validated the market." — Krish [05:04]
2. Building the AI Note-Taker Category & Scaling Up
[05:43]
- Early Fireflies allowed users to simply record and search meetings—basic, clunky, but solved real problems like limited cloud storage on Zoom/Meet.
- Transcription was expensive, inaccurate, but “good enough” if users could search transcripts for keywords.
- Gradually layered in more value: automated transcriptions, action item detection, integrations.
- The product officially launched in 2020, pre-ChatGPT, but user growth accelerated during COVID-19 as virtual meetings exploded.
3. LLMs as an Inflection Point
[08:30]
- In 2022, Fireflies got early access to OpenAI’s GPT-3.5 through connections with Vinod Khosla.
- “It blew our mind. It changed the complete technology. And then... it's been an absolute rocket ship. We've never looked back." — Krish [09:17]
- Revenue scaled from seven to eight figures; company broke new ground in the “AI note-taker” space.
4. Lessons in Product-Market Fit & Technical Evolution
[11:00]
- Lesson: Solve for customer pain, not just feasibility. “We said, let’s forget what is even theoretically possible right now, let’s figure out what customers want and we’ll work backwards.”
- Massive engineering challenges: Cloud streaming and transcription were expensive and unreliable; Fireflies managed its own cloud infrastructure for cost and scale, now running more of its own traffic than via AWS/Google.
- “It was a miracle” — profitably scaling to millions of users while still charging only $10/month per user.
5. Technical Deep Dive: From Hand-Crafted Rules to Plug-and-Play AI
[19:00]
- Early speech-to-text required heavy “massaging” (speaker ID, fixing grammar, removing filler words).
- Now, with models like Whisper, the “base 80%” of note-taking is easy—real value is in the last 15–20% (enterprise features, customizations, deep integrations).
- “If a person was doing something today off the shelf, you don’t have to do all of that. You can get to 80% pretty solid just using off the shelf parts... But the other 15% actually takes a long time. That differentiates good versus great.” — Krish [21:30]
On distribution as a barrier to entry:
- Fireflies’ scale and early mover advantage make it hard for late entrants to stand out—deep integrations, compliance, and feature diversity (“10, 20 different features or enhancements every week... it compounds.”)
6. Real-Time Functionality – The “Live Assist” Leap
[34:05]
- Fireflies’ next step: from post-call recaps to “Live Assist”—a feature providing real-time suggestions and knowledge in meetings.
- Live Assist capabilities:
- Meeting prep: context, previous conversations pre-loaded.
- In-call cues and context-aware info (e.g., sales coaching, candidate interview reminders).
- “Catch me up” button for missed segments.
- Real-time transcripts and notes as the meeting unfolds.
- Partnership with Perplexity to bring relevant web knowledge into meetings.
- Available via new desktop app—no more bot needed, works anywhere meetings happen (Zoom, Teams, Slack Huddles, Discord, in-person with mobile, etc.).
“Fireflies will give me detailed meeting prep before we even get into the conversation... and then during a meeting, giving me cues and live suggestions while we're talking about different topics... It's like having someone that serves as autocomplete for your meetings." — Krish [34:33]
7. Behavioral Surprises and Insights from Live Assist
[40:03]
- Usage analytics: “Manual queries have shot up even more than what we had in the past.” Users engage heavily with prompts, then dive deeper via manual follow-ups.
- “Our suggestions are actually helping people talk to Fireflies more. ... It gives me that 'Her' type the movie... where you're having this AI that's like help helping you and it knows and it's learning.” — Krish [41:45]
8. Looking to the Future: Hardware and Going Beyond Meetings
[44:16]
- No rigid long-term roadmap: “So much can change in six months... With AI we have a general sense of the direction, but no fixed long term plans.”
- Teaser: Fireflies to be embedded on “10 million devices” next year, bringing AI meeting assistance to everyday hardware.
- Anticipates LLMs running cheaply on edge devices, enabling local, low-latency AI agents.
- Fireflies eyeing other “knowledge work” domains—“What other parts of knowledge work can Fireflies provide value at?”
Notable Quotes & Memorable Moments
-
"Before we would write code for six months, we would ship something, and no one would want to use it...This is the first time where before we built anything or wrote a line of code, we validated the market."
— Krish [05:04] -
"It blew our mind. It changed the complete technology. And then... it's been an absolute rocket ship. We've never looked back."
— Krish on LLMs [09:17] -
"We said, let’s forget what is even theoretically possible right now, let’s figure out what customers want and we’ll work backwards from there."
— Krish [11:18] -
"If a person was doing something today off the shelf, you don’t have to do all of that. You can get to 80% pretty solid just using off the shelf parts... But the other 15% actually takes a long time. That differentiates good versus great."
— Krish [21:30] -
"Fireflies will give me detailed meeting prep before we even get into the conversation... and then during a meeting, giving me cues and live suggestions while we're talking about different topics... It's like having someone that serves as autocomplete for your meetings."
— Krish on Live Assist [34:33] -
"Manual queries have shot up even more than what we had in the past. We thought manual queries would go down because everyone would just use the suggested Live Assist."
— Krish [41:00]
Timestamps for Key Segments
- [02:56] — Fireflies’ founding story & human-in-the-loop phase
- [05:43] — Market validation before writing code & early growth
- [08:30] — LLMs change product direction and scale
- [11:00] — Technical lessons: focus on customer pain, solving scalability hurdles
- [19:00] — Detailed breakdown of architecture evolution; what differentiates the leaders now
- [34:05] — Real-time “Live Assist” feature announcement and walkthrough
- [40:50] — User surprise: manual queries and new engagement patterns with Live Assist
- [44:16] — Roadmap: Hardware partnerships, AI everywhere, Fireflies beyond meetings
Takeaways & Closing Thoughts
- Practical AI at Scale: The best AI startups validate need before building; the hard work is in that “last mile” of reliability, integrations, and user experience.
- Distribution & Early Mover Advantage: Starting early—and sticking with it—can create powerful moats, especially as the commoditized parts of AI become plug-and-play.
- The Next Phase: Real-time, context-aware AI is moving beyond automation into collaboration, with AI as a teammate or “second brain.”
- Vision: The future is ambient AI—everywhere you work, from desktop to edge devices, assisting not just after but during your most high-impact moments.
For More:
Catch up with Fireflies AI at fireflies.ai and the Practical AI Podcast at PracticalAI.fm.
End of Summary
