
Loading summary
A
Hey there, freedom fighters. My name is Andrew Warner. I'm the founder of Mixergy, where I interview entrepreneurs about how they built their businesses. If you're listening to me, there's a good chance that you've seen all these AI note takers, not just like on your computer, but every time you join a meeting. There's. I've had meetings where there are multiple note note takers in the call with me. Joining me is the founder of one of the originals of these and a company that I see in a lot of my meetings, David Shim, founder of Reed AI. I'm curious about how he built it up and I'm also curious about the competition, how you survive in a world where so many people are getting into the AI space. So let's find out. David, good to have you here.
B
No, I'm excited to be here and especially I think you're hitting on the right points to say, what's the differentiation that you can deliver in a world where AI solutions come up every single day? Like, if you go to Product Hunt today, you'll probably see six meeting note takers that have launched in the past week. You'll see email solutions, you'll see messaging solutions all across the board. I think there's just a lot of noise in the market right now.
A
And then I even see that notion now will pop up, oh, do you want to record this thing? And then ChatGPT has now a button to record a meeting and all that. So let me ask you this, before we even get started into it, how many people are actually using Read AI now?
B
Yeah, so it's in the millions, and then on a daily basis, we get about 50,000 new accounts created every single day. So that's a run rate of a million plus on a monthly basis, 12 million annually. So we are the fastest growing meeting note taker in the world today, and we have been for the last two years.
A
I had no idea there was that. And, and it is very viral. You get into a meeting, you see that there's someone using Read AI. You probably get notes afterwards from it by email. You see it in the chat. I totally get it. All right, let's. Let's answer the most difficult question, which is how does someone survive in a world where everybody seems to be creating all these note takers?
B
So I think this is the beauty. It's a blessing and a curse when it comes to AI. It's very easy to build a basic model. So transcription, you can do open source, you can buy some stuff, you can drop it into chatgpt or anthropic and get a summary against it. And those are good. And you'll see there's this kind of table stakes of you should be able to do transcription, you should be able to do summary. And then people start to come in with what's different. And for us, what's really resonated is our ability to actually measure sentiment and engagement in real time and then apply it to the text. So you and I are talking right now. We. Let's say I start going on about our founding and how we raise money, and you're kind of like, this is sort of interesting. But then I go for five minutes, you might start looking around, get really distracted. We'll pick that up and we'll actually put that in as the narration layer. So imagine if you read a book and you only read the quotes, you get a general understanding of what's happening. But that narration layer is that additional context that really puts characters into play. That goes in and says, this is compelling verses. This isn't interesting. And so for us, it was going in and saying like, hey, can I start to understand what are people are interested When David talks about the weather, where are you from, Andrew, how's the weekend? You know, what are your plans? Those things are kind of like, ah, 90% of the audience tunes out. But when you start talking about AI application revenue, people start nodding their head and agree. You can actually readjust the summary, not to go in chronological order, but to say what were the most important topics and how should I actually display that within the summary?
A
I had no idea I could see that in a sales call, that if I know what I'm doing in a sales call because I've got a general rhythm that I go through and an agenda that I follow. You're telling me I could spot, or Reid could spot the place where people are interested and highlight that for me. And the places where they moved on. And the way that you know it is because you are watching not just the words they use, you're not tracking just that, but you're also tracking people's faces and their. Their. The way that they're moving their heads.
B
Yeah. So we look at head orientation, so we don't look at your face. We don't do any facial recognition, but we go and we say, if you're looking at the camera straight ahead and you're talking, your camera's there. And then if you look over this way and you go to a fixed position when someone else is talking, your head stays the same Spot. We have a model that says that's a second screen. So you're actually pay attention, you're not distracted. But if you start scanning your head back and forth, if you go like this a lot, that's going in saying you're not as engaged in the conversation because you're not going to a fixed point. So we've got these models in place that understand engagement. Like if you go down and you write your notes, people go to the same place because your notepad is right here and you write your notes down. Well, we're able to pick that up as well and say, oh, you're going to a fixed position. So that's one of the things. But imagine a sales call. You got a seller that understands the pitch deck forward and backwards, but they really don't have context for an audience. They might do the whole pitch and say, this is why I work great. This is just going to save you money. And it only costs this much. Why don't you buy this? And the person says, yep, that's interesting. Let me get back to you. If I put that into an LLM, it's going to say it was a fantastic conversation. But if I can actually go in and see the client was not really engaged. They barely spoke in the conversation. When they did ask a tough question, David went from 150 words per minute to 225 because he was super nervous. So that metadata is not picked up in an LLM. That metadata we pick up to go in and say, David's pacing is 150, but when he's excited or nervous, he goes to 220. That then feeds it in the narration.
A
How do you know that anyone even wants that? It feels like people, they just want notes and that's it. They just want to know, what do they say? Can I go search for the thing that I wasn't paying attention to?
B
They want effective notes. So when you get notes that summarize it in chronological order, those aren't effective notes. That's also.
A
How do you know that what they want is sentiment analysis? Like, how are. How are you able to tell that that's the thing that Dr. People to.
B
Come to read, they actually don't want to see that on the back end. They want the notes to actually take that into account. So when I present you with the summary, they don't care that this is how the algorithm works on the back end. They want to know that this was the most.
A
Yeah, how do you know that? So here's the thing. The other thing that gets me. I would have thought that at this point in, in transcriptions and in meeting note takers, that we would have one that's just geared for sales. And I think GONG does that, but it's so expensive I haven't even used it. And that. That there'd be one for sale, one for sales, one for hr, one for team meetings, one for this, one for that. It doesn't seem like it's broken down that way. And if that was the way it was, then I could totally be with you and realize, hey, you know what, when it comes to sales, you want to know that people are engaged. When it's team meetings, you want to know this. Like, but how, how are you able to tell that people care without segmenting and having a tight niche that you're going after?
B
Yeah, I think the big metrics that we look at are retention. So if someone tries our product, are they actually using the product 7, 14, 30, 60, 90 days after the fact? And for us, across time, as we've improved the models on a consistent basis, we've seen that retention increase to where most recently in this last month, if you use our product in a meeting and you get the reports, we see that the retention is 80% after 30 days.
A
So if you use a good. But how do you know that sentiment is a part that they care about?
B
Because it's fed into the model. So it's going in and saying it is easy. It has been never easier to compare when it comes to meeting transcripts and meeting summaries, three different solutions at the same time. Because you've been on a call where you've got me 1, 2, and then read perfect World. You just look at all three and say, which one do I like the best? And what we found is nine times out of 10, we're going to win that. And we see that from an adoption perspective, where, like I had mentioned in our program session, we're seeing about 50,000 signups every single day. So that's going in and someone's saying, I find this so compelling. I am going to go in and create an account on Read. And those people, when they use it, are sticking around 80% of the time.
A
Okay, why, why do you think that this space hasn't yet broken down by. By niches? And they do still tend to be just general meeting notes.
B
I think that's where it's going. It's going to be more about, there's a meeting note taker to go and apply to. There's one meeting note taker to rule them all in the sense of you.
A
Do think it'll be that way? You think it's going to be one that does for everybody a hundred percent.
B
Because it's too niche. You're not going to go in and say, oh, I've got a sales call, I'm going to invite this one, I've got to call with my doctor, I'm going to do this one. I've got my family, I've got this other one that doesn't make sense. That's the old days where we used to have blue jeans, we used to have WebEx, we used to have Zoom, we Google Meets, we had all these gotomeeting, we had all these different solutions for different use cases and ultimately came down to the top three. Right now it is Zoom. Microsoft Team.
A
I don't mean for different platforms. I mean I think of the people who are on my team, one of them is just doing member success. All he wants to know is are my members happy with what they're doing? And he would be interested in knowing did I ask tough questions to let me know if the person understands the material? Did I, does the person seem engaged in the course? That kind of a thing. I have another person, all he does is sales. He wants that after sales analysis of whether he was asking for the sale properly, whether he pushed too much, that kind of a thing. He would want to know even afterwards what, what's a good follow up suggestion for what to send the person if they don't respond within a week. And then I have one other person, she only does team meetings and she wants to know certain things about how to engage the team and make sure that they're getting their work done. I would imagine that there would be a note taker for the different role in a company and I'm not seeing that here. And again I'm looking at you guys as the model for what's the future of a lot of AI SaaS because you're so far ahead. So why do you think that it hasn't broken down that way here?
B
I think in that scenario you're applying traditional SaaS approaches where it's like I've got a solution for sales, I've got a solution for coaching, I've got a solution for intra eatings and in this AI world you don't need that. I think that one AI solution needs to go in and say how do I tackle this? For an entire organization where you're interviewing customers, where you're doing sales calls, where you're having Internal standups. And having that one solution actually gives you much more value than an individual standalone. So to give you an example, let's say you've got a sales team of 50 that are going out in the market, they're pitching the product, clients are asking questions that might be in a solution like gong today. And you've got gong, but it's too expensive, like you had mentioned before, where it's like, it might be too cost prohibitive and the product team doesn't use it well. Now all of a sudden the product doesn't have access to any of that information that is siloed in this wall where it's like salespeople are talking with customers every single day telling them why they like the product or why they don't like the product or what features that they want. They don't have access to that. So you want one solution across an organization that could be that silo, that storage of intelligence at the end of the day that everyone can tap into. And I think that requires a single meeting note taker. That requires a single system for productivity. Like the way that we pitched ourselves when we first started was we are the system of record for meetings. Where we become is the system of record for productivity. And so that's going in and saying, bring in your emails, bring in your messages, bring in your Google Drive, your SharePoint your OneDrive Drive, your Notion, your confluence, your Jira. You need all of that information to actually deliver correctly. So if a salesperson is doing a pitch, you want to give them as much information as possible about the roadmap, even if they didn't sit on all the product calls this week to be able to say, yeah, we actually have something coming out in two weeks that is directly going to address your problem.
A
I see. So you don't just want to suck in data and analyze the meeting. You want to make the meeting more useful during the meeting and afterwards so the salesperson can go in and do a search to get the answers and come back with a, with a response. Okay, I think I'm getting it. I still feel a little bit skeptical and still feel like that the software will break down by job type and then all of these tools will have a pass this on to other people or just you. You know, you give your developers a seat and they can have access to, to see what's going on with the salespeople. But you don't see it that way. And if I'm understanding you right, part of the reason you don't see it that way Is you say everyone needs to be in the same platform. But also with AI, the AI can just adjust based on who's in the meeting. So if Andrew's doing meeting and he's doing an interview, the AI will know that it's an interview and adjust to Andrew's interview style and give him feedback. If they know that Laika is doing a sales call, it adjusts to Lake's needs during a sales call. That's the world that you envision.
B
I think 100% it's going to be all done in the background. Because the way that you're able to stack it is there's strong and weak signals. So we already classify meeting types when that data comes in. So if you have an external person that is on the meeting invite, that all of a sudden changes the dynamics of that meeting because you're talking to someone outside of your organization. And then if you realize, okay, this is interest, what's the meeting length? Have I ever actually met this person before? Okay, that's interesting. Now I've never met this person before. This is a sales call, potentially, or interview. Then it goes in and says, hey, let me check your emails. Oh, this email goes in and says, you were actually pitching this client cold to say, can we schedule a call? So now all of that context, if you're able to stack it together, can be applied into the meeting summary. To go in and say, here's additional metadata that's available.
A
All right, let's go back to how you founded the company, and then it'll give me some opportunities to ask you questions about how other people can create at companies like this. From what I understand, this idea came to you when you were in Cabo. Tell me how you ended up in Cabo. Hanging out, but still on Zoom.
B
Yeah. So this is Pete Coven. Prior to that, I was the CEO of FourSquare. I became CEO of FourSquare through an acquisition.
A
And so they acquired your company, which was acquired by Snap.
B
Yeah, so short version would be 2021. Started company called Placed, did location analytics. We pivoted slightly to attribution to say, if you saw an ad, did that drive you into the store? In 2017, Snapchat acquired us. 2019, Snapchat spun us out, not because we were doing poorly, but because we were actually profitable, growing at 50% plus clip rate. And there were other companies that wanted to buy the business from Snap because Snap was refocusing on their ad business. And so Foursquare came along and said, like, we'd love to incorporate this in Ultimately, we decided to go with Foursquare, became CEO there, was there for about two years, did a couple acquisitions, Factual being one of them. So we kind of built this entire location stack. And then from there, I was like, hey, I'm ready to leave. I've been in this space for 10 years. I'm gonna take a little bit of a break. Covid was happening. Mexico was open to the US Ended up going to Cabo too many times, because that was the only country that was open outside of the US and so when I was in Cabo, I was doing a lot of these calls where I'm sure you do these as well, where it's like intro calls. Hey, someone's doing a pitch. Someone wants some feedback. And what I realized was about two minutes in the call, I could realize, should I even be on this call or not? And I was like, but I can't leave because I turned on my camera. I already said hi to everybody. So now I'm stuck here for an hour to listen in on something I'm not that interested in. And so at first, I did what everybody else does. They surf the web, they multitask, they do other things. Then I was. Then I. Then I saw pretty much everything on the Internet after a year, and I started to go and say, like, hey, this person isn't paying attention either. This person isn't paying attention. And I started to do some math. I was like, this is really expensive for how many people that aren't paying attention. Then the moment that really kind of drove me was I was on espn. There was someone's little kind of screen, and their glasses were on. The colors look similar. So I clicked on it, and I was like, I can see the exact writing of the ESPN.com website that's on my screen on their glasses. And I was like, okay, this is very inefficient use of time. I'm able to detect this. So can models also detect sentiment and engagement? And that's where we really started.
A
Reid, who did you think the customer would be for that?
B
Initially, I thought it was going to be sales. I think you were. I think that what you've alluded to makes sense. It's like, on a sales call, you want to know when it isn't going well, because then you can stop, pivot, and ask a question. You can say, I'm going to take a beat. Do you understand this? We have sales. But what we realized kind of in 2023 was AI is making this mainstream, where people are going to find applications that we would have never expected. So one example was someone in 2023 reached out to us really early when we started getting product market fit and said, hey, I'd love to jump on a call. I've got a use case and I'd really like to talk to someone and I'm going to have my caregiver join the call as well. We're like, okay, this is interesting. Did not expect this. All right, jump on the call. We're like, hey. I was like. I was like, hey, I love your product. This is great. Let me tell you how I use it and why I've got the caregiver on my call. I have early onset dementia. I forget things. I know I forget things right now. And I talk with my family once a week and I want to remember the things that we talked about so they don't feel bad. And so the caregiver was there to actually say, hey, how do we keep this subscription going while this person is having the conversation where it is going to get worse over time because he's looking at these notes and he finds incredible value in making sure his family feels good. And so for us, it was like, never expected that, not on the roadmap in any use case. And I was like, damn, this is something where it could have more than just a sales impact. And now we're seeing that across the board where we're doing really well on the sales side. We've got Mag7 using us with thousands of licenses that are paid for, where they're aggregating 200 product managers, five customer calls every week per product manager. And they've built this silo or this kind of storage of intelligence of customer interviews where they're able to go in and ask this database any questions that they want. They've got a thousand interviews every single week. That's stacking. Or you've got people on the, on the ground right now, where we've got healthcare workers when we're. We are HIPAA compliant. They've got healthcare workers that are going out in the field and they're interviewing patients out in the field. They're recording the conversation with their consent. They're going back to the office and by the time they go back to the office, they have the full notes and they're able to go in and look at this and say like, hey, what was the sentiment and engagement score? What were the questions that they were asking? How often do they interrupt when questions were being asked? So now they can actually track that. And even though they hundred patients or Clients that they work with now, they can go and just say, hey, this trend looks a little off. Next time I meet with them, I'm going to do this. And it saves them time, it gives them better services. So this is kind of where we kind of saw like there's a bigger opportunity than just sales.
A
No, I totally get it. I mean, frankly for me, one of the reasons why I like remote work is that I could have notes on meetings where you couldn't in person. But what I've discovered lately is when we're in person in a meeting, we'll just throw up the meeting recorder and actually have notes afterwards. And, and we do go back to it and look afterwards so it shows up everywhere. I can almost imagine in the near future having one of those pendants. But instead of having it be independent, be connected to meetings, I anyway, I'm, I'm obviously, I'm very excited about this phase. It really is so fun. I'm excited about your software. But let's go back to the original sales thing. Were you thinking that you would sell this to salespeople and they would have that sentiment analysis in real time or for post game analysis to see what they could have done differently?
B
Both. So we started as purely real time that then you could look at it after the fact. Because if you're a salesperson you get one opportunity to make the pitch in a lot of cases. So we want that instant feedback. What's not realistic is your manager sitting on every single call that you have to give you that feedback on slack on teams to say, hey, slow down your pace, talk about this. You forgot about that. It's not realistically, that's not realistic. So from our perspective, we thought that real time feedback would be incredibly valuable. What we found was it is valuable for that niche use case. But a lot of times it becomes cognitive overload where if you are doing poorly and all of a sudden the score goes down from an engagement standpoint, people are like, oh shit, what's going on? What do I do? How do I solve this problem? And what we found was the feedback was like, this is great. I actually believe this, this is pretty accurate. But you're not telling me what to do next. And so that's where we were kind of like, okay, what do we need to do as a next step beyond just showing the metrics went from 80 to 75 to 69 to 62. How do we change that outcome?
A
And so you started doing what?
B
So we started a couple things. So one is we started around the Summarization. So doing it in real time to understand what was being discussed. So as the metrics dropped, we would call out. When you talked about methodology, they got really confused because engagement dropped and sentiment was kind of negative. So you might stop and say, okay, I have the data in front of me now. We've linked it to specific content. You might be like, hey, can we take a step back? Let me talk about the methodology again and make sure that you all understand it. And like, oh, this is great. I really appreciate that. So that's. That was a big differentiator. The second thing was after the fact, taking those coaching metrics, because a lot of time it is going to be too much cognitive overload, especially for a more junior seller, where they're not used to taking all these inputs because you got a PowerPoint presentation, you're talking, the client's on the call, you're also looking at that. That's a lot of things. So after the fact, we sent, how do we coach them better? So we came out with the coaching recommendations where it says, david, your talking speed is a little bit too high. It's at 225 words per minute. It should go down to 175. And what happens a lot of times is people get defensive. And this is human nature. It's like, if I critique you, even if it's with the hope of you becoming better seller, you're like, no, I was not talking that fast. What are you talking about? Or gave it. You swore a little bit too much on the call. I get you trying to be like, matches, energy. But that's not the right approach. Well, now we can actually identify those moments and you can play them back. And what we found was people were incredibly surprised to say, oh, wow, I was actually talking really fast. I can barely understand myself. Hey, I did swear too much because I said it 16 times on a call or I'm interrupting the client when they're giving me an answer to a question, because I'm seeing that interruption rate at a higher number. So those are things where it's like, you play it back after. And we found that if you even show people that moment where they did an action that resulted in low sentiment, negative sentiment, negative engagement, they're like, okay, this makes sense. I'm going to correct this behavior. I'm going to make adjustments. And they're not defensive because it's the AI telling you, it's not your manager telling you where you're like, okay, is this going to impact my job? This is the AI Saying, like, make a left turn or a right turn, and you're like, okay, Totally makes sense. I'm not even having second thoughts about that.
A
I get that. Oh. As someone who plays chess and I'm addicted to chess.com I play it every day. One of the reasons why I play with Chess.com is after each game, you can see analysis of all your moves. And in chess, there is often a right move. And I could see that would have been the right move. And if you see it over and over again, you can't help but finally make the adjustment and get better. All right. I heard that you actually reached out to Eric, the founder of Zoom, when you came up with the idea to pitch it to him, and he turned it down, Right?
B
He didn't turn it down. So he gave me context on the idea. So I said, what do you think about sentiment and engagement in real time? Is that something that the market wants and is that something that you're building? Because if you're building it, I might be less likely to build something like that because you've got that built in base. And when I had a conversation with him and I knew him because he reached out when I became CEO of Foursquare, said, congrats. We got to talking a little bit, and I just said, like, what do you think about this? He's like, we were actually thinking about building this on the roadmap, but when Covid hit, it became a smaller priority compared to the billions of users that we need to support on a daily basis in terms of jumping on calls, managing that. And so I was like, okay, that was validation to say, this is a big market opportunity. Maybe not as big as video conferencing as a whole at that time, but it was going in and say, the ability to understand how a call is going is incredibly important. And when you think about it from a measurement and ROI perspective, meetings are still the most inefficient thing out there, where we're spending time on calls, where we're like, I don't need to be here. I wish I could drop off, but I can't. Hey, why did someone send this to me? Like, we've seen accept rates across an organization as high as 90 plus percent when you send a meeting invite. And that's without any context. If someone sends you a meeting invite, you will hit excite.
A
Hate to say, you're right. I've got to get better at saying no. When somebody on the team sends an invite and say, can we just move it to Slack? I totally get it? Okay. Did you start to sell this or do any market research before you, before you started building?
B
We didn't do much market research other because there was a market at that point. There wasn't anything for real time analytics or post meeting analytics at the time.
A
The closer you didn't reach out to salespeople and say, look, if I build this thing for you and give you sentiment analysis, would you pay for it? Is this something you're interested in? None of that.
B
We didn't do any of that because what we said was like, can we even do that at first? Like, is this possible, dude, to die?
A
Okay.
B
And because there wasn't any data, there weren't any public open source models, there weren't paid services, the closest that we could find was security cameras that you can kind of look at to say something occurred that looked out of the ordinary. And we're like, okay, this is, this is really old school models. They're not applying it to a world where every single meeting is happening digitally on a video conference call. And so for us it was more about is this possible to do? We spent a year building models, really foundational models when it comes to a multimodal approach to communication. And we said, okay, we've got in a really good spot, let's publish it, let's make it free. Because we didn't want to go into like, I'm going to try to sell you into this and then not just ask you to install this and give me your time, but also give me your money. I said, more. If we can't get traction when it's a free product, how are we going to get traction if it's a paid product?
A
Okay, so what did you use to build the first version? And just for people's context, I always, I always look at mobile as, what was it, 2007 when the iPhone came out? Right. So that's a good indication of when the start time was. For chachi, for, for AI ChatGPT's launch. November 2022 is obviously not the start of it, but it's the start of suddenly the hype. You launched the year before to give people a sense of when, when this was. What did you use to build the first product?
B
Yeah, so the first product was we actually looked for meanings that were available in the public domain. So being able to pull in those different conversations. But then we realized if you can't see the other person's reaction when someone is saying something, that creates low value, because I'm only analyzing the person that is Speaking and that's not that interesting. I want to see how people react to the words that are being said. So we had to go out and find additional data sources where we could see reactions. We hired actors to go in and say, like, read this script and read this out. Then we realized that's too expressive because they're acting for a movie. So it's like no one smiles with a giant smile. Like the Joker, you're going to do a subtle smile or a smirk and we need to pick those things up. So we started to go in and try to find these data sources, create these data sources, and that became our training data set. And then from there, we kind of tuned the models as we went. We used early days, we used things like Mechanical Turk to go in and say, hey, what is this A versus B? Does this look correct from an labeling perspective? Then we started to go in and hire more people to go on the ground and watch full meetings that we generated internally and say, like, hey, what do you think about this? What is your takeaway from this? What is the action item that you have in mind? So all those things combined kind of got us to that first model.
A
How'd you get the. Where. Where did the first AI models come from?
B
We built them in house for us. So it was all in house models that we built. And it was really expensive. When we started, we were lucky enough to raise a large round. We raised a $10 million round led by Madrona for a seed round. Most startups can't do that, but we had that in place so that we can invest in building that ground truth data where we can go in and pay people for their data, where we could say, hey, this is the use case that we want, because we want to build foundational models around engagement and sentiment.
A
What's your background in AI? How did you know that this was possible?
B
I think my background in AI was more machine learning. So early in my career, I worked at a company called Faircast. We did airfare price predictions. So in the old days, this is really old days, you'd actually have to have boxes in different areas to actually go on the Internet, pull data down, and you'd build these things called scrapers, and they would go in and say, scrape the airline ticket prices from all these websites on a daily, hourly basis. And we actually built a model that could predict airline ticket prices. And so that was really kind of early days, machine learning, a little bit of AI, depending on how you want to look at it, to go in and say, we can predict Airline prices based on date, time of day, routes, weather, all these different variables that come into play then when we started place, kind of very similar thing. If you remember the early days of the Apple phone, that blue dot would bounce all over the place. It would normally be a giant blue dot that somewhere, you're somewhere in this neighborhood. You go to Chicago, you'd look like you're on the other side of the river. And so with all those noisy data points were signals for us. So we actually built models to go in and say, hey, it says you're across the street, but if I send you a push notification to check into the nearest location that you have, where are you? Oh, you're at the McDonald's, not the law firm across the street that's now training data. So we built this check in aggregation app where you could check into Foursquare, Gowala, Google places, Facebook places, and you could be the mayor on all the services. So we had millions of users that were just checking into these places and giving us training data. And for them, it was like a fair trade. It's like, you get my location data so I can become mayor. We get your location data, we can.
A
Build better models and get more accurate location than what. What Apple would have for itself to give to its developers. Correct.
B
If we could figure what the signals were. Because then you can start. This is the metadata that's really important, where it's going in and saying, hey, it shows you across the street. But the WI fi signal is stronger. This is 27 or 2011. The Wi Fi signal is stronger here, but there's 12 different signals. Which is the strongest signal? So you actually want to collect not just that you're connected to the WI fi or that it's available, but which one has the strongest signal? Hey, the accelerometer and gyroscope and compass, they're moving really fast. So it looks like you're moving at like 60 miles per hour. You're probably not here. You're probably in the train tracks downstairs or in the underground. That's what's happening. So you're able to stitch these things together. Where we called it dead reckoning. It's a model or an approach where you can go in and say, based on the number of steps that you take, based on the elevation that you go to, based on all these variables, we can predict where you are without actually having your GPS location data. But then if you add GPS on top of that, it becomes even better. So it really is this concept of weak and strong signals to recommend or to predict where you are.
A
And that's what place did, the company that you did place did.
B
And I think it's the same thing for meetings. It's going in and saying, are you looking straight ahead? Are you looking away? What's your rate of speed in which you're talking five minutes into a conversation? You and I might interrupt each other because we're still getting a cadence. We call it synchrony. But. But after five minutes, if you're still interrupting each other, that's almost intentional. It's not a good conversation. You both aren't paying attention, or you're both kind of irritated or aggregated agitated in terms of that conversation. So we can see that transcripts don't pick that up. For the most part, we're able to do that and say, take those strong and weak signals. Here's another example. We do more than just meetings. We do emails, messages, et cetera. If someone emails you and you email them back within five minutes almost every single time, because this person is important to you, significant other, maybe a client, a prospect, we'll go in and we'll recognize that. If you're on a call with them and you're paying attention to them, we'll pick that up as well. So we'll tie those two things together. Those two weak signals now become a very strong signal to say, this person is very important in your life. So whenever something comes in from that person, you need to prioritize that up higher. When they're speaking in a meeting, even the one that you're not in, you need to prioritize that up because you care about what they say, and you wouldn't be able to do that if you did stitch together.
A
Are you thinking of doing email too? Like, it sounds like you're. You're thinking of, of making email more useful by taking all these signals and then popping up the more important emails to the top of an inbox. But you're not thinking of creating like a superhuman, or are you?
B
We're not creating a superhuman. So we think you got to work within your existing workflows. We're not trying to change your workflows. That's why we send email updates for your meeting reports. But we are bringing in that content that contacts. So in right now, for Gmail as well as for Outlook, you can actually connect your accounts. And we have millions of accounts that people have actually connected. And now we've got all the emails that come in and that you send out on top of the meetings that you have. So Imagine you and I are having a meeting. There's four action items. Those action items in a silo of a meeting, unless we meet again, they never got completed. But in an email if you followed up. And I said, hey, Here are the three PowerPoints that you asked for, plus here's a quote on pricing. Now that action item from the meeting is complete, so you need that full level of context. So I think going back to your original question around like with the meeting note takers, why won't you have. Why do you believe that you're not going to have siloed for product, for sales, for client success? Well, that information has to go to different organizations. So the sales team goes and says, hey, I'm getting this constant feedback that this feature isn't working correctly. You want product to have easy access to that information and in that same concept you want those emails to have easy access to say did this thing get actually done? Did the follow up actually occur? Did that file get created? So you can connect your Google Drive to read now you can connect your notion, your jira, your confluence, and not just to pull in data, but you can create tickets. So one final example I'll give is we have a product called Sales Copilot which goes in and pulls all of your data from HubSpot, Salesforce as well as your emails, messages, et cetera. And we will actually push you notifications to say we think based on all these things that have occurred that you should move this deal from 25 to 50% will give you the reasons why. And you might not have even updated HubSpot or Salesforce yet. And you could say, oh, this actually makes sense. You push a button and it's done. So we've only had that product out for three months. Over $50 million in deals have gone through that where we've initiated a change in status.
A
See that, that makes total sense to me. I but I guess in my mind it still could be a sales only tool and another tool could do something else. But you're right, one of the most annoying parts about sales is having to go and update software. It is so fricking frustrating and nobody wants to. The kind of person who's this good salesperson is not the kind of person who wants to sit down and start dragging a thing over or taking a note. It's so painful. I just want it all to be done and then to come back to them and say, you know what, it's been a week since you followed up with this person. That's way too long. Go follow up with them. And based on what they said, here's what you should tell them. Ask them about their wedding. Ask them about the thing that they brought up. There's so much there. I would love for also all this stuff to get funneled into a CRM. This is my personal dream. I keep asking people personal questions and meetings, right? Just like I do in my interviews. I would love for all that to go into my even personal CRM so that I remember it, so that I can come back and check in and say, you know, last month you told me that you were going to Hawaii. How's that trip coming? Did you book it yet? You know, 100%.
B
And I think those nudges are that AI or that kind of recommendation or that notification is really what makes AI personal. That's where you need a human in the middle. It's that orchestration layer to go in and say, I've got all these data sources. The AI is working for me. It's doing a job. And then it's going in and giving me that. Just say, hey, you should probably reach out. And if you want to reach out, let me draft that email. And we do this today. Let me draft that email for you. Let me put some context in there. But we don't hit the send button. I think you still need a human in the middle to go in and say, yeah, but that context's a little weird. I'm going to go adjust the sentence and then, boom, I'm ready to send it out. But prior to AI, that email would have never gotten sent.
A
I had no idea you did that. I had no idea you did that. You will also draft an email for me.
B
We'll draft an email for you. And I think that's the hard part about AI is a lot of it is not just the technology, but it's how you market it, how you distribute it, how you educate people about it. Because features are coming in so fast and furious that for us, I think one thing is like, we're constantly launching features, but we need to sometimes take a step back and go in and say, what is the benefit of this? How do you apply it? How do we remind people that it's available to them? And we've had a number of calls with the search copilot where people have started going and saying, like, I didn't know you had this. You should have told me you had this like three weeks ago. Because I needed it. And now I'm using it every single day. And we're like, okay, that's something we need to work on.
A
Yeah. That is actually a really difficult thing to know what's available. And then once you get an answer of no, if it's. It becomes available, you assume that no is still there. Here's. Here's something I think that is kind of lucky in this space. Zoom has come out with their own meeting note taker. It's. It's. I'm going to say you don't have to say because you're partnering with them. I think it's absolute trash. They're just the basics of the fact that the email that they send you has no taste. It's just a bunch of text. You can't read it and make sense of it. It's not even as easy to read as a Wikipedia entry. And so you just move on. Their AI interaction is kind of confusing. Do you have a platform risk, though? Because they clearly care about this. At some point, Google is clearly going to care about it.
B
They.
A
I don't know if they have Gemini in there yet or not, but I see them kind of at the edges. What's the risk of them coming in and creating an okay enough product?
B
I think we already know the answer to that and that's going in and saying they've got products in place. If you only use Zoom, if you only use Google Meet, if you only use Microsoft Copilot, you might get away with, like, this is enough for me. But honestly, I think you're exactly right where the products aren't necessarily focused after launch. It's going in and say, we've launched this thing. Here you go. The continued kind of investment isn't there because that's not their core business. They're working on the next thing to introduce Copilot to more people. They're working on the next thing to enable Gemini. And the really interesting thing for us has been like. Like, there's not much loyalty on platforms. So for our users, on a weekly basis, we see that they use more than one platform for the vast majority of our users. So it's going in and saying like, hey, now I've got a Google Meet meeting with Gemini that has a different voice from Zoom, has a different voice from Microsoft Teams. They don't talk with each other, the agents don't work together. They're just kind of in these silos. So I'm never going to look at this and it's a waste of time. And then the last thing I'll say on that is, like, from a growth perspective, when Microsoft Copilot launched In about late 20, 23, 24, everyone was like, oh, you're in trouble, guys. Like, they're going to take over and no one's going to ever use that. We've seen more than a 15x increase in Microsoft Team meetings that we've measured since Microsoft Copilot launched for teams meetings. So it's going in and saying it's almost the best thing if you're in the startup world, to go in and say, here's this new technology. Every single major platform is going in and saying, this is available now. People try it out and they're like, this is good enough. There's a population, but there's a group that goes in and says, I need Tableau, I need databricks, I need Snowflake. And that's the market that we're seeing that you're coming in and saying, like, yeah, this is great starting point, but I need more than this to get my job done. And we don't have to spend anything on marketing when it comes down to that. Like, that's going in and saying the platforms are marketing the benefits for us.
A
They're saying, look, here's how a transcript can help you. Here's how AI could help you. They, the user says, okay, but I want more. And then they go look for more. Or they show up in a meeting where suddenly there's a Read AI AI agent and they say, let me explore what my friend is using and try that. That's a very viral market.
B
Yeah.
A
And you know what? There is no lock in with Zoom. There is a lot of lock in with Read and the Transcript companies because once you have your stuff in there, you want to be able to go and search every past meeting and it's just too convenient to stick with your platform. So you do have a lot of stickiness built in, right?
B
A hundred percent. And the stickiness is built in. All the data is there. And we're starting to release things like the agent that we're releasing next week where it's going in and actually going in and pushing in these agents that will go in and say, hey, your week was over last week. What were all the open action items that you had? What were the deliverables? What was important to you, what was important to your team? And on Monday it comes in and says, like, hey, here's an update on everything that's going on and if you want more updates, just let me know. But it's automatically doing that job for you. So it's pulling in all that content and saying like, hey, Andrew, heads up. These are the four projects. These two things actually got done. This didn't get done. You might want to follow up on this. And it's giving you those nudges. So it's kind of what we talked about a little bit earlier.
A
What are you doing to understand how users are using Reed and what else they. What else they would want and what they're not? I guess what they're not using is easy enough. You check the software and you see what people aren't using and you maybe market a little more or understand that it hasn't worked. But the things that they need that they haven't expressed, how do you come up with those?
B
Part of it is the metrics that GAD talked about. I think the other part is kind of going in and just getting feedback. So because we have 50,000 new users every single day that are coming in, they're constantly trying these things out and we add new features. We use a company that recently was acquired, Statsig, from OpenAI, where they're able to let us test different things out to go in and say, hey, should we focus in on Google Drive Connect or SharePoint or OneDrive Connect? Because people want to search their meetings and the files that they've created to make sure that we can connect the dots there. And so we do these tests. Statsig lets us do that. We are able to see what the behavior changes are very quickly and we give a winner and then we turn that across the board or it's feedback where someone says like, hey, I need this in Portuguese. And this is where we, this Brazil is our top five market for us. We would have never known that had it not been for the customers kind of just coming out and telling us and saying like, hey, we need Portuguese. And we're like, okay, are we seeing a big uptick in Portugal? I was like, actually no, it's Brazilian Portuguese. It's a little bit different. And so we went in and we said, hey, are we going to build two separate models for Portuguese and Brazilian Portuguese? I was like, yeah, I don't know. The user started saying like, hey, I'm using your European Portuguese version. And there's certain words that it's not picking up. Can you make adjustments? Then we make the adjustment. Now we see where I think we are the biggest note taker in that market today. And we didn't exist a year and a half ago and these people are creating accounts or Colombia is a great example where we never thought Columbia was a market we would go after this is a country of Columbia, not the university. And so we're like, okay, why are we seeing all this activity for Columbia? And we dove into a little bit. We started asking people, we have about 1 to 2% of Columbia's population using Reed on a daily basis. Basis. And we're like, this is weird. Like, we don't even. We have not targeted this all. What's happening was university students would bring Reed into their meetings or their classrooms. They would bring it into their study sessions, et cetera. They would kind of go in and create multiple accounts. And we kind of let them do this because we were trying to say, what's going on here? And they would. Every single class now in Columbian universities are actually recorded with Reed. They're sharing that across the entire student body. They're making it available. And they built this storage of intelligence for the university, for the classes, which kind of drove us to go in and say, okay, if they're using it for that, why don't we make it easier to do that so that you could have this system of record for productivity, for knowledge, for intelligence, so that if you have a question about a class that you missed, you can just go in and ask it just in the same way, like, hey, I missed that meeting, or, hey, what's the status of this project? Because I've been out for two weeks. You don't have to ask 12 people. You just go in and ask that system of intelligence.
A
All right, I want to still understand what other. What other possibilities there are for AI companies. You are someone who. I didn't realize this until I looked at your LinkedIn. In preparation for this conversation, you had invested in dozens and dozens of companies in addition to launching a couple of really successful ones. What. What are you seeing that's working for startups that are building on AI or building AI first companies?
B
I think launching very quickly is kind of a thing that people said all the way back to the early days of Y Comb, or fail quickly push it out there. If you launch a product and you're not embarrassed by it, you're not launching it in a timely manner, those things are all true, but it has never been more true when it comes to AI. You have OpenAI, which has a problem called hallucinations, where it makes stuff up to the question that you are asking. And that application has close to a billion monthly users now. And so that goes in and says, people are willing to go in and say, I'm okay with mistakes. I'm okay with drastic mistakes. As long as you're Clear that this is a new technology, this could happen. So you need to make sure and review these results. But if you are able to give me 98% of the time something that looks like magic, I will sacrifice the 2% of hallucination. So I would go in and say, like, be okay that your product is not perfect for early stage founders where you've got smaller products now at our stage, we have to be more careful. We have to put checks and balances because now we're the system of record. But it's not.
A
Give me, give me more of these. Like, do you have an example of a smaller business that had launched that did have some, maybe a lot of hallucinations and mistakes in the beginning, but look at where it is now.
B
Yeah.
A
So give me samples of what you've seen.
B
So one is kind of on the creative editing side. So on the ad tech side, today, my background's a little bit in the ad tech space. What you found is like, AI should be applied to every single ad that goes out there in the world today because you can go in and adjust it based on performance due to real time. Hey, the shirt should be red instead of blue. It should be winter, not summer. And do these things. It hasn't caught on because you've got old school companies going in saying like, well, does it align with the brand tenants? Does it align with what we're doing from a messaging perspective? And who's winning is the people that are focused in on performance? These are people where it's like, hey, I'm going to drive results and if the results prove themselves out, eventually we will go out and deal with like, does it align with the brand goals? Is this right? Color palette? Those things don't matter. So one has been around kind of video editing. There's one company, I can't talk specifics on their numbers, but they've done incredibly well where they do video editing. And at first it was kind of like 50, 50 chance you'd get something that was great. But the 50 times that was great, it was done four to eight weeks faster than anything that you could do in the current world. So you would just go in and create more and more content. And now they are, I'd say, The probably top 10 video editing app out there. When you can.
A
We're talking about Opus, I'm assuming not that one.
B
There's another one that you'll probably hear about here very quickly.
A
It hasn't. Has it launched or not?
B
It's launched. So it's got a million users yeah.
A
Okay. And so it does video editing. And what I've seen with video editing is they'll take a big video and they'll create short clips. I'm imagining that's what you're talking about, right?
B
Or it's more like it'll take like a 8 second clip and it'll extend it out into the video where it's like a 15 second or 30 second and you get a lot of weird stuff. Right? Like if you've ever messed around a VO3 or VO 2.5 or done it in ChatGPT, you get some really interesting things. Now you, if you're a startup founder, you'd be like, oh, until it's perfect, I'm not going to launch it. Well, by the time you get perfect and you launch it, there's going to be a new model that's already going to do that by default. So you need to launch quickly to get it out in the market and people can then make the decision, is this good enough for my use case or not? You don't need to say like, I need it perfectly. I just need good enough that people will say, this is valuable. And I will take the hits. When it comes to the hallucinations, the weird outcomes where it's like, hey, someone jumping up and down when I said sit down, like, those are things that you can handle. When you're able to get that magic.
A
What are some ideas that you wish somebody would create in AI that maybe are some simple ones that no one's jumping in and doing?
B
I would say from a simple perspective, it's not simple, but it is going to be. Table stakes is. I've spoken with a few companies in the last election that happened where they thought AI was going to be a bigger contributor to noise in the marketplace. And what ultimately happened was it wasn't, it wasn't there. Ready for PrimeTime. We're all AI savvy enough that we could say that image looks a little bit wrong. Something looks off there. It's not really happening. Other countries that don't have that, they actually ran into a lot of issues. The US didn't talk about it that much, but in the US itself, it was very marginal in terms of the impact that it had. I do believe in the amount of the next year or two, it is going to be almost impossible to figure out is this AI generated or not. And at that point it's a simple problem in the sense of like you want a certificate of authenticity, you want to say this is real or this is fake. And the ability to measure that is going to be incredibly important. In the same way, with ad impressions right now, you've got viewability. Was that ad viewable or not? I think you're going to have the same thing when it comes to any type of video or text content where there is some kind of. Of seal that says real AI generated or a combination of both. I think whoever figures that out is going to build a very big company that is going to be something that we see every single day where it's like, hey, on cnn, that clip, there's a little seal. Okay, that's good. I should believe that you would you.
A
Have been able to do it if you weren't working on Reed with your background?
B
There, there. There's a shot I could do that. There's a shot I could do that.
A
It is a tough. It's a tough one.
B
It's a tough one. And the big thing is like, you have to get adoption. That's the biggest thing. You have to get people bought into it early. You have to have enough kind of distribution that people will contribute, their content will validate it, that you can get that level of scale. Because I've seen a lot of great ideas flame out. Like, if you look at my portfolio on LinkedIn, half the companies have been wound down because it just didn't work out for whatever reason. It's not that they were dumb. It's not that they were. The technology wasn't great. It's that they didn't have distribution. I think that is key at the end of the day. And then the other thing I would say is, like, and this is where we are touching a little bit is I do think there is going to be AI that will fill in the gaps when you're not available. So let's say you're on vacation, you're out for seven days. You don't want to get interrupted because you want to unwind. But there's something that happens at the office and you have the contacts right now. Someone would call or text like, hey, Andrew, I'm sorry to interrupt your vacation. Can you use the answer this. And you're like, like, yeah, it only took a minute of my time, but this is now a distraction. Now I feel bad, they feel bad. And I'm not going to give you the best possible answer. AI can fill those gaps where someone is out of office that I can go in and fill that gap. When someone leaves the company right now, a lot of times when you leave the company with two weeks notice, you're spending a bunch of time documenting everything because it's a big waste of time versus like actually doing meaningful work while you're there for the last two weeks.
A
Right.
B
Now imagine if AI can go and say like Andrew, do whatever you want to do, like get, get the business going in a good spot. But the documentation, the training for the person that's gonna take over free. We got this. Like those are small tasks that people haven't built out yet because that system of intelligence isn't available. Cause if I only took your meetings, I don't have enough context. If I only took your emails, I don't have enough context. I need to build all of those things to get that full context that I think is gonna happen really quickly. Like I'd say in the next six to 12 months.
A
Yeah. For me to give some kind of AI experience where I have, have access to my email, access to my Slack, access to my Google Docs and Notion with some restrictions so that you can't start asking questions that give you private information while you're on my team. I'd be okay with that.
B
Yeah. That feels like it saves you time. Yeah. I think people would go in and say like, you're not going to bother me. You're not going to call me at 2 o' clock in the morning because something broke. Because my AI assistant can go in and actually answer this for you. And if you think about it, we have out of Office messages is that's the dumbest thing right now. It's like, hey, I'm out of office. Like, okay, what if I do now? Hey, here's four other people you could contact that there aren't going to reply to you in a timely manner. Or I could have my AI just give you that answer, which is kind.
A
Of what happens on like Intercom and Zendesk. Right. Where an AI can look through all the docs and give an 80% answer. 80% of the time, if not. All right, I think I got where this is going. You didn't tell me the exact number of users. Right.
B
On Read, we're in the millions and we're growing like every single month we're adding new records.
A
All right. Millions. More than 10 million?
B
Not yet.
A
Okay.
B
All right.
A
You also showed me a lot of things that I didn't know were in Read. Here's what I like about Read Video. I can't just have audio. I love the audio only apps. They're phenomenal. Like Otter is really good, but I need to see the person's face when I said the thing or to clip it for the team and go look when I said this, look at how excited I they were. That is a really nice feature. Here's what I think is. Is helpful but it's not changing my life, but it's helpful the calendly integration inside. Really nice. Here's the part that is the ability to ask questions of my past notes to be able to go in and say what about this? What about that? I had. I don't know, can I get live meeting notes while I'm in a. While I'm in a session and start asking questions a hundred percent with.
B
So we've got a chrome extension that you can actually see a pop up that comes in and says like hey, this meeting is going on. Do you want to ask any questions? Do you want to catch up? Do you want to.
A
Can I just go into Read AI app Read AI once I'm in and then say. And see my notes. See my transcript as we're talking.
B
If it's in zoom, you have that available today too. Yeah. So you can see a live meeting in there that you could pull in those notes. So I had no idea this is where we got. We have to do a better job with all the features making them more easily discoverable.
A
All right. And then there's also the thing that tells me about charisma in the call and all that other analysis which I which I like. All right, thanks so much David for doing this. The site is Read AI. I appreciate you and I'm looking forward to doing more of these AI conversations.
B
No, appreciate the time, Andrew. This was great.
A
See you man. Bye bye Moen.
Episode #2280: Read.ai is adding 50k users per day
Host: Andrew Warner
Guest: David Shim, Founder of Read.ai
Date: September 15, 2025
In this episode, Andrew Warner interviews David Shim, founder of Read.ai—a rapidly growing AI meeting note-taker. They explore how Read.ai stands out in a crowded market, the nuances of AI-powered engagement and sentiment analysis, the evolution of meeting productivity tools, and the surprising directions in which users are taking the product. David shares entrepreneurial lessons from building multiple startups, why specialization in AI tools may not be sustainable, and actionable insights for founders building in AI today.
"That's a run rate of a million plus on a monthly basis, 12 million annually." (01:09)
"It's very easy to build a basic model... For us, what's really resonated is our ability to actually measure sentiment and engagement in real time and then apply it to the text." (01:45)
"That narration layer is that additional context... that goes in and says, this is compelling versus this isn't interesting." (02:16)
"If you're looking at the camera straight ahead... if you look over this way and go to a fixed position... We have a model that says that's a second screen." (03:34)
"If you use our product in a meeting and you get the reports, we see that the retention is 80% after 30 days." (06:12)
"They actually don't want to see that on the back end. They want the notes to actually take that into account." (05:21)
"In this AI world, you don't need [separate tools]. I think that one AI solution needs to go in and say how do I tackle this for an entire organization." (09:07)
"So you want one solution across an organization that could be that storage of intelligence... that everyone can tap into." (09:32)
"If Andrew's doing a meeting and he's doing an interview, the AI will know that it's an interview and adjust to Andrew's style." (11:47)
"I could realize, should I even be on this call or not?... I started to do some math. I was like, this is really expensive for how many people aren't paying attention." (12:49)
"I have early onset dementia... I want to remember the things we talked about so they don't feel bad." (15:08) "Healthcare workers... are recording the conversation with their consent... and now they can go back [to analyze sentiment and engagement]." (16:38)
"What we found was it is valuable for that niche use case. But a lot of times it becomes cognitive overload..." (18:22)
"They're not defensive because it's the AI telling you, it's not your manager telling you..." (19:25)
"We built them in house for us... really expensive. When we started, we were lucky enough to raise a large round." (26:14)
"So it really is this concept of weak and strong signals to recommend or to predict where you are." (28:24)
"Those action items in a silo of a meeting... But in an email if you followed up... now that action item is complete..." (30:54)
"There's not much loyalty on platforms. For our users, on a weekly basis, we see that they use more than one platform." (36:06) "We've seen more than a 15x increase in Microsoft Team meetings that we've measured since Microsoft Copilot launched..." (36:06)
"Every single class now in Colombian universities are recorded with Read... and they built this storage of intelligence for the university..." (39:29)
"We use a company that recently was acquired, Statsig from OpenAI... they let us test different things out." (39:29)
"If you launch a product and you're not embarrassed by it, you're not launching it in a timely manner... but it has never been more true when it comes to AI." (42:29)
"Half the companies [I've invested in] have been wound down... it's not that the technology wasn't great. It's that they didn't have distribution." (47:37)
"I do believe in the amount of the next year or two, it is going to be almost impossible to figure out is this AI generated or not..." (46:05)
On Standing Out in AI:
"There's just a lot of noise in the market right now." (00:34)
"For us, what's really resonated is our ability to actually measure sentiment and engagement in real time." (01:45)
On Product Design:
"What we found was people were incredibly surprised to say, Oh, wow, I was actually talking really fast. I can barely understand myself." (19:27)
On Organizational Productivity:
"We are the system of record for meetings. Where we become is the system of record for productivity." (09:32)
On AI’s Evolution in Meetings:
"We classify meeting types when that data comes in... all of that context... can be applied into the meeting summary." (11:47)
On Real-World User Impact:
"I have early onset dementia... I want to remember the things we talked about so they don't feel bad." (15:08)
On Competing with Platforms:
"For our users, on a weekly basis, we see that they use more than one platform... They don't talk with each other... So I'm never going to look at this and it's a waste of time." (36:06)
On Launching Fast in AI:
"If you launch a product and you're not embarrassed by it, you're not launching it in a timely manner, those things are all true, but it has never been more true when it comes to AI." (42:29)
This episode shines a light on how modern AI startups can differentiate and scale in competitive markets. David Shim’s journey demonstrates the power of listening to actual user behaviors, the value of core technological differentiation, and the necessity of rapid iteration and distribution. For founders, Read.ai’s playbook and real-world lessons offer a blueprint for building lasting, impactful products in AI’s fast-evolving landscape.