Transcript
A (0:00)
I've been paying attention to a new design tool called Subframe. So I invited their co founder, Philip Skreshinski on to give us a demo and just talk through the vision a little bit. And one of my favorite parts is seeing the different patterns they've adopted for interacting with AI. Things like contextual edits and a totally new way to think about prototyping. So let's start by learning more about the subframe journey and what led them to go all in on designers.
B (0:25)
The story with Subframe and why it's unique is that it is kind of the last design tool before AI because we built all of it without any LLMs under the hood to power the design code translation layer. And I think a lot of that had to do with really thinking from a technical first approach, from the constraints of code, and building a visual tool that works around it. Where we are in San Francisco, small startup, walk into any coffee shop, someone's building something. Um, so our product strategy and go to market really revolved around early stage founders. And that's because in a lot of ways, the moment you cross into that code space, even though you're designing with code, you're kind of a builder. You're, you're not just in an abstracted world just, just drawing pictures, you're kind of building the thing. And so the first builders for our us were those founders because they have to do all of product, all of design, all of front end, all of backend, and do it all every day. And many of them actually were not designers, right, and were not front end experts. So we built this tool that let them just get to code so much faster to build like beautiful ui. And a lot of it was template driven, you know, like you have, you have colors, you have components out of the box, you have preset spacing, and all of that just helps you design with better defaults. So that was really our first, you know, year after we launched, was continuing to go down this like early stage, you know, Y Combinator, like last year, last summer we had like 20% of YC companies use subframe to just build their MVPs. And then something very interesting happened at the end of last year, right, which was we resisted AI ourselves for a while because we were like, are the models good enough? What does this mean? How do we integrate that well? And what happened was this kind of AI coding explosion in the market, right? You had all the builder tools, like, you know, your lovables, your bolts kind of just blow up and come up from nowhere. And in some Ways I think a lot of the what I mentioned with like competing with the large language model, a lot of that initial, you know, ICP for us, the founders like Bursar and just ChatGPT and Claude kind of took those away from us because again, these were people who just needed a dashboard and they had no way to do that before. Now they can get a dashboard from an LLM. But we are a design tool right, where you can craft things. And so suddenly that opened up this market of like designers who for so long were stuck just again being far away from code. Suddenly we're like, wait, I can just like prompt this thing and build a real interactive like web app? Like that's pretty freaking cool. I want to be working closer with that. But I still need design and design control and a tool where I could drag and drop to be like my interface for working with this, with this code and these large language models. So at the end of last year this kind of explosion happened where designers kind of became enlightened. I think with working from code. We hooked up large language models with our system which because it's built on code, worked really well to generate good looking designs that we were able to train and give you a canvas for working with them. And at the beginning of last year, yeah, designers came flooding in because they try out these other tools and they're like, can't hand this code off to an engineer because it's not using our system. I can't drag and drop stuff around. It doesn't feel like a design tool. And they're kind of finding us because right now we're sort of in that unique position because we started before AI, that we can do both things.
