B (14:38)
Yeah, it's definitely the most boring part. And the reason it was actually kind of hard to decide what I wanted to talk about, because with cloud code you're doing so many things that I wanted to go deep rather than wide. But yeah, I would say for directories, the moat is definitely data and your SEO, like if you have really strong backlinks. But yeah, like also just quickly. Another reason I wanted to show you those three directories is because I think the number one question that I saw reading the comments from the last podcast was how do you monetize these directories? And the answer is really unsexy. But it depends on the niche. I don't think it has to be display ads or legion. It obviously can be really creative. I've seen tons of different offers created on the back end of directories, but yeah, it's super cool. So anyways, I built this directory in four days and it literally started from a raw, massive, unorganized CSV of data of over 70,000 rows that I scraped for the entire country for Porta Potty suppliers. And I just wanted to start with an example where I show you what a directory might look like with high quality data. And one that looks. Yeah, and what one would look like with low quality data. So let's just start with this one. This is my cloud code directory. Luxury restroom trailers. Boring, but awesome. Niche people are spending a thousand dollars, $2,000 a day renting these out for their weddings. You know, there's corporate events, film sets, really all sorts of stuff. Just kind of showing you what you can create in record time. And yeah, the listing looks great. Super, super clear with the amenities and features. We have some images, the types of stalls that you can go and rent, and then of course the lead form some service areas. And I think this looks solid. But let me take you a year and a half back when I didn't know Claude code and I created this beauty. This is portapottymatch.com I built this in WordPress and it still has Lorem Ipsum on the front page. You know, we got some listing pages that are looking pretty sparse. And all of this is just AI generated, like sentence structures all the same. It's a bad directory. It's ugly, it's bad. All these images are the same for every listing. So I built this thinking like no one could reasonably trust this directory and like inbound as a lead, right? But to my surprise, I just built this and left it up. And I got like some leads coming in. And this is one example here. Kind of blocked out the email, but I actually got a bunch of leads that just inbounded. This guy named Eric wanted a restroom trailer for with multiple stalls. He has a film shoot. So that was kind of a cool way to look at the use cases. Another film shoot guy coming in asking for a trailer for a film shoot. Martha looking for a reasonably priced Porta Potty with a sink or hand wash station. And then this is the big one. I got a lead from the New Mexico State Fair. This is the order that they wanted. And this is like $20,000 plus of Porta Potty slash luxury restroom trailers that they wanted. So just an insane lead. And I was surprised. So despite my really poor data curation, it kind of just showed me that there is a massive need in this space. And it's not really easy to go and compare the different Porta Potty types if you're in the market for one. And there's also like a vast amount of like use cases. But I knew that things were kind of falling through the cracks. And so if all these leads inbounded with this crappy looking directory with Lorem Ipsum on the front page, what would it look like if I actually built this thing out now that I know Claude code? So I vibe coded this kind of how I built this directory that outlines this seven step process on how I'm using Claude code and, and an open source GitHub repo called crawford AI to handle any kind of data curation. So this is where I kind of just want to show people how to tackle that data side because I've seen probably a thousand or two thousand people build directories in the last year. And a hundred percent, this is where people either quit or give up. And I totally empathize. There was a point in my life where I was waking up and being like, all right, I'm going to sit down in this chair for six hours and click on this website and just manually verify if this is a luxury restroom trailer. And it was hell. Like it was awful, terrible, terrible use of time. But as you can see from these stats, I mean I built this in four days. Like I mentioned, I probably saved over 2,000 hours in what would have been manual data cleaning, manual data enrichment. Also record low costs. I mean I would have to Hire a developer to like write custom Python scripts before and it's just not necessary nowadays. So I built this in under 250 bucks. A hundred? Yeah, a hundred dollars of that was my cloud code max subscription. Another hundred dollars was for the Data, and then $50 for some Cloud API credits to do some deep cleaning, which we'll talk about in a moment. So all you really need to get started is your directory niche. Mine's luxury restroom trailers. You need Claude code and you want an understanding of what drives decision making. Like what do people actually need to make a decision around a luxury restroom trailer? And you can just achieve that by looking at Reddit forums. I think TikTok's great. Facebook groups, Instagram, anywhere where the conversation online is being had around these, around the topic that you're building a directory around. So as we can see, like this whole process I'm about to go over is just the second step for me. I think my structured approach to building directories starts with finding the idea, collecting the data, building the website, SEO optimization, and then figuring out monetization. So let's just start with the first step. The first step is just scraping the data. I just went straight to Outscraper. I don't think there's nothing new here. I went over this in the last pod, but Outscraper is still the cheapest option for me. I've used apify and other kind of alternatives, but just keep it simple. I think Outscraper is a great place to start and there's tons of videos online where people can learn how to scrape data with from Google Maps using Outscraper. So I went and did this. I got 71,000 rows and potential listings and cover the entire state. I build nationwide directories and the whole game plan is to sell leads in this high ticket niche. So pretty straightforward. The problem is 71,000 rows is massive and we definitely have to do a lot of cleaning as far as data goes. So this is when I introduced Claude code into the mix and I used to have to manually clean data and I would have all of the obvious junk data removed. So remove things like listings with no business name, address, city, state, permanently closed ones and any like obvious ones that don't relate to my niche, like big box retailers and and so this is the prompt that I wrote and obviously people are going to have to kind of retrofit this to your own niche. But I'm just telling Claude code, here are my five CSVs. Go ahead and just look at every single one and use this criteria to clean the data. And so this simple prompt got me from 71k down to 20,000 listings. And if you are going to use this prompt, I think anyone can benefit from this particular part, which is just getting rid of obvious junk data. And yeah, what we're left with is just still a massive piece of data. So the next round is where it gets a little bit more interesting. And this was like where I had the biggest problem a couple years back, which is, okay, we have 20,000 potential Porta Potty businesses is I've kind of exhausted all of my ways to kind of superficially clean the data. And so at this point I need a way to automate the process of manually going to every single website and verifying whether or not these are luxury restroom trailers. And so the way that I did that is I installed Crawford AI. This is an open source LLM friendly web crawler and scraper, totally free, which is insane. And this is something that you just install locally to your computer. I'm brand new to, I'm brand new to AI coding. Like I'm only six months in. And so if you're just getting started, the way that I literally did it is just took this link, gave it to Claude code and told it to help me install it and it took about 15 minutes. So now that we have this installed, Crawford AI is kind of the engine that allows us to, at scale look at every single website. And then cloud code is the brain. So we can prompt CLAUDE code and say look at every single one of these 20,000 websites and just go and identify the luxury restroom trailers. So as a quick little demo here, oops, I'll just pull up my Claude code here. And I just wanted to show people the workflow instead of just talking about it, but I'll just do it on a smaller sort of sample size. So here's the CSV of just 10 restroom trailers or sorry, here's an example of 10 porta potty companies. And basically I'm trying to figure out which ones are restroom trailers and which ones aren't. So I'm going to go ahead and just run that prompt that I showed you here and I'll just copy and paste it here and.