Transcript
Jordan Cooney (0:00)
The Voices of Search Podcast is a proud member of the I Hear Everything Podcast Network. Looking to launch or scale your podcast, I Hear Everything delivers podcast production, growth and monetization solutions that transform your words into profit. Ready to give your brand a voice then visit iheareverything.com welcome to the Voices of Search Podcast. A member of the I Hear Everything Podcast network, ready to expedite your company's organic growth efforts. Sit back, relax, and get ready for your daily dose of search engine optimization wisdom. Here's today's host of the Voices of Search podcast, Jordan Cooney.
Sam (0:42)
Crystal Ball we're looking into the future now, Martha. Tell me how the relationship between structured data and generative AI will evolve over the next 18 months.
Martha Van Berkel (0:55)
I think this is why MCP was such an interesting topic. I think we're going to basically use structured data as the feed into the models and is going to become like, even less of an, like, just an SEO thing. And it's going to become like, this is like your Data strategy for LLMs, but we're going to have to just like, journey on that. Like, I would say, like, we're still pretty early days in that. I'm excited because we're going to, we're going to just like make that happen.
Sam (1:21)
Absolutely. I agree. Frameworks and protocols are going to be a way for us to organize and structure better because none of these, none of these models, and we don't know if these models will verticalize themselves or they'll become super critical for certain types of consumers. As that happens, we have to be very prudent about what we supply. And I think these frameworks and models are going to be critical over the next 18 months.
Martha Van Berkel (1:47)
I think it's going to be data feeds with context. And I say that very specifically because I don't know, I read like four articles this week where I was like, you LLM Txt, that's not going to be good enough. Right. The reason why all the research says knowledge graphs and learning language models work well is because it's data with context. So it's structured data. Think that broad concept, but it has context, which means you're defining relationships between things because it's when those relationships are defined that you can do inferencing. And the problem we're solving for large language models is the inferencing part. That's the expensive part. Right. It's the part that hallucinates. And so I think, like, historically, if you'd asked me a year ago, I'd be like, enterprise are going to be investing in knowledge graphs. I think like as we have new standards like that's hard to do and we're seeing people still do that work. But I think the most important, the reason why knowledge graphs are important because it's data with context.
