Transcript
Jaden Smith (0:00)
Mistral has just rolled out a new tool that is keeping it, I think, competitive with most of the other AI players and that is Deep Research. Now, for those that have been following Mistral for a while, you'll know that this is an interesting company. It's based in France, over in Europe, and it's kind of the biggest AI competitor to ChatGPT and Grok and Gemini over in Europe. Now it has a bunch of really interesting kind of features that Chat, GPT and other players over here don't have it's trying to compete against. But it does seem like some of these features that it rolls out are a little bit behind Deep Research. You know, many months ago, most of the American companies implemented, they've just implemented it, but they're doing some interesting things with, for enterprise customers in different sectors that make it a little bit trickier. And so they're actually able to still grab a sizable part of the market. So I want to dive in today on the podcast on what exactly is included in their Jeep Research, why it's slightly different than what's happening with some of the American competitors, the industries they're targeting and overall where Mistral is, because I think they have some, some interesting, some interesting stuff that they're going to be working on in the near future. So let's dive into all of that. Before we do, I wanted to mention if you want to try out any of the top AI models, I would love for you to go check out AI Box AI. It's my own startup and we just released a beta playground where you have access to the top 40 AI models, including models from Mistral, Meta, Google, DeepMind, Coherenthropic, OpenAI, tons of image models and audio models that you may not have tried. And I find that every single model is, is good at different tasks. So this is a place that I love to go and compare them. You can get a result to get regenerated by a new model and compare it side by side. There's some tabs to compare all, you know, the same, same response from tons of models side by side. You can switch what model you're talking to Mid Chat. You can get images and audio and text all generated in the same thread. It's super useful and you can store everything in your media storage so you get access to all of your files in one place for one subscription. So go check it out, the links in the description AI Box AI. All right, let's get into what Mistral has released with their new Deep Research mode. So the first thing I Wanted to say is, is kind of a quote from Mistral's head of product, which is Eliza Salamanca. And she said what the model does is really go and look into a big variety of sources on the web to answer a specific question. We believe this feature is going to be very relevant for both customers and enterprise use cases for consumers because it can, it can research travel and provide an exhaustive analysis of the best travel plans. And for enterprise work, it can do exhaustive research. Okay. Personally, the exhaustive research is what I'm excited about. I, if you know me, you know my absolute pet peeve of all time is whenever anybody demos any new advancement with AI to like and it can plan travel itineraries. Like, this is like the golden crux of like planning. Like, it doesn't take you, I don't know, five seconds to go find a, an itinerary that someone else posted online and go look at Google flights. But I mean, I digress. I just, I hate this example because I think there's so many incredible use cases for these AI tools. And planning your travel itinerary is just like not the number one thing I think of. But every single person demos that use case every single time. It's just like without fail, they'll say, can plan your travel itinerary. It's like, okay, big whoop, whatever. Maybe it's a bigger thing for some people than me. I have no idea. I usually just kind of wing it, but I know, whatever. In any case, I do think this is interesting. This is available for all of their different tiers. So they have free pro teams and enterprise. And I'm over on their website right now. If you're watching on Spotify or YouTube, you can see I'm sharing my screen, but otherwise I'll explain. And basically on their website they have something that looks very similar to, you know, chat GPT for a website. So you have the ability to do voice mode, so you can dictate whatever you're saying to it. They have a whole bunch of different tools. They have a code interpreter, they have image generation, they have a canvas which they also have just launched, and web search. You can select what tools you'd like it to use. And then of course they have this think button now I think they're, they're pretty much copying kind of what like Grok did the way they rolled it out with like a think button, which Grok has now actually removed. That button I believe is sort of probably just on by default. But for a while there was like a think button and if you click the think button, this is where it's going to do the quote unquote deep research. Right? So there's the basic fast answers, then there's the think button. They have actually something else that's really cool that I haven't seen everybody else do. And that is there's something called pure thinking and there's something called 10x speed, so you can have Think mode enabled. So basically what you're saying is like, I want a response, I want your best response. So beyond just your regular I want your best response. But if you're, if you're like not doing math or coding, then it's like, it doesn't have to be so analytical. So you can just do the 10x speed. So it's like I want you to think but do it fast. Basically it's just a question about something like, not insanely technical. Now if you have something that's insanely technical, if you're trying to do code, if you're trying to do, you know, math or something that it's like you need it to like be a calculator, basically they have this pure thinking mode you could toggle on Pure thinking and that is like you are a computer calculator. Figure out this problem in a very analytical, basically math encoding way. It disables all the other tools that are trying to think about how to make it happy and friendly and fun and EQ and whatever. Like nothing matters. Just like be an engineer and figure this thing out. So I do think that's really cool that they have that option. They also have some, some things they're calling agents, which are pretty cool. So I think that overall Mistral's got a lot of really exciting things they've rolled out. Um, and this, this new deep research is fantastic. It's not new like everyone else is doing it now, but I think it makes them competitive. So I guess a lot of people have asked, like, what do they do that's different? Like, why do people use Mistral? It is a, it's a pretty popular topic. I know every time I make a video About Mistral on YouTube, it's like one of my best performing, like topics that people are always interested in. And, and Mistral essentially has a very interesting use case. I mean a lot of people, I think in Europe will use it because they're patriotic. Maybe in France and Europe. I mean, don't, don't hate me for saying that. I'm sure, like it's, it's pretty decent model too. But I think that's why a lot of people use it there. Like, I don't want to just use American tools. So, you know, I get it. Well, a little patriotic nationalism or, I don't know, you're nationalistic of the European Union, whatever that's called. In any case, one thing that I think is like a very real, like a very real use case and set of customers that they serve is enterprise. They're doing some very interesting things for enterprise across the world. And this is what Salama said about this in particular. She said a lot of customers who have very sensitive data don't actually use cloud services. Or if they do, they do it on their own premise with virtual private clouds. So in order to address this, essentially Mistral's with chat and all of their, they have a whole productivity suite, essentially they're connecting to all the enterprises data on premise. So in other words, essentially companies can go and use the chat, they can use this deep reasoning and their other tools and all of the capabilities to look at their own internal data without having to upload anything into the cloud. This is the concept that Apple is moving towards. Although Apple's, you know, chronically behind on everything AI, but like Apple's iPhone, the whole concept was like, look, we're going to roll out an AI model, but it's going to be running on your phone. Um, you're not, it's not gonna have to connect to the cloud. And like that was kind of what a lot of people were excited about with Apple intelligence. They're like, oh, this would be sweet. You have like basically ChatGPT, but it's not pinging a server somewhere, sending my data there. Whether ChatGPT says it's secure or not. Like, hacks happen, leaks happen. Who knows what shady stuff met has been doing for the last 20 years. That crap happens. So like, I think people are just like generally skeptical of big tech companies with their data. And if you can have something that locks your data down to your phone, I ask it a question and it's not leaking out. You know, I'm talking about some, you know, rash I have on my elbow. And now it's like off in the cloud somewhere and it's saved in chat GPT memory that, oh, Jaden gets rashes on his elbow or something, right? Like this is basic things you, you don't want to, you don't want these, these AI models storing forever. So that's the cool thing about having it on device. Now for users, this is obvious. For companies, it's especially obvious you have Trade secrets. You have ip, you have all sorts of things that are confidential that you don't want leaked, you don't want anyone to ever get access to. And so this is one of the big use cases that Mistral is serving. Essentially what's, I think a big differentiator is that, you know, against other companies like OpenAI, Azure, Gemini, which is Google Cloud, Hope hosted. It essentially means that the fact that all these companies are using Mistral in this way, we're going to secure their data. Doing this on premise because Mistral kind of was releasing all these open source models, people could just run them on their own stuff. And once they got used to it and kind of started working with them, it also opens the door for Mistral to sell them tools in their other tools in their productivity suite. Right? So it's like basically we're using this as our foundational model. Maybe they're using an open source version for quote, unquote, free. But now we want to connect it with other tools and so Mistral will come in and sell them solutions for that. This is specifically what they said. They said, quote, a big part of our value proposition, especially when we release lechat Enterprise, is that we want to make things like Microsoft Excel and Google Docs and, and that kind of Office suite work seamless with our capabilities. So the connectors we released with lechat Enterprise are actually one step in that direction. We're building these connectors internally because we believe this is going to be a key for using the chat as a productivity enabler in the business context. So overall, I'm really excited for this. There's a whole bunch of cool tools that have been rolled out other than just this deeper reasoning thing. There's a bunch of other updates that I got today. One of them is, one of them is essentially the fact that like, so I think in the past they're kind of their reasoning or their model that used to be able to do complex tasks. It was only available in English. Now it supports French, Spanish, Japanese and other languages, which is kind of funny that it was only in English considering it's a French company. But, and you know, the French support came after, obviously it's a bigger user base, but I still think that's kind of funny. Um, it can also switch code between languages mid sentence, which is pretty cool, right? So you might be writing in Python and then all of a sudden you switch to, you know, another coding language mid, mid sentence, which is very interesting. And then it also has the addition of projects which essentially can Help users stay organized. You can group chats together, documents and ideas, all basically organizing everything you're doing. Each project can have its own default library and it can remember what tools and settings you've enabled, which is really useful. So overall, like Mistral is building a lot of these really cool products and tools into there, like things that I've been, you know, begging chat GPT to do for ages and so like, I'm really bullish on that. And some of the things that they're doing with enterprise, the way that they're able to host their models offline, you know, locked down for an enterprise to run it on their own services, I think that's amazing. It's just not something we're seeing OpenAI really serving. I guess they tried to do like custom, you know, servers with Azure. They do have some options there, but I think Mistral's doing this in a bigger way. So it's, it's very, very interesting. And I think that with the addition of their new Deep Research, they're going to be a serious competitor in this space, assuming they can keep up. Right. Everything's changing rapidly. So now they're kind of caught up. But will they fall behind? We'll, we'll see. It's definitely not the company that releases things first, but it's a, it's pretty serious competitor. It's got some unique value proposition. So it's an interesting company. I'll keep you up to date on everything rolling out with Mistral in the future. Thank you so much for tuning into the podcast today. I hope you guys all have a fantastic rest of your day. Make sure to try out AI box AI if you want to check out the latest from Mistral and all of the other top AI models all in one platform for $20 a month. And I hope you guys all have a great rest of your day. I'll catch you next time.
