Transcript
A (0:00)
Meta has just rolled out a new feature inside of their AI Smart glasses, their Meta Ray Bans that can now help you hear conversations better. This is a really interesting feature in what I believe is the next hottest product. I think there's been a lot of products out there that have, you know, you know, essentially said that they're going to replace the smartphone. We had the Humane pin earlier this last year that was like a pen you clipped on your shirt. It had a projector that projected on your screen and they're like, never take your smartphone out of your pocket. You. You can do everything on this device. We had the rabbit R1 that flopped. Humane obviously went bankrupt. Amazon recently purchased a company called be, which I per. Which I bought one of them and tested them. It's like a wristband that records your conversations and gives you notes and helps you with stuff. None of these devices have been very hot topics, in my opinion. None of them have have taken off, nor do I think they'll continue to take off as a replacement to an iPhone. The form factor that I 100% think will win is glasses. And right now Meta is leading the way in this, probably because Mark Zuckerberg wasted, you know, billions and billions of dollars on, in my opinion, wasted on the Oculus and on VR, which is a cool technology, but I just don't think it's going to see the mass adoption. I think Apple learned that the hard way with their Apple Vision Pros. But I think Zuckerberg stumbled upon an incredibly winning form factor, which was glasses. There is a speaker right next to your ear. There's a microphone embedded into them. They have a camera right on the front of them. And the lens of the glasses will eventually be able to have projected augmented reality stuff on the screen so you can read books or watch movies or see things on your lenses in front of you. So everything kind of the Oculus and the Apple Vision Pro theoretically could offer you, but in a very small form factor that is that people are used to wearing already, so you don't have to put a giant mask on your ski mask on your face to use the technology. So I think this is going to be a winner. I want to break down the new feature here because I think this is really interesting. But before I do, I wanted to mention if you want to try all of the AI models I mentioned on the show, including Metas, which has a bunch of really cool new models, a bunch of really cool open source models, if you want to try the latest from OpenAI and Google and Anthropic and Grok and 11 Labs for audio and a ton of cool image generators. Go check out AI Box AI. This is my own startup that I've built. For $20 a month, you get access to all of the top models in one place. You don't have to forget where you put stuff. You don't have to open a hundred tabs. You don't have to pay subscriptions to a bunch of different models. $20 a month, you get access to everything. So you can go check it out. It is AI Box AI. There's a link in the description. All right, let's get back to the story about Meta. They just announced this update to their AI glasses and essentially it's going to allow you to hear people talking better when you're in a really noisy environment. I think we've all been there. Whether you're on like a train or just in a really noisy crowd, someone next to you is talking. You can barely hear what they're saying. It's kind of embarrassing when you have to keep asking them to repeat themselves. Initially. This feature is going to become available on the Ray Ban Meta and the Oakley Meta HSTN Smart glasses just in the US and Canada. In addition, they are getting another update that lets you use Spotify to play a song that matches what's in your current view. This is kind of like, I don't know, a funny gimmick, but I think it highlights a really cool feature which is that you can play Spotify on your glasses and you know, listen to music on on Spotify. I think you may be able to also pair with Bluetooth headphones if you want to do that, but otherwise I like it's kind of interesting because on the one hand I think Meta does a pretty good job with the speakers inside of these glasses, not being too disruptive to all the people around you because because the frame of the glasses is resting right on top of your ear. It's real close to your ear, but at the same time it is a speaker playing out loud. So I think they do a great job to make it not too annoying for people around you. But you probably, if you're playing Spotify want you want to have some headphones in as well. So if you're looking at an album cover though, for example, this is a great example of when the spot this new Spotify use case would be. If you're looking at album cover, the glasses can play a song by that artist. Or if you're looking at a Christmas tree with a pile of gifts you could play holiday music. So like in, like on the, in one way, I think it's like sort of like cheesy of the feature. Like as if I'm gonna look at my Christmas tree and then all of a sudden I want it to start playing like Deck the Halls. But like, sometimes when I'm like, hey, like, I do want some like, Christmas music. It would be nice if I didn't have to search for Christmas music. It would be nice if I was just like, look over my Christmas tree and just like, play some, play some music that, you know, play some music like what I'm looking at. Essentially it's like a faster way of getting what you're, what you're trying to see. So as a gimmicky feature, I don't think it's. It has a lot of sticky value, but if it reduces friction. And I'm like, literally, you know, looking at the beach and I'm like, play some music like this. And it starts playing the Beach Boys. And I'm like, cool, this is what I would have wanted. Or I'm looking at a Christmas tree, it starts playing Christmas music like, this is what I would have wanted. If it reduced the friction of me having to like type out a specific playlist or song, then yes, I do think that's awesome. How good will it be? I think is really what is going to determine if this is a good tool or not. But you know, some people really do like music inspired by certain things. And recently I had a, a family member that's a dentist and for his birthday his wife was asking everyone to submit a song that reminded them of him. And so I was like looking up, I was trying to find like a bunch of like funny songs about like dentists or something. And, and ChatGPT was really doing me dirty, was not giving me any good recommendations. My brother ended up finding a whole bunch of hilarious ones and, and submitting them all. And so anyways, I wish I had have gotten better recommendations, but Chat GPD didn't give them to me and so it made me think, you know, it really all depends on how good the AI model is at recommending. If Meta could solve that problem, then I think it would be a great tool. If not, it could be a flop. So it all depends on that. The conversation focus feature though, that I think is probably a lot more practical that they just announced, they actually announced earlier this year. But it's, it's getting rolled out. It uses the AI glasses, open ear speakers, and it's going to Essentially amplify the voice of the person you're talking to. So they say that the smart glasses wearer is going to be able to adjust the amplification level by swiping the right temple of the glasses or um, in the device settings. So this is essentially gonna let you set the level, make it really precise to match your current environment. So if you're in a busy restaurant, a bar, a club, a train or anywhere else, whoever you're looking at, it's going to amplify what they're saying, but not the ambient sound of the room around you. Honestly, this is a really, really cool feature when you're in noisy environments. How well this works I think is gonna definitely need to be tested, just like their Spotify feature. However, I will say that the idea of using this to help with hearing is not just something that Apple's working on. Apple's Air or Meta's working on. Apple's AirPods already have a conversation boosting feature that's designed to essentially help you focus on the person you're talking to. They'll make that person a little bit louder. And I think even like Apple's, you know, their, their late, their latest AirPod Pro models have added support for like a clinical grade hearing aid feature as well. So there's, you know, a lot of people that are kind of working on this. But I think uh, these obviously are the first glasses that are going to embed this feature and I do think this has got a lot of great use cases. Um, I think it's limited to the US and Canada right now the Spotify feature is anywhere that's, you know, English is spoken. So basically every, like many, many places this is going to be a new software update that's going to come and you'll be able to use it. I think right now you have to join a wait list, get approved. It's going to roll out to everyone later. But regardless, I mean, I think this is a good idea of what is coming down the pipe, what we can expect from the Meta Ray Bans and just like another really insane use case, I think translating other languages, I think amplifying the voice of the person you're talking to. These glasses I think are going to be one of Ray Ban and one of Meta's biggest sellers. And of course they're doing it with Oakley as well. So it's not just Ray Ban, but I think this is going to be one of Meta's biggest sellers and it probably will make Zuckerberg back all of the money that he lost on the Metaverse that I just don't think is too much of a happening place these days. All right, thank you so much for tuning into the podcast. If you enjoyed the episode. If you learned anything new and interesting, make sure to leave a rating review on the podcast. It essentially just helps more people find out find the show. It boosts me in the algorithm and helps the show out a lot. So if you want to say thank you, that would be the best way to do it. Drop some stars on Spotify on the about tab. Or yeah, leave a review over on Apple. Thanks so much for tuning in. I will catch you guys in the next episode.
