Loading summary
A
You're watching TVPN. Today is Meta Connect 2025 and we are live from the fortress of followers, the villa of virality. It's Mount Metaverse, baby. We are here in Menlo park at Meta HQ to break down all the good stuff coming out of meta connect 2025. And there's a lot of a massive lineup. But first we wanted to sort of reflect on the past year, which has been remarkable and busy. I remember in one of our first episodes, we reviewed the Meta Ray Bans. I purchased them myself. So did I. This was coming off of Meta Connect 2024. We hadn't started the show when Meta Connect 2024 happened. We sat down in a conference room at the Jonathan Club, turned on the microphones and the cameras, recorded and chatted for a while and. And I put on the Meta Ray Bans and filmed you talking about it. And you said that filmed me talking about the Meta Ray Bans. Yes, yes. And your prediction was that in the future you might have multiple sets of glasses for different occasions. Some work glasses, some workout glasses, maybe a VR headset. That we're entering this era of spatial computing and augmented reality devices, head mounted displays, all sorts of different stuff. And so it's just been a remarkable, remarkable run. And I think we also, in that episode, if I remember correctly, we're talking about the importance of leveraging existing silhouettes, which they've done incredibly well. Yeah. With luxottica. And so we have a wonderful show. We are of course distributing this across the Internet with Restream. Restream.com one live stream, 30 plus destinations. And we also just wanted to say thank you to the advertisers who have been with us from day one. Adquick.com out of home advertising made easy and measurable. Getbezel.com you know Mark Zuckerberg loves his watches. Get yourself one@bezel public.com investing for those who take it seriously, they have multi asset investing. They're trusted by millions. Eight sleep.com exceptional sleep without exception. Fall asleep faster. Deeper. Sleep deeper and wake up energized. Wander.com, book a wander with inspiring views, hotel grade amenities, dreamy beds, top tier cleaning and 247 concierge service. Find your happy place. Find your happy place. Yeah. These are the advertisers that have made this show possible from day one. And look at where we are now. We're very happy to be here. But first let's go through the lineup today. It's a offensive lineup, the offensive lineup. Yes. So on The Meta team, the Meta mates who will be coming on the show. We have Chris Cox. He's the Chief Product Officer. He joined Facebook. He joined Facebook in 2005. That was the same year the company was founded. He was in the first 15 software engineers and played a role in the development of Newsfeed. Then we got Adam Mosseri coming on. He's the head of Instagram. He joined Facebook as a product designer in 2008. In 2009 he became the product design manager and in 2012 he became the design director for the company's mobile apps. Connor Hayes is coming on to. He's the head of Threads. Great name. Yes, fantastic name. Although he has an E. Yep, added that in there. He did. He joined Facebook in 2011. He served in various product roles across Meta and Instagram over the past 14 years. He was VP of Gen AI before building Threads in 2023. Alexander Wang's coming on the Chief AI Officer. Yes. He briefly attended MIT, had a stint as an algorithm developer at the high frequency trading firm Hudson River Trading. Dropped out to co found Scale AI. US Physics team. Yeah, US Physics team and I think IMO or ioi, one of those two. He was, he was pretty, pretty top tier at. Smart kid. Smart kid. Built Scale into a behemoth and wound up doing a deal to come over here and lead the Meta superintelligence team. And so we're going to go run into everything that he's doing to build the team at MSL and some of his plans. Although obviously this event is focused on Meta Connect. It's focused on the. Some of the. Some of the. A lot of the hardware. So a lot of the hardware that we'll see. We're excited to talk to him about how he's thinking about integrating AI into all these different sets. Yeah. Then we have Andrew Bosworth. Boz, Chief Technology Officer, head of Reality Labs. Boz began his career working for Microsoft as a developer on Microsoft Visio. In 2006, Bosworth received a call from a recruiter looking for a candidate with a background in artificial intelligence. In 2006, 2006, they're like, who knows AI? Boz gets the call, joins as one of the first 15 engineers at Facebook. Then we have Eva Chen, VP of fashion partnerships. She joined Instagram in 2015. We have Mark Zuckerberg, the man who needs no introduction. Alex Himmel is the VP of Wearables. He's been at Meta for over 15 years. Alex has played a key role in developing products like Ray Ban, Meta, Smart Glasses and Orion, which we got a demo from. We got a demo for recently. Had a lot of fun with that. We've had three significant demos. Yeah. The first one a few months ago, second one last week and then one today. One today. And fun fact about Alex Hyml, he met his wife at Meta. And then lastly, closing out the team from Meta, who's coming on the show today, we have Vishal Shah, the VP of the Metaverse. He joined Meta on the Instagram team back in 2015. So he's also been on a decade Product legend. Product legend. So if you're looking to go on the offense, go to adeo.com Adeo is the AI native CRM that builds scales and grows your company to the next level on the defense. What's going on in the rest of the market? What's going on in the rest of the tech world? Who's paying attention? Who's watching today? Tim Cook. Definitely watching. We saw this with the iPhone launch event. The iPhone Pro Max comes out vapor chamber. Very cool. But a lot of people were focused on what was going on with the iPhone Air because the air seemingly when they showed the cross sections of what's going on internally, seemed like they had shrunk the entire computer down just to the bump and the rest was just screen and battery. And so a lot of people are saying that Apple is going to maybe take that miniaturized phone and put it into another device, maybe glasses. They've already done the Apple Vision Pro. Not a huge success there. They're still finding their footing, licking their wounds, watching today to figure out how they need to react next. Then you have OpenAI. We know Sam Allman didn't hire Jony. I've just to film cinematic coffee chats. They're building something. Spent a couple points of the company. Make that happen. They're going to launch something. We've heard rumors. What was the rumor? That it was something like telepathy. So you speak. Totally unclear. Yeah. You speak without actually raising your voice to the point where someone can actually hear you across the room. And yet that can go into some sort of device. What you do know is that it's positioned as a third device. It's not your laptop, it's not your phone. Something else. Yep. Yeah, yeah, yeah. Sam's been pretty clear that it's not a phone. And then there's a startup that's doing something similar in the telepathy space. We saw their. Their launch video that looked. Yeah, blanking on the name. Went very viral recently. The other company that's probably watching is Waves. Yeah, Waves. They went very viral recently. A lot of people were pissed off about the products. It was basically ruining. Ruining everything. Yes. So they make a live streaming or they're making a device that will offer perpetual live streaming on your face. And it's worth noting that the devices today are not focused on live streaming. Yeah. And it seemed like people were excited about the video for Waves and they were excited about the. The actual technology and the ability to live stream. They just didn't like the fact that you could turn the light off. Right. Yeah. The privacy light of knowing when someone's recording seemed to be something the community really wanted. Yeah. So we'll see. Maybe. Maybe he'll change his. Maybe the founder will change his tune and switch up the product the way the product's built so that you can't turn that off. That might be something that people just demand, but at least until now, I'm sure they'll be watching closely to see what's coming out of Meta today. Then you have Elon. There's big news from Semi analysis. The massive Colossus 2 cluster is coming online. Elon simply refused to be GPU poor. He's doing a ton of interesting techniques to GPU generate power. We're going to go through some of that. And then of course, whenever you launch anything on the Internet, you're going up against the timeline. Yep. But we were talking about this. This has been interesting. Right. There was leak earlier this week. We didn't cover it closely, but people were very excited about the releases. It was immediately a good reaction. We're going to have something leak. Better to be at least get a positive reaction. Yeah. And I was trying to compare this to previous tech launches from this year. Let me grab my page first. Got a little wind here. A little weather. Need to put my, my Ray Bans on here. I was thinking about, like, why, why has the response been positive? Tech people or fickle and skeptical of everything? And it does, it does feel like wearables are under hyped right now. Totally under hyped. Right. But it's delivering science fiction today. Yeah. And the devices that we're talking about today are all pretty much immediately going to be available. Yeah. And. And at least from the previous meta Ray bans, I feel like the original meta Ray Bans launch kind of surprise. It kind of seemed like this like offshoot. It didn't have the same like, oh, this is going to take over everything. Like VR. Like VR. You immediately go into like, are you going to be living in a virtual world, are you going to be doing everything in VR? And the Med Arabians were just like, look, it's something you're already wearing, it's fashionable and now it just has a little bit more technology on it. It's a great camera, great camera. Headphones. Yeah, and then the headphones. And then eventually. Oh, you can also talk to meta AI through it. And then people get excited about that and whatnot. But it seems like the timeline is primed to this. I think people have been so used to this, getting heads up display demos and then not actually getting to experience it themselves. Right. Not getting to experience something that's at the quality level of the demos provided. Yes. And we've done the demos. Yes, it's real. Yes, you're going to be able to walk around with a heads up display. Yes, you're going to be able to. And really I think we're obviously going to walk, watch the live stream ourselves, be able to react to it. But I think people are going to be incredibly impressed by a number of the new features and functionality of it. Yeah, totally. Other big tech companies will be watching, of course. You got Google at I O. They announced something that looked like glasses with potentially a heads up display. Google of course launched Google Glass years ago. Couldn't get that project fully off the ground. Dipping their toe back into it with a bit of like a vision document, Vision presentation, no firm timelines and sort of unclear from the Google I O presentation whether this would be something that they're merely building software for and then handing off to partners. Like they do with Samsung. Like Samsung, exactly. But you know, obviously some focus there. And then Amazon. There's been other news with Amazon. Right. So they're trying to create devices for their workforce first, so focusing more on effectively being the customer themselves. They have millions of employees globally, but their plan is to leverage the learnings from that, the scale from that and take it in a consumer direction over time. Yeah, well if you're planning your big next move, you need to meet the system for modern software development. LINEAR is a purpose built tool for planning and building products. And that takes us in to the timeline, the news, what's going on in the tech world? So the huge news today out of the Financial Times is that China has banned import of US chips. This has been going back for a while. We were talking to Bill Bishop about this a few days ago. The quote that Didi Das shares is Beijing's regulators recently summoned domestic chip makers such as Huawei and Cambercon as well as Alibaba and Baidu to report how their products compare against Nvidia's China's chips. They concluded that China's AI processors had reached a level comparable to or exceeding that of Nvidia's products allowed under export controls. That's currently the 20s. Now when we were talking to Bill Bishop, there were, they seemed like Nvidia was already working on like a succession of the Blackwell version of the Blackwell. And so there's obviously been a ton of pressure out of Beijing to reduce the amount of American chips and continue to drive domestic production with Huawei down the learning curve, no matter how painful it is. Yeah. And the question was, are they going to rip the band aid off and just decide, hey, we're willing to set back our industry slightly in order to gain a long term competitive edge on the manufacturing side? Yeah. This feels like. I don't know. Yeah. Interpreting it in the, in the context of like how hot is the AI race? David Sachs has been saying that like the, the AI we're not in. Bill Gurley as well was talking about how like we're not in this hot AI war. This fast takeoff, you have to do it. It's much more like just a little bit of additive value to your economy. Yeah. China, China's internal AI planning docs reflect this. Right. It's like we're going to drive efficiency in industry using artificial intelligence. Yeah. It's not necessarily machine God yet. The machine God of war. I mean at least that, that does seem the interpretation. You would think that you would get as, as many chips as possible if you were like, it's happening this year. Yeah. Like, don't worry about our local manufacturing. Just get the chips, train the model and then you have it. But clearly this is, and remember this is all following up a couple of weeks ago where they were being domestic players are being encouraged by us. So, so it was, it was certainly something that they were saying, hey, we don't want you buying us chips. Yeah. But you can, you know, it wasn't a hard and fast rule. Yeah. This, you know, is them coming in with the ban hammer. Yeah. Well, let's go through this Financial Times article a little bit more. China's Internet regulator has banned the country's biggest technology companies from buying Nvidia's artificial chips. As Beijing steps up efforts to boost its domestic industry and compete with the US the Cyberspace Administration of China, CIC told companies including ByteDance and Alibaba this week that to end their testing and orders of the RTX Pro 6000D, Nvidia's tailor made product for the country. According to three people with knowledge of the matter, Nvidia's Shares fell around 3% on Wednesday. Did you see. Jim Cramer said he's excited about AMD because they're going to be able to sell video game graphics cards into China. That feels like extremely temporary because I would be surprised. It feels like the headline is like no Nvidia, but the broader context here, specifically an Nvidia ban. Yeah, that's what the Financial Times are reporting. Yeah, Theoretically could go around that. Yeah, apparently. But I mean we'll have to see how they see how long that lasts. Yeah, this feels like something that's coming out of leaks coming out of. Not necessarily like the final law. Look and see what AMD is doing today. Yeah, yeah, look it up. Several companies had indicated that they were would order tens of thousands of the RTX Pro 6000D and had started testing and verification work with Nvidia servers, server suppliers. The people said after receiving the CAC order the companies, the company told. The companies told their suppliers to stop work. The ban goes beyond earlier guidance from regulators that focused on the H20, Nvidia's other China only chip widely used for AI. It comes after Chinese regulators concluded that domestic chips had attained performance comparable to those of Nvidia's models used in China. That's of course the what their process was. Regulators deciding these are actually good enough. Well, it's this weird dynamic because Huawei was saying the same thing. Like Huawei's incentivized to say, yes, we're as good as Nvidia. And then the Chinese regulators are saying, well, Huawei says it. So the rest of the companies, everyone who would be buying from Nvidia, Huawei. Go read the Huawei press release. Do you recall Semianalysis doing any type of like direct comparison? They did, yeah, they did. And the main result was that you can train. I think roughly if I'm trying to. It's not as energy efficient, right? Yeah, yeah, just more energy, costly, more expensive. China's got energy, but they do have energy. Exactly. So Jensen Huang, the chief executive of Nvidia told reporters in London on Wednesday that he expected to discuss the chip maker's ability to do business with China with Donald Trump during the, during that, that evening during the President's state visit to the uk. Quote from Jensen Huang, he says we can only be in service of a market if the country wants us to be. There's not much Trump can do unless he makes This a part of the conversation in Madrid. Yeah. Did you see the other news? Palantir today signed a billion dollar contract with the UK. 750 million. No, 750 million. £750 million. Which I believe translates to exactly just over 1 billion USD. Yeah. Let's hear it. Beijing is putting pressure on Chinese tech companies to boost the company's homegrown semiconductor industry industry and break their reliance on Nvidia so it can compete in an AI race against the U.S. the message is now loud and clear, said an executive at one of the tech companies. Earlier, people had hopes of renewed Nvidia supply if the geopolitical situation improves. Now it's all hands on deck to build the domestic system. Nvidia started producing chips tailored for the Chinese market after former U.S. president Joe Biden banned the company from exporting those, its most powerful chips to China. In an effort to rein in Beijing's progress on AI. Beijing's regulators have recently summoned domestic chip makers such as Huawei and Camerocon, as well as Alibaba and Search Engine Baidu, which also make their their own semiconductors, to report how their products compare against Nvidia China chips. According to people familiar with the matter, they concluded that China's AI processors have reached a level comparable or exceeding that of Nvidia products allowed under export controls. And this was sort of the messaging from, from Jensen and Trump when they were talking about the H20. They were saying like, everyone knows H20 as the China compliant chip, but it's been years and so the, like, the market has moved on and Nvidia has more advanced product like Blackwell. And so we are, we are talking not only about a chip that was nerfed on memory, interconnect and a few other, a few other characteristics that make it more, more compatible with the trade regime, but it's also just old at this point. Yeah, the immediate thought I have is what does Nvidia do with their huge R and D center in. They've been building out in China, right. At a certain point, I mean they still have, there's a lot of talent there. Is it more important than ever? Because you got to be, you know, talking to Beijing and convincing them to buy. The next thing you know, it's an olive branch. So you got to be, you got to be pushing it to right now. It's so over. But you think we could get to the point? I mean, we'll have to talk to, you know, the regulars on the show, but this entire year has been back and forth with this Story. Yeah. Do you think the Stock's down roughly 3% today? Do you think it'd be more if, do you think the market's to going kind of calling China's bluff? There's just so many different dynamics where there's you know, cloud providers that are outside of China that will still be able to buy. There's you know, ways to funnel chips through to China like the deep sea story where all those chips come from. There's so many different dynamics and then, but, but even, even if you cut off China entirely from Nvidia, like that's not the bulk of their business. They can still sell to American hyperscalers. Yeah. They can sell, sell to clouds based outside of China that people American clouds. American clouds buy a ton of Nvidia chips and they want to buy more and more and more. You know so like the demand mainland China and still leverage international clouds to some extent. To some extent. Not, not. You can't just go to aws. But you know there, there are certainly jump ball countries that are kind of playing both sides. Yeah. The Financial Times reported last month that China's chip makers were seeking to triple the company the country's total output of AI processors next year. The top level consensus is there's going to be enough domestic supply to meet demand without having to buy Nvidia chips. And Nvidia introduced the RTX Pro 6000D in July during Huang's visit to Beijing when the US company also said Washington was easing its previous ban on the H20 chip. China's regulators, including the CIC have warned against tech companies, have warned tech companies against buying Nvidia H20 which you've talked about asking them to justify having purchased them over domestic products. The FT reported last month the RTX Pro 6000D which the company has said could be used in automated manufacturing, was the last product Nvidia was allowed to sell in China in significant volumes. Alibaba, ByteDance and the CIC. And nobody, basically nobody, nobody responded to requests for comment, of course. Well the other news today that we have to cover the. The Fed made the first rate cut 25American bips. Yes. Polymarket, our sponsor says breaking the Polymarket for today's Fed decision has surpassed $200 million in volume, making it the largest FOMC prediction market in history. And Polymarket is projecting two more rate cuts this this year from the Wall Street Journal. Fed lowers rates by a quarter point signals more cuts are likely. Concerns about a job market slowdown are overriding jitters about inflation. In adjusting A pivot towards a shallow sequence of rate reduction. The Federal Reserve approved a quarter point interest rate cut Wednesday, the first in nine months, with officials judging the recent labor market softness outweighed setbacks on inflation. A narrow majority of officials penciled in at least two additional cuts this year, implying consecutive moves at the Fed's two remaining meetings in October and December. The projections hint at a broader shift toward concern about cracks forming in the job market in an environment complicated with major policy shifts that have made the economy harder to read. The recent declines in the growth rate for both the number of people looking for jobs and those gaining employment have, quote, certainly gotten everyone's attention, Fed Chair Jerome Powell said at the conference. Powell, who referred to, quote, downside risk six times at the news conference in July, said on Wednesday that downside risk is now a reality. The Fed's carefully drafted post meeting statement pointed to those concerns when it said the rate cut was justified in light of the shift in the balance of risks. The statement no longer described the labor markets as solid. Yeah, what's your take on this? I feel like when we looked through the Sun Valley transcript from Powell, we were seeing lots of mentions of inflation. That was mostly because they were kind of redefining the definitions and working through some kind of jargony issues. It was, it wasn't really an inflation focused talk necessarily, but there is this interesting dynamic where there's not a lot to point to like gold at all time highs, Bitcoin at all time highs, stock market at all time highs, Nasdaq. That's data you can trust because you can go to your brokerage and see prices. Yes, the data that, that, that now I don't think people have a lot of faith in at all. Is, is, is job market data, labor market, labor market data, because it just gets revised up or down and always been working. And of course it's tricky but, but that just makes it, you know, again, I mean, the big concern is stagflation, right? The feds carefully drafted. So 11 of 12 Fed voters backed the quarter point cut. Fed Governor Stephen Mirren, who served as a senior White House advisor until his confirmation of the Senate Central Bank Board this week was the loan to center. He favored a larger half point cut. That makes sense given the White House connection. The projections underscore how coming decisions could be more contentious. Seven of 19 meeting participants penciled in no further rate reductions this year and two more penciled in only one more cut. And they show that most officials don't expect to make many more reductions next year under their current outlook for or solid economic activity. And the, yeah, I mean the reaction from the timeline has not been fantastic. Have you seen the 10 years resurging back up? Not what you want to see. I mean the hope with rate cuts is mortgages get more affordable. Right. And with higher 10 year, higher, longer at the long end of the yield curve, you're going to see just less home affordability. And so hopefully this does kind of ease markets to the point where you can solve the hiring problem. But are management teams thinking, oh we got a quarter point reduction, let's, let's hire a bunch of people. Right. I don't think anybody's maybe right. Like the stocks go up, gets easier, like it's easy to raise money and so you raise more money, you hire more people. Like that's the, that's the certainly the startup world and like rate cuts should work their way through all the way to the venture markets and every company should be a little bit breathing a little bit easier with lower rates. And so you should see it should have an effect on the, on the job market. But there's certainly just how quickly. Yeah, yeah. And there is a lot. President Trump has berated Fed chair Jerome Powell for months for the central bank's reluctance to cut rates. Senate Republicans can firm Mirren to his seat on Monday night. And he was sworn in just before the Fed's two day meeting began on Tuesday morning. Mirren, who is on unpaid leave from the White House has said he could go back when his Fed term expires early next year. And it goes into a bit on Lisa Cook history here debacle. Between September And December of 2024, the Fed cut rates by 1 percentage point, lowering them from a two decade high the previous to prevent unnecessary weakness to the economy after substantial, substantial and broad decline in inflation. But officials paused cuts after, after that amid signs of stronger growth and potentially stickier inflation. Officials are navigating an economy reshaped by sweeping policy experiments. Trump has imposed tariffs that far exceed those of his first term. Rising costs from manufacturers and small businesses. The full, the full effects on consumer prices remain unclear as companies adjust supply chains and pricing strategies. Sharper curbs on immigration could be contributing to a slower pace of job gains by reducing labor force growth. So we'll keep tracking this. We'll have to catch up with Joe Weisenthal hopefully get the update. Our financial brother. Yes, of course. So the other big news is from semianalysis. Xai's Colossus 2 is the first gigawatt data center in the world and we don't have time. We are going to run into the live keynote commentary in just two minutes. But the interesting is. Wow. Take away is this picture. Yes. So Colossus 2 is technically in Memphis, Tennessee, but according to the semianalysis team, Memphis and Tennessee have been getting a lot of pushback. So Xai's genius move was to develop a gigawatt scale energy hub right across the border in South Avon, Mississippi. And so you can see on this map, we'll have to pull up the photos. They're hacking the world. Yeah, it really is this like crazy arbitrage. I think Dylan Patel called it like 4D chess that only Elon can do or something. 2D chess. It really is 2D chess. It's like you look at the map and it looks like, it looks like chess. But yeah, he's behind regular chess. Maybe behind enemy lines. I'm not exactly sure what the, what the correct analogy is, but I mean, it's a good, you know, it's a good bet. You, you go over to Mississippi, you talk to the governor mayor of South Haven and you say, do you want me to hire a bunch of people in your state? Mississippi says, sounds great, let's do it. The benefits of the American state based system, right? Every state can compete for jobs, for business, for energy production, whatever it takes to get it done. Also today, I think Elon said or claimed that Grok 5 will begin training in just a few weeks. And he thinks that Grok 5 will be capable of reaching AGI. And so I'm not even sure how we're benchmarking that or quantifying that. Grok has obviously been doing fantastic on ARC AGI, our favorite. Everybody has their own definition. Everyone does. We have goal posts. We're going to keep moving. We're going to keep moving. We are in the goalpost moving business. What have you done for me lately? Foundation labs. That's what I like to say. Exactly. Anyway, the keynote is starting in just 50 seconds. We are going to be broadcasting it live and giving you commentary on the keynote from Meta Connect. I am impressed that we've gotten this far without leaking anything. We have a lot of strength of embargoes, but we pulled it off, we did it and we got through. Capital, James. Yes, exactly. Well, Amanda Goodall on X says if your interview process takes longer than electing a Pope, you're doing it wrong. Currently, the Pope is elected in two days. Two days. Your remote hire doesn't need five rounds of interviews. I said what I said And Gabe says, yeah, well, I bet the pope can't debug a distributed system. 40% in 2025. That is crazy. Everything is up. Everything is up. Gold, bitcoin, the market, everything is ripping. With some friends who predicted this. I remember in our group chat one of our buddies was saying, golden bull run. There's also somebody put a golden statue of Trump holding a physical bitcoin right near the White House. Can you just put up statues? We talked about that, right? Maybe. I mean, we talked about this with the original Wall street bull, right? The guy built it in his. He built the bull, the famous Wall street bull in his apartment and then just dropped it off. But there was a Christmas event at the time, so he had to like leave and come back and sneak it in there and put it down. I guess you can just make statues and just put them down in the real world. The other news in the venture world, artificial intelligence chip Startup Grok raised $750 million at a post funding valuation of 6.9 billion. Said, yeah, Grok will be a hundred billion dollar company if it doesn't get bought before then. We've debated on the show before who would be a potential buyer for Grok. Yeah, number of, number of players. But we'll see. This is a fantastic milestone. We'll have to have Jonathan on again soon. Yeah, we've had a couple of the folks on the show. Alex Cohen had a great post here. Salesforce on Salesforce. So last fall, one of Salesforce technical teams told large Salesforce customers that using Agentforce, the software firm's artificial intelligence for automating customer service and other functions would require extensive planning. The product information, the Salesforce. We're going live. Jordan. Going live. It's time. Three, two, one. Here we go. There he is. There he is. All right, here we go. No way. Here we go. Wow. Throw on some tunes. Live demo, ballsy. High risk, high risk, high reward. And I will say the speakers in the new Meta ray bans have improved dramatically. Yeah, there you go. You know, just ripping emojis on the way in. Going, man. Hey, there's Diplo. Good to see you, Wes. Just so the glasses can support live stream. They must. Maybe that's the one more thing. We'll see. Here we go. Packed house @Mea Connect 2025. Here we go. We'll talk about these in a minute. Here we go. Welcome to Connect. All right. No chain AI glasses and virtual reality. Our goal is to build great looking glasses that deliver personal super intelligence and a feeling of presence using Realistic holograms. And these ideas combined are what we call the metaverse. Now, glasses are the ideal form factor for personal superintelligence because they let you stay present in the moment while getting access to all of these AI capabilities that make you smarter, help you communicate better, improve your memory, improve your senses, and more. Glasses are the only form factor where you can let an AI see what you see, hear what you hear, talk to you throughout the day, and very soon generate whatever UI you need right in your vision, in real time. So it is no surprise that AI glasses are taking off. This is now our third year shipping AI glasses with our great partner, Essilor Luxottica. And the sales trajectory that we've seen is similar to some of the most popular consumer electronics of all time. Now, we are focused on designing glasses with a few clear values. Number one, they need to be great glasses first. Now, before we get to any of the technology, the glasses need to be well designed and comfortable. And if you're going to wear glasses on your face all day, every day, then they need to be refined in their aesthetics, and they need to be light. So in addition to working with iconic brands, we have spent years of engineering obsessing over how to shave every fraction of a millimeter and portion of a gram that we can from every pair of glasses that we ship. And I think that that shows in the work. Number two, the technology needs to get out of the way. The promise of glasses is to preserve this sense of presence that you have when you're with other people now, this feeling of presence, it's a profound thing. And I think that we've lost it a little bit with phones, and we have the opportunity to get it back with glasses. So when we're designing the hardware and software, we focus on giving you access to very powerful tools when you want them, and then just having them fade into the background otherwise. Number three, take superintelligence seriously. This is going to be the most important technology in our lifetimes. AI should serve people, not just be something that sits in a data center automating large parts of society. So we design our glasses to be able to empower people with new capabilities as soon as they become possible. We think in advance about what kind of sensors are going to be necessary, and we make it so you can just update your software and make your glasses and yourself smarter and direct AI towards what matters most in your life. All right, so with all that said, we do have some new glasses to show you today. I want to start with these. The next generation Of Ray Ban Meta glasses. These are the original iconic design. I think that this is actually the most popular glass design in history. Underrated. How smart it is for Luxotta to not get now steam room, double the battery life. That was a great partnership. I wear them all day. They never run out of Battery. It's got 3K video recording, double our previous resolution for sharper, smoother and more vivid videos. This feels very fast. Ray Ban Meta. Really good pacing. And Meta AI keeps on getting better. He knows. So last year I did this live demo translating live between two people. We were doing that on stage. Now, today, I am excited to introduce a feature that we call Conversation Focus. It's a new feature coming soon that is going to be able to amplify your friends voices in your ear. So if you're in a noisy restaurant, you're basically going to be able to turn up the volume on your friends or whoever you're talking to. This feature's crazy. Yeah. And Conversation Focus, it's not only going to be on the new Ray Ban metas, it's going to be available as a software update on all of the existing Ray Ban metas too. Too. This feature makes being in a loud restaurant bearable now. Or being at. Or being at a concerto. And Jack Coin in the streets of. People are into watching. Check out how this works into the wearables at concerts. Hi, Johnny. Hello. How are you? Got the Renaissance vibes going on. What's going on, baby? Jack, I just put my name in. It's going to be a couple minutes. Nice. I need your advice. Okay. Every time I get my picture taken, I feel like I'm not being normal. I want to feel like just a regular person when I'm. One sec. Jack. Hey, Meta. Start Conversation Focus. Starting Conversation Focus. Okay, go on. As soon as the camera comes up. Yeah. I start to have this, like serious deer and headlight. Yeah. Yeah. How do I be, like more normal? How do I be more natural? Like when I'm getting my picture taken, sometimes I play around with something like your collar. Fix your sleeve a little bit. Pretend like sort of action, like nobody's around. You know what I mean? We got his. You got it ready? Good demo. All right. Value of having a camera on the. On the headphones. Conversation Focus. All right. We are also improving live AI as we optimize battery and energy efficiency. Meta AI is going to transition from being something that you invoke when you have a question to a service that is running all the time and helping you out throughout the day. Now, to be clear, we're not there yet on all day live AI use. This is one of the major technology challenges that we're still working through. But today you can use live AI for about an hour or two straight. So to get a feeling for what this is like, Love the Odyssey. Cut to chef Jack Mancuso, who's coming to us live from a kitchen on Meadows campus, preparing for the after party. How's it going, chef? All right, so what do you think? Maybe let's make. I don't know, what should we make? Maybe like a steak sauce, Maybe Korean inspired type thing just to show what the live AI is like. Yeah, let's try it. It's not something I've made before, so I could definitely use the help. Hey, meta. Start live AI. Starting live AI. I love the setup you have here with soy sauce and other ingredients. How can I help? Hey, can you help me make a Korean inspired steak sauce for my steak sandwich here? You can make a Korean inspired steak sauce using soy sauce, sesame oil. What do I do first? What do I do first? You've already combined the base ingredients, so now grate a pair to add to the sauce. When I first got my pair of meta Ray Bans, what do I do first? I wear them. When I'd walk around, I'd take my dog for a walk, and I would wind up just having they're showing the most, like, cutting edge, like the thing that you can only do with having a camera looking at the actual ingredients, piecing it all together. But I would just ask it about the history of the Roman Empire and just be talking to it like it was any other ll. And I think that, like, I'd love to know the actual breakdown. Yeah, I'd love to know the actual breakdown of like, like meta AI queries that come through the glasses. How much are the uniquely unlocked? Like, when you talk to a lot of folks that use meta Ray Bans, a lot of them will say, yeah, I take calls on them. It's not something you can just say, but it's important to have function. Yeah, yeah, yeah. But you, you need to have the unique unlocks, like the key. The key features that can only be done when you put this particular set of technologies together. That's the key demo. But a lot of times, like, you know, you go back to, like, you know, so much technology, we wind up just using it for messaging, wind up just using it for knowledge retrieval, that type of stuff, you know. Last year at Kinect, they also released limited edition clear frames. I Got them right here. And they were pretty popular. They sold out in a few days. So we've got a new edition. You can see the internals here with two colors. Do they not have anything in the. Get them quickly because they're probably going to be sold out in a few days too. Now look on the other side. Now. It's been pretty fun to see how you think you'd put stuff everywhere taken Ray Ban Meta in a lot of different directions. You know, some of, some of you probably, probably are familiar with the fashion label Luar run by Raul Lopez. Are you? I am not. I'm actually not. And he's a bold designer who's bringing together sportswear and high fashion. He recently debuted a look that's centered on Ray Ban Meta at New York Fashion Week. Raoul's actually here today along with Christy Baez, modeling the look that he created. Here we go. There he is. Awesome. Good to see you. All right. All right. That's the next generation of Ray Ban Meta. We're really excited about this. They're available now starting at 379 price point this summer, we launched our first pair of AI glasses with Oakley, the Oakley Meta Houston. This next announcement is what I've almost leaked. Like, did he talk about slow mo Nucleus, synonymous with sports for 50 years. Now they're available in a number of great colors. And by the way, you can. On the screen right now, you can see a massive skateboard ramp to the left of John. Brand new. And I don't think we want a docs who's going to be skateboarding in a little bit. But we were seeing practice earlier. Look at these. The vanguard. Now this is, this is the iconic Oakley aesthetic. These glasses are designed for performance wakeboarding. You know, we push battery, even trace the lineage perfectly using them the whole time on a single charge. And then you can turn around and run another marathon on the same charge and still not be out of battery. The camera. These are incredibly light. Yeah. For perfect not feel like you're holding the camera at all. It's got a wider 122 degree field of view, so you can capture all the epicness of your adventure in 3K. And it's got video stabilization, so that means that as you're going down a trail, you're going to be able to capture some really great video. All right. The open ear speakers are the most powerful speakers that we've shipped yet with 6 decibels louder than Oakley Meta Houston. So they're great for running on a noisy road or biking in 30 mile an hour winds. You know, I actually took a call on a jet ski a few weeks ago. It was great. You're the other person fine over the engine. And our advanced wind noise reduction makes it so that you can basically be standing in a wind tunnel and you'd still come in clear to the person on the other side. Person had no idea I was on a jet ski, which is good. All right. We've added slow motion and hyperbole capture modes so you can capture your adventures in new ways. These modes are also going to be available on all the new glasses that we're announcing here. The new Ray Ban Meta, the New Oakley Meta Houston's 2. And you can trigger these with Meta AI. Great footage with any of the Meta Slow Mo video partnering with Garmin. Have you ever done Hyperlapse and introduced. Are you familiar with this? So now if you're talking about it, take a bunch of photos, stitch them all together and smooths everything out. Video when you reach certain speeds or different distance intervals or like every mile of a marathon. And then when you're done, we'll just stitch together all the videos for you and you can overlay the stats on top of them and you get a nice video that you can share wherever you want that goes wild. And we're also partnering with Strava so you can overlay your stats from Strava too and share all the same type of content with your Strava community. All right. This is big. We put an LED in them so that way it can light up in your peripheral vision to help keep you on your pace target or heart rate zone to target. So that's going to be really useful if you're. I didn't catch that on the demo. That's very cool. Garmin device 2. These are also our most water resistant glasses yet with an IP67 rating. They can get wet. I've taken them out surfing. It's fine. It's good. I'm going to put this to the. To the real test. They're also designed with swap surfing. Two wave hold down. You know about two wave hold down? No, it's not styles. Normally you can customize curving. You'll fall, the wave will. If you fall, the wave will fall over you and you're going to be. You'll come up. Two wave hold down is when another wave. Oh. So you have to wait for the second one to come in. And so you'd be really potentially sitting basically on the, on the surface, if you're lucky, kind of waiting for it to roll over and then pop up. Do you wear sunglasses when you surf? No, but I'm going to now that I can. No. Yeah. Nothing. I could see you going surfing and bringing a pair of. Is that just the nerdiest thing you could possibly. That would be typically. What's the right style? Okay, cool. Yeah, I would definitely be caught wearing. Wearing some scuba gear while surfing. A hat. Yeah. You don't wear a hat. It's awesome. I feel like hat flies off right. Immediately. Yeah. It's hard to keep on even. Even if you have like. Even if you have it strapped on. Yeah. I haven't seen what. What are those things that go behind. Look at this. Or something. Yeah. This is a Red Bull, right? Yeah, the best. Yeah. Cameras in the center. So I think it fits inside a helmet better. Also lighter. Yeah, I do. I do wonder if they'll. They'll work with Oakley on. On a. On a ski goggle. Oh, yeah. That would make a ton of sense. Probably leaking the next thing. No inside knowledge. No. But it does make sense. Perfectly like fit Oakley makes. Makes ski goggles now. Yep. Now the announcement you've all been waiting for. Oakley men of Vanguard. All right, we are selling them for $4.99 pre orders start now and we're going to ship them on October 21st. Priced to sell and shipping fast anons everywhere. Rejoice. All right, aesthetic. Now let's check out those glasses I walked on stage with. Here we go. All right. We been working on glasses for more than 10 years at Meta. And this is one of those special moments where we get to show you something that we've poured a lot of our lives into and that I just think is different from anything that I've seen anyone else work on. I am really proud of this and I'm really proud of our team for achieving this. This is Meta Ray Ban display. Here we go. These are glasses with the classic style that you'd expect from Ray Ban, but they are the first AI glasses with a high resolution display and a whole new way to interact with them. The Meta Neural band. That's this game. He's had it on since the beginning. I got two wrists for a reason. People were teasing it. This isn't a prototype. This is. Here it is, ready to go. And you're going to be able to buy them in a couple of weeks. All right, so we've demoed this on two separate occasions. There are two key innovations. Obviously pulls a ton of the stuff that we saw in the Orion demo and people were Talking about the last meta connect. At the last meta connect came out, Ian Orion presented it, but it was a demo. Now getting ready to ship. It appears in one eye. It's slightly off center so it doesn't block your view and it disappears after a few seconds when it's not in use. So it doesn't distract you and it's not visible from the outside. Yeah, I mean like 42 pixels per degree, which is sharper than any major headset that's out there. And up to 5000 nits of brightness so it is crisp whether you're indoors or outdoors on the sunniest day. This required a custom light engine and waveguide to deliver this. There's a lot of awesome technology that we are really proud of. And then there's the neural interface. Every new computing platform has a new way to interact with it. So for the glasses, we are replacing the keyboard, mouse, touchscreen buttons, dials with the ability to send signals from your brain with little muscle movements that the neural. Can't wait for people to try this. It's a really wild experience. It is crazy. They founded a company that was doing something like this and then obviously added the folks of the team to build it up. But yeah, it is a completely different interaction paradigm. We have built a neural interface into a durable, lightweight, comfortable and good looking wristband with 18 hours of battery life and is water resistant. Changing the volume when you're listening to music, but just going like this, that was crazy, crazy experience. I want to get into this in more detail. We've got two options. We've got the slides or we've got the live demo. Slides. Slides give us slides. We're slide enjoyers here, but we'll take the live demo. Now. One of the most important and frequent things that we all do on our phones is send messages. So when we were designing these meta Ray Bans, we wanted to make it really easy to send and receive messages. And look, Boz is messaging me right now. All right, now, okay, I could go ahead and I could dictate with my voice, I could send a voice clip, but I've got this neural vent and it's silent. And now, and you know, a lot of time you're around other people, so it's good to just be able to type without anyone seeing. And he's doing both at the same time. He's talking while he's training. That is so aggressive. Yeah, when we tried this, I remember how to write. Realized quickly we forgot how to write. Yeah, it is, it is. But you pick it up quick. Yeah. It's like riding a bike. There's something about actually having a pencil or pen in your hand that makes it easier to come back. What do you think? Just to write. All right. It's definitely a new skill. Like, it's. It's like learning to type on a smartphone. Keyboard or keyboard. There we go. It's just interesting to be thinking about sitting here. I get a message from somebody and I just need to respond like this. Yeah. It is incredibly natural. I don't know what happened. Yeah, I'm interested. I mean, you can see the. See, he's dictating text as well. And I'm wondering what do you think the breakdown will be between people rating with the handwriting? Handwriting input versus just whispering to it or talking to it. This is. You remember trying the microphones. It was pretty remarkable how you could just whisper and it would still pick it up because of the location of the microphones on the device. Let's go for a fourth. All right. Try it again. I keep on messing this up, and if not, then we'll go for the less fun option. Okay. I don't know what to tell you guys. All right. Live demos, man. But thanks, Guts. And we're just going to go to the next thing that I wanted to show and hope that will work. All right. Takes guts. The functionality that he's testing is you. Basically, I can call somebody and give them a first person view. What I'm seeing. Yeah. So you went out of the room. I called you while I was wearing these. And you saw what I saw and I saw you. Which was kind of funny. I mean, it makes maybe more sense for both wearing them, but. And you could imagine that at some point they just do an avatar. Yeah. And then think about you're at the grocery store and it's like, hey, which one do you want? Right? You can call. Yeah. Makes it to. From Spotify. Here's California Dreaming by the Mamas and the Papas. Here we go. All right. And if I want to adjust the volume, I act like there's a volume control in front of me and I can just turn it. That's pretty good. That is a. It is. It is a really good interaction there. I mean, it's not that hard. Don't these have volume rockers on them? You could slide your finger up. Yeah, you can slide your finger. Even that. Yeah. A little bit easier. What do you think the. What do you think the difference between the input for handwriting versus talking to will be? The difference in terms of usage. Yeah. Like if you were, if you were one year from now, you have access to Meta's internal data. Obviously there's going to be a bunch of people that buy these. Try them, they use them. Some of them are addicted to the handwriting. Some of them never use the handwriting. Some of them use 50. 50. What do you think will be more popular in a year? I just think the ability to communicate in text without a device is. Okay, now, I don't know about. Is highly useful in certain circumstances. Yeah, but not necessarily the way that you're going to have long drawn out conversations. Yeah, it is really this case where you need to be. That's how we prove it's live. Yeah. Okay, so now, like I was saying. Oh, yeah, this is really cool. It centers the voice, gives you subtitles for the person that you're talking to, obviously in any language. Yeah. When I watch tv, I pretty much always have the subtitles on. I can hear fine, but I find that it just makes it easier to follow along. But if you have an issue hearing, then I think that this is going to be a game changer. Yeah, I agree. And it's also cool, it can do translation. So if I'm talking to somebody who speaks a different language than me, I'll get a translation in my native language right on the display. Real life subtitles. I do that a lot with the subtitles and movies, but I feel bad about it every time I turn them on because I'm like, is there something wrong with me? Why can't I just enjoy it the way the filmmaker intended? Filmmaker. Maybe he's a subtitle enjoyer too, baby. Like I've made this film to be enjoyed with subtitles. I somehow believe that Tom Cruise would not want that. He doesn't. He doesn't believe in framing. Tripolation. I was trying to call you. Were you busy? Yeah, you know. All right, all right. What's your ticket? You got some sick shoes, man. Okay, this is. This is important. I'll take. Take some photos. You know what, let's go ahead and take a video. Just because we missed that opportunity before. Thank you. Say hi. You want to wave? All right, There you go. Just a couple of lads. Yeah. You want to show the case? So the charging case holds nice and flat. Fits in your pocket, fits in your bag. And then look at that. Pops open. Oh, wow. Oh, interesting. It sits flat when it doesn't have them in there. But then when you put them in, it gets bigger simply. And then I can just, you know, go ahead and just browse through them and look at them after. That's not all you're going to be able to do one day. That's going to be valuable. Very cool decision to put the heads up, display offset so I can have a conversation with you right here. And if I'm getting a message or notification about something, it's not like blocking your face. Yeah. I was watching the Google I O keynote and I mean, it was a little bit more like vfxy. It wasn't as much. There obviously wasn't a live demo like this and it felt like they were centering the hud, like much more in the center of field of vision. And it does feel like the Call of Duty mini map is maybe the correct paradigm. Yeah. How the meta displays and the neural band come together to enable some pretty amazing new things. The last thing that I want to show is a glimpse of how this is going to work with agentic AI. And, you know, the basic idea here is that, you know, we all have dozens of conversations throughout the day and if you're anything like me, then in every conversation there are normally like five things that you want to follow up on. You know, maybe there's something you're supposed to do, maybe there's a conversation that, you know, this reminded you that you need to have. Maybe someone just said something that you weren't sure about and wanted to confirm or want more context on. But, you know, the thing is, it's tough to follow up while you're in the middle of a conversation. So if you're anything like me, you probably don't. And then you just forget a lot of these things. So the promise of glasses and AI is that they're going to help with this over time. So you just start a live AI session and the glasses are going to be able to see what you see, hear what you hear, and they're going to be able to go off and think about it and then go, can you tell if the indicator lights on for that? This one's. I feel like this is. This is going to be the same discussion as the AI pin. It's always listening to me. There's questions about, you know, can you maybe not have it listening to me right now or I want to know if it's listening to me. Being really clear on that is pretty important. Hey, Jake, I'm so glad you reached out. Hey, yeah, I was hoping you could help me on this board I'm building for my brother. Oh, of course. Hey, meta start live AI. So for the board, my brother needs Something with a wide tail so it's easy to catch with waves, but the performance of a narrower tail. What about a swallowtail shape? Oh, that's great. Yeah. But maybe three fins, that makes this accurate. Fact check this. You're the surfing expert. Is that what you would recommend? When would you use a swallow tail? I have no. I have no idea what any of this means. Actually, a few weeks ago, the supplier confirmed that the fins will be here in October. That's great news. But volatile. I usually surface walletail with a quad setup. What's that mean? Or twin. Twin fins. Oh. So you have three options. You have a traditional thruster, three fins. Thruster, thruster. What's a thruster? Three fin setup. Okay, three fin. Does no one surf? Just the normal one fin? The old single fin. Single fin. Is that popular anymore? Longboard. Lindy. It's Lindy. It's Lindy. People still do it? Yeah. Okay. Mostly longboarders. Amen. What's the lindiest surfboard or the trolley one? So there you have it. This is the next chapter in the exciting story of the future of computing. And so we got Meta Ray Ban display, our first AI glasses with high resolution display and the Meta neural band, the world's first mainstream neural interface. The glasses are going to come in two colors. They're going to come in black and sand. And they also all come with transition lenses so you can wear them indoors. They turn into sunglasses when you go outside. And you are going to be able to buy the set for $7.99 in stores where you can get demos as well on September 30th. All right. I'm pumped that people can actually go and try it. Buy it immediately. Yeah. In what? This is gonna be big for John Exley. He's gonna be able to have the show up running 13 perpetually weeks. Exley's gonna be able to have the show running perpetually on the heads up display. Okay, just. I already. I mean there's a big question about that. Right? Like obviously Meta's starting with first party apps, WhatsApp. They obviously have a deep integration Spotify, Instagram, but oh yeah, I mean we are streaming live on Instagram. We've got the next generation of Ray Ban Meta, including our special edition. You've got the Oakley Meta Houstons that we released in the summer. You've got the Oakley Meta Vanguard for performance. And now you've got the Meta Ray Ban display. Those are our fall 2025 glasses. If you go to meta.com about right now, the first header nav item is AI glasses. That's how important they're framing this family of apps is like deeper VR is to the right now it's AI Glasses is the category or dominated wanting to dominate. We want to help bring about a future where anyone can just dream up any experience that you can think of and then just create it. So even though obviously the Reddit may ban ray ban display, you can immediately start thinking about other apps that you develop. I mean you're just saying like people watching live streams, people watching all sorts of stuff. People using essentially third party apps. Hot dog, not hot dog running perpetually. I mean truly like the Cluley team, like they should want to integrate with this. Right. There should be a ton of companies that want to get worth. It's worth noting that their entire thesis was like always on AI is undeniably, directionally correct. Yes. I think the interesting thing is that like if you want to be a platform and you want this to be a platform, you need to be open enough that you are willing to let other companies win in the subcategory. Right. And so the like the iPhone came. The whole point of this is that Meta wants to be not just a platform. Yes. But a hardware platform. Yeah, right, exactly. And so I think that got to be somewhat open. You got to be friendly to developers, you got to let people integrate and build cool experiences on top of it. We haven't gotten a lot of messaging around that yet, but you have to imagine that it's coming, right? Because. And even in, even in gaming, right. You think about, you think about historical, these, these online offline games like Pokemon Go. Yeah. I do wonder what announcements we'll see in the next two weeks. I feel like gaming is an easy give. It's harder for a tech platform for like when, when the iPhone says we have a clock app, like the flashlight app, you know, like people built these different things and then the, the beer. Apple never made a beer app. They didn't make beer app. Missed opportunity. But they. Yeah, I mean as a platform you have to be able to, you have to be willing to give up on your first party apps or at least like allow them to compete in like a somewhat free market on top of your platform. And, and it'll be interesting to see like how aggressive developers get about, you know, plugging in and figuring out where the actual APIs are, how, how friendly is the ecosystem. We've spent the last couple of years building from scratch to replace the Unity runtime, which is great by the way. Think about Think about, you could build a. Is fully optimized for bringing something like Guitar Hero. Yeah. For like the piano, for example, where it's just like heads up display. It's flashing. Yep, flashing. I think that's one of the better selling apps on both. Basically all the VR headsets pass through, you see the actual keys, but then there's virtual elements laid over the keys. And I've actually. I've actually done that with a. I think it's called what this engine can do. I forget there was some app that I had that where you basically just put your laptop on top of the keyboard and then it. And then it overlays the keys as they drop down. You can play the piano. Pretty decent. Pretty decent. We just got this demo. Yes, yes. Got to be in the center of the octagon. Today we are rolling out early access to Hyperscape. Did you see the brand that was on that UFC Octagon in the demo? I don't think it was. It had logos. It didn't, but in there I believe it did. I need to roll it back, but it looked like there was a Lucy logo there. I don't know if I'm hallucinating them, but I'm pretty sure I just saw that. I mean, you guys are. We are sponsor, so yeah, totally possible now. Eventually you're going to be able to see. Yeah, this is cool. We got this demo earlier and it worlds into Horizon and have them all be connected too. All right, this one, this is our new immersive home rendered entirely in meta Horizon engine. Visually, it is a big step forward from where we have been. There is no 8bit Eiffel Tower here. Good callback. You can pin different apps to the wall, like this Instagram app. It automatically renders your posts from. From creators and friends in 3D. We're in such a weird time with this, like how you actually experience a virtual world. Like earlier this week we were talking about Fei Fei Li's World Labs. She's doing Gaussian splatting. Gaussian splatting. And so you take a bunch of photos, run it through a training run algorithm that runs cooks, and then you can move around in the browser and it looks extremely photoreal until you get like outside of the house and then too far. And then it kind of like. Yeah, it kind of breaks down in this really interesting, like, bizarre way. And so they brought that to the Oculus world, the quest world. But you can generate. But then you can also generate real worlds, like using a traditional 3D pipeline. So there's. And I feel like these two technologies are on a collision course. And they. Because they don't play well together right now, but they're so starting to. And we're starting to see demos where you can go take a bunch of photos. It builds the Gaussian splat, and then from there it generates 3D geometry that can be interacted with. Because in those Gaussian spots, you can move around like a camera that's just flying around, but if you pick up a ball and throw it against the wall, it won't bounce. And that. And like, obviously, that's a prerequisite for basically everything. Michael just confirmed Lucy logo in the UFC ring. Yeah. That's crazy. That's crazy. And then coming to. This happened years ago with my first company. We somehow. There was a Super bowl ad, and they needed. It was for Fast and the Furious, and they needed an ad to go on a billboard in Times Square where the cars are racing through, and they couldn't use an actual ad. And so my ad guy knew someone in Hollywood and was like, you can use our brand for free. We'll send you an image of a billboard that you can Photoshop or VFX into the shot that will go in the super bowl ad. We're like, wow, we got our logo in the super bowl ad. Also, it's really neat to see how many people are using Quest to watch video content. It's just a lot more immersive. So we think that this category, watching video content, is going to be a huge category, both in virtual reality headsets and on glasses, too. So we're launching a new entertainment hub that we are calling Horizon tv. And I don't think we actually got this demo. A bunch of great partners to include a bunch of movies and TV and live sports and music. Talking about this. Excited to announce that Disney plus is coming to Horizon TV and bringing along content from Hulu. It's. It's. It's. So it's such a basic functionality, but when the Apple Vision Pro launched, I remember seeing you open up the apps and what was the top left app? What was the app that they. If you read it like a book, left to right, like, what was the app that they wanted you to open? It wasn't any of the crazy VR video games, 3D worlds. It was Apple TV. Because they were like, look, the one thing, apparently the Apple team, one of the folks that they'd hired to work on the Apple Vision Pro, came from Dolby Cinema, and they were like, the one thing that we can know that we can deliver is just like a movie watching experience. And I feel like, I don't know. Palmer Luckey has that quote about like, the war fighter will be wearing a VR headset before the average consumer does because you can spend so much money and you can mandate that they wear it. And there's all these different reasons. I still think I might be wrong on this. We'll have to talk to folks and debate it. But I still feel like there's a world where the VR headset replaces the TV before it replaces the MacBook Pro or like the laptop. And I know you never watch movies at all and also probably have never watched full film in VR, but I feel like the screen pixel density for the Quest is on a trajectory where it's going to be cinema quality level pretty quickly. But you can't just show up and be like, yeah, of course you can like log into this app through the browser. Like it needs to just be there natively for one important thing. They're not trying to develop some massive content. They're not trying to build a film studio dedicated to. Maybe they should, I don't know. I mean I, I honestly think that there's a world where they, where they, where they. They should buy Terminator 2. They should buy and they should give that pre installed in the Quest. When you get one, when you get one, you should just get a free copy of Titanic or Avatar. Because one of the first movies that I watched in 3D in VR was Avatar. Because I was like, that's a movie that needs to be experienced in a huge screen in a theater and VR can actually afford you. That doesn't quite hit the same when you just watch on a TV or your phone. And so actually having a partnership that allows you to deliver that at least in just a few clicks with just a few logins, like, that's better. But I'd like to see a movie pre installed. Filmmaking and it goes back a long ways, two decades. Really talk to me about where that comes from, why you believe so strongly in this. I've spent my filmmaking career trying to really engage people, draw them in, get them involved, get them involved in the story and the characters. I was first exposed to 3D filmmaking in 1998, I think, and it was massive. Film cameras for a ride show. I thought we got to be able to do this better. Came along. I was a super early adopter. I think there was George Lucas and then me. And that was in 99, 2000. And I said, why can't we just slap two of these things side by side and make 3D, you know. Well, it turned out to be a lot more complicated than that. And so 25 years later, I'm pleased to say I've got a great 3D team and we've, we've made it all. We not only made my films, we've made the 3D cameras available to a lot of other filmmakers doing concert films and sports for tv, which didn't last long, but. And, you know, lots of big movies, Ridley Scott, that sort of thing. I just love 3D personally. I love authoring in it. I love seeing the end result when it's, when it's done properly. And I think it's how we, we perceive the world. Why would we throw away 50% of our, of our data, you know, and see everything through a single eye? It makes no sense to me. And I just see a future which I think can be enabled by the new, you know, devices that, that you have, the Quest series and then some of the new stuff. Hopefully down the line we get to talk to him, take them through what's happening in the. You don't realize SaaS companies are going to release cinematic before they release the real product. I mean, if the, if the launch video meta continues, it's going to be like, yeah, like we're, we're excited to launch our product. Like go to the nearest IMAX theater to see it. I mean, there's companies that are. Buy ticket, hit the box office. Hit the box office, yeah. I don't know if people know this. James Cameron's like a complete purist when it comes to 3D, which means like he actually films it with two different cameras because there's a lot of. Once there was a 3D boom, like theaters. Just realized that you could just charge more money by saying, hey, there's a 3D version. But then they realized that they could create a 3D film from a 2D production. And so they'd film the whole movie normally. Stolen Valor. Stolen Valor. Yeah. And then they'd go in and they'd. They'd have a whole team of rotoscope artists which would basically cut out from the image. Okay, Jordy, you're in front of that background. I'm going to cut you out and put you on a different layer in post basically, and kind of fill in the background. Blurry says we're doing it live. Yeah, he does do it live and with AI and stuff, that's got to be easier to do. But I would be surprised if James Cameron is putting down the super heavy 3D IMAX camera anytime soon. He was also famous for operating the camera himself. And there's all these pictures. I don't know if he does it all the time. I think he says that he's not comfortable doing the full Steadicam. Remember we saw that on the New York Stock Exchange floor. But. But if it's just like a shoulder mounted shot, he will actually be like I want to himself. Founder mode. Founder mode. And that's og. What is the. What is the founders podcast anecdote about James Cameron? He taught himself visual effects while he was a truck driver. Right. This was scary. This was scary. Oh yeah. You were asking about. He was doing. He was reading and driving. Yeah, yeah. Basically the stories that whatever books was reading. Yeah. It was clear that. That I think James was driving trucks while reading books. It did make it sound like that. It seemed like it was. Who knows it might have been pulled over, might have been taken a rest. Yeah. The story was like he was driving trucks. And then on in his free time he would study visual effects and study cinematography and get up to speed on filmmaking. But very funny. Hey maybe you know the next James Cameron is probably using the meta Ray Ban displays. They got their books right here, their truck driving. They're doing great. That's the future. Let's listen to James Cameron a little bit more. Been able to prove that there's more emotional engagement, there's more sense of presence. You know, if you're going to watch a Blumhouse film horror film, your fight flight reflex is more engaged. Right. If hopefully if you're watching one of my first VR experiences with which one it was back when it was Oculus. It was post acquisition but it was the first consumer version. Maybe it was actually Developer Kit 2 DK2. It was this huge block on your face and you had to hook it up to a PC. It would not just run by itself. And I connected it To Half Life 2 and Half Life 2 is action game shooter. Not too scary. But there's this one level Ravenholm where it gets really dark and the zombies start coming out and jump scaring you. And I remember turning around seeing a zombie run at me and actually like jumping out of my seat. And I played this game before and like the reaction to a 2D shooter horror film is just not. It just doesn't like scare you that much. It just doesn't hit like that. But in VR it was something pretty. Pretty crazy. I think our task, the reason that we've. We've partnered and it's under, you know, can we get a wrist check? And Sarah Milliken and who? James Cameron. What's he got? Oh, wait, is that on his right hand. Is to get lefty showrunners. Because by the way, I think episodic television, short form, long form, I think that's the low hanging fruit that people have historically ignored. Ignored because so much 3D content was just made for movies. I'm not talking about Avatar. I can't. I can't make movies fast enough to feed this pipeline. We do it at lightstorm vision. My 3D company is. We build cameras and systems and networking and tools to give to other film. It looks great. He looks like he's got a few more. At least a few more Avatars. Few more founders podcast episodes. Small fee. Job's not finished. Other. Other filmmakers and showrunners and and broadcasters. John, the BMX bikers are lining. They're going, oh, that guy's got like an evil Knievel helmet. So there's two back there. There are two half pipes that go like this. Do you think it's possible to transfer from one to the other? The transfer. The angle makes that seems possible. Yeah, right. It wouldn't be possible because the ramp is actually coming back. Yeah. But there is a section that they have to clear that we're both looking at. Look like a death trap. Yeah. Who wants to invest? Pretty crazy. And it's not only just bringing down the hardware, but it's making the hardware smarter. There's a lot of software solutions, and if anybody is tuning into this live from MediConnect, just know that you can walk up here and say hello to yourself, take care of, you know, the decision making around what makes good stereo, what makes it easy on our eyes, easy on our brains, where we're not getting eye strain and all those things. So it's taken us 25 years to figure out the kind of algorithm for that. It is worth noting that this time last year, we were both having the conversation that technology brothers have had in the past, which is we should tens of thousands of start a podcast. Yes. This was the. We should start a podcast. This was month. Yeah. Going from like, you know, autofocus. Here we are. You have the ability to interocular distance can be an automatic auto stereo. So, yeah, this is one of the things that really, I think has made this partnership so great. And you get a sense, I think of it from the two of us, we're effusive about the partnership is you are somebody who's had it is crazy. There hasn't been more of a start with a product 3D movie push story you want to tell and how you want people to experience that story, probably because of the display resolution, the audience a little bit more for gaming. Does that matter? I mean, if you have the catalog, why not distribute it widely? Right? Like the Avatar's been shown on free TV with commercials. It's also been shown in theaters. The super expensive prices, 3D. Right? Yeah, but yeah, I think it goes back to the VR companies really focusing on the long term promise of what's possible with VR. Immersive worlds, huge video games. But the. Yeah, I mean you have to get the install base up to actually get that. I don't think you need to get the install base up to make 3D, the catalog of great 3D cinema, a fantastic experience on a headset. So I would be starting to feel like it's picking up momentum not only in the hardware, but also in the content side. You are willing a future into existence that you saw clearly. And this field, this moment in history feels a lot to me like it did back in the very in the early 90s, late 80s and early 90s when CG was first manifesting itself. And oh, you're going to replace actors and it'll never look real. And you know, analog is the answer. And that's why I founded a company called Digital Domain. I wanted, you know, it was, it was revolutionary in its moment today and it's ubiquitous today. So I've actually seen historically in my, my own life experience how you can actually make massive change. And you know, and then that led to 3D. Okay. Everybody accepts the fact that we go to digital movie theaters now, right? Obvious. Right? Except that when the digital technology exists, it wasn't adopted right away. It took 3D to get the theaters to convert to digital projects. It took you. Well, we were in the middle of that when it released. Unless they updated the theaters. Yes. And it was actually talking to the team at Texas Instruments that developed the chip that made the near term projection possible and saying embed in your servers and in your electronics the ability to carry two image streams. And because they did that, then digital projection just rolled out and now it's everywhere other than the occasional art house someplace with a 35 millimeter print. But when you've lived through enough of these revolutions, you start to see them coming as a wave, like a good surfer. I know you surf. That's right. I watch it from the beach, you watch it from underwater. I watch it from underwater. Listen, we've got something, one more exciting piece coming. I want to thank you again for coming to Kinect. It's really our honor to have you. I can't wait to check out Avatar Fire and Ash as I'm sure everyone here will agree when it hits the theaters in December 19th. Thank you. I love Avatar. We have our first guest on the way over. As a special surprise, we have an exclusive never before seen stand stunning 3D clip from Avatar Fire and Ash for everyone to check out in demo stations here for attendees and available on all Meta Quest devices in Horizon TV for a limited viewing window. So thank you all. Thank you James and trust the process. This is all going to be very exciting. Here we go. Take these out. So we have our first hands on live with the Meta Ray Ban display. You can't even tell. It is remarkable. Thank you. It's so close you couldn't tell at all. Right. You, you walked in together, you assumed that they were smart glasses. You assume they were smart glasses, but I didn't know they were the display model. Yeah, it's pretty relatable. They've really shrunk it down so much and I mean we tried Orion and Ryan is blocky. It looks like it doesn't look like a full consumer product and obviously when they announced it, they were messaging. Hey, we're going to shrink this down and the narrow band. But they really did really light. Yeah. I mean people are already wearing bands like this all the time. I see more and more people wearing two devices on their wrists. People are very comfortable with this. I don't learn. All right, we've got an after party over at Meta's class. Diplo is going to play. There you go. Please join me in welcoming Diplo. Fantastic. Well, we are moving over to our first guest of the stream, Chris Cox, the chief product officer at Meadow. People are also starting to learn that you're a big runner and you've got the whole Diplo Run club. Exactly. So what do you think? Should we run over to Ready for a run right now and take these things for a spin? Absolutely. All right, let's do it. Meta Play. Be right there. From Spotify. Go for a run. And I think, I believe they're going to run right past us. So we will say we will wave to them when they run over here. Going for a light jog before hopping on the show. Love to see it a warm up. Great. And we are ready for our first guest of the show. Welcome to the stream, Chris Cox. Let's do it. Thanks so much for hopping on. How you doing? Welcome. This is Jordy. Here, grab a headset. Here you go. Shades or not? Yeah, please throw them on. It's a little hard to wear over the head under the headset, but you can make it work. Nice. Yeah. Which ones are you grabbing? Well, I brought my own. I got these Navy. What are you daily driving? I like the Navy. Great. And they're a transition. Here, pull up the mic a little bit. We can hear you. There you go. Great. Can you hear me? Loud and clear. Sweet. Yeah. So what is. What does your organization look like right now? I mean, you've been at Meta for 20 years, right? Almost 20 years. Almost 20. Congratulations. I mean, it's a massive company. How do you fit into today? So I'm the cpo, Chief Product Officer. I lead the family of apps. So That's Facebook, Instagram, WhatsApp, Messenger, Threads, edits. Working very closely with Alex and Nat, building out all the AI stuff that we're doing. Also lead our privacy team, the team that thinks about protecting user data. Yep. How has your frame of mind changed in the age of AI? Around the trade off, the decisions around how you build the products. It's a new era for product guys. Yeah, it is. I mean, it's changing these days. It's changing like one week at a time. That's how much is changing. How people engineer. Prototypes can now be done. Stuff can be done in hours that used to take weeks. And part of what we're trying to do for the company is just encourage everybody, even if they know what they're doing, to take risks on trying to do things differently and to learn as quickly as they can. All the way down to the way infrastructure is built, the way bugs are detected, the way optimizations are made to ranking. For example, we've been ranking News feeds since 2006. We're now starting to deploy agents to think about how to do that themselves and already seeing pretty interesting wins in terms of just making the experience better for people. So I would say it's changing very rapidly and it requires a huge amount of constant attention to make sure that we're staying on the edge. What about at the product level for consumers and how you think about, like, product quality? Historically it was easier to be like, does a button work or not? Yep. And now we're in an era where AI is probabilistic. You don't have the same ability to have consistency. How has that kind of shifted your thinking? A lot of it. I mean, AI can be used to detect edge cases a lot more easily, which is really important. AI can be used to scale a judgment to lots more types of people and lots more languages. For example, one of my favorite features on the glasses is live translations. And then one of my favorite features we've started to roll out on Instagram is captioning and lip syncing so that you can take any video creator's language and translate it into the native language of the viewer along with lip syncing. This to me is like very, very fundamental if you think about what it unlocks. It's kind of like Tower of Babel level phenomenal to take any voice and, and translate it into the voice of the listener. So it scales the kind of thing that's just pure human connection, but it does it in a way that's instantaneous and could let somebody who speaks a relatively small language family experience the rest of the Internet or experience the same speaker of anybody out there. Yeah, it'll be interesting to think about new superstars, Internet superstars starting out default global just because they're able to just be instantly translated across the entire world. Exactly, yeah. I mean, you said something about, you said you could scale a. What was that? A resolution or something. You had some word for it. But I'm interested to know how to. How you think about the trade offs between like rethinking products entirely from the ground up in AI native ways versus like there are so many amazing unlocks with like captioning and just translation. Just like we take them for granted, but you gotta go chop the wood and actually get them out into the products. How are you thinking about balancing those are there's like two different teams where you're kind of thinking about a greenfield project that could be like an entirely V2. Or do you see yourself as like iterating towards whatever that next version of the product looks like? We do a lot of both. Yeah. We basically ask every team to have a portfolio to make sure they have something that's going to, they're going to deliver in the next year and then something that's going to come three years from now that's much riskier. That's in the prototype phase where you're playing around with ideas. You're literally prototyping something that doesn't quite work. And if it does work, you're not thinking far enough and it's advance. We do this for every single part of the business. So WhatsApp does that, Instagram does that, our ads team does that. And that way you're Sort of constantly having a product pipeline of things that require a lot more risk taking. What you're starting to see now is that the farther out stuff you can code up a lot more quickly, you can play around with a lot more quickly and then the nearer term stuff you're able to scale. What I was saying before is you can take something that works for one set of users and just scale it out a lot more quickly. Yeah. How are you thinking about talent internally? We've seen a couple highly entrepreneurial folks join to build msl. What does that look like on the product side? Meta has a really rich history of acquiring and bringing some of the greatest founders into the organization, turning them loose, growing them into huge products. Is that something you want to continue on the product side? Is it something that's more important in the age of AI? How are you thinking about that? Yeah, we've had going back to the very earliest days of the company, like I was one of my earlier jobs was building up a product management team. And the way we did it, and this is before product management was really a thing, it wasn't a major discipline in software. So I couldn't go out there and find a lot of experienced product managers. Aside from Google was the only company that was like still standing from the dot com. Yeah, yeah, yeah. It's crazy to think that only a little while ago there wasn't young people that were like, I want to be a product maker. It just didn't exist. So this is 2007, ish, 2008. And so the way we did it was like, let's go find the best small startups and like see if they'd want to work with us. So this is Brett Taylor who is leading FriendFeed. This was Gokul Rajaram, one of the sort of founders of Google AdSense who is leading a team called Chai Labs. Yep. This is Blake Ross who built Firefox. It was just these like legendary for me it was like, these are legendary people. We were all like 25, they were too. But part of the reason that was such an interesting looking back is like we had a lot of founders at the company and we loved that the energy that a founder brings, the entrepreneurial energy is really powerful, especially at a company that is in white space. Like social media was brand new, smartphones were kind of brand new. So you want as many people as you can, frankly that can operate are like comfortable being at a company and dealing with like, okay, I need to like actually check a bunch of boxes to deliver something to billions of people. I can't just do that in a weekend. But you want the sort of aspirational, just like energy of a founder. So we do acquisitions. We also are frequently seeing people leave and go found a company and then often come back and sort of understanding sort of all the goodness of big companies and all the goodness of the outside world and trying to get the balance right between the two. What are the buckets like? How are you thinking about the value? You know, we talk about super intelligence, we talk about AI. It's extremely broad. It means things for different people. How do you think about sort of the categories that. That AI can deliver value in? I can think of, like, pure utility, like summarizing a message in WhatsApp. I think I can think of entertainment, I can think of connection. But what's your framework in terms of like making AI, you know, integrating it through meta platforms and just making it valuable for end users versus this abstract kind of concept? Yeah. So, I mean, just thinking about the displays and the wearables we lock today, a lot of this is going to be about something that is with you all day long. When we talk about personal superintelligence, it's basically this idea that your computer should understand what you care about. It should understand what you're thinking about today. It should understand your values. Yeah. What you're trying to get done, what you're interested in, the people you care about. Like, that's what a super intelligent assistant that you could design for yourself would know. And then when you open Instagram or you open Facebook, everything you see there should be responsive to, like, your interests and values. And, like, if you think about things from that perspective, like, we're a pretty long way away. I'm still seeing things that, like, may not be interesting to me today or were interesting to me weeks ago. It's not like, up to date to the second with, like, the way that people are. Like, if you have a really close friend who knows what you're reading today, like, you'll talk about today, you'll talk about the news today. And so for me, it's just taking the idea of what our apps do today. They connect you with people, they connect you with your interests, they help you create content and just bringing the barrier of all of those things down. So that, to me is like how you extend the product forwards. And then with these, with these glasses, I mean, you really, once you start wearing them, you do start to have a sense that this could replace a lot of the, like pulling your phone out of Your totally the handwriting. Yeah. And. And that type of thing just feels like running around. Here we go. Look at that. There we go. They did it. Credit for Mark for going on a run before. Sure. Jumping on a shout out to her keynote. Thank you so much. Have a good rest of your day. Cheers. Really quickly, let me tell you about Fall FAL AI is the website the world's best generative image, video and audio models all in one place. Develop and fine tune models with serverless GPUs and on demand clusters. Our next guest is Adam Mosseri, the head of Instagram. We will bring him down. It's an app you probably use. Yes. You're probably watching TVPN live on Instagram right now. We are streaming live on Instagram for the first time ever. It's a regular thing now. Yes. Very excited for our vertical layout, but we will bring on Adam. Adam's looking incredibly sharp. We're swapping things out and the run has concluded. They're taking photos and we are bringing on new products. Here is the. Here's that case that you saw. I can't believe it's so cool that it folds up flat. Yeah. Here we go. And we're ready. Let's bring them on. Adam, how you doing here? They're gonna have you put up. Put on this headset. There you go. No, we can't hear you. Barely locked in. It's noisy here. I like the idea of talking to you before you can hear me. Hi, I'm John. Nice to meet you guys. Welcome to the show. What's happening? There's people running back. Yes. The Run Club, I believe, is complete. It concluded. Mark Zuckerberg's right over there. It barely broke a sweat. Yeah. And Dip. Diplo. Yeah, Diplo's here. Yeah. I was like, okay. Yeah, that's cool. Yes. Another day at the office. Take us through. Take us through. How the glasses play with Instagram over the long term. We saw a demo where Mark Zuckerberg was. It seemed like he was live streaming. Is that something that's going to come to the glasses, you think? Yeah, I think so. In general, I mean, Instagram is about trying to inspire creativity and having people connect over that creativity. We started as fun square photos with big filters and ridiculous borders that were kind of fun, as in a way to help people create things that they wanted to share. And I think in a world where you can take pictures with things like these or live stream what's going on when something special is happening in your life. We love the idea of bringing that to Instagram, the platform feels like remarkably stable, super feature complete. There aren't a lot of feature requests that people are angry about. And oh, why don't you have this feature? It's got a lot. The critical feedback comes in other forms. Oh, yeah, you still get that, I'm sure. Do you ever check my comments? My question is, there's two things going on in AI. I put on a Hazmat suit before I go in there. Yeah. I mean, I go into the request once a week. I feel like it's important. I usually leave feeling pretty poor about myself. Try to refresh, get a good night's sleep, shake it off. But there's. There's a ton of stuff going on in generative AI. What about in core AI? Do you feel like there's still room to get gains out of core AI models just on better recommendation feeds with bigger models, bigger training runs, what is the gap to just getting people better recommendations? Absolutely. There's a number of different ways, if you want to go into the tech side of things, about how these frontier models can change how we do or how we recommend content, how we understand people's interests. One big pillar of that is content understanding. These Omni models, these models that can work across text, video and photos, they can understand things in much more nuanced and complicated ways. Before, if we wanted to build a classifier that understood what something was about, we'd build a different classifier for every topic. We can only come up with so many topics now. We can look at these, that used to be these pieces of technology that we couldn't actually read directly as people and we can use LLMs to make sense of them. And we can say like, oh, these two videos are in the same place on a map. Now we know that is vintage Arsenal 90s highlights. We never could have done that before. So that empowers things on finding content to help people connect to that. It's going to empower things. On giving people more control over the recommendations and their experience on Instagram and these other apps. There's all sorts of really compelling opportunities that I think are going to come to fruition over the next couple years. On the gen AI side, do you have a view on how much AI content we're going to be seeing? People like to complain about AI slot, but I've seen some incredible AI generated videos. I'm sure we've all seen Harry Potter, Balenciaga. It clearly still had a human element in it. It wasn't just make me something that gets likes There was a human touch that was enabled by AI technology. Right. Well, I think you're going to see, like with all other technology, that there's going to be good and there's going to be bad. And the most interesting content that I've seen that has been generated with or AI or AI has been part of creating it, have had a point of view that has come from a person. Sure. I do think what you're going to see is we're going to see, yes, more purely generated AI content grow over time. And some of that is going to have real risks. Things like deep fakes, trying to misrepresent what's happening. Some of it's going to be really inspiring and trying to help you. You know, you can imagine things like creating tutorials to learn how to do things that you couldn't do before. And a creator might not have been able to do that. Now they can use the tools that do just that. I also think you're going to see a lot of content that is sort of hybrid. We don't talk about this a lot because we're more focused on the extremes. But AI can help people just clean up photos, clean up videos, make every clip in a reel, the same lighting. There's a lot of basic stuff that is actually, I think, super important opportunity for creators. My view, it's like, if you have a. If you've built an audience that cares about you and cares about your content today, you're gonna do really well over the next 10 years. You're gonna be able to make more content, you're gonna be able to make better content. And then the exciting thing is the entirely new categories of people that never thought to make something because it was really hard or they never thought to learn. Right. I mean, the beauty of Instagram early on, and even Facebook was like Facebook, you could just type out a message and hit return. Yeah, post it. Instagram, you can take a picture, post it. Maybe you add a filter, maybe you don't. And it's just about reducing that friction. Question I have is like, how do you think about the push and pull between keeping Instagram? You know, I feel like in your comments, keeping Instagram. Instagram, Right. We think about, you know, we think about, you know, people have an idea of what an app is, and then there's constantly pressure to add new things and do new things. But how do you think about that push and pull internally? So I think about our reason to be is about inspiring creativity and helping people connect. Over that creativity, I see an amazing Piece of stand up that I know is really hit hard with my brother and I send it to him and then we talk about it. You're seeing the shares are higher than likes in a lot of reels. Yeah. So it's about sharing reels, it's about responding to stories, it's about connecting over your interests. Now, how people do that on Instagram is going to have to change is how people communicate with their friends and how people entertain themselves inevitably changes. Often people think of Instagram as a feed of square photos. But if we didn't evolve, if we didn't add video, if we didn't add stories, if we didn't add DMs, if we didn't add reels, we wouldn't be here today. You wouldn't be asking me any questions. And so we have to figure out, how do we evolve forward but stay true to our core identity, to our reason to exist in the first place? That's a balance. Sometimes we get it right, sometimes we get it wrong. We've pushed too hard sometimes. I've been on the fair amount of that feedback and I appreciate it. But like I said, if we didn't evolve, we would just slowly become irrelevant. Well, thank you so much for taking the time to talk to us. Yeah. Good to meet you guys. Get you guys a little bit less on X and a little bit more. Oh, we're coming over. We're live streaming on IG right now. We are growing. I know, but we're going to be in your comment section, guys. Yes, we're working on it. I want the heat, I want the real feedback. I want to know what we're doing well and what we're not. For sure. For sure. Yeah. We'll talk soon. Thanks so much. Thanks for coming on. Don't forget to take the headset off before you walk away. I just, you know, take this. Yeah, yeah. Pull the whole. I think it's a good look. Yeah. All right. Fantastic. Aquanaut too. I got. Yeah. Beautiful. Excellent taste. Good eye. Let's tell you about graphite.dev Code review for the Age of AI Graphite helps teams on GitHub ship higher quality software faster. And our next guest is Connor Hayes. Not H A Y S. He's got an E. H A Y E S. He is the head of threads. Welcome to the show. How you doing? Throw this headset on. Great to meet you. Throw this on so we can hear you. This is crazy out. This is a crazy event. Thank you for having us. The BMX riders are in the Air. Everyone's going, you guys are in headset. You're not like, oh, yeah, yeah, yeah. When you're out headset. We're locked in wild environment here. So, yeah, I mean, everyone knows Threads take us through. Like, what is the scale of the platform? It feels like it's massive. Where is the biggest success of Threads? Because, you know, we've heard about other platforms. You know, Instagram famously started with runners. Where's the O.C. and Run Club? Yeah. Where's In. Where's. Where's Threads really found its footing. Yeah. We recently announced that we crossed the 400 million month. So hit the screaming eagle maybe. I don't know. That's great. And we've done really well, actually, globally. So one of the main ways that people find out about Threads is through promotions that we do in Instagram and Facebook, the content that's most popular in Threads and show people there. So basically anywhere where those platforms are big, we've been able to attract people to Threads. Speeding back and forth. Yeah, sorry, Adam. We're gonna end up on Instagram now. I appreciate it. This is threads time. Japan, Korea, the U.S. india, Brazil. It's pretty much a global platform at this point. Alex Heath, who just. Alex Heath is 50K. He's here. We're going to hang out with him. Yeah, sorry, we ran into him earlier today. Congrats, Alex, on his. He went independent today. Yeah, he did yesterday. But, yeah. Yeah, it's great. Yeah. Talk about. Talk about kind of the different kind of inflection points, because obviously there's the big launch. Right. But it's like any, like building any new products. It's a roller coaster. Yeah. And it feels like you guys are really figuring things out. I find myself in the. I certainly am a DAU now because I'm just constantly. Even if it's not necessarily like muscle memory to open threads, I'm getting. I'm finding. I see it on Instagram and I'm getting flushed over. Yeah. I mean, yeah. One of my kind of philosophies as a product person is anytime you can launch something with a bootstrap, do it. I think bootstrapping off of Instagram was like 100% the right thing to do in the beginning. We had this really big pop. I think it kind of established the platform. Got a lot of people in there. But I think what you quickly find out if you were using Threads at that time versus now, is that not all the people that are best at Instagram are going to be best at Threads or the format is so different. So we had to spend a lot of time kind of getting the Threads native people onto the platform and then also helping users build a thread specific graph. So that has been kind of the last year and it feels like we're starting to break through and have some power users that really love the product now. It's so fascinating because I feel like Meta has done a few of these greenfield projects before. Facebook Camera. There's been other apps that eventually got rolled into Instagram. Was this always the plan? Is this surprising internally? Am I just out of the loop here? Like, it is. It is a unique story. Right. I was on the team that like helped build Threads in the beginning. Took a little detour. Yeah. Yeah. But we actually debated really, really heavily in the. Should it just go on Instagram? Right. And that was actually my original pitch, I'll admit, as like, that's the ultimate bootstrap. Yeah. No, it makes a ton of sense. It's like you've added stories, you added reels. Like there's been so many times when Instagram was a fertile ground for that. Like why? What is fundamental about you need a separate app? To his credit, I mean, Mark was the person that pushed us the hard on this. I think his point of view is that the use case and I think over time it ends up being distinct enough that you kind of want it to feel like a separate space. Sure. If Threads is going to be the place where it's like fresh perspectives on what's happening now in the world, that's a little bit different than like what's the most entertaining and visually appealing. Yeah. Reading versus Reading Network. When. When something happens in the world want to go to where things are that. That news is breaking and being discussed. Right. The discussion is key. Yes. And I do think it's just wildly different thing. You open Instagram and you might be seeing what Instagram thinks is what you're going to be most interested in that moment, not the story that just broke. And so I think that it's like a distinctly different ecosystem. We actually got criticized for this. There was like, you know, Casey was making fun of us a bunch in the beginning because it's like the Threads feed would be showing him things that were like a week old or six days old. That was a lot of. Because we built on top of Instagram tech. Oh, interesting. So that, you know, Instagram pushes for timeliness with the content. But if there's something that's awesome that you missed six days ago and it's not about some breaking News thing. You are fine seeing a funny video that was posted six days ago. So actually a lot of our thread specific relevance investment over the last couple years has been training for timeliness. Yeah. Trying to get it to feel really fresh and you actually create a constraint for yourself because the pool of content that you can pull from is then inherently smaller. How big is the Threads team? I don't know if I'm able to share that. Well, Adam's team, relative to the. To the rest of the company, seems like small, but lean and scrappy. Adam was pitching us on live streaming on Instagram. We're live streaming on Instagram right now. Is live streaming going to come to Threads? Talk to me about the. Where the product goes because pretty soon you could build all the Instagram features. You know, you don't care if you're not careful. Sometimes you go into an interview and you know that there is a question here. This was the one. Okay. I mean, listen, we just talked about. You want Threads to be the place where it feels live. What people are saying about what's happening right now. We just have such a long list of basic stuff that has to get done. I've been like this catchphrase that I've been giving to the team is like be the app that ships. You can come up with. We could sit here for like two minutes and come up with a dozen features that are missing. Making replies better, making notifications work really well. Like getting the profile to feel really good, getting search better, trending. So those are the things that we're going to be focused on in the near term. But I do think it'll probably. There will come a day. Well, if you launch it, call us first. We'll be the first one on. I promise you and commit to you that that is what. Thank you. Well, thank you for coming on the show. We appreciate you. Great to meet you. Thank you guys so much. We'll talk to you soon. Yeah, have a good one. We'll see you on Threads. If you want AI to handle your customer support Threads, head over to Fin AI, the number one AI agent for customer service. Our next guest is Roberto Nixon. Fantastic Instagram creator, friend of mine. We've been dming for years now talking about tech. He is. He. He does these incredibly polished Instagram reels. Please. Hell. Easy. Firing up the meta ray bands. There we go. Welcome to the stream. How you doing, man? Good to see you. Good. Good to see you. Good to see you. Throw the headset on so we can hear you. There you go. Oh, beautiful. My, my. Made it. I'm on tvpn. Yeah. Long overdue. Great to have you take us through your reaction to the event today. Look, the. The one thing I'll say is, you guys know, in tech. Yeah. Every couple years is like a. Can you curse on the show? We don't, but you can. There's a holy crap moment. Okay. A few years in tech. Right. Like, you know, iPhone 4 retina screen. Yeah. The EMG band, the electromyography band, the meta neural band that comes with the new Meta Ray Ban display. Feels like magic. It's so natural. So we've demoed it a couple times. I was shocked that I picked it up. It just felt like it feels like using a phone with no phone. Yeah. It's like a crazy thing. Yeah. And so I tried it last year with Orion, but Orion has eye tracking. This one doesn't. So it's a little bit more precise. And the new pinch and yeah, the volume, that's when I was like, that's great. Yo. That was my favorite part of the keynote. But I'm also a sucker for those new Oakleys. Yeah, I know. Well, talk to me about just Meta Ray Bans as a creator tool. We saw after the iPhone keynote, Mr. Beast said, I'm going all in all, my cameras are going to be iPhone 17 Pros or something. Do you think this will be a daily tool that you use in creating content? Obviously, you're still going to use cinematic footage for a lot of the stuff you do, but how does this fit into actually creating content? I would say for. Everybody's different. Yeah. Here's my thing. A lot of times when it comes to Instagram, some people get frustrated by this. I think personally, it's kind of cool, but sometimes the process gets more love than the art, than the final result. Yeah. So me, when I rock these, it's always bts, it's always pov. Sure. So I put out like a piece of art, let's call it like a video. And then I'm showing the process behind it. The POV from the glasses. That combination is killer to get two pieces of pieces of content for one idea. So that personally, that's how I use them. I think live streaming is another great use case, but I think every creator is using it for different things. Well, thank you so much for coming on the show. We got to have you back and hang out more. We can talk so much more about the creative economy and whatnot. All right, have a good one. Have fun. We have our next guest, Alex Dimple. Coming onto the stream. Let me tell you About Profound. Try Profound.com. get your brand mentioned in LLM searches. Reach millions of consumers out Reach using AI to discover new products and brands. Alex, good to see you. Congratulations on the new gig. My man. How you doing? How are you? How are you? Good to see you guys. Do I do anything? No, no, you're good. We can hear you. Talk to us about the first day on the job. How's it, how are you settling in? How's it, how's it going? Honestly, it's been incredible. It's like a lot of fun. I think, you know, building an AI lab in 60 days flat is kind of a, kind of an incredible activity. But, you know, it's a good way. It's. Give me, give me your pitch if you were trying to hire me, if I'm some hotshot AI scientist. I think Meta has everything necessary to achieve superintelligence. There are no obstacles. We have the business model to support building literally hundreds of billions of dollars of compute to be able to actually produce the technology. We have an incredibly talent, dense team. Our team is smaller and more talented than any of the other labs. The other labs are like 10 times bigger. And our team is about 100 people of cracked AI scientists. Yes. That's how we're going to get there and we're going to be incredibly bold and we have the scale of products and business to be able to deploy superintelligence to every person on the planet. Yeah. What are you looking for in that hundred people? Are you doing two pizza teams? Like, who's fitting in really well right now? I think that the AI researchers are all pretty, are incredibly kind and lovely people. And so I think we've been able to just build a team of great people. Everybody's trying to build super intelligence. Everybody is excited to be able to build, you know, potentially the most important technology of all time. And we're. My job is like ensuring that we have the conditions to be able to do that. Yeah. Talk to me about the pillars, how you're thinking about research, safety, product, how all that comes in the position of MSL as opposed to other labs where you have pretty much every human in the world that you can actually distribute these products to. Yeah, I think so. We kind of split the team into three pieces. Infrastructure, research and product. Research obviously has this job of building these models which will ultimately be super intelligent over time. Product is responsible for ensuring that over time they do get distributed and used in novel and interesting ways by the world and then Infrastructure is this very difficult challenge of building, you know, literally the largest data centers in the world and continue to scale those over time. I think that the, over time, like, you know, not only having the distribution of all of Meta's products, but also truly like having this incredibly talented team is going to be, is going to prove to be a huge differentiator. I think that like, you know, one of our guidelines for building the team is that people have to be in the, in the very top handful from one of the other labs and if you just do that, if you just build a team of the very best people from, from the industry, like you're going to be very successful. Talk about the advantage of having a hardware team that's been at it for a decade versus maybe starting in the last year. How, how closely are you, are you and are you in touch with them in terms of kind of showing what, what capabilities will be coming down the pipeline from the MSL side? I mean the, the amount of engineering that has gone into this thing is absolutely incredible. They have like the transparent versions. We can see all the fucking shit. Yeah. People have like painstakingly engineered over the course of a decade. Yeah. I mean, and it's not a, it's, you know, we've, we've done a couple demos over the last month or so, but this is a product that people are going to be able to have their hands on in two weeks. 100 and like, and fundamental like, I think, you know, glasses are the natural delivery mechanism for super intelligence. Like the. It is. You need something that will see what you see, hear what you hear and they can easily deliver information to you. Yeah, it's literally right next to this. The human sensors. The human sensors and the human brain. The human. Right there next to the, to the digital sensors. Digital sensors merged. Yeah. The merge slapped together. Yeah, it's happening and I think it'll like, I mean like my, my view is like, it will literally just feel like cognitive enhancement. You will just, you'll gain 100 IQ points by having your superintelligence right next to. Yeah. Or you like talk about chat bots. Right. It feels like the chatbot chat bot meta like is here, but it's not. It doesn't feel like what's going to be the most important thing in a decade from now. Yeah, I mean, I think fundamentally if you look at like the AI industry, there's been relatively low innovation on the product side. Like ChatGPT was one of the first products that we had and chat was one of the first products and it still is the dominant product for AI delivery. And then on code you've seen innovation with like, you know, cursor and agents and all these, all these other products. But, but yeah, we're just still in the like and from a product innovation standpoint, we're still very much in some local maxima and any like this is true of any consumer product. There are going to be many innings of innovation that come along the way. And so our bet is that we're going to be able to be pretty bold and iterate and build some very innovative new product experiences. Do you buy into that idea of AI writing 90% of your code? Is that just you're writing 10 times as much code or you can write the same amount as a thousand person team with 100 people. What does that actually mean when people throw around that 90% of code will be written by AI? Yeah, I think it's impossible to understate the degree to which I've been radicalized by coding. Like, I think that fundamentally the role of an engineer is just like very different now than it was before. And you know, I think, I think it like feels obviously true that for any engineer, including me, like I've written a bunch of code in my life, like literally all the code I've written my life will be replaced by what will be able to have been produced by an AI model within the next five years. Yeah. What's your advice for young people then? I think you just have to figure out how to use the tools maximally. I think, like it's actually in some ways like this incredible moment of discontinuity where if you just happen to spend like 10,000 hours playing with the tools and figuring out how to use them better than other people, that's like a huge advantage. And adults all have jobs. So we're not like, you guys are on freaking tbb, you're not vibe coding. Where's your club code? Yeah, it is, it is interesting. We were at YC demo day last week and talking and looking at the eras of the sneaker flippers, the people doing Minecraft servers. And it feels like the people today are going to be leveraging the tools not just to learn them, but actually making money from them while they're in middle school, high school, etc. I think it's exactly that kind of. It's almost like, you know, when personal computers first came about, like the people or, or just computing in general, the people who spent the most time with and grew up with it had this immense advantage in the future economy like the Bill Gates or even the, even the Mark Zuckerbergs of the world. So I think that that moment is happening right now. And like, if you are like 13 years old, you should spend all of your time vibe coding and just, you know, that's how you should live your life. It's amazing. Well, thank you so much for coming on the show. Yeah, we'll have to have you back for a longer conversation again soon. This is fantastic. Congrats on the new. In the meantime, let me tell you about Turbo Puffer. Turbopuffer.com serverless and full text search built from first principles and object storage. Fast 10x cheaper and extremely scalable. Used by the best we have. Andrew Bosworth. Boz. How you doing? Welcome. You see what, what shoes are you wearing? We saw them, we saw them photographed in the, in the, in the keynote. Let's go ahead and do this. Let us know we got the shoes. There we go. And they say Boz on them. Yeah. Alex Rust is on Instagram. Alex Alpert. Okay. Brooklyn based artist. Fantastic. Honestly, he was solving a problem for us. We're like, how can Mark use the zoom feature when he's one foot away from me? Yeah. Like he is right next to my face. I don't want it to be my face. Yeah, yeah. And so we needed some detailed shoes for it. That's great. That's great. On the Oakley's, no less. We're being brand loyal. Yep. Oh, that's great. That's great. In, in the future, do you think I'll be able to just point the, point the, the glasses at the shoes and say, hey, go pull these up for me. Order them, deliver them to me. 100 ideally in the future. You don't even have to later on, you're just like, oh, I wanted those shoes. I knew that I wanted them. I love that. That's great. I got a budget. Authorized credit card. Yeah, this is my budget. No, this is great. This is great. Fully h. So here's the thing. Yeah, here's the thing. I don't think this can be like the thing that feels incredible here is that you walk onto the show and you're wearing the new displays and you're not thinking, oh, this guy's wearing a computer on his face. It's crazy. You're saying this guy's wearing a pair of glasses. We do have this problem, I think, in the tech industry where we look at a set of features and we're like, oh, this is the same. These two things are the same. It's not the same. You look at what's come before and you look at this, you're like, I'm sorry, this is a different thing. And I think that's what we've been from day one. Mark said it, if they're not great glasses, first we're just not doing them. Yeah. You told Ben Thompson you shipped the V1, but the V3 is the what? What you want to be your V1. Has there been a secret V2? Because I feel like Orion. And then here, this is the actual V3, but it seems polished. We've got. So Orion is a different tech line. It's like a whole tech tree of augmented reality, which is very much following the V1, V2, V3, kind of for sure. This is. I will say this is an Uncommonly good V1. It's good. I'm not gonna lie to you, it's good. A lot of that is because I think the neural interface is V2. Yeah. And that's really. You can feel it. The neural interface feeling as smooth as it does, as natural does. What's funny to me now is if I wear the Ray Ban metas and I'll just walk around and I'll be pissed. I'll be using the interface. I'm like, ah, I'm not. I'm not using the display glass. I can't use the interface. It's crazy. It's crazy how natural it is. We picked it up. I mean, we've done a couple demos now, and it's just. You pick it up and it's just like using. It's just immediate translation from using a regular mobile device. So a year ago, Jordy made the prediction that in the future you might have multiple pairs of glasses. A work pair, a sports pair. Is that gonna hold for a long time? Yeah, people already have a lot of glasses. You got different styles, you got different things. I think that's appropriate in the future. Yeah, you want to have the full functionality of augmented reality. That's one zone. Sometimes you just. Yeah, I'm just going to my kids soccer game. I want to take videos. That's all I need. I do think you're going to end up with a strata, the entire line where you get into full ar, these AI smart glasses with displays, AI glasses that don't have displays, and maybe even some stuff at the lower end. There's a whole range there and people should be able to dial what they want out of that. If you were a young founder excited about ar, how would you be planning the next five years. Well, there's two really important things. The first one is you have to embrace AI and these are really tightly connected. People didn't see that five, six years ago. Now it's so clear. It's very unlikely to me that in the ar. You gotta give you some credit for that by the way. I think broadly tech didn't see the intersection in the same way. Yeah, now it feels natural because you guys are up there pitching live AI. Yeah, real time AI and it's like, oh, makes total sense. But they felt like different tech trees at one point. And in the future you're not going to have an app store. Like I don't need, I don't want to go figure out what the app for my toaster was called and like make sure make toast, man. Like I just need to talk to the AI and let it handle the back end. So it's a lot about what's the functionality you're producing, how is that going to integrate with AI? And then I would be thinking a lot about dynamic ui. That's the thing that no one's cracked yet, including us, is how do you get it so that the UI that I need is available when I need it generates in real time as opposed to just like this fixed set of things that I got to go learn every time I have a new appliance in my life. Totally talk about the tech tree in VR. It feels like Jim James Cameron was on stage. I've said for a long time that this is. I don't know if you agree with me, but I feel like people will be watching movies in VR before they're playing fully immersive 100 hour AAA games because you got to get the install base up and avatar in 3D already exists. It's a great experience. You know, he's such a passionate guy and he cares so much about the quality of the product. He really was not a fan of headsets until quest 3 and it finally got high res enough and AVP and all these different things. And then he was like, oh, I'm in. He, he went from like not that interested to all the way in. You saw him on, just hyped, fired up. He's totally fired up. And that's because we finally crossed. So up until then, you know, listen, watching on your TV was better. It was like, why would you watch it in the headset? It was better. That's not true anymore. It won't be true in the future. Like we're going to be. He's seen the future. He ruined all my secrets on stage. I haven't even raining them back. And so that just does keep getting better. The future isn't just the tech though. And people underestimate this. A big part of what is premium in the headset space is lightness, is weight, is comfort. Those are premium features. And that's kind of different than the previous generation. That's not how it was with phones or laptops. Where you wanted it to feel solid and sturdy. Do you want it to feel plasticky and light? But when you're a wearable, that is one of the most premium things you can deliver. So we are looking at not just the technology, but how do you package it into the smallest amount of space? Space and weight. Talk about the decisions to around the heads up display, specifically in the display lenses. Right. They allow you to interact with the real world. It's not meant to like cut you off. But what went into those decisions? Yeah. So from day one, Mark, you heard him on stage. We wanted this not to be interruptive. If this is a thing that's constantly flashing notifications up on your face, that's a pretty annoying piece of technology. That's not a technology you're going to be delighted by. So literally the way we did this was we thought, what are the top 10 things that you take your phone out of your pocket for? Taking a picture, changing the song, listen to music, get a playlist, you know, send messages, you're going to get a navigation, you're going to get directions. And we just started working down that list and making sure that we could do those things on the headset. We're doing it in partnership with the phone. It's all part of the same ecosystem. But that's one less thing. You have to take your phone out of your pocket for that's one more minute that you're engaged in whatever it is that you're actually doing. Everything. I mean the handwriting stuff, did you guys try that demo? Crazy. You know, that was a 20, 27 maybe thing. No way. And then in the last year, we just blitzed it. The team did incredible work to blitz that. And that's another thing. Now you can respond to messages without having to take your phone out. So it's like, who's the fastest handwriter at Meta now? I'm not exaggerating. This will sound like I'm blowing smoke, but it's not who I am because I'm very competitive. It is Mark. Now, he was not good to start. He got his glasses, he got the handwriting, like two weeks ago. But he knew he was doing it on stage, so he has been. He runs this company on WhatsApp. Yeah. And so he has been doing every single message any of us have gotten for two weeks. Is it WhatsApp? I literally think he went from like, like the 99 percentile words per minute. We have a. We have a touch typing demo that we do with no keyboard, nothing, just from cameras. And I was number one till Susan Lee, our cfo. Excel Jockey, is always Excel Jockey Crush number one. I'm number two at that. But, yeah, no, I'm not exaggerating. I say Mark, we watched him on stage, like, oh, damn. He's. The keyboard demo. You're just. You're leaking that. But that seems. That seems incredible. I mean, the name of the company is Meta Platforms. It feels like this is a hardware platform. There weren't that many things where I was like, I want an independent developer to play around with this. But for those. I want the. I want the innovation to flourish. I want. I want the kid in the, in the college dorm room to build something that runs on that. What is that going to look like? I want the beer app. I want the beer app. Come on, give us the beer app. The pressure has been immense on us ever since the R Meadows kind of really became a hit last year to produce there. We have some exciting announcements tomorrow. API development. For some people, it's. It's. It's too tough on the glasses. What? You know, we worked with Spotify to do the Spotify and we really had to rebuild it with them, help them design it to make sure it met their specs. Yep. Even Instagram on the, on the glasses. We had to, like, redesign it with Adam and like, go to. So it's so tight. The thermal and, and space is so tight. And it's so expensive to run the radio. You lose your battery and thermal so fast. So. So the app is going to be. It's the worst it's ever going to be, Right? That's right. Everything is exquisite right now. Yeah. Over time, obviously, we want to buy that space back and open it up to developers and. But again, I think a lot of it is going to go through the AI. A lot of it is going to be. You invoke the AI to accomplish the task. That's. That's not just on us. Whether it's MCP or something else, that's on our industry. We got to continue to build what is the web of interaction design for AI apps. Yeah. Because we all know that is where things are headed. What about on the other side of like the big tech partners that could potentially vend messaging into this? We were talking, we were at YC demo day a few weeks ago and we were talking to a team that like, they're like, well, this AI agent will run on your laptop tough and it'll suck in all your messages. And we're like, that's probably not going to last for that long. But is there any hope that other platforms will play ball and say, yes, there's enough demand. What's it take to get imessage showed in here or Gmail or anything like that in there? Yeah. So we would love to work with these partners, as you can imagine. And I get it, we're so early on in the technology for Google or for Apple. At first they're just thinking, can we do it at all? Can we do our own version? What's that look like? But we have an opportunity here, I think as meta, to not only establish a consumer category that nobody's in, I think the more people play in that category, the more attractive it is for us to work together to make sure all of our use cases are supported there. You never want the platform to get in the way of a great consumer experience. And that's true for us and that's true for them. So it's too early to say we're literally day zero, actually probably day minus 30 on these things, but we are getting there rapidly. Yeah, the actual, like there, it looks like there's cuts on the glasses. Those are. Is that a design touch or is that actually a waveguide? Is that a functional feature? Yeah, these are called the input gratings. Okay. So you've got a little light liquid crystal and silicon display right here. It's piping light in total internal reflection of light. It spreads that light out across a bunch of different pipes and channels that then shoot it into my eye through what we call a geometric waveguide. And so I have a display on right now. So you can't. You do. And we can't see it. You can't see it. That's crazy. So that is. Wait, did you just turn it off? Yeah, I just turned. Oh, it's, It's. That's remarkable. I can't wait for people to go and just demo these because it, it literally is a science. It's the science fiction we've been promised. Like as at 29 years old, heads up to slow. It's in every video. I told my team, like we are. It's yesterday's future today. Yeah, it's like. It's like the things that we were promised are finally arriving. Thank you guys for doing a demo and letting people buy it this year. Yeah, not that doesn't happen that much. It's a bold statement in AR and we appreciate that. Yeah. What's next for you at Meta? What are you focused on? What do you want to deliver this year? Yeah, for us, look, the really big arcs, obviously you're continued towards full ar. So we're really excited that we're supporting the entire strata we talked about before. Full AR display glasses, regular glasses, and even other exciting form factors that I can't yet tease. On the VR side, we're advancing the hardware. We have multiple fronts that we're advancing the hardware on and also on the software side, supporting creators. Genai, don't sleep on it. That is the real unlock. You've got Roblox, you've got Minecraft. They're awesome tools. You've got tremendous communities there of creators. But there is a ceiling on how good the rendering can be in those platforms. We don't know. And it's also. You have to be a certain level of creator to be able to produce good content in those platforms. And with Gen AI, you can lower the floor. So it's so much. We just played around with the basically prompt to game functionality. It's crazy. And I gave the. I was talking with the team that gave us the demo and I was like. It was like not that long ago that I was like, coding little iPhone apps. Pong. Yeah. It would take me two days to like, get a functioning app and you're able to just prompt it and be like, hey, change this character completely. Change the entire world. And getting a decent 3D texture used to be like, you needed an artist and you needed like a rent. You need a whole bunch of things. And then also with the new Horizon engine, like making it so it looks better. And we think that's gonna be something that appeals to people not just in headset, but on mobile. Yeah. Well, I remember, I believe it was a couple years ago. You shared a photo of your home setup and you had the teleprompter and you had the VR headset. What does that look like? Today, a lot of the focus has been about getting the glasses into the real world with the Run Club. But are you using these in front of your computer as well? Yeah, I use these. At this point, I'm pretty much using these all the time. Me and Mark kind of like just been on nonstop. Once you get these on, you start using them in that you're in. The thing of messaging, like, it's pretty next level. Like, even earlier, I was just coming off a stage, Mark was messaging me about, you know, the things we. I will say, future's here. As a cto, you feel a certain responsibility to your setup. You gotta really go over the top. Yeah. I'm now shooting hilarious. Like, Leica Cinema lenses. Yeah. The ones that Inaritu shot Birdman on is like, my VC setup. Like, I've gone. I've gone too far. That's amazing. I've gone too far and there's no turning back. Well, we got to have you on the show, call in with those lenses. We'd love to have you remotely. That's amazing. We're excited to get the glasses because John and I will be on the show and we want to not be on our screens. Right. I have. I usually have a computer open when we're at our studio because the team might text me, hey, this guest running late. Whatever. Being in a world where we can just be live and get a notification, hey, we got a guest running two minutes behind. And it is. It is underrated. I mean, obviously, like, we're all hoping for, like, App Store super open development, but I was just talking to some of our team, and I was like, wait, we could actually, like, pipe tons of crazy stuff just through the WhatsApp API that's not that crazy to do. And WhatsApp has a bunch of primitives that you can build around, so there's going to be some cool stuff. And there's already WhatsApp apps and the whole ecosystem of developers there, so. Yeah. Awesome. You know Alex Hymel who runs the wearables division? Yeah. A year ago, when he got the first prototypes, he gave a whole speech, and nobody knew he was doing the teleprompter entirely on the glasses. And he was just. And he was swiping the slides. There you go. Gestures. So there's a lot of potential here for these kinds of integrations. Yeah. Yeah. That's great. Well, thank you so much for coming on the show. This is fantastic, by the way. I love the show. Thank you. This thing rocks. We have a lot of fun. And I want to. I want to get out shotgun shooting with you after all your practice in Robo recalls. Yeah. Yeah. All right. It really. It really did, like, completely change the game. I never shot a gun in my life. I can't believe that's a true story. It is a true story. It is a true story. And I don't remember Robo Reco. Yeah, no, no. That's a spectacular, spectacular story. Yeah. But it was remarkable. Yeah, it was great. Anyway, thanks so much for having us on the show. This is great. We will talk to him later. Up next, we have Eva Chen. Next time we do this. Yes. 12 hour stream. 30 minutes of guest. Yeah, for sure, for sure. Not nearly enough time, but we have Eva Chen, the VP of fashion partnerships. Welcome to the stream. Thank you so much. Well, I'm. I'm learning. I'm learning. It's good here. Let's throw this on. I'm. I'm more of an enthusiast. Yes, Jordy's the enthusiast right there. Okay. Yeah. I'm John. This is Jordy. John and Jordy. Jordy and Jordy. Jordy. Jordy George. Jordy Hayes. Jordy. Okay, here we go. Like finding Dory. And I was like, something like that. Call us whatever you want. We're just happy to have you here. Thank you so much. For now. You can hear us. I can hear you perfectly. It's quite a vibe here. I mean, it's insane. Guys, it's a music festival. What frames are you wearing? I'm wearing the. Are we live, right? Yeah, we're live. Oh, my goodness. Here we go. We're live. Welcome. All right, I'm wearing the Skyler, which is like a subtle cat eye. Looks good. And they're transition. I like it. And you can put prescriptions in them, and it's like a great everyday class. I wear them all the time, using them for headphones and live streaming, taking content for Instagram. Yeah, there's been tons of fashion partnerships, Mark. Reference them online. First Labubu, actually. Is this a Labubu? Is this the first Labubu? This is the first on. Live on the show. We sent one to a guest, Bill Bishop. We sent him a Labubu. That is a wild one. It is a meta. It is a meta branded Labubu. A little meta bucket hat. Fantastic. And the jorts. One of one. One of twenty. One of twenty. One of 20. Mark actually just got the last one. There you go. Sorry, guys. Gonna be crazy. Talk about actually how to get these products except accepted in the fashion community. Fashion community. I don't know that much about it, but, you know, very exclusive. Limited released only 20 of those. This product's available for everyone. How do you. How did you actually think about the steps of what activations, what partnerships you want to do in what order to actually get traction within the fashion community? Totally. Well, the first thing is that to make a stylish glass period, something that like people on campus here are wearing them, they look like regular glasses and they blend right into everyone's style. When you partner with a company like Slur Exotica and you're working with Ray Ban, which has like the number one glass most iconic here. It's iconic. Think about it. James Dean. Oh, yeah. You know, like silhouette, like Bruce Springs. Like everyone. Bob Dylan. I remember Casey Neistat. That's what I think. Yeah. I think about James Dean a little bit more. But, you know, they're an iconic glass and it like, it just blends into everyone's style. We're here on the meta campus. You can see hundreds of people wearing these glasses. Looks good on everyone. That's the first foundational thing. Yeah. And then in terms of the technology, once people try these on, I've worn these, the fashion shows, front row, Milan, Paris, London. Everyone who tries them on is like blown away because not only do they look good, they're just like the capabilities are next level. What's coming next? Unbelievable. Yeah. What is the future of the meta Ray Bans display look like in fashion? Am I going to be able to put these on in a. In a physical. Try like physical changing room and. And have it AI generate me in a different outfit? I mean, I was already that to me like earlier using. Yeah. Like a. Like an ar try on mechanism. Yeah. I was looking in. In one of the setups that they had over there. Oh, yeah, that's right. Yeah, yeah. What does that look like? I mean, that's the dream. Right? That's something that, like, as someone who works in the fashion industry, that is like kind of the holy grail. Be able to try on glasses and then see yourself and style yourself in different outfits. I don't know if you've seen the movie Clueless. It is a seminal movie for me in terms of fashion. Oh yeah, Fantastic movie. Jordy has never seen a movie in his life, but I've seen all of them, so. Okay, start with Clueless is great. Start with the Clueless. Okay, but like, there's a scene where Cher Horowitz is going through kind of like virtual try on. Sure. Yeah. But that's like. It's another one of those things that's been sci fi. Right. Like, display feels like science fiction today. We need to get there. But for now, like with these new glasses, the ability to be able to kind of have that experience of seeing something asking meta AI. Hey, Meadow, how would you describe this dress. And they would say, oh, this is like a 1950s fit and flare dress style that's going to be so, so helpful for people as they're learning and stretching in the fashion industry. Yeah, yeah. The other thing, I mean, we were talking with, was it Boz or Chris with this? But just like seeing things in the real world and being like, I want that, and being able to like, like, basically decrease the friction between. Oh, I don't have to like, Google it or do a reverse image search. It's just like instant, right? Yeah, there's a lot of friction right now, just in general with fashion. Right. Like, I'm on Instagram, I'm scrolling, I see a friend's really cute outfit I have to tap. If she didn't tag it, then I have to like, screen rabbit WhatsApp it to her and be like, hey, Ami, where are these shoes from? And then she has to write back. Then I Google it. And it's just so many steps from inspiration to actual purchase. And I mean, listen, that's my dream, like in terms of reality labs, to get there to make commerce easier. But for now, I think in terms of the everyday consumer just being able to see the world around you in 3D, to be able to, like, ask questions and feel like you're being interactive, it's amazing. Well, thank you so much for coming on the show. This is fantastic. Congratulations. More fashion segments. Yes, absolutely. We'd love that. We got that. We got that. Thank you so much. Have a great day. Bye. Let me tell you about numeral HQ document sales tax on autopilot. Spend less than 5 minutes per month on sales tax compliance. Our next guest is Tiffany Jansen. Welcome to the stream. Tiff, how you doing? Good to see you. Welcome to the stream. Can hear you. What's your day been like? How have you been enjoying meta connect 2025? You know, the day's flown by. Let's get the mic up a little bit. Perfect. The day's been great. It's flown by. I mean, the product announcements were phenomenal. I got a demo, the Meta Ray Ban displays yesterday, so it's really excited to see the announcements around it. It blew me away. Do you think that these products are ready to be integrated into creator workflow? Your workflow? How do you think they fit in? Obviously, there's so many creative tools. When would you pull a Meta Ray Ban product off the shelf? Oh, absolutely. I mean, well, for one, while you're on the go creating content, it's. It's huge organic content, being Able to capture those moments instantaneously, I think is going to be a game changer. Even thinking about the meta Ray Ban displays, using meta AI while I'm walking around and ideating, that's huge for a creator to stand out. Sure, sure, sure. What, what advice do you have for people who are maybe getting started on creating content on meta platforms? Utilize AI. Really treat it as almost a coworker you can think of it as. It's one of my favorite ways to. If I have an idea but I maybe can't fully piece it together and I want someone to bounce it off with. You know, I work by myself, I work from home. I need that collaboration. You really don't have a. How big is your team? My team, I mean, my team is about five people. But you're saying when you're in that creative workflow. Yeah, when you're in that creative workflow. I mean, for scripting, for coming up with ideas, that's my role, that's my job. So, you know, I have some people I can ideate with, but I find honestly the more I do that with something like meta AI and it can keep on keeping track of what I'm thinking and really wanting to put together. It's almost better sometimes. Any advice for content creators that want to interview the Mag7 CEOs? Yeah, you've been doing that. Who have you interviewed so far? So, so far. Last year was Mark and I know you guys have him up next. And then last year was Mark Jensen. I did Satya. Satya Nadella. Yeah, I did. There's, you know, the list goes on and it was. Been great. You know what? I think the key or the secret is just be yourself, be authentic, be knowledgeable with what they're building. And yeah, they're also down to earth. It was great. Yeah. We asked a couple other folks this. How much content do you think on Instagram is going to be AI generated in five years? That's a really great question. Well, I mean, I'm okay. Five years, let's say. What right now? I would say we're already probably at 40 to 50%, honestly. I think certainly like AI enhanced AI enabled AI in the loop, but there's still a human somewhere involved. Yeah. It's kind of the question of like what. What company's an Internet company. Yeah, it kind of just pays in the background. Right. Product. But they distribute it with that. Right? Yeah. You kind of just forget about it. I know. Yeah. I don't know what it will be. Maybe it will be closer to you know, 80 to 90% in to some way AI touches it. But yeah, human taste and point of view. I like the human touch. Yeah. I mean, if you go back to the original Instagram, it's like, what percentage? It's almost like asking like, what percentage of photos were not filtered? Hashtag, no filter. Like that was a trend. But most people filtered them. And like in the future most people would be like, yeah, check the box. Yeah, check the box to also like fix the lighting or add subtitles automatically. Like, there's a lot of things that AI can do that still keep the core human element, but then add a bunch of stuff on top. Collaboration. Yeah, yeah, there's also. Have you seen those videos on Instagram of like where people take some very technical concept and then they turn into a song? I saw one around Steel Coils. Have you seen this one? I have, I have. And that's one where it came from human. Clearly that came from the brain of a human. And then AI was just used to make the song and the, and the voiceover and stuff. Anyway, it's been fantastic having you on the show. Thank you so much for hopping on. Great to hang, guys. Cheers. We'll talk to you soon. And up Next we have Vanta.com Automate compliance. Manage risk, Improve trust continuously. Vanta's trust management platform helps you get compliant fast. We have Mark Zuckerberg joining in just a few minutes. Founder Mo, what has your main reaction been to the Overall Meta Connect 2025? Jordy, how you doing? I think it's impressive to see like, you know, the, the immediate reaction I have is, is how important it is to keep the band together. Right. People like it is crazy how long some of these folks have been Adam Mosseri. Right. It's like you need to keep talent focused and, and yeah, I think talking with Boz and like understanding, I do feel like five, you know, only five years ago it was not. People were not seeing the connection between. They just weren't seeing the connection between glasses and AR VR and AI and the intersection is just beautiful. Yeah, yeah. The, the original Meta Ray Bans, it felt like such an add on little like side project almost. And now it's like the center of their annual keynote and they're really building a lot of different stuff on top of it. That's been fascinating. We got to read a post here from what you got? Atlas Creatine Cycle. You thought we weren't going to print out posts and read them? Here we are. Here you are. Atlas live. Live at Meta Connect My prompt. Let's get some ice. Let's get you some ice cream. GF agent. Okay. Yay. Will you have some? My prompt? Probably not. I'm kind of full. GF agent. Okay, fine. Thought for 30, 46 seconds. I'm not hungry. I, I, I honestly don't understand this. The classic interact. If it, if it recreates human interaction perfectly, it will behave exactly like a. We got a rude post here. It says, bro, last night was a testament to our culture and civilization. It is ready an absolute party. We are bringing Mark Zuckerberg on. Before he hops on, let me tell you about ramp.com time is money save both use corporate cards, bill payments, accounting and a whole lot more all in one place. Here we go. Let's bring him on. Mark Zuckerberg live on tvpn. Welcome to the stream. How you doing, Mark? Good to see you. Great to see you. Congratulations on massive day. You got a bunch of fans here. Love to see see it. Yeah, it's a fun one. You still winded from the run or. No, that was a pretty conversational pace. That's conversational pace. Love it. Yeah. React to the Kinect announcements. How do, how, how do you envision the next phase of this with developers? I mean, there's so many cool ideas that I could imagine happening on Ray Ban display, but and there's an immense amount of constraints operating in such a small format. What does this look like over the next couple years? Yeah, well, I mean, I think that there are two platforms here that are interesting. One is the display glasses and the other is the neural band. Sure. And I actually think both of them could evolve into important platforms by themselves. So the glasses, I actually think there, it's pretty clear. Right. I mean, you saw, there's the nav, where there's a bunch of different apps. We're going to try to, you know, start off with partnerships and start off getting some of the most used use cases and really nailing those and getting them in there. And then over time, hopefully we'll, we'll be able to open it up in some way, but I think we need to figure that out. The neural band, I think, is going to be an interesting platform by itself because I mean, right now we're basically, we designed it to be able to power glasses. I mean, that was the purpose. But there's no rule that says that it can only be used to power glasses. So I think that's an interesting thing to explore over time too. I mean, you can imagine, you know, something like this when you're sitting at home and watching TV being pretty cool too. So I think we need to figure out what direction this goes in over time. But this is a pretty good start. We've had this for many years. The display is going to get all the attention, but the neural band is insane. I can't wait for people to try it. I mean, the fact that you can buy this in a couple weeks is just insane. Talk about the team's foresight around the intersection of glasses and AI, because now it seems incredibly obvious, right this, like always on this live AI, but it wasn't that long ago that people thought these were like two different sort of like tech trees and they didn't see the convergence. Yeah, I mean, look, every new important technology needs a new class of devices in order to make it first class. And I think glasses have three main advantages that I think are going to be just make them the ideal candidate to be the next major computing platform. One is that they help preserve this sense of presence when you're there with another person. I mean, you take out your phone, you're gone from the moment. Glasses have the ability to bring that back. Two is that glasses, I think are the ideal form factor for AI because it's the only device type where you can let an AI see what you see, hear what you hear, talk to you throughout the day. Soon it's gonna be able to just generate a UI visually for you in your vision, in real time. And then the third thing that glasses can do, it's really the only form factor that can bring together the physical world that you have around you with realistic holograms and blend those together. And I think it's one of the crazier things about living in the modern world is that we have this incredibly rich online world and you access it through this 5 inch screen most of the time. So I just think that it's only a matter of time before these two things are basically fully merged and glasses enable all of that. That's kind of been the plan all along. I mean, when we started Reality Labs, or the kind of precursor to it, I think it was back in 2014, it was basically we went public and became profitable and then that's when we started working on these longer term bets. That's when I started FAIR for our AI research and we started the precursor to Reality Labs. But yeah, no, I think that these two tech paths really kind of go together. I noticed you took a picture of Boz's shoes on stage. Yeah, they were nice shoes. They're Fantastic. As we saw them on the stream, talk about what personal super intelligence means. Longer term, is there a world where I'd be able to take a picture, look at your watch, say, that looks like a good gift for my business partner. Find it, order it, send it to him. Yeah, well, I mean, look, I think where we're really going with personal superintelligence and the glasses, it's more the live AI vision that I talked about. So right now with the glasses, you basically, you can invoke that AI. You can say, hey, meta. You can, you know, do the gesture with the glass with the, the neural band, bring it up and you can ask it a question. But I think where this is going over time is basically you're going to have. It's just going to be on all the time, right? You'll be able to turn it off and you'll obviously be able to have control over it and all that, but. But you'll be able to think about AI as more something that is just running all the time, that has context on your conversations. If there's something that comes up in a conversation that it thinks that you should know as you're having the conversation, it'll be able to go off and think and find the answer to that and then just show it in the corner of your vision. If it thinks of something that it thinks that maybe you should be reminded of after you're done with the conversation, it'll be able to go off and process that and come back. So I actually think that this kind of agentic AI vision of it having context to what's going on in your life and then being able to go off and do work for you and then bring that into your view when it makes sense, I think is going to be really powerful. That's a good conversation. It can sense that you're forgetting a word and it just pops it up. Yeah, yeah. Or my version of this. I mean, I just. Ever since I've been thinking about this, I've just been running this thought experiment where every time I'm having a conversation during the day, I'm like, wow, like, there's information that I wish I had during this conversation. I mean, the most annoying thing to me is like, you're having a conversation. It's like you need to go check in with someone else about something and then go back to the person you're talking to here. It's like, all right, you can just like send them a quick message with the neural band, get the information that you need, right? It's kind of like multitasking. You're like the best power user for this. I've run a couple brands. I've advertised on Facebook a bunch. I've advertised on meta platforms. What does, what is the role of a brand look like? Is it shifting in the age of super intelligence? If everyone, if all my customers have personal super intelligence? Is, is my experience running a brand going to be different? Well, I think the brands are going to become more important. Right. I mean, I mean, I basically think that like all kind of economic theory assumes that people have access to perfect information. And I think the Internet took us a step closer to that and AI is going to take us another step closer to that. But in a world of perfect information, what matters? It's like that people trust you and that you have a good reputation and that they know that you're. They know what your work is. So, yeah, I think that the evolution of how people think about brands, I mean, that will, that'll obviously shift with every new technology, but, but I think it's only going to get more important. Yeah, I feel like there's, there's already a little process where people find a product, they see it on Instagram, but then they might search the comments or go to their favorite creator. Where's my creator friend? Think about this and super intelligence being able to go around, do some of that for you surface at all. That makes a ton of sense. Totally talk about the work with James Cameron and the future for virtual reality. How. How many pairs of glasses do you think people will have in the longer term? That's a good question. Interesting tension between condensing everything into a single pair of glasses. Yet at the same time, humans love to. You don't want to wear the same watch every day. Yeah, yeah, I, I think that you're clearly going to be able to have a lot of interactive and immersive experiences on glasses. AR glasses. But I think the right analogy is kind of like augmented reality is the future of phones. It's the mobile thing that you're going to take out with you. And I think virtual reality is the future of TVs. And, and the reality is, is that, you know, the average American spends. I actually think it's still. They spend about as much time on a TV on a daily basis as they spend on their phone. Yeah. And. But they're, they're different use cases. Right. One is more immersive and interactive. I think they're both going to be important and the experiences are going to be limited by how much Compute, you have. You are just going to have less compute in augmented reality glasses. I mean, you only have so much space to fit a battery and compute and. And like the connectivity to whatever other device it's running is not tethered. Right. Because you don't have a wire. Whereas with VR you just have more real estate. So it's kind of like the difference today between you have mobile games on your phone and then you have much more advanced games on a gaming console or a PC, which can have a lot more processing. I think the same is going to be true here. You're going to be able to have great experiences on the glasses. Kind of like akin to your phone. You can do pretty much anything. You can watch videos on your phone, do whatever. But. But if you want the kind of most immersive version of it, I, I think that there's going to be a dedicated thing for that. Yeah. You're using the Meta Ray Ban display on the way to the office, reading some emails. You might sit down at the desk, walk in. Right. Yeah. I think it's time for a size gong. You have some big announcements today. We'd love for you to hit this gong for us. Here we go. Congratulations on Meta Connect 2025. Fantastic. We would love for you to sign in this as big as we want game signed. We're going to retire this and we're going to hang the gun. Hang it in the rafters. All right, There you go. Thank you so much. Congratulations. Thank you. Good to see you guys. Have a great rest of your day. Sci fi into the present. All right, see you guys. Fantastic watch, by the way. Thank you. We will bring on our next guest, but let's first tell you about Figma. Figma.com Think bigger, build faster. Figma helps design and development teams built great products together. We have people standing over here that are. It is crazy. Waving at stock. He's crazy. This is our biggest live show ever for sure. Lots of people here. We have Alex Himmel joining next couple minutes. Okay. We're going to hang out. Alex Himmel's the VP of wearables. We will. Oh, show the gong. We want to show off the gong. Careful here. Game hit the game hit gong. This will be retired to this one right here. This one. This camera over here. There we go. Look at that game hit. There we go. Mark Zuckerberg's signature on it. Love to see it. We are building the museum of technology business back home. Yes. And that will be a staple. I like that. I think the agentic commerce Thing is going to be a big discussion over the next year. We saw OpenAI teasing it there. Was that leaked screenshot or like, say it was a kind of intentionally leaked Screenshot of Like ChatGPT having an orders tab. Google's obviously thinking about that. Semi now what Zuck was saying about having perfect information. Right. The Internet so that consumers could easily research a product. Right. You could be Addison store look up reviews. What does the creator think about it now? It's like even less friction and that you can just be looking at something you can be pulling up like, hey, Andrew Huberman actually doesn't like. Not a big fan of it. Yeah. It was funny when you were saying, like, I'll have my credit card saved with meta and I'm pretty sure they probably have like hundreds of millions of people that already have credit cards. I already have one saved from the Meta AI app because I put one down to buy stuff in the Oculus Quest store a long time ago. We have Alex Himmel coming on the stream next. Award winning. Looks like. Yeah, it looks like he won an award. Let's bring him on the stream whenever he's ready. Thank you for tuning in to TVPN live from Meta Connect 2025. We appreciate you. Here we go. Let's bring on Alex Himmel. Welcome to the stream. How you doing? Good to see you. Fresh off a run. Throw this on. There you go. Yeah. Having fun. Congratulations on the day. Absolutely massive. And let's get that mic down. There you go. Perfect. All right, we in position? Yeah. Yeah. Okay, take us through your. Look at those. We, we, we. By the way, we were. We were bummed. I wanted to open the show with the Vanguards, but they were Aaron Bargo. But those will be. Those will be Jordy's daily driver. What time did you open up? We started 4:30. We could wait like one more hour. I know, I know. Walk me through your role, how you fit into the organization, what you need to do specifically to get all the products out today. Well, today's a big day for us. I lead the wearables group at Meta and we announced a whole bunch of wearables today. Few things. We had a few things. Yeah. Usually we announce one device, but today was a real stack up of. We had, if you like, the Ray Ban Meta glasses, we had software updates for them and we had brand new Ray Ban Meta Glasses Generation 2 with tech improvements for the battery life, the image quality, got an AI mode for the camera. Talked about the Oakley Meta Houstons, the Oakley Meta Vanguards, which I'm wearing which are designed for sport, which is pretty exciting. And then of course our first pair of display glasses. Don't forget the neural band which I'm wearing. I've been wearing it all day. What lessons are you pulling from? From previous meta projects around wearables, hardware supply chain. Like there's, there's new challenges, but I feel like you've done a lot of this stuff before. Those are my kids over there. They got better, they got radio, they're looking great. Yeah, we got to get a small pair for the kids at home. I know, I know. Well, so we've been working that soil exotica for a few years now and our first generation was the Ray Ban stories and then and those didn't do as well as we had hoped then. Raven Metta the Gen 2 really exceeded our expectation. What was the, what was the metric? Was it just overall sales or churn? Because I feel like whenever we're talking wearables. Yeah like you know, Christmas comes top of the app store and then it's, we got like, we got to get the retention up and it feels like the latest products are finally passing that, that retention and we're not seeing delivery turn. Real people are using them. Is, is, is that how you measure success these days? Yeah, I mean there's the metric. So we're pretty metrics driven as a company. We do look retention, we look J curves is what we do. So the X axis is the days after you've purchased and the Y axis is the percent still retained. So we're looking for that to be high and be flat. So if you look at the advantage between the original meta stories and then the meta Ray Bans, was there a jump in the 12 month retention? Yeah, just picture two lines and one was way higher than the other. You know, I think it's just the image quality was good enough to be able to share on Instagram and WhatsApp elsewhere on your phone. The audio quality was good enough to answer calls, listen to music. And then the form factor, it was subtle improvements but we grind away at millimeters and milligrams and they were, you know, just a little bit more comfortable, a little bit lighter. And it was the small things that added up and we built on that for the new generation we're launching. We're pretty excited about the full lineup. Yeah, I think these Oakley meta Vanguards, I think they're going to be a hit. I've been using them for a few months now. They're wonderful. This feels like something that I Don't know. I was like, every guy in my friend group is going to want one. Right. And I think they look great on women, too. Yeah, yeah. No, but specifically, I mean, like, for. For me, when I think about, like, as a kid, I was like, I want to buy a pair of Oakleys that just have that iconic shape. Right. And I can just remember doing them in all the activities, whether it was running or hiking or skiing or snowboarding. Right. It's like, this is going to be something that you're going to be seeing everywhere out in the real world. Who's faster, you or Mark? Well, we came in around the same time. We both met. Yes. I'm still dressed from the wrong clothes. And we had 30 people wearing the Vanguards and taking a video, so we're gonna have some good footage. Mark's pretty fast. Don't bet against him. Yeah, he's got a serious, serious workout routine right now. I knew he was gonna set the pace. I wasn't sure what pace he was gonna set. Well, we know he's the. He's the fastest handwriter at Meta, but I don't know if he's also the fastest. Oh, man. I mean, not only is he doing 30 words per minute there, I don't know if you noticed he was saying something and writing. It's a lot going on. Like, that's a lot going on of that, which is pretty impressive. Talk about some of the partnerships on the sort of, like, active side with wearables. I know you have the Garmin partnership. Strava. What else? What? Like, how did those come together and what else are you thinking on that front? Garmin makes the best smartwatch in the world. I've been wearing Garmins for over 15 years. I do marathons. I've done an Ironman. I've been a heavy Garmin user, and they sell a lot of watches a year, so they're good penetration in the market. We're thinking, hey, if we're building a pair of glasses designed for sport, who better to partner with than Garmin? And it opens up auto capture, which is going to be a really big thing. That's really fun. So if you go for a long bike ride or if you go for a run. We had it set up for this run, so we were taking a short video every quarter mile automatically from the glasses, and then it stitches together in a reel at the end of it. And my hero scenario is you run a marathon. You set it up to take a short video at every mile marker. It's 26, 27 videos. You want one at the finish line too. And then it stitches them together and then you've got a fun shareable reel at the end of it. And to do that, we're using the location triggers from the Garmin watch to make that possible. I think we can do a lot more with Garmin. There's a lot of scenarios that are enabled, but we're very excited about the initial. How important is it to actually distribute compute across your body? Because you could potentially stuff all this in there, but then you get some big heavy headset. Do you want to be leaning on other parts of the body to have sensor data and whatnot? So our strategy is, you know, I believe in familiar form factors. I think that over the course of hundreds and thousands of years, people have gotten used to wearing different devices and like the ergonomics are dialed and it's taken a lot to get there. And so we're trying to lean in that people wear watches, but in order, just about every. I can only think of like two people in the world who don't wear glasses, sunglasses or optical watches. Just about everyone, you know, and they kind of go from there. So we're pretty excited about those form factors and what they enable. And runners and people doing sport have been wearing watches for forever. Right. Well, thank you so much for coming on the show. Thanks for having me, guys. Cheers. We'll talk to you soon. Let me tell you about Julius AI, the AI data analyst that works for you. Join millions who use Julius to connect their data, ask questions and get insights in seconds. Julius AI very smart to figure out where to innovate. We don't want to innovate on form factor. We don't even really want to innovate on design. Right. Yeah. And welcome to the stream. How you doing? Welcome to the show. Welcome to the stream. This is Rocco, the chief wearables officer. Here you go at luxottica. Hi. There you go. Big day. Congratulations. Very big day. Would you mind taking us through, Take us back in history. Tell us the story of that original cold email to Mark Zuckerberg. How did that happen? What inspired it? Walk me through that. And you know, the company obviously is like the market leader in eyewear and I always had a passion for technology. And so, like, one day, you know, like I decided, you know, I decided, you know, really to pitch, you know, like a bunch of technology company. And the way that I was doing it is actually Google, you know, their email of executives. So, like, you know, like one of them was actually Mark and the email at the time was zachacebook. So, you know, like I wrote this email which at the end of the day actually became what is Ray Ban meta now? But the idea was simple, you know, like having an amazing recognized design and that's the Wayfarer and most recognizable brand in the world, which is Ray Ban. At the time, I think my collaboration was only on Instagram. And then Mark convinced me that, you know, was like, you know, like we had to do something bigger with all the different platform and yeah, like, you know, after that cold email, Mark replied to me after three days. Pretty fast. Pretty fast. You know, then we met and then, you know, like, you know, like before we launched something took probably a couple of years. Yeah, still pretty fast. Yeah, it was pretty fast. You know, but it's nice when you don't have to innovate on the design. Right. It's so key to just be able to focus on like what is the value? What is, you know, getting the technology right. Yeah, I think that was honestly the key of the success and it's still the key of success today. Mark today, even on the presentation said glasses needs to come became needs to be like beautiful glasses before anything. And then you almost find the technology as an added value. So yes, we started, you know, with the most recognizable frame, the Wayfarer Ray Ban. And the technology is the magic that we baked in the product. How do you think about scaling production the way that these are priced? I think they're going to be selling fast. Yeah, I mean, you know, there are like three obviously. Let's say now today really we launch free architecture. You know, like you have the eyeglasses, the new generation ray ban meta gen 2. Then you have Vanguard, which, you know, it's a sport architecture. And then you have the display glasses. And you know, I do think, you know, like already scaling AI glasses, camera plus audio and is doing really well. So we are very proud of the already the success we have in the market. We're gonna build on that. You know, Oakley is the second brand that we introduced to the family that really defines the category with ribbon. And then, you know, we're gonna probably introduce more brands after that. And you know, and you saw, I guess you probably tried Orion, your more advanced technology. Exactly. That's, you know, was always Mark dream and we started to do at the time Ray Ban story now Ray Ban meta and you know, which is a much simpler product. But the vision is still there, the dream is still there. So that's we're gonna get, you know, to hopefully the glasses will be the next computing platform. And that's the kind of in between, you know, do you think there's room or products in your portfolio that still have the Ray Ban or the classic silhouette, a classic style, but they just give the technologists more space to work with. It feels like until we can miniaturize everything, there's value in having more space to work with. No, you're absolutely right. That's the most critical thing, the miniaturization of the technology. Thank God in aiwar is happening something interesting that, you know, actually glasses, chunkier glasses are now a trend. Yep. So I think we are right on the. Good timing. Good timing. So. Yes. But you know, the goal is obviously to reduce the form factor to a smaller form factor even with display and you saw even, you know, vanguard, beautiful glasses, great form factor, you know, but everything needs to be smaller. I think we did very well and you know, we prove that it's doing really well on the product of Ribbon Meta and we need, we will get there even with the other generation and other platform. One step at a time. Well, thank you so much for coming on the show. Thank you. This is great. We'll talk to you soon. Have a great rest of your day. We are ready. And we have Privy. Privy IO Privy makes it easy to build on crypto rails, securely spin up white label wallets, sign transactions, integrate official wallet infrastructure all through one simple API. We'll be telling you more about Privy. And we have the Cameron James. Cameron, good to meet you. I'm John. Welcome to the show. Pleasure. We're going to have you throw this headset on. They are, I think once you put them on, you'll be able to hear us. Ah, there we go. There we go. Peace. Perfect. Thanks so much for taking the time. No problem. Really excited. I am a VR is overhyped. One year under hyped one year. I remain extremely bullish about the idea that I will be watching cinema in virtual reality. Am I crazy? No, not at all. No. I think you're right on the money. I had a kind of a piffonal experience when I saw the quest 3 with my own content on it. I mentioned it in my remarks. It's like, okay, I know what that's supposed to look like and it's this. Yeah, right. Yeah. And you know, theaters are hit or miss in quality, but with the quality control on the device, you're always going to get that brightness level. That brightness level can be an order of magnitude greater than a movie theater, you know, Think about it. I had no idea. So movie theaters are supposed to run 16 foot lamberts, which is a metric like nits. Right. I don't know how many nits it's the equivalent of. And that's based on the SMPTE kind of engineering standard for the movie industry. But very few of them do and they're mostly down around 10 or nine or three. So at three, you know, you're literally at a tenth of what the Quest series displays do. And so to me, that's phenomenal. So brightness is obviously not the only metric. You've got spatial resolution, right field of view, all that. How close can you be to the screen? And I just think it hits a sweet spot. Yeah. People talk about like, you need to watch it as the filmmaker intended and like this stuff didn't exist when you created the film, so it can't satisfy that perfectly. But at the same time we're getting to a point where you can recreate the theater experience. Right? Yeah. And look, hopefully if this becomes a pivot for people to see, to take their entertainment media, you know, on VR, Arkansas, Mr. Whatever you want to call it. Devices. Not the glasses. The glasses are obviously a separate thing. And they're cool. They're very cool in their own right. And I saw the newest ones demonstrated today, unfortunately not at the demo on the stage, but, you know, I mean, I can vouch for that. The stuff's amazing. You've probably seen it already, right? Yeah, yeah, yeah. What do you think? I mean, yeah, we've been blown away by the demos broadly. My question is, how should the film industry respond to progress in VR? Because clearly you're paying attention, but probably to a level that the rest of the industry isn't. Yeah. I think VR is a broad term and it's constantly getting redefined. And I think when you hear VR, the average person thinks, okay, gaming, okay, immersive. I can look all around sometimes. But even just thinking of it as like the next television platform. Right. So let's, let's narrow that down to it being essentially a media player in stereo. Because the gorgeous thing, the elegance of this is that a good VR headset was a stereo display. Right. And it may be the best stereo display. And what did we have previously? We had TVs that didn't work. Right. Where you had to find a sweet spot you couldn't watch with other people, all that crap. And then you've got cinemas that are, that are hit or miss. Some are dark, some are, some are fine. Right. People love the cinema Experience. I pray, I hope and pray that never goes away. But I want people to see what, what I created. And so, yeah, so I think that if you think of VR as an innately stereoscopic display device, then that's a differentiator from the best big 80, 90 inch flat panel screens or, and most people consume their media, smaller devices anyway. And the thing is, this gives you the feeling of a large screen and you can spatially adjust it, you can move it in close or you can just keep moving it out and expanding it until it spatially feels like you're in a bigger display space. Right. Yeah. And so, and so some filmmaker today, they need to assume. I didn't answer your question. Yeah, but, but, and this is the cool thing is like, you know, maybe when you were getting starting your career, you couldn't assume that the Quest 3 was gonna ever come, even though it was sci fi. But today, filmmakers today should assume that in 30 years everybody's going to be able to watching in that type of experience. I think filmmakers, and I'm less. Honestly, this is going to sound a little weird coming from me. I'm a little less interested in movie makers because they can't generate the content quickly enough. What I'm interested in is live feed of sports, any form of live entertainment, your concerts and so on, and short post episodic. Because we can get that out there quickly by next year or the year after, we can get that stuff out there en masse. Right. So I'm interested in showrunners that are doing hit shows. I don't want to. I can't. I said this in my remarks. I can't create enough content to move the needle individually. But I can act as a catalyst by providing the tool set to any production anywhere that wants to just say, okay, well, we're already here. We got a crew, we got some actors, we got some lights. We can do it in 3D. The only reason they don't is because there's no defined distribution model. But that's coming. That's what the Disney plus agreement with Meta, Horizon tv, Horizon TV itself. You know, everything's going to change in the next 18 months. Yeah. What are you most excited about in AI at this very moment? Which type of AI are we talking about specifically in a filmmaking context? Okay, so for filmmaking, we're talking about generative AI. We're talking probably about, you know, text to video and other, you know, video to video and that sort of thing. I'm guarded because I think it's an answer to how we bring down costs and become more efficient. I was going to ask, do you think there'll be more like 10 million dollar films made relative to 100 million dollar films? Does it change the shape of what's getting funded? I think what you'll get, I think it's going to affect the middle to the high end of the curve in the following way. Most films involve VFX now. Right. And it's going to affect the toe of the curve. Yeah, makes sense. But the, but the lower part of the slope is not going to change that much. And I say that because if you're not using vfx, you're not going to enjoy a great reduction in your overall cost. Caterers, actors, grips, you know, I mean, grips, dollies, the normal stuff that a small production uses. If they're not doing vfx, you know, how are you going to make catering cheaper? Yeah, with, with, with AI, you're not. But I say the toe of the curve because where filmmakers used to come in through, I don't know, music videos or low budget horror films and things like that, entry portal has shrunk so much in recent years. It gets so difficult for filmmakers to get a toehold. But now you can basically make a movie by yourself. Yeah. We have a friend who for very small budget made a full sci fi film. Yeah. Which is not, which would not be. He would have been single location, one house horror film a couple of years ago. Exactly. So now we got to go, what does that guy do next? Yeah, he takes that to a studio and he says, now give me a budget. Right. I don't think anybody that wants to be a filmmaker wants to replace actors and replace the process of filmmaking. But it gives you a new entry point into that business. Well, thank you so much. Last question. How do you work? What makes your approach to your work unique? I don't know, I just, I just ask myself, what do I. What would the 14 year old version of me want to, to see? And then I do it. I love that. That's amazing. That's a great mantra. Thank you so much for coming on the show. We really enjoyed this. Cheers. Have a great rest of your day. We have the Shaw Shaw coming on the show next. He is the vp of the metaverse. Vp of metaverse @ Meta. The Mayor of the Metaverse. Welcome to the show. How you doing? Good to meet you. How's it going? It's good. You had some amazing predictions four years ago. You said that in the future you'll just be able to prompt and Generate an entire world. And it feels like today we're getting very close to being able to do that. Did you, were you just following all the research really closely? Was this just a broad sci fi thing that you just knew was going to happen? Probably a bit of both. I think if you look at the entire Metaverse vision, it's rooted in where we think the future of immersive entertainment. You just talked to the legend James Cameron about where he thinks that's going. But also we just know generally, as technology advances, lowering the floor to help more people create things, that's just. We did that with video on our phones. We think there's an opportunity on that for immersive experiences in VR. So we both predicted where that was going, but also helped drive it. That's the work we've been doing on Generative AI for years. There's the new engine we build on Horizon. All this is in service of a prediction, but also a roadmap that we set out four years ago. When we rebranded the company, we said it was a 10 year bet. We're four years in on that journey and I think we're making a bunch of good progress. I feel like there's sort of like a fracturing of the technologies that are happening right now. We got a demo of a Gaussian splat, where you could take images and then walk around a virtual world. Didn't generate physics, didn't generate geometry. And then simultaneously in a different demo, we were going from text to prompt to physical 3D objects in something in Horizon Engine, in a game engine. And so are these two things going to come together at some point? Like, how do these things actually merge and on what timeline? Yeah, I mean, in general, with things like this, with research, we both push the research forward on an independent path. We then find the way to productize that research. So Hyperscape, which is the sort of Gaussian splat representation that's been researched for some time, we productize that in as an environment. This for the first time, we're bringing capturing. So anyone can put on a headset and capture a space. Yeah. And the immediate reaction that I have is I never want to. Like if I'm looking at a house on Zillow, I want to be able to experience it this second. Totally. And I feel like that's like right around the corner and it's so magical, obviously, for things like that. But imagine a place that you know that you can't physically go to anymore. And then imagine bringing someone that you care about that Also knows that place to that place. They're recreating memories. When I was getting the demo, I was thinking, if I have a good enough video of my 1 year old playing around 10 years from now, I'm going to be able to just generate a world of that space and just almost relive it. And so this is kind of back to your question. You have to push the technology boundaries, but then the vision is very much to bring these things closer together. So it's not just an environment, it becomes interactive. It has geometry, you have collisions as you're moving around the space. You bring other people into this space. So we're laying out all the pieces. Some have made more progress, frankly, than we thought even four years ago. But the idea is that they all fit into one general vision for how we bring people together when they can't physically be together. And that's the general thing we set out to do. Yeah. How do you think about the different windows into the metaverse? We've seen. We saw a demo today where there was traditional game engine world developed. We were able to interact at with it on a phone. I'm not sure if it was streaming from the cloud, but you can clearly see that mobile is a path into a space that you could also explore on a quest. But is there a world where you could bring that through to the other family of apps? Short answer is yes. Part of the reason we brought Horizon to mobile, by the way, what you played today, that is not just streaming, that's a live game that's being edited and then you're playing it live. But the idea is that most people today don't have access to a headset. Sure. And so how do we give them a taste of what some of these experiences are like? Not as immersive, not as great as being in. But you can start to play with these experiences, see what they feel like. That's in the Horizon app. Today, we've started to see. Well, okay, how else do people discover these experiences across the devices that they have and experiences that they're in? You're in Instagram, you're on Facebook, someone messages you something on WhatsApp, and can you just jump in really quickly? Again, not as good of an experience as jumping straight into an immersive headset. But this doesn't require you to make that leap on day zero, you get to kind of build a taste of what that looks like. Yeah. How are you? Sorry. James said he was extremely bullish on live entertainment in virtual reality. Walk us through what that Looks like over the next few years, a big part of what. And it's, you know, so weird. Like, follow James Cameron. I'm putting it up. He's your friend. My friend Jim. Part of what we are working on together is not just building content. It's updating the entire workflow and tool chain for how content gets made so that it can be stereo by default. So how do you shoot in the field? How do you edit in a truck that's parked outside a stadium? How do you then broadcast that up to the cloud? How do you get that distributed? It's the entire tool chain. You need that to exist. If you want to do something immersive, you need more content so the products have better retention, right? So every time you throw on a headset, you have something that's right there that you haven't necessarily seen before. That's the key point. Something you can't do anywhere else. I can watch a game on TV and it's fantastic. I can't feel like I'm sitting courtside. I can't fly around a Formula one track as if I was a drone floating on the track. I can't actually even experience a race like that. Bring up Formula one, eventually you're going to be able to drive on the real race, right? And there's some sports that you can't actually see. The Whole Arena Track F1's a great example. And so you can't actually experience that in the physical world the same way you could in headset, where you can kind of move around the space, et cetera. So the point is we have to update the tool chain. We have to bring experiences that you can't get anywhere else. But in use cases that people are familiar with. Gaming is amazing. This device is the best gaming device on the market. But not everyone's a gamer. So how do we expand the things that, that people can do and see the differentiation that A fully immersive 3D native device can accomplish? And that's a lot of the work we're doing together. How are you talking to brands about the Metaverse these days? I remember there was. I mean, the Metaverse was like very hyped. Now it's kind of under hyped. I think there's actually really solid progress being made. We heard a story about IKEA selling a ton of product in the Metaverse and Roblox and stuff. And Meta is known for, you know, any brand as small as possible can go and participate. Are we. How far away are we? What are your conversations with Brands like, yeah, I mean, look, four years ago, if you didn't have some Metaverse achieve something or another in your company, you were failing. Two years later, if you had achieved Metaverse something or another in your company, you were failing. And so I think the hype is dead. Yeah. To your point, we've been making a bunch of good time to be a failure. We've been making a bunch of progress kind of in the background today. A brand can come on, they can create a space in horizon, it's fully open, ugc, but they have to have a reason. Is it the case that your physical footprint should just live in the virtual world one to one? Maybe. Can you do things that you can't do in the physical world? Yeah, that's interesting. But in these things, generally we follow the same pattern everywhere. Build something great for consumers, make sure it's got some scale, retention, it's growing. I think brands will find an opportunity to reach people where they are. But we need people first for them to actually care. And then is there opportunities for them to build a business and to entirely new sets of businesses that can't even exist in the physical world. That's absolutely the vision. But we just, you know, 10 year vision and I think we're making progress. Yeah. Well, what about the flappy bird of VR Metaverse? How long until. I mean, the demo we saw today felt like somebody who's non technical could get there. When are we going to see this kind of rolled out and we're going to see this explosion of like the early App Store where things that you guys could never come up with, no matter how many people you hire, no matter how much we spend, you need the creative creativity of a billion people. No, this is exactly where we see the generative AI stuff going. It isn't, you know, the prompt isn't make something great. You have to have a good idea. You have to have something that is unique and novel, but the speed of iteration can be dramatically faster. Yeah. And you as an individual can do a thing that maybe you need an entire kind of skill set that you don't have today to go and build. That's the idea. The other really interesting thing, if you look at reels and you look at other video content, ideas are all just remixes of one another. Yep. And so it isn't just from scratch. Yeah. It's just like what is four different ingredients put together. And so we have some ideas on what we're going to be doing there that we'll talk about more next year. But the Idea is that it isn't all from scratch. It is somehow taking best ideas, putting them together, put your own spin on it, but then give you the tools to do that really easily versus having to build it all from scratch. Well, thank you so much for coming on the show. This is a fantastic conversation. We'd love to have you back. Thanks for having me. Cheers. We will talk to you soon. John, think about the opportunity, the Metaverse opportunity. It was over hyped for a while. As the tech improves, a fashion brand, being able to set up a retail store, you put on Meta understands your avatar. You just walk around the store trying stuff on, looking at a mirror like you're in a retail store, seeing yourself wearing the items like the whole virtual. Virtual try ons have been overhyped forever. But having the demos, you remember we were doing the demos earlier with the quest. It's like you could look around and it's not that difficult to swap out. I could be swapping out your clothing, reacting to it. It is. The competition is going to heat up. I feel like, I feel like people that all of the tech firms are going to be taking wearables. I mean they already are taking it incredibly seriously. But there will be a redoubling of the efforts as these products roll out and actually get in people's hands and then developers start building on them. Like this is the, this is the, like you can see with the Ray Ban displays, like it's going to be hard to ship an app on here, but once you start doing that, then you're really in that platform era and you have the, you have the beginnings of the ability to, I mean going back to the building, it's a moat. It's like how, how, how much will it matter to be the first product in market with a incredible, you know, display built in? Right. Can they get to that app store moment first? Yeah, it'll be a knockout drag out fight as always. It'll be a lot of fun. Anyway. This has been insane. This has been insane. It's a party out there. There are tons of people here. Thank you so much for tuning in to TVPN at Meta Connect 2025. Hope you enjoyed an honor. We will be back in the temple of technology, the fortress of finance, the capital of capital. Tomorrow, Hollywood, California. We will see you at 11:00am Pacific. And before we go, I need to say, we should say thank you to the whole Meta team. They have absolutely crushed this. They're putting a little bit of heat on our production team back at home. Obviously, Michael Scott, Ben and the whole team have been here helping out. But. But the meta team has been absolutely incredible and has totally set the bar on what TVPN never would have imagined. Imagine this. When we first set up the microphones and cameras, we were like, the whole. The whole shtick is that it's just two people, no guests, paper. Yeah. An hour. Yeah. And then it's this. And you know, these things work in mysterious ways. But tomorrow morning we'll be back to just two people talking. Talking shop, hanging out. Anyway, thank you so much for tuning in. Thanks for tuning in. We'll see you tomorrow. We'll see you soon. Have a great rest of your day. Goodbye. Cheers.
Date: September 18, 2025
Hosts: John Coogan & Jordi Hays
Summary by TBPN Podcast Summarizer
TBPN's special episode live from Meta Connect 2025 at Meta HQ in Menlo Park dives deep into Meta's hardware and AI announcements, the broader context of the wearable and AI arms race, and the future of immersive tech. The hosts interview Meta execs (including Mark Zuckerberg), creators, and special guests (such as James Cameron and Alex Wang), providing live reactions to Meta's major product launches: new generations of Ray-Ban and Oakley glasses, the debut of the Meta Ray Ban Display (with heads-up display and the Meta Neural Band), and much more. The episode weaves in commentary about global tech news (China chip bans, Fed rate cuts, Xai’s Colossus 2), developer implications, and the shakeup coming to platforms and content as AI and wearables converge.
00:00-14:00
1:15:00-2:20:00 (Keynote reactions & Meta exec interviews)
~30:00-1:00:00 Interleaved through Episode
2:20:00–4:00:00
Chris Cox (Chief Product Officer):
Discusses portfolio approach balancing pragmatic iterations (like translation and captioning) with greenfield innovation.
“We ask every team to have a portfolio...something to deliver in the next year & something riskier, further out.” (2:41:00)
AI is already remaking core internal processes (bug detection, ranking), and fundamentally alters product QA (“AI can be used to detect edge cases a lot more easily…”).
Adam Mosseri (Head of Instagram):
AI transforming both content recommendations and content generation/augmentation; hybrid human-AI creativity highlighted.
Connor Hayes (Head of Threads):
Threads’ explosive growth (~400M MAU), “be the app that ships” ethos, focus on freshness/relevance/timeliness, tension between copying Instagram features and defining its own niche.
Alexander Wang (Chief AI Officer, ex-Scale AI):
Meta Superintelligence Lab ("MSL") has “100 people, cracked AI scientists...smaller and more talented than any of the other labs.”
“Meta has everything necessary to achieve superintelligence…There are no obstacles. We have the business model to support building hundreds of billions of dollars of compute.” (03:55:00)
On coding: “The role of an engineer is just very different now than it was before...all the code I’ve written in my life will be replaced by what will be able to have been produced by an AI model within five years.” (03:57:00)
Andrew Bosworth (Boz, CTO):
Hardware journey is finally at truly “good enough” V1. Neural interface “V2”, agentic AI inevitable, "dynamic UI" still unsolved. Predicts AR, VR, and AI will fully merge form factors and platforms.
“In the future you’re not going to have an app store...I just need to talk to the AI and let it handle the backend.”
“What is the interaction design for AI apps? We all know that is where things are headed.” (04:12:00)
Meta wants to open ecosystem, but the constraints of power/thermal space currently force tight integrations (e.g., Spotify, WhatsApp).
Eva Chen (VP, Fashion Partnerships):
"To make a stylish glass...something people are already wearing—that's the first thing. Then people try these on…blown away."
Predicts fashion/AR try-on as next killer use case, inspired by “Clueless.”
4:16:00–4:35:00
4:45:00–5:00:00
Vishal Shah (VP, Metaverse):
Predicts generative AI and new engines will allow people to remix & create persistent, sharable worlds as easily as making TikToks.
“Lowering the floor for creation always changes everything…today anyone can capture a space in VR. Tomorrow, it’ll merge geometry, physics, real interactivity.”
Brand/Commerce angle:
“In a world of perfect information, what matters? Trust. Reputation. That’s the evolution of brands.” (Zuckerberg, 04:16:00)
5:00:00–END
“Glasses are the ideal form factor for personal superintelligence because they let you stay present…while getting access to all these AI capabilities that make you smarter, help you communicate better, improve your memory, improve your senses...”
(01:21:10)
“You’re not thinking, ‘oh, this guy’s wearing a computer on his face’...You’re saying, ‘this guy’s wearing a pair of glasses’. That’s different.”
(03:41:00)
“The role of an engineer is just very different now than it was before...all the code I’ve written in my life will be replaced by what will be able to have been produced by an AI model within five years.”
(03:57:00)
“I want people to see what I created...If you think of VR as an innately stereoscopic display device, then that’s a differentiator from the best big 90-inch flat panels...This gives you the feeling of a large screen, and you can spatially adjust it. It hits a sweet spot.”
(04:23:45)
“We ask every team to have a portfolio...something to deliver in the next year and something riskier, further out.”
(2:41:00)
“It’s just going to be on all the time...having context on your conversations…thinks you should know…shows it in the corner of your vision.”
(04:16:00)
“The idea is that [AI/UGC] all fits into one general vision—how we bring people together when they can’t physically be together...If you’re not a gamer, how do we make sure there are things you can do here you can’t do anywhere else?”
(04:53:00)
This episode captures a snap turning point—where AI, wearables, and immersive content finally break through to mainstream reality. Meta’s major announcements and executive interviews expose not only how quickly hardware and AI are merging, but how this shift is set to rewire both consumer computing and the tech platform wars. The platform is opening, the developer arms race is on, and science fiction is landing—now.
(TBPN Episode: Meta Connect 2025 Live – Summary curated for maximum insight and depth. Shareable for those who missed the show—or want the whole visionary arc in one fast read.)