Loading summary
A
Today on the podcast I want to talk about a company called Peripheral Labs. They're essentially using self driving car technology to bring you right into a sports game. There's a whole bunch of interesting things going on here and I think there's a bunch of implications for the AI industry overall. So today on the show I want to dive into some of the cool technology that they are, that they've recently launched and kind of unveiled. Before we get into that, I wanted to mention the sponsor of Today's episode is delve.com if compliance is something that is slowing down your deals, then I'd love for you to check out the sponsor. Delve, if you're interested in SOC2 HIPAA, GDPR. If these things are important to you, Delve is amazing. They use AI agents to automate compliance. They do this end to end, essentially. They collect evidence, they fill out security questionnaires, they customize controls to your actual business so you can get compliant in days and not months. You also get one on one slack support from real security experts who respond fast. Over a thousand fast growing companies trust Delve to close deals faster and to stay compliant. They scale. So if this sounds interesting to you, you can book a demo@delve.com I will leave a link in the description. All right, let's get into the episode. So I think right now there's a lot of different reports out there that basically say live sports viewership is declining for certain leagues, I think particularly among Gen Z audiences. In response to this, leagues and broadcasters are experimenting with a whole bunch of new formats that essentially like the goal is just to make the games more interactive. A lot of them are doing things to try to make it more data rich or engaging. They're kind of enhancing the viewer experience. They're trying to add like advanced statistics and deep analysis, like a bunch of things that they think will make this more interesting for people. One emerging approach is called volumetric video. Essentially what this is is just allowing viewers to watch plays from multiple angles so you can create kind of like a video game like experience. This particular technology, it uses, you know, an array of cameras that capture footage in 3D and then they let fans move freely through a play and view it from a whole bunch of different perspectives. Honestly, this is really cool, right? You can imagine typically when you watch a game, you watch it from the angle of the camera on tv or if you're at the actual game, you could just see it. You know, maybe you're, you're behind the goal or on the side like you Just see it from that one angle. So there's a company out of Toronto called Peripheral Labs and they are working to make this volumetric video affordable enough for widespread adoption. Um, so basically all teams can use it, leagues, broadcasters and everyone else. This is a company, Peripheral Labs, they're founded back in 2024, so very new company by Kelvin Q and Mustafa Khan. Both of them previously worked on the University of Toronto's autonomous driving team. They won a bunch of different competitions, but Khan later became a researcher at Huawei. Q worked as a software engineer at Tesla focusing on chassis systems. This is what Q said about this, the company said. Mustafa and I are both lifelong sports fans. He's a huge Arsenal supporter. And I followed the Vancouver Canucks since I was 7. It's funny personally I'm also from, I'm from Vancouver island in Canada. So the closest big team to us was also Vancouver Canucks. So I relate as the Canucks were my team growing up and in fact I just saw one of their games like a month ago with some friends in North Carolina against the Carolina Hurricanes. So the Vancouver Canucks did not win, but it was a good game in any case. Kai said when he showed me his research on 3D reconstruction, I immediately thought about how incredible it would be to watch hockey this way with fluid multiangle controls. That's what led us to start Peripheral Labs. So I think volumetric caption itself is not like a new technology. But these founders believe that recent advances in AI and computer vision have essentially made it practical at scale. So by applying techniques developed for self driving cars, including robotics, perception and 3D vision, their goal is to reduce the number of cameras that you actually need for a full volumetric reconstruction. Because right now you need like more than a hundred and they think they can get that down to 32, which, which makes it more affordable, more realistic. Obviously, you know, hundreds of cameras in the arena is just a lot to take on. So if they can bring that down to 32 and maybe in the future even less, you can use kind of this reconstruction to still get the full experience on for much cheaper. According to Q and Con, this is, you know, a lot less expensive for hardware, but also it's the operational complexity that they're solving because a hundred cameras is pretty difficult just operating it and making everything tied together and work. So Peripheral Labs plans to keep installation costs as low as possible for teams and broadcasters. They're trying to get multi year platform contracts and their software also captures biomechanical data. So that's player movement. Join Flexation to Have like these proprietary sensor stacks that are similar to those found in autonomous vehicles. All of the data enables new ways for broadcasters and fans to essentially interact with live games. Viewers could isolate and follow a single player, right? So if you have a favorite player, you could follow them whenever they're on the field or on the ice. You could freeze moments in play to kind of look at some controversial calls to see exactly what happened. Or you could look at key plays from a lot of different angles using photorealistic 3D reconstruction. I think on the coaching side, it's also very interesting and they, they can essentially let you track detailed joint movements like your knees, your ankles, even your fingers, which I think could give a lot of really interesting insights into body positioning, flexibility, performance optimization for some of the players. Something that Q said about this, he said, well, we use off the shelf camera. Our advance or our advantage comes from how we combine them with robotics and machine learning that allows us to scale from small practice facilities to large soccer and football stadiums. They recently raised a $3.6 million seed round led by Koshala Ventures. They had participation from Daybreak Capital and Entrepreneurs first and transposed platforms. So Joss Joe Rose, who's a partner at Entrepreneur first, said that the firm was really impressed by the founders follow with, you know, how much they had done within the University of Toronto's autonomous driving community. While sports startups can be really challenging to invest in, he said that Peripheral Labs is both a sports and entertainment platform. So this is more interesting to them. They said to the end consumer is the viewer and demand for the sport content is evergreen. Peripheral is helping redefine a new standard for how that content is consumed through immersive volumetric video. The work we're doing now creates a data and deployment advantage that will be hard to replicate. One thing that I think was pretty smart, Peripheral Labs was pretty selective in how they chose their investors. Basically they're looking for people who could contribute to the product development and their go to market strategy. They currently only have 10 engineers. They're planning to expand their team and they're going to focus on the platform and the hardware optimization to try to reduce costs, lower the latency and and basically improve the reconstruction resolution. I mean, right now they're saying they're trying to bring it from like 100 cameras down to 32. It would be pretty incredible if they could bring that down to like you know, 10 cameras or something where they just have a handful around the stadium and they could reconstruct the whole game in 3D. There's a lot of impressive things you can do now with AI. So I think while the startup hasn't publicly disclosed who their partners are, like what teams they're working with, they have said that they're in active discussions with multiple teams and leagues across North America and they are competing with companies like Arcturus Studios in this kind of volumetric sports capture space. So there are other players that are doing the same thing, there's other companies, but it's gonna be interesting to see how they, how they play out. Thank you so much for listening to the podcast today. If you learned anything new. If this was an interesting topic, I tried to find one that was AI adjacent but it's got some really cool, interesting technology and applications that was exciting or interesting to you to have kind of this, this type of company covered. Make sure to leave a rating review and let me know your thoughts on this. I hope you have a fantastic rest of your day. And as always, check out the sponsor of today's video delve.com there is a link in the description.
Date: January 5, 2026
Host: The AI Podcast
In this episode, The AI Podcast explores the groundbreaking work of Peripheral Labs, a Toronto-based startup blending self-driving car technology and AI to revolutionize sports broadcasting. The conversation highlights the company’s vision for immersive, interactive sports viewing experiences via affordable volumetric video, its $3.6 million funding round, and the broader impacts on fan engagement and the AI industry as a whole.
This episode highlights Peripheral Labs' fusion of AI and sports technology, their innovative approach to volumetric video, and the potential to revolutionize how fans experience live games. By translating lessons from autonomous vehicles to sports arenas, the startup is pushing the boundaries of immersive fan engagement and redefining the future of broadcasting.