
What does it take to design a programming language from scratch when the target isn’t just CPUs, but GPUs, accelerators, and the entire AI stack? In this episode, I sit down with legendary language architect Chris Lattner to talk about Mojo — his ambitious attempt to rethink systems programming for the machine learning era. We trace the arc from LLVM and Clang to Swift and now Mojo, unpacking the lessons Chris has carried forward into this new language. Mojo aims to combine Python’s ergonomics with C-level performance, but the real story is deeper: memory ownership, heterogeneous compute, compile-time metaprogramming, and giving developers precise control over how AI workloads hit silicon. Chris shares the motivation behind Modular, why today’s AI infrastructure demands new abstractions, and how Mojo fits into a rapidly evolving ecosystem of ML frameworks and hardware backends. We also dig into developer experience, safety vs performance tradeoffs, and what it means to build a la...
Subscribe to your favorite podcasts and get free AI summaries within minutes of release.
Browse trending podcasts or search for your favorites
One click to follow any show — always free, no credit card
Free AI summaries delivered by email within minutes of release
Free forever · No credit card · Unsubscribe anytime
Never miss an episode of Hanselminutes with Scott Hanselman. Subscribe for free →
No transcript available.