Podcast Summary: Digital Disruption with Geoff Nielson
Episode: CEO of Liquid AI Ramin Hasani Says a Worm is Changing the Future of AI
Release Date: March 17, 2025
Introduction
In this captivating episode of Digital Disruption hosted by Jeff Nielsen from Info-Tech Research Group, Jeff welcomes Ramin Hasani, the co-founder and CEO of Liquid AI. The conversation delves deep into the innovative technologies developed by Liquid AI, particularly focusing on how insights from biology are revolutionizing artificial intelligence.
Liquid AI's Origins and Inspiration
Ramin Hasani shares the genesis of Liquid AI, drawing inspiration from biological systems, specifically the nervous system of the C. elegans worm. He explains:
"We started looking into how we can bring in insights from biology and physics into machine learning because we wanted to see what are the mathematical operators that we can find in biology that doesn't exist or we are not using them right now in the space of neural networks." ([00:54])
The C. elegans worm was chosen due to its transparency and genetic similarity to humans, making it a valuable model for understanding nervous systems. This biological foundation set the stage for developing Liquid Neural Networks.
Technology: Liquid Neural Networks
Liquid Neural Networks (LNNs) represent a novel approach to AI, inspired by the dynamics of the worm's nervous system. Ramin elaborates:
"Liquid neural networks are an instantiation or like a type of recurrent neural network that are coming from more mathematics that we use for describing physical processes." ([06:33])
These networks utilize differential equations, a staple in modeling physical systems, to govern neuron interactions. This approach contrasts sharply with traditional neural networks, offering enhanced expressivity and efficiency.
Differences from Traditional Neural Networks
A significant distinction between LNNs and traditional models like Transformers lies in their computational scalability and efficiency:
"Our computation scales linearly. Meaning the longer information they process, the better the gap becomes between like 10x could become like a thousandx at use." ([34:58])
Unlike Transformers, which scale quadratically and become computationally intensive with larger contexts, LNNs maintain linear scalability, making them vastly more efficient, especially for extended sequences of data.
Scalability and Efficiency
Ramin discusses the breakthrough achieved in 2022, where Liquid AI overcame the computational challenges of scaling LNNs:
"In 2022, we published a paper in Nature Machine Intelligence that allowed us to bypass the computational complexity of differential equations, enabling us to scale these models to billions or even trillions of parameters." ([13:48])
This advancement has positioned Liquid AI to compete robustly in the AI landscape, offering models that are not only powerful but also energy-efficient.
Applications and Use Cases
Liquid Neural Networks have a broad range of applications across various sectors:
-
Edge Computing: Deploying AI models on devices like smartphones, robots, and IoT devices without relying on cloud-based APIs.
"We are building the best quality foundation model today below on the edge... You can host liquid foundation models on a Raspberry PI that works with a robot." ([19:49])
-
Biotech: Utilizing LNNs for drug discovery by designing DNA sequences that translate into meaningful proteins.
"We can design new structures that match biological proteins... which could be a drug candidate." ([26:42])
-
Financial Services: Enhancing fraud detection and portfolio optimization through sequential data processing.
"In the financial sector, we can build predictors that provide financial advice and detect anomalies for fraud detection." ([20:18])
-
Robotics and Autonomy: Improving control systems and synthetic scenario generation to enhance robotic interactions and surgeries.
Ramin emphasizes the versatility of LNNs in handling different data modalities, making them suitable for complex, real-world applications.
Explainability and Control Theory
One of the standout features of Liquid Neural Networks is their inherent explainability, rooted in control theory:
"The approach, the mathematics of liquid foundation models are informed by control theory mathematics. That allows us to understand how foundation models come up with decisions." ([38:33])
Unlike Transformers, which operate as black boxes, LNNs provide transparency into decision-making processes. This "white box" nature is crucial for deploying AI in safety-critical applications, where understanding and verifying AI behavior is paramount.
Market Readiness and Deployment
Liquid AI is transitioning from experimental stages to production, collaborating with early adopters across multiple industries. Ramin outlines their strategic focus:
"We are 50 people right now. We are working with enterprises in consumer electronics, e-commerce, financial services, and biotech to implement our models." ([35:44])
Additionally, Liquid AI offers platforms like Liquid AI Playground and partnerships with Perplexity AI and Lambda Labs to provide early access to their models, fostering broader experimentation and feedback.
Future Vision and Long-term Goals
Looking ahead, Ramin envisions Liquid Neural Networks as the foundation for future AI systems:
"I can see liquid foundation models to be deployed on any devices we own... in the near future, we could port them onto satellites and other edge devices." ([43:20])
He emphasizes the importance of sustainable AI development, aiming to balance intelligence with energy efficiency. Furthermore, Liquid AI is exploring the potential of co-designing specialized hardware to further enhance the capabilities and deployment of LNNs.
Personal Insights from Ramin Hasani
Ramin shares his personal journey from a scientist focused on understanding intelligence to leading a pioneering AI company:
"Exposure to venture capital and the real-world application of our technology shifted my focus from purely scientific endeavors to bringing tangible value through scaling." ([51:58])
He highlights the collaborative environment at Liquid AI, teaming up with esteemed scientists like Professor Daniela Roost and leveraging the collective expertise of his team to drive innovation.
Conclusion
The episode sheds light on Liquid AI's groundbreaking approach to artificial intelligence, blending biological insights with advanced mathematical frameworks to create efficient, scalable, and explainable AI models. Ramin Hasani's vision positions Liquid AI as a formidable player in the AI revolution, promising transformative impacts across various industries while maintaining a steadfast commitment to sustainability and transparency.
Notable Quotes:
-
"Liquid neural networks are an instantiation or like a type of recurrent neural network that are coming from more mathematics that we use for describing physical processes." — Ramin Hasani ([06:33])
-
"Our computation scales linearly... That's one of the nice things about them." — Ramin Hasani ([34:58])
-
"We can design new structures that match biological proteins... which could be a drug candidate." — Ramin Hasani ([26:42])
-
"Instead of just designing black boxes... we have a white box intelligence." — Ramin Hasani ([38:33])
-
"I can see liquid foundation models to be deployed on any devices we own." — Ramin Hasani ([43:20])
This detailed summary encapsulates the essence of the episode, providing listeners with comprehensive insights into Liquid AI's innovative journey and the transformative potential of Liquid Neural Networks in reshaping the future of artificial intelligence.
