Member-only story
Build Your Own Liquid Neural Network with PyTorch
Why LNNs are so Fascinating — 2024 Overview
For the past 35 years, we have built these probabilistic models that output predictions based on data and learned parameters(θ). Each neuron is a logistic regression gate. Tie that to backpropagation — a model’s ability to retrain parameter weights based on model loss and you get neural networks.

Neural networks, however, have some limitations in the modern world:
- They perform well on unified tasks, but cannot generalize knowledge across tasks, i.e have solid states.
- They process data non-sequentially, making them inefficient at handling real-time data.
Solution: “a type of neural network that learns on the job, not only during the training phase.”
That’s what we refer to as LNNs — Liquid Neural Networks.
Liquid Neural Networks (LNNs) are a type of neural network that processes data sequentially and adapts to changing data in real-time, much like the human brain.

A Liquid Neural Network is a time-continuous Recurrent Neural Network (RNN) that…