Skip to main content

A PyTorch based module for Liquid Networks

Project description

#🧠 liquidnn

A PyTorch implementation of Liquid Neural Networks (LNNs) and their hybrids (LTC, CNN, RNN, LSTM variants). This library brings continuous-time dynamics into deep learning architectures using learnable time constants.

##✨ Features

🔹 LiquidNeuralNetwork – stacked liquid neurons

🔹 LTCLayer – Liquid Time-Constant layer

🔹 LiquidCNN – CNN + Liquid update

🔹 LiquidRNN – RNN + Liquid update

🔹 LiquidLSTM – LSTM + Liquid update (with bidirectional support)

🔹 HDNN - Height Dimensioned Neural Networks

🔹 Per-neuron learnable τ (time constants) with stability clamping

##📦 Installation From PyPI (after publishing):

pip install liquidnn
##🚀 Quick Start (all models in one script)

Copy-paste this script to test every model in liquidnn:

import torch
from liquidnn import (
    LiquidNeuralNetwork,
    LTCLayer,
    LiquidCNN,
    LiquidRNN,
    LiquidLSTM,
    HDNN
)

# 1. LiquidNeuralNetwork
x = torch.randn(4, 10, 8)  # batch=4, seq_len=10, input_size=8  

model = LiquidNeuralNetwork(input_size=8, hidden_size=16,num_layers=2)  
out = model(x)   
print("LiquidNeuralNetwork output:", out.shape)  # [4, 16]   

# 2. LTCLayer    
x = torch.randn(5, 20, 10)  # batch=5, seq_len=20, input_size=10    
ltc = LTCLayer(input_size=10, hidden_size=32, num_layers=2)    
out = ltc(x)    
print("LTCLayer output:", out.shape)  # [5, 32]    

# 3. LiquidCNN    
x = torch.randn(8, 1, 28, 28)  # MNIST-like input    
model = LiquidCNN(input_channels=1, hidden_size=64, num_layers_liq=2, num_layers_conv=2)    
out = model(x)    
print("LiquidCNN output:", out.shape)  # [8, 64]    

# 4. LiquidRNN    
x = torch.randn(15, 4, 12)  # seq_len=15, batch=4, input_size=12    
model = LiquidRNN(input_size=12, hidden_size=32, num_layers_liq=2, num_layers_rnn=1)    
out = model(x)    
print("LiquidRNN output:", out.shape)  # [4, 32]    

# 5. LiquidLSTM    
x = torch.randn(3, 12, 10)  # batch=3, seq_len=12, input_size=10    
model = LiquidLSTM(input_size=10, hidden_size=32,    
                   num_layers_liq=2, num_layers_lstm=1,    
                   batch_first=True, bidirectional=False)    
out = model(x)    
print("LiquidLSTM output:", out.shape)  # [3, 32]   


#HDNN
model = HDNN(input_dim=10, hidden_dim=32, output_dim=2, height=3, depth=2)
x = torch.randn(5, 10)
out = model(x)
print("HDNN output:", out.shape )      #[5,2]

##📚 Architectures

LiquidNeuralNetwork: Stacked liquid neurons

LTCLayer: Continuous-time RNN update rule

LiquidCNN: Convolutional backbone + LTC dynamics

LiquidRNN: RNN + Liquid refinement

LiquidLSTM: LSTM + Liquid refinement

Install dependencies:

pip install torch

##🌟 Contribute

PRs and issues are welcome! If you try new liquidized architectures, feel free to share 🚀

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

liquidnn-1.1.2.tar.gz (4.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

liquidnn-1.1.2-py3-none-any.whl (5.5 kB view details)

Uploaded Python 3

File details

Details for the file liquidnn-1.1.2.tar.gz.

File metadata

  • Download URL: liquidnn-1.1.2.tar.gz
  • Upload date:
  • Size: 4.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.10

File hashes

Hashes for liquidnn-1.1.2.tar.gz
Algorithm Hash digest
SHA256 f90d88c13a8a3cde95fd92a1ced866108cb361f3384c69b273b80307048c81e2
MD5 433ca9922cbf4ce7f9aa44f707e20c4b
BLAKE2b-256 f68634b4db6ba11dc1646bd55fda96e25eb90ca232a24b4b21b78a094ee15e9f

See more details on using hashes here.

File details

Details for the file liquidnn-1.1.2-py3-none-any.whl.

File metadata

  • Download URL: liquidnn-1.1.2-py3-none-any.whl
  • Upload date:
  • Size: 5.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.10

File hashes

Hashes for liquidnn-1.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 4dae3882e427985bf9b5e747172ff5ce59b3cddaaf7ccb505766bba676921d9c
MD5 d43cb8da73ee31a273180674c9452b25
BLAKE2b-256 5e654be0b01641bb5e55f518e5a24b21160bcdc4484989eca807adc7c811330e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page