Thanks to visit codestin.com
Credit goes to github.com

Skip to content

ECNU-Cross-Innovation-Lab/Mamba-Spike

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Mamba-Spike: Enhancing the Mamba Architecture with a Spiking Front-End

CGI 2024 DOI Python 3.9+ PyTorch License: MIT

This is a PyTorch implementation of the paper "Mamba-Spike: Enhancing the Mamba Architecture with a Spiking Front-End for Efficient Temporal Data Processing" published in CGI 2024 (Computer Graphics International Conference).

Overview

Mamba-Spike is a novel neuromorphic architecture that integrates a spiking front-end with the Mamba backbone to achieve efficient and robust temporal data processing. The architecture leverages:

  • Event-driven processing through Spiking Neural Networks (SNNs)
  • Selective state spaces for efficient sequence modeling
  • Linear-time complexity for processing long temporal sequences
  • Energy-efficient computation through sparse spike representations

Architecture Components

  1. Spiking Front-End: Uses Leaky Integrate-and-Fire (LIF) neurons with recurrent connections to encode event-based data into sparse spike representations
  2. Interface Layer: Converts spikes to continuous activations using fixed time window accumulation and firing rate normalization
  3. Mamba Backbone: Processes temporal sequences using selective state space models with linear-time complexity
  4. Classification Head: Outputs class predictions with layer normalization

Paper Compliance

This implementation strictly follows the paper specifications:

  • Recurrent Connections (Page 6): Added to spiking front-end for temporal feature extraction
  • LIF Time Constant (Figure 5): Optimized to τ ≈ 30ms (beta=0.97) for best performance
  • Spike-to-Activation (Page 7): Implements fixed time window accumulation with firing rate normalization
  • Sequential MNIST (Table 1): Full support with rate coding conversion from standard MNIST

Setup

1. Environment Setup

The project uses a conda environment named mambaspike:

conda create -n mambaspike python=3.9
conda activate mambaspike
pip install -r requirements.txt

2. Dataset Preparation

The project supports four neuromorphic datasets:

  • N-MNIST: Neuromorphic version of MNIST (34x34 resolution)
  • DVS Gesture: Dynamic hand gestures (128x128 resolution)
  • CIFAR10-DVS: Neuromorphic version of CIFAR-10 (128x128 resolution)
  • Sequential MNIST: Standard MNIST converted to spike trains (28x28 resolution)

Datasets will be automatically downloaded when running the training script.

Training

Basic Training

To train on N-MNIST:

python train.py --dataset nmnist --epochs 100 --batch-size 32

To train on DVS Gesture:

python train.py --dataset dvsgesture --epochs 150 --batch-size 16 --lr 5e-4

To train on CIFAR10-DVS:

python train.py --dataset cifar10dvs --epochs 200 --batch-size 32 --lr 1e-3

To train on Sequential MNIST:

python train.py --dataset sequential_mnist --epochs 100 --batch-size 64

Training Parameters

  • --dataset: Choose from nmnist, dvsgesture, cifar10dvs, sequential_mnist
  • --epochs: Number of training epochs
  • --batch-size: Batch size for training
  • --lr: Learning rate (default: 1e-3)
  • --weight-decay: Weight decay for AdamW optimizer (default: 1e-4)
  • --time-window: Time window in microseconds (default: 300000)
  • --dt: Time bin in microseconds (default: 1000)
  • --output-dir: Directory to save outputs (default: ./outputs)

Evaluation

To evaluate a trained model:

python evaluate.py --checkpoint outputs/nmnist_*/checkpoint_best.pth

With additional analysis:

python evaluate.py --checkpoint outputs/nmnist_*/checkpoint_best.pth --analyze-temporal --compare-paper

Results

Performance comparison on various neuromorphic datasets:

Dataset Mamba-Spike Mamba SLAYER DECOLLE Spiking-YOLO
DVS Gesture 97.8% 96.8% 93.6% 95.2% 96.1%
TIDIGITS 99.2% 98.7% 97.5% 98.3% -
Sequential MNIST 99.4% 99.3% - - -
CIFAR10-DVS 92.5% 91.8% 87.3% 89.6% 91.2%

Project Structure

mambaspike/
├── data/
│   └── dataset_loader.py      # Dataset loading utilities (includes Sequential MNIST)
├── models/
│   └── mamba_spike.py         # Model architecture (paper-compliant)
├── train.py                   # Training script
├── evaluate.py                # Evaluation script
├── requirements.txt           # Dependencies
└── README.md                  # This file

Key Features

  1. Paper-Compliant Implementation: Strictly follows all architectural specifications from the CGI 2024 paper
  2. Recurrent Spiking Front-End: Temporal feature extraction through recurrent connections (30ms time constant)
  3. Efficient Temporal Processing: Leverages Mamba's selective state spaces for O(L) complexity
  4. Neuromorphic Data Support: Native processing of event-based data (DVS cameras, Sequential MNIST)
  5. Multi-Scale Architecture: Supports different input resolutions (28×28 to 128×128)
  6. Flexible Training: Easy to adapt for different neuromorphic datasets

Citation

If you use this code in your research, please cite:

@inproceedings{qin2025mambaspike,
  title={Mamba-Spike: Enhancing the Mamba Architecture with a Spiking Front-End for Efficient Temporal Data Processing},
  author={Qin, Jiahao and Liu, Feng},
  booktitle={Advances in Computer Graphics: 41st Computer Graphics International Conference, CGI 2024},
  pages={303--315},
  year={2025},
  publisher={Springer},
  address={Cham},
  series={Lecture Notes in Computer Science},
  volume={15339},
  doi={10.1007/978-3-031-82021-2_23},
  url={https://doi.org/10.1007/978-3-031-82021-2_23}
}

Reference: Qin, J., Liu, F. (2025). Mamba-Spike: Enhancing the Mamba Architecture with a Spiking Front-End for Efficient Temporal Data Processing. In: Magnenat-Thalmann, N., Kim, J., Sheng, B., Deng, Z., Thalmann, D., Li, P. (eds) Advances in Computer Graphics. CGI 2024. Lecture Notes in Computer Science, vol 15339. Springer, Cham. https://doi.org/10.1007/978-3-031-82021-2_23

License

This project is licensed under the MIT License - see the LICENSE file for details.

Contact

For questions or issues, please:

  • Open an issue on GitHub
  • Contact the paper authors (see paper for details)

Note: This is a research implementation. For production use, additional testing and optimization may be required.

About

Mamba-Spike——CGI2024

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages