NHSMM — Neural Hidden Semi-Markov Models
- Repository: NHSMM on GitHub
- Interfaces: NHSMM-INTERFACES on GitHub
- Documentation: NHSMM Wiki
- Article: Unlocking Hidden Patterns in Time – Meet NHSMM
⚠️ Alpha stage — NHSMM is a proof-of-concept and actively evolving. Public APIs may change before stable1.0.0.
NHSMM is a modular PyTorch library for context-aware sequential modeling, forming the foundation of the State Aware Engine (SAE). NHSMM-INTERFACES defines domain-level contracts for integrating NHSMM in diverse systems.
Designed for developers, data scientists, and system integrators, NHSMM enables rapid understanding, deployment, and extension of latent state models for domains such as finance, IoT, robotics, health, and cybersecurity.
- Neural HSMM — integrates Hidden Semi-Markov Models with neural parameterization for expressive latent dynamics.
- Context-Aware Modulation — initial, transition, duration, and emission distributions adapt to external covariates.
- Flexible Architectures — supports hierarchical and hybrid models.
- PyTorch & GPU Ready — scalable multi-domain deployment.
- Modular Foundation — for research, experimentation, and production-ready sequence models.
NHSMM explicitly models:
- Context-Dependent State Durations — variable dwell-times per hidden state influenced by covariates.
- Context-Dependent Transitions — dynamic transition probabilities adapting to time-varying features.
Suitable for non-stationary, heterogeneous, and time-aware sequences across real-world applications.
- Forever Open Core —
nhsmmremains fully open-source and actively maintained. - No Hidden Dependencies — core library uses only open components; experimental modules (
nhsmm-interfaces) are optional. - Transparent Evolution — research previews and pre-release interfaces are clearly marked.
- Community Respect — contributions are acknowledged; experimental previews may close but knowledge remains accessible.
- Clear Upgrade Path — experimental work informs SAE; core NHSMM is stable and independent.
- Contextual HSMM — dynamic modulation of initial, transition, duration, and emission probabilities.
- Duration Models — explicit, context-aware state dwell-times.
- Emission Models — Gaussian, Student-t, or discrete outputs; differentiable and context-aware.
- Transition Models — learnable, covariate-aware with gating and temperature scaling; supports low-rank factorization.
- Hybrid HSMM-HMM Inference — forward-backward and Viterbi adapted for neural latent states.
- Subclassable Distributions — extend Initial, Duration, Transition, Emission modules.
- Differentiable Training — gradient-based optimization, temperature annealing, neural modulation.
- Neural Context Encoders — CNN, LSTM, or hybrid encoders for time-varying covariates.
- GPU-Ready — fully batched operations.
- Multi-Domain Applicability — finance, IoT, robotics, health, cybersecurity.
- Extensible Architecture — foundation for SAE interfaces, API integration, and research projects.
- Hybrid Update Modes — neural gradient-based updates, optional alternative schemes.
- Vectorized forward-backward for batched likelihood computation.
- Optional low-rank transitions for large state spaces.
- Supports long sequences efficiently.
- Memory-efficient Viterbi optimized for GPU.
- Handles variable-length sequences with padding and masking.
| Stage | Status | Notes |
|---|---|---|
| Proof of Concept | ✅ Done | Alpha release (0.0.1-alpha) |
| Testing/Enhancement | Improve performance, extend API | |
| Production Release | Stable 1.0.0 release with documentation |
pip install nhsmmgit clone https://github.com/awa-si/NHSMM.git
cd NHSMM
pip install -e .Editable mode allows modification and testing without reinstalling.
See: State Occupancy & Duration/Transition Diagnostics
Works similarly for IoT signals, health telemetry, robotics, or cybersecurity logs.
from nhsmm.models import HSMM
model = HSMM(n_states=3, n_features=5)
seq_features, canonical = model.encoder.encode(sequences)External Input → Neural Initial Module (π)
→ Neural Transition Module (A)
→ Neural Duration Module (D)
→ Emission Module (Gaussian/Student-t/Discrete)
→ Forward-Backward / Viterbi → Backprop
- External Input: features, covariates, embeddings.
- Neural Modules: context-conditioned initial, transition, duration, and emission distributions.
- Inference: latent states inferred via forward-backward and Viterbi.
- Backpropagation: updates all neural modules jointly.
- Security & Cyber-Physical Systems: anomaly and hidden state detection.
- Finance & Trading: regime detection, forecasting, adaptive strategies.
- IoT & Industrial: predictive maintenance, fault detection.
- Health & Wearables: activity and state tracking, multimodal fusion.
- Robotics: behavior monitoring, safe human-robot interaction.
- Telecommunications & Energy: latent state monitoring, resource optimization.
- Research & AI: temporal modeling, neural-probabilistic experiments.
Contributions welcome! Bug reports, feature suggestions, or documentation improvements strengthen NHSMM.
git clone https://github.com/awa-si/NHSMM.git
cd NHSMM
pip install -e ".[dev]"
pytest -v
black nhsmm
ruff check nhsmmDevelopment is supported via GitHub Sponsors, Patreon, Medium. See FUNDING.md for details.
Apache License 2.0 © 2024 AWA.SI Full terms: LICENSE
If used in academic work, please cite the repository.