Thanks to visit codestin.com
Credit goes to lib.rs

#artificial-intelligence #machine-learning #onnx-runtime

sys no-std ort-sys

Unsafe Rust bindings for ONNX Runtime 1.23 - Optimize and Accelerate Machine Learning Inferencing

16 releases

Uses new Rust 2024

2.0.0-rc.11 Jan 7, 2026
2.0.0-rc.10 Jun 1, 2025
2.0.0-rc.9 Nov 21, 2024
2.0.0-rc.4 Jul 7, 2024
2.0.0-alpha.2 Nov 28, 2023

#581 in Machine learning

Codestin Search App Codestin Search App Codestin Search App Codestin Search App Codestin Search App Codestin Search App Codestin Search App Codestin Search App Codestin Search App Codestin Search App Codestin Search App Codestin Search App Codestin Search App Codestin Search App Codestin Search App Codestin Search App Codestin Search App

475,840 downloads per month
Used in 273 crates (14 directly)

MIT/Apache

220KB
4.5K SLoC


Coverage Results Crates.io Open Collective backers and sponsors
Crates.io ONNX Runtime

ort is a Rust interface for performing hardware-accelerated inference & training on machine learning models in the Open Neural Network Exchange (ONNX) format.

Based on the now-inactive onnxruntime-rs crate, ort is primarily a wrapper for Microsoft's ONNX Runtime library, but offers support for other pure-Rust runtimes.

ort with ONNX Runtime is super quick - and it supports almost any hardware accelerator you can think of. Even still, it's light enough to run on your users' devices.

When you need to deploy a PyTorch/TensorFlow/Keras/scikit-learn/PaddlePaddle model either on-device or in the datacenter, ort has you covered.

πŸ“– Documentation

πŸ€” Support

🌠 Sponsor ort


πŸ’– FOSS projects using ort

Open a PR to add your project here 🌟

  • Koharu uses ort to detect, OCR, and inpaint manga pages.
  • BoquilaHUB uses ort for local AI deployment in biodiversity conservation efforts.
  • Magika uses ort for content type detection.
  • Text Embeddings Inference (TEI) uses ort to deliver high-performance ONNX runtime inference for text embedding models.
  • sbv2-api is a fast implementation of Style-BERT-VITS2 text-to-speech using ort.
  • CamTrap Detector uses ort to detect animals, humans and vehicles in trail camera imagery.
  • oar-ocr A comprehensive OCR library, built in Rust with ort for efficient inference.
  • retto uses ort for reliable, fast ONNX inference of PaddleOCR models on Desktop and WASM platforms.
  • Ahnlich uses ort to power their AI proxy for semantic search applications.
  • Valentinus uses ort to provide embedding model inference inside LMDB.
  • edge-transformers uses ort for accelerated transformer model inference at the edge.
  • FastEmbed-rs uses ort for generating vector embeddings, reranking locally.
  • Ortex uses ort for safe ONNX Runtime bindings in Elixir.
  • SilentKeys uses ort for fast, on-device real-time dictation with NVIDIA Parakeet and Silero VAD.

Dependencies

~0–2MB
~27K SLoC