Thanks to visit codestin.com
Credit goes to github.com

Skip to content

VDHewei/ort-rs

 
 

Repository files navigation


Coverage Results Crates.io Open Collective backers and sponsors
Crates.io ONNX Runtime

ort is an (unofficial) ONNX Runtime 1.22 wrapper for Rust based on the now inactive onnxruntime-rs. ONNX Runtime accelerates ML inference and training on both CPU & GPU.

📖 Documentation

🤔 Support

💖 Projects using ort

Open a PR to add your project here 🌟

  • Bloop uses ort to power their semantic code search feature.
  • edge-transformers uses ort for accelerated transformer model inference at the edge.
  • Ortex uses ort for safe ONNX Runtime bindings in Elixir.
  • Supabase uses ort to remove cold starts for their edge functions.
  • Lantern uses ort to provide embedding model inference inside Postgres.
  • Magika uses ort for content type detection.
  • sbv2-api is a fast implementation of Style-BERT-VITS2 text-to-speech using ort.
  • Ahnlich uses ort to power their AI proxy for semantic search applications.
  • Spacedrive is a cross-platform file manager with AI features powered by ort.
  • BoquilaHUB uses ort for local AI deployment in biodiversity conservation efforts.
  • FastEmbed-rs uses ort for generating vector embeddings, reranking locally.
  • Aftershoot uses ort to power AI-assisted image editing workflows.
  • Valentinus uses ort to provide embedding model inference inside LMDB.
  • retto uses ort for reliable, fast ONNX inference of PaddleOCR models on Desktop and WASM platforms.
  • oar-ocr A comprehensive OCR library, built in Rust with ort for efficient inference.

🌠 Sponsor ort


About

Fast ML inference & training for ONNX models in Rust

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Rust 98.4%
  • Other 1.6%