Thanks to visit codestin.com
Credit goes to lib.rs

46 releases (28 breaking)

Uses new Rust 2024

new 0.29.0 Jan 20, 2026
0.27.0 Dec 15, 2025
0.24.0 Nov 10, 2025
0.16.0 Jul 30, 2025
0.0.6 Jun 12, 2024

#43 in Machine learning

Codestin Search App Codestin Search App Codestin Search App Codestin Search App Codestin Search App Codestin Search App Codestin Search App Codestin Search App Codestin Search App Codestin Search App Codestin Search App Codestin Search App Codestin Search App Codestin Search App Codestin Search App Codestin Search App Codestin Search App

34,275 downloads per month
Used in 124 crates (90 directly)

MIT and maybe GPL-3.0

2.5MB
34K SLoC

Rig

Rig is a Rust library for building LLM-powered applications that focuses on ergonomics and modularity.

More information about this crate can be found in the crate documentation.

Table of contents

Features

  • Agentic workflows that can handle multi-turn streaming and prompting
  • Full GenAI Semantic Convention compatibility
  • 20+ model providers, all under one singular unified interface
  • 10+ vector store integrations, all under one singular unified interface
  • Full support for LLM completion and embedding workflows
  • Support for transcription, audio generation and image generation model capabilities
  • Integrate LLMs in your app with minimal boilerplate
  • Full WASM compatibility (core library only)

Installation

cargo add rig-core

Simple example

use rig::{completion::Prompt, providers::openai};

#[tokio::main]
async fn main() {
    // Create OpenAI client and model
    // This requires the `OPENAI_API_KEY` environment variable to be set.
    let openai_client = openai::Client::from_env();

    let gpt4 = openai_client.model("gpt-4").build();

    // Prompt the model and print its response
    let response = gpt4
        .prompt("Who are you?")
        .await
        .expect("Failed to prompt GPT-4");

    println!("GPT-4: {response}");
}

Note using #[tokio::main] requires you enable tokio's macros and rt-multi-thread features or just full to enable all features (cargo add tokio --features macros,rt-multi-thread).

Integrations

Rig supports the following LLM providers out of the box:

  • Anthropic
  • Azure
  • Cohere
  • Deepseek
  • Galadriel
  • Gemini
  • Groq
  • Huggingface
  • Hyperbolic
  • Mira
  • Mistral
  • Moonshot
  • Ollama
  • Openai
  • OpenRouter
  • Perplexity
  • Together
  • Voyage AI
  • xAI

Vector stores are available as separate companion-crates:

The following providers are available as separate companion-crates:

Who is using Rig?

Below is a non-exhaustive list of companies and people who are using Rig:

  • St Jude - Using Rig for a chatbot utility as part of proteinpaint, a genomics visualisation tool.
  • Coral Protocol - Using Rig extensively, both internally as well as part of the Coral Rust SDK.
  • VT Code - VT Code is a Rust-based terminal coding agent with semantic code intelligence via Tree-sitter and ast-grep. VT Code uses rig for simplifying LLM calls and implement model picker.
  • Dria - a decentralised AI network. Currently using Rig as part of their compute node.
  • Nethermind - Using Rig as part of their Neural Interconnected Nodes Engine framework.
  • Neon - Using Rig for their app.build V2 reboot in Rust.
  • Listen - A framework aiming to become the go-to framework for AI portfolio management agents. Powers the Listen app.
  • Cairnify - helps users find documents, links, and information instantly through an intelligent search bar. Rig provides the agentic foundation behind Cairnify’s AI search experience, enabling tool-calling, reasoning, and retrieval workflows.

Are you also using Rig in production? Open an issue to have your name added!

Dependencies

~13–37MB
~438K SLoC