Thanks to visit codestin.com
Credit goes to lib.rs

#io #api #api-client #input #message

onellm

Official rust crate to communicate with the OneLLM API in rust

5 releases (stable)

Uses new Rust 2024

1.0.3 Aug 13, 2025
1.0.2 Jul 18, 2025
1.0.1 Jul 16, 2025
1.0.0-beta Jul 10, 2025

#736 in Web programming

MIT license

14KB
287 lines

OneLLM API Client

This is a Rust client for interacting with the OneLLM API.

Usage

Add this to your Cargo.toml:

onellm = "1.0.0"

Example

use onellm::input::Message;

mod input;
mod output;

#[tokio::main]
async fn main() {
    let output = input::APIInput::new(
        "https://api.deepseek.com/chat/completions".to_string(),
        input::Model::DeepSeekV3,
        vec![Message {
            role: "user".to_string(),
            content: "hi there!".to_string(),
        }],
        200,
    )
    .send(String::from("YOUR API KEY HERE"))
    .await
    .expect("Error obtaining result");
    println!("Output: {output:#?}");
}

Dependencies

~6–20MB
~222K SLoC