7 releases (breaking)
Uses new Rust 2024
| 0.6.0 | Oct 19, 2025 |
|---|---|
| 0.5.1 | Aug 7, 2025 |
| 0.4.0 | Aug 3, 2025 |
| 0.3.0 | Jul 29, 2025 |
| 0.1.0 | Jul 24, 2025 |
#636 in Machine learning
30KB
684 lines
Introduction
This crate exposes an async stream API for the widely-used OpenAI chat completion API.
Supported features:
- Stream generation
- Tool calls
- Reasoning content (Qwen3, Deepseek R1, etc)
This crate is built on top of
reqwestandserde_json.
use nah_chat::{ChatClient, ChatMessage};
use futures_util::{pin_mut, StreamExt};
let chat_client = ChatClient::init(base_url, auth_token);
// create and pin the stream
let stream = chat_client
.chat_completion_stream(model_name, &messages, ¶ms)
.await
.unwrap();
pin_mut!(stream);
// buffer for the new message
let mut message = ChatMessage::new();
// consume the stream
while let Some(delta_result) = stream.next().await {
match delta_result {
Ok(delta) => {
message.apply_model_response_chunk(delta);
}
Err(e) => {
eprintln!("Error occurred while processing the chat completion: {}", e);
}
}
}
Notice
Copyright 2025, Mengxiao Lin.
This is a part of nah project. nah means "Not A
Human". Source code is available under MPL-2.0.
Dependencies
~5–25MB
~350K SLoC