Thanks to visit codestin.com
Credit goes to Github.com

Skip to content

schmitech/orbit

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

License Python Docker Release PyPI NPM GitHub stars

ORBIT โ€“ One API. Any LLM. Your data.

Stop rewriting your app every time you switch LLMs. ORBIT unifies 20+ AI providers with your databases, vector stores, and APIsโ€”all through one self-hosted gateway.

Ship faster. Stay portable. Keep your data private.

Star on GitHub

Table of Contents


Highlights

  • Unified AI gateway supporting 20+ LLM providers (OpenAI, Anthropic, Gemini, Cohere, Mistral, Ollama, Groq, DeepSeek, xAI, OpenRouter, and more) plus local models via Ollama, llama.cpp, and vLLM.
  • Data integration with RAG adapters for SQL databases (PostgreSQL, MySQL, SQLite, DuckDB, Oracle, SQL Server, Cassandra), vector stores (Chroma, Qdrant, Pinecone, Milvus, Elasticsearch, Redis), MongoDB, HTTP APIs, and file uploads with multimodal support.
  • Intelligent query processing with intent-based adapters that translate natural language to SQL, Elasticsearch queries, MongoDB queries, and HTTP API calls.
  • Vision capabilities with support for vLLM, OpenAI, Gemini, and Anthropic vision models for image analysis and OCR.
  • Secure by default with token-based auth, role-aware API keys, and pluggable content moderation.
  • Ready for teams thanks to batteries-included clients (CLI, React widget, Node/Python SDKs).

Why ORBIT

  • Avoid vendor lock-in by switching between LLM providers without rewriting your application codeโ€”change providers in configuration, not code.
  • Keep your data private with support for on-prem deployments, air-gapped installs, and local models that never leave your infrastructure.
  • Query your data naturally in any language instead of writing SQL, Elasticsearch queries, or API callsโ€”intent-based adapters handle the translation automatically.

Built for

  • Platform & infra teams who need a stable control plane for LLM workloads across multiple providers and data sources.
  • Product teams shipping AI copilots that depend on reliable retrieval, intent-based querying, and guardrails.
  • Data teams building RAG applications that need to query SQL databases, vector stores, and APIs through natural language.
  • Researchers & tinkerers exploring local-first stacks, evaluating different foundation models, or building multimodal AI applications.

โญ Why Star This Repo?

Your star isn't just a vanity metricโ€”it directly helps the project:

  • Visibility โ€“ Stars help other developers discover ORBIT in search results
  • Releases โ€“ Get notified when we ship new features and providers
  • Open source โ€“ Support independent AI infrastructure development

Star ORBIT


๐Ÿงช Sandbox Environment: https://orbit.schmitech.ai/

This environment is unstable and may be unavailable at any given time. Not for production useโ€”just examples showing ORBIT functionality. Prompts are not retained.

๐Ÿš€ Deployment Guide

There are three ways to get started with ORBIT.

Option 1: Docker (Fastest)

docker pull schmitech/orbit:basic
docker run -d --name orbit-basic -p 5173:5173 schmitech/orbit:basic

Open http://localhost:5173 in your browser and start chatting.

The Docker image includes:

  • ORBIT server (API on port 3000)
  • orbitchat web app (browser UI on port 5173)
  • Ollama with pre-pulled models
  • Pre-configured API key (no setup needed)

For more Docker options, see docker/README.md.

Option 2: Download Latest Release

Download and install the latest stable release. Best for production deployments.

Prerequisites

  • Python 3.12+
  • Node.js 18+ and npm
  • AI Provider (choose one or more):
    • Local: Ollama, llama.cpp, or vLLM
    • Cloud: Your own API keys for OpenAI, Anthropic, Cohere, Gemini, Mistral, etc.
  • Optional: MongoDB, Redis, and a vector DB (Chroma, Qdrant, etc.)

1. Download and Extract Release

# Download the latest release archive
# Replace v2.2.0 with the latest version from https://github.com/schmitech/orbit/releases
curl -L https://github.com/schmitech/orbit/releases/download/v2.2.0/orbit-2.2.0.tar.gz -o orbit-2.2.0.tar.gz

tar -xzf orbit-2.2.0.tar.gz

cd orbit-2.2.0

2. Configure and Install

# Add API keys if using proprietary services like OpenAI, Cohere, Anthropic, etc.
cp env.example .env

# Install packages
./install/setup.sh

# Activate Python environment
source venv/bin/activate

3. Install Ollama and Download a Model

  1. Install Ollama (if not already installed):

    # macOS/Linux
    curl -fsSL https://ollama.com/install.sh | sh
    
    # Windows: Download from https://ollama.com/download
  2. Pull the model:

    ollama pull granite4:1b
  3. Configure Model:

    • The default model is configured as granite4:1b in config/adapters/passthrough.yaml and config/adapters/multimodal.yaml.
    • You can configure model settings in config/ollama.yaml.

4. Start the Server

# Start the ORBIT server
./bin/orbit.sh start 

# Check the logs
cat ./logs/orbit.log

5. Access the Dashboard

Once the server is running, open your browser and navigate to:

๐Ÿ–ฅ๏ธ Dashboard: http://localhost:3000/dashboard

The dashboard provides a visual interface to manage adapters, monitor conversations, and configure your ORBIT instance.

dashboard.mp4

The ORBIT Dashboard in action

Option 3: Clone from Git (Development)

For contributing or modifying ORBIT, clone and run from source.

Prerequisites

  • Python 3.12+
  • Node.js 18+ and npm
  • Docker 20.10+ and Docker Compose 2.0+
  • AI Provider (choose one or more):
    • Local: Ollama, llama.cpp, or vLLM
    • Cloud: Your own API keys for OpenAI, Anthropic, Cohere, Gemini, Mistral, etc.
  • Optional: MongoDB, Redis, and a vector DB (Chroma, Qdrant, etc.)

1. Install ORBIT Server

# Clone the repository
git clone https://github.com/schmitech/orbit.git
cd orbit

# Add API keys if using proprietary services like OpenAI, Cohere, etc.
cp env.example .env

# Install packages
./install/setup.sh

# Activate Python environment
source venv/bin/activate

# Start the ORBIT server
./bin/orbit.sh start 

# Check the logs
tail -f ./logs/orbit.log

Note: After starting the server, you'll need to create an API key using ./bin/orbit.sh key create before you can use the chat clients.


๐Ÿ’ฌ Chat Clients

Once your ORBIT server is running (via Docker or manual installation), you can interact with it using one of these clients:

Using the Python CLI Client

The orbit-chat python CLI provides a terminal-based chat interface.

# Install the client from PyPI
pip install schmitech-orbit-client

orbit-chat --api-key YOUR_API_KEY
orbit-cli-chat.mp4

Using the orbit-chat CLI. Run orbit-chat -h for options.

Using the React Web App

npm install -g orbitchat
orbitchat --api-url http://localhost:3000 --api-key YOUR_API_KEY --open
orbit-chat-gui.mp4

Chatting with ORBIT using the React client.

Using the Embeddable Chat Widget

Add an AI chatbot to any website with the @schmitech/chatbot-widget. Supports floating and embedded modes with full theme customization.

<!-- Add to your HTML -->
<script src="https://unpkg.com/react@18/umd/react.production.min.js"></script>
<script src="https://unpkg.com/react-dom@18/umd/react-dom.production.min.js"></script>
<script src="https://unpkg.com/@schmitech/chatbot-widget@latest/dist/chatbot-widget.umd.js"></script>
<link rel="stylesheet" href="https://unpkg.com/@schmitech/chatbot-widget@latest/dist/chatbot-widget.css">

<script>
  window.addEventListener('load', function() {
    window.initChatbotWidget({
      apiUrl: 'http://localhost:3000',
      apiKey: 'YOUR_API_KEY',
      sessionId: 'session-' + Date.now(),
      widgetConfig: {
        header: { title: "ORBIT Assistant" },
        welcome: { title: "Hello! ๐Ÿ‘‹", description: "How can I help you today?" }
      }
    });
  });
</script>
theming-app.mp4

The embeddable chat widget in action. Try the theming app to customize and preview your widget.

For full configuration options, themes, and integration guides, see clients/chat-widget/README.md.

Using the Node.js SDK

ORBIT provides a native TypeScript/JavaScript client for seamless integration into Node.js, web, or mobile applications.

npm install @schmitech/chatbot-api
import { ApiClient } from '@schmitech/chatbot-api';

const client = new ApiClient({
    apiUrl: "http://localhost:3000",
    apiKey: "YOUR_API_KEY"
});

async function chat() {
    const stream = client.streamChat("How can I integrate ORBIT into my application?");
    for await (const chunk of stream) {
        process.stdout.write(chunk.text);
    }
}

chat();

Using the OpenAI Python SDK

ORBIT exposes an OpenAI-compatible /v1/chat/completions endpoint, letting you use the official openai Python library with your ORBIT server as a drop-in backend.

from openai import OpenAI

client = OpenAI(
    api_key="ORBIT_API_KEY",
    base_url="http://localhost:3000/v1"
)

completion = client.chat.completions.create(
    model="orbit",  # Value is required but ORBIT routes via your API key
    messages=[
        {"role": "system", "content": "You are a helpful ORBIT assistant."},
        {"role": "user", "content": "Summarize the latest deployment status."}
    ],
)

print(completion.choices[0].message.content)

# ORBIT-specific metadata (sources, threading info, audio, etc.) is available via completion.orbit
if completion.orbit.get("sources"):
    print("Sources:", completion.orbit["sources"])

Streaming works as well:

stream = client.chat.completions.create(
    model="orbit",
    messages=[{"role": "user", "content": "Give me an onboarding checklist."}],
    stream=True,
)

for chunk in stream:
    delta = chunk.choices[0].delta.content
    if delta:
        print(delta, end="", flush=True)

๐Ÿ—ƒ๏ธ Chat with Your Data

See the Tutorial โ€“ Set up the HR example in 5 minutes and start chatting with your data.


Support the Project

Your support keeps ORBIT independent and focused on open-source innovation.

  • โญ Star the repo to signal that ORBIT matters to you.
  • ๐Ÿ“ฃ Share a demo, blog, or tweet so other builders discover it.
  • ๐Ÿ› Open issues and PRsโ€”your feedback directly shapes the roadmap.

Documentation

For more detailed information, please refer to the official documentation.

License

Apache 2.0 - See LICENSE for details.

About

An adaptable, open-source context-aware inference engine designed for privacy, control, and independence from proprietary models.

Topics

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •