A familiar and easy to use terminal interface for OpenAI and/or Ollama
cmdAI Terminal brings the elegance of modern chat interfaces to your terminal, with a carefully crafted dark theme, real-time streaming, and intelligent conversation management. Works seamlessly with both Ollama and OpenAI models.
Main chat interface with sidebar navigation and streaming responses
Easy model switching with support for Ollama and OpenAI models
- π₯οΈ Beautiful TUI - Modern terminal interface built with Textual and custom cmdAI dark theme
- β‘ Real-time Streaming - Watch AI responses appear token-by-token as they're generated
- π¬ Rich Markdown - Full support for code blocks, tables, lists, headers, and more
- π Persistent History - All conversations automatically saved and resumable
- ποΈ Quick Navigation - Browse and switch between chats in the sidebar
- ποΈ Smart Deletion - Delete individual conversations (d) or clear all (Ctrl+D)
- π Auto-titling - First message becomes the conversation title
- π Instant Switching - Click model name or press Ctrl+M to change models
- π¦ Ollama Support - Full integration with local Ollama models
- β OpenAI Support - Connect to OpenAI's GPT models with API key
- πΎ Model Persistence - Remembers your last selected model between sessions
- π Dynamic Loading - Automatically fetches available models from both providers
- β‘ Keyboard Shortcuts - Efficient navigation without touching the mouse
- π¨ Clean Minimalist UI - Distraction-free interface with intuitive interactions
- π Split-pane Layout - Sidebar for navigation, main area for focused chatting
Visit cmdai.io or run:
curl -fsSL https://cmdai.io/install.sh | bashAlternative installation methods
Using wget:
wget -qO- https://cmdai.io/install.sh | bashManual Installation:
# Clone the repository
git clone https://github.com/dotcomdudee/cmdAI.git
cd cmdAI
# Install
pip install -e .
# Or install dependencies manually
pip install textual rich httpx pyyamlCreate config.yaml in your current directory or at ~/.cmdai-terminal/config.yaml:
api:
ollama:
base_url: http://localhost:11434 # Your Ollama API endpoint
timeout: 60
openai:
api_key: sk-your-key-here # Optional: Add your OpenAI API key
timeout: 60
ui:
theme: dark
sidebar_width: 35
storage:
conversations_dir: ~/.cmdai-terminal/conversations
default_model: llama2 # Your preferred modelNote: Both providers are optional - you can use:
- Only Ollama (leave
api_key: null) - Only OpenAI (Ollama endpoint doesn't need to be available)
- Both providers simultaneously for maximum flexibility
cmdai-terminal
# Or if not installed as a package
python3 -m cmdai_terminal| Key | Action | Description |
|---|---|---|
| Ctrl+N | New Chat | Start a fresh conversation |
| Ctrl+M | Change Model | Open model selector |
| Ctrl+Q | Quit | Exit the application |
| Enter | Send | Send your message |
| Esc | Cancel | Close dialogs/cancel actions |
| d | Delete | Delete selected conversation |
| Ctrl+D | Clear All | Delete all conversations |
| β/β | Navigate | Browse conversations/options |
cmdai-terminal/
βββ π pyproject.toml # Project dependencies & metadata
βββ βοΈ config.yaml # User configuration
βββ π cmdai_terminal/
β βββ π __main__.py # Entry point
β βββ π¨ app.py # Main Textual application & UI
β βββ βοΈ config.py # Configuration management
β βββ π api/
β β βββ ollama.py # Ollama API client & streaming
β β βββ openai_client.py # OpenAI API client & streaming
β β βββ unified_client.py # Unified client for both providers
β βββ π§© components/
β β βββ sidebar.py # Sidebar navigation component
β β βββ chat_view.py # Chat display with markdown
β β βββ input_box.py # Message input component
β βββ π¦ models/
β β βββ conversation.py # Conversation data model
β β βββ message.py # Message data model
β βββ πΎ storage/
β βββ history.py # Conversation persistence (JSON)
cmdAI Terminal works with both Ollama and OpenAI API endpoints. More will be added asap, such as Claude, Gemini etc.
| Endpoint | Method | Purpose |
|---|---|---|
/api/tags |
GET | List available models |
/api/chat |
POST | Chat completions with streaming |
Request Format:
{
"model": "llama2",
"messages": [
{"role": "user", "content": "Hello!"},
{"role": "assistant", "content": "Hi! How can I help?"}
],
"stream": true
}Response Format: NDJSON with {"message": {"content": "token"}}
| Endpoint | Method | Purpose |
|---|---|---|
/v1/models |
GET | List available models |
/v1/chat/completions |
POST | Chat completions with streaming |
Request Format:
{
"model": "gpt-4o",
"messages": [
{"role": "user", "content": "Hello!"}
],
"stream": true
}Response Format: SSE with data: {"choices": [{"delta": {"content": "token"}}]}
- Ollama models: Display as
π¦ model-name(e.g.,π¦ llama2) - OpenAI models: Display as
β openai/model-name(e.g.,β openai/gpt-4o) - The app automatically routes to the correct provider based on model prefix
Symptoms: Ollama models not loading
Solutions:
- β
Verify your API is running:
curl http://localhost:11434/api/tags - β
Check
config.yamlhas the correctollama.base_url - β Ensure firewall allows the connection
- β
Try increasing the
timeoutvalue in config
Symptoms: OpenAI models not loading or showing errors
Solutions:
- β Verify your API key is valid: Check at https://platform.openai.com/api-keys
- β
Ensure
api_keyis set inconfig.yamlunderapi.openai.api_key - β Check you have credits/quota available in your OpenAI account
- β
Try increasing the
timeoutvalue in config
Symptoms: Only seeing default model, no model list
Solutions:
- β
Check both provider configurations in
config.yaml - β Verify at least one provider is properly configured
- β App will use fallback models if both APIs are unavailable
- β Look for error messages in the terminal output
Symptoms: Chats disappear after closing app
Solutions:
- β
Check directory exists:
ls ~/.cmdai-terminal/conversations/ - β
Verify write permissions:
touch ~/.cmdai-terminal/test - β
Use proper home expansion in config (
~/.cmdai-terminal/...)
Symptoms: Messages appear all at once instead of token-by-token
Solutions:
- β Verify API supports streaming responses
- β Check network isn't buffering responses
- β
Ensure
stream: truein API request
Contributions are welcome! Feel free to:
- π Report bugs
- π‘ Suggest features
- π§ Submit pull requests
- π Improve documentation
MIT License - Feel free to use this project however you'd like!
If you find cmdAI Terminal useful, please consider:
- β Starring the repository
- π Reporting bugs
- π‘ Suggesting new features
- π’ Sharing with others