Thanks to visit codestin.com
Credit goes to Github.com

Skip to content

A familiar and easy to use terminal interface for OpenAI and/or Ollama

Notifications You must be signed in to change notification settings

dotcomdudee/cmdAI

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

3 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

cmdAI Logo cmdAI Terminal v1.0

A familiar and easy to use terminal interface for OpenAI and/or Ollama

cmdAI Terminal brings the elegance of modern chat interfaces to your terminal, with a carefully crafted dark theme, real-time streaming, and intelligent conversation management. Works seamlessly with both Ollama and OpenAI models.

πŸ“Έ Screenshots

cmdAI Terminal Main Interface Main chat interface with sidebar navigation and streaming responses

Model Selection Screen Easy model switching with support for Ollama and OpenAI models


✨ Features

🎯 Core Experience

  • πŸ–₯️ Beautiful TUI - Modern terminal interface built with Textual and custom cmdAI dark theme
  • ⚑ Real-time Streaming - Watch AI responses appear token-by-token as they're generated
  • πŸ’¬ Rich Markdown - Full support for code blocks, tables, lists, headers, and more

πŸ”„ Conversation Management

  • πŸ“š Persistent History - All conversations automatically saved and resumable
  • πŸ—‚οΈ Quick Navigation - Browse and switch between chats in the sidebar
  • πŸ—‘οΈ Smart Deletion - Delete individual conversations (d) or clear all (Ctrl+D)
  • πŸ“ Auto-titling - First message becomes the conversation title

πŸ€– Model Control

  • πŸ”€ Instant Switching - Click model name or press Ctrl+M to change models
  • πŸ¦™ Ollama Support - Full integration with local Ollama models
  • ⭐ OpenAI Support - Connect to OpenAI's GPT models with API key
  • πŸ’Ύ Model Persistence - Remembers your last selected model between sessions
  • πŸ“‹ Dynamic Loading - Automatically fetches available models from both providers

⌨️ Developer Friendly

  • ⚑ Keyboard Shortcuts - Efficient navigation without touching the mouse
  • 🎨 Clean Minimalist UI - Distraction-free interface with intuitive interactions
  • πŸ“ Split-pane Layout - Sidebar for navigation, main area for focused chatting

πŸš€ Quick Start

One-Line Installation

Visit cmdai.io or run:

curl -fsSL https://cmdai.io/install.sh | bash
Alternative installation methods

Using wget:

wget -qO- https://cmdai.io/install.sh | bash

Manual Installation:

# Clone the repository
git clone https://github.com/dotcomdudee/cmdAI.git
cd cmdAI

# Install
pip install -e .

# Or install dependencies manually
pip install textual rich httpx pyyaml

Configuration

Create config.yaml in your current directory or at ~/.cmdai-terminal/config.yaml:

api:
  ollama:
    base_url: http://localhost:11434  # Your Ollama API endpoint
    timeout: 60
  openai:
    api_key: sk-your-key-here  # Optional: Add your OpenAI API key
    timeout: 60

ui:
  theme: dark
  sidebar_width: 35

storage:
  conversations_dir: ~/.cmdai-terminal/conversations

default_model: llama2  # Your preferred model

Note: Both providers are optional - you can use:

  • Only Ollama (leave api_key: null)
  • Only OpenAI (Ollama endpoint doesn't need to be available)
  • Both providers simultaneously for maximum flexibility

Run It

cmdai-terminal

# Or if not installed as a package
python3 -m cmdai_terminal

⌨️ Keyboard Shortcuts

Key Action Description
Ctrl+N New Chat Start a fresh conversation
Ctrl+M Change Model Open model selector
Ctrl+Q Quit Exit the application
Enter Send Send your message
Esc Cancel Close dialogs/cancel actions
d Delete Delete selected conversation
Ctrl+D Clear All Delete all conversations
↑/↓ Navigate Browse conversations/options

πŸ“ Project Structure

cmdai-terminal/
β”œβ”€β”€ πŸ“„ pyproject.toml              # Project dependencies & metadata
β”œβ”€β”€ βš™οΈ config.yaml                 # User configuration
β”œβ”€β”€ πŸ“š cmdai_terminal/
β”‚   β”œβ”€β”€ πŸš€ __main__.py            # Entry point
β”‚   β”œβ”€β”€ 🎨 app.py                 # Main Textual application & UI
β”‚   β”œβ”€β”€ βš™οΈ config.py              # Configuration management
β”‚   β”œβ”€β”€ 🌐 api/
β”‚   β”‚   β”œβ”€β”€ ollama.py             # Ollama API client & streaming
β”‚   β”‚   β”œβ”€β”€ openai_client.py      # OpenAI API client & streaming
β”‚   β”‚   └── unified_client.py     # Unified client for both providers
β”‚   β”œβ”€β”€ 🧩 components/
β”‚   β”‚   β”œβ”€β”€ sidebar.py            # Sidebar navigation component
β”‚   β”‚   β”œβ”€β”€ chat_view.py          # Chat display with markdown
β”‚   β”‚   └── input_box.py          # Message input component
β”‚   β”œβ”€β”€ πŸ“¦ models/
β”‚   β”‚   β”œβ”€β”€ conversation.py       # Conversation data model
β”‚   β”‚   └── message.py            # Message data model
β”‚   └── πŸ’Ύ storage/
β”‚       └── history.py            # Conversation persistence (JSON)

πŸ”Œ API Compatibility

cmdAI Terminal works with both Ollama and OpenAI API endpoints. More will be added asap, such as Claude, Gemini etc.

Ollama API

Endpoint Method Purpose
/api/tags GET List available models
/api/chat POST Chat completions with streaming

Request Format:

{
  "model": "llama2",
  "messages": [
    {"role": "user", "content": "Hello!"},
    {"role": "assistant", "content": "Hi! How can I help?"}
  ],
  "stream": true
}

Response Format: NDJSON with {"message": {"content": "token"}}

OpenAI API

Endpoint Method Purpose
/v1/models GET List available models
/v1/chat/completions POST Chat completions with streaming

Request Format:

{
  "model": "gpt-4o",
  "messages": [
    {"role": "user", "content": "Hello!"}
  ],
  "stream": true
}

Response Format: SSE with data: {"choices": [{"delta": {"content": "token"}}]}

Model Selection

  • Ollama models: Display as πŸ¦™ model-name (e.g., πŸ¦™ llama2)
  • OpenAI models: Display as ⭐ openai/model-name (e.g., ⭐ openai/gpt-4o)
  • The app automatically routes to the correct provider based on model prefix

πŸ› Troubleshooting

πŸ”΄ Cannot connect to Ollama API

Symptoms: Ollama models not loading

Solutions:

  1. βœ… Verify your API is running: curl http://localhost:11434/api/tags
  2. βœ… Check config.yaml has the correct ollama.base_url
  3. βœ… Ensure firewall allows the connection
  4. βœ… Try increasing the timeout value in config

πŸ”΄ Cannot connect to OpenAI API

Symptoms: OpenAI models not loading or showing errors

Solutions:

  1. βœ… Verify your API key is valid: Check at https://platform.openai.com/api-keys
  2. βœ… Ensure api_key is set in config.yaml under api.openai.api_key
  3. βœ… Check you have credits/quota available in your OpenAI account
  4. βœ… Try increasing the timeout value in config

πŸ”΄ Models not loading

Symptoms: Only seeing default model, no model list

Solutions:

  1. βœ… Check both provider configurations in config.yaml
  2. βœ… Verify at least one provider is properly configured
  3. βœ… App will use fallback models if both APIs are unavailable
  4. βœ… Look for error messages in the terminal output

πŸ”΄ Conversations not saving

Symptoms: Chats disappear after closing app

Solutions:

  1. βœ… Check directory exists: ls ~/.cmdai-terminal/conversations/
  2. βœ… Verify write permissions: touch ~/.cmdai-terminal/test
  3. βœ… Use proper home expansion in config (~/.cmdai-terminal/...)

πŸ”΄ Streaming not working

Symptoms: Messages appear all at once instead of token-by-token

Solutions:

  1. βœ… Verify API supports streaming responses
  2. βœ… Check network isn't buffering responses
  3. βœ… Ensure stream: true in API request

🀝 Contributing

Contributions are welcome! Feel free to:

  • πŸ› Report bugs
  • πŸ’‘ Suggest features
  • πŸ”§ Submit pull requests
  • πŸ“– Improve documentation

πŸ“„ License

MIT License - Feel free to use this project however you'd like!


⭐ Show Your Support

If you find cmdAI Terminal useful, please consider:

  • ⭐ Starring the repository
  • πŸ› Reporting bugs
  • πŸ’‘ Suggesting new features
  • πŸ“’ Sharing with others

About

A familiar and easy to use terminal interface for OpenAI and/or Ollama

Resources

Stars

Watchers

Forks

Packages

No packages published