Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Lightweight coding agent that runs in your terminal with support for Ollama also

License

Notifications You must be signed in to change notification settings

moulish-dev/codex-openllm-cli

 
 

Repository files navigation

🧠 Codex OpenLLM CLI

A blazing-fast, terminal-first AI coding assistant powered by Ollama, with full offline model support. Built as a simplified alternative to the OpenAI Codex CLI, this tool provides natural language coding capabilities via locally running LLMs like mistral, llama2, codellama, and more.

🚀 Features

  • 🤖 Local model support via Ollama
  • 💻 Terminal-based conversational interface
  • 📁 Full-context mode: loads entire project directory into LLM
  • 🧠 Single-pass edit mode (experimental)
  • 🔧 Easy to extend with your own tools and workflows
  • 🛠️ Shell command execution support (opt-in)

⚠️ Note: This project is still in active development. Expect breaking changes. Contributions are welcome!

📦 Requirements

  • Node.js v18+
  • TypeScript
  • Ollama with a supported model installed already (e.g., mistral, codellama, deepseek-coder)
  • Git (optional but recommended for context-aware operations)

📥 Installation

git clone https://github.com/moulish-dev/codex-openllm-cli.git
cd codex-openllm-cli
npm install

Make cli.tsx executable if running directly from terminal:

chmod +x src/cli.tsx

🧪 Usage

Run a Single Prompt

tsx src/cli.tsx "Write a Python function to reverse a string"

Full Context Mode

Load an entire codebase into memory and let the AI reason and edit it in a single go:

tsx src/cli.tsx --full-context "Convert all async functions to regular functions"

Quiet Mode

Great for scripting and CI:

tsx src/cli.tsx -q "Fix broken imports"

Model Selection

You can specify a local Ollama model:

tsx src/cli.tsx -m deepseek-coder "Refactor this script"

⚠️ Warnings & Notes

  • ⚠️ LLMs may hallucinate. Always verify generated code before use.
  • ⚠️ Shell commands can be dangerous. The CLI may suggest terminal commands — always review before execution.
  • 🚫 No sandbox by default. Commands will run directly if you approve them. Use with caution.
  • 🖼️ Image support is disabled. For simplicity and compatibility, image inputs are not currently used.
  • 🧪 Full-context mode is experimental. It can consume large memory for big repos — use wisely.

⚠️ Note: This project is still in active development. Expect breaking changes. Contributions are welcome!

🧠 Inspirations

This CLI is inspired by OpenAI Codex CLI but rebuilt to run entirely on local LLMs using Ollama.

🙌 Contributing

Pull requests are welcome. Open an issue first if you’d like to suggest a major feature or change.

📄 License

LICENSE


Challenges

🚧 Development Roadmap

  • Removed OpenAI API key dependency
  • Integrated Ollama client
  • Stubbed out model checks
  • CLI prints output with --full-context
  • Refactored AgentLoop.ts to remove OpenAI dependency
  • Defined and used custom ResponseItem types
  • Adapted UI handlers to work with raw streaming data
  • Add model selector UI
  • Add image input support (optional)
  • Implement persistent prompt history
  • Plugin architecture for custom workflows

About

Lightweight coding agent that runs in your terminal with support for Ollama also

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • TypeScript 87.6%
  • JavaScript 3.7%
  • Python 3.7%
  • Shell 2.4%
  • Jupyter Notebook 1.3%
  • HTML 1.1%
  • Dockerfile 0.2%