Delta is an intelligent shell wrapper that enhances your command-line experience with AI-powered suggestions, encrypted command history, and seamless shell compatibility.
- Universal Shell Compatibility: Works with bash, zsh, fish, and preserves your existing shell functions and aliases
- Multilingual Support: Available in 6 languages with runtime language switching (🆕 v0.1.0-alpha)
- AI-Powered Suggestions: Context-aware predictions and insights using local Ollama models
- Secure Command History: Encrypted storage with privacy filtering
- Advanced Memory System: Learn from your command patterns and improve over time
- Custom Model Training: Train personalized models on your command history
- Smart Navigation: Quick directory jumping and path completion
- Vector Search: Fast semantic search through your command history
Download the latest release from GitHub Releases:
# Download the latest release (replace VERSION with actual version like v0.1.0-alpha)
curl -L -o delta-VERSION-linux-amd64.tar.gz \
https://github.com/DeltaCLI/Delta/releases/download/VERSION/delta-VERSION-linux-amd64.tar.gz
# Extract the binary
tar -xzf delta-VERSION-linux-amd64.tar.gz
# Make executable and install
chmod +x delta-linux-amd64
sudo mv delta-linux-amd64 /usr/local/bin/delta
# Verify installation
echo ":help" | delta# Clone the repository
git clone https://github.com/DeltaCLI/Delta.git
cd Delta
# Build the application
make build
# Install to your system
make install- Go 1.16 or higher
- Ollama (for AI features)
- SQLite with vector extensions (automatically handled)
Simply run delta to start the interactive shell:
deltaexitorquit- Exit Deltasub- Enter subcommand modeend- Exit subcommand mode
Delta uses a colon (:) prefix for internal commands (similar to Vim):
:ai on- Enable AI suggestions:ai off- Disable AI suggestions:ai status- Check if AI suggestions are enabled:i18n- Internationalization commands (🆕 v0.1.0-alpha):i18n locale zh-CN- Switch to Chinese (🆕 v0.1.0-alpha):i18n list- List available languages (🆕 v0.1.0-alpha):memory- Memory system commands:tokenizer- Command tokenization utilities:help- Show all available commands:jump <location>- Quick directory navigation
Delta CLI now supports multiple languages with runtime switching:
# List available languages
:i18n list
# Switch to Chinese (Simplified)
:i18n locale zh-CN
# Switch to Spanish
:i18n locale es
# Switch back to English
:i18n locale en
# Show i18n status
:i18nSupported Languages:
- English (en) - Default
- 䏿–‡ç®€ä½“ (zh-CN) - Chinese Simplified
- Español (es) - Spanish
- Français (fr) - French
- Italiano (it) - Italian
- Nederlands (nl) - Dutch
Delta includes AI-powered contextual suggestions using Ollama with llama3.3:8b model.
- Ollama installed and running locally
- llama3.3:8b model pulled (
ollama pull llama3.3:8b)
The AI analyzes your recent commands and displays a single line of "thinking" above the prompt. This provides contextual insights or suggestions based on your work. All processing happens locally via Ollama, ensuring your command data never leaves your machine.
Delta includes a sophisticated memory and learning system that can remember your command history and learn from your usage patterns over time.
Delta can safely store your command history with privacy filtering:
# Enable memory collection
:memory enable
# Check memory status
:memory status
# View detailed memory statistics
:memory stats
# List available data shards
:memory list
# Export data for a specific date
:memory export YYYY-MM-DDBefore training, commands need to be tokenized:
# Check tokenizer status
:tokenizer status
# Process command history into training data
:tokenizer process
# Test tokenization on a sample command
:tokenizer test "git commit -m 'Update README'"Delta supports training custom models on your command history using Docker:
- Docker and Docker Compose installed
- NVIDIA GPU with CUDA support (optional, but recommended)
- NVIDIA Container Toolkit installed (for GPU support)
-
Collect Command Data: Use Delta CLI regularly with memory collection enabled
-
Process Command Data: Convert raw commands to training data
:tokenizer process
-
Start Training: Launch the Docker training environment
:memory train start
-
Configure Training (Optional): Modify training parameters in
~/.config/delta/training/docker-compose.yml:environment: - MODEL_SIZE=small # small, medium, or large - BATCH_SIZE=32 # batch size per GPU - MAX_ITERS=30000 # maximum training iterations
-
Monitor Training: Training logs are stored in
~/.config/delta/training/logs
Training will automatically utilize all available GPUs with distributed training. The model will be saved to ~/.config/delta/memory/models and used by Delta's AI system.
Requirements:
- Go 1.16 or higher
- Make
# Build
make build
# Run without installing
make run
# Install to system
make installDelta uses your existing shell's configuration files (.bashrc, .zshrc, etc.) for compatibility with your customized environment.
DeltaCLI is supported by continued investment from Source Parts Inc. (https://source.parts / https://sourceparts.eu).
MIT License
Copyright (c) 2025 Source Parts Inc.
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
This project does not provide GitHub or any other party without explicit consent to train on the source code contained herein.