Snow CLI is an intelligent AI-powered CLI tool that brings advanced AI capabilities directly into your terminal. It provides lightweight access to multiple AI models, giving you the most direct path from your prompt to powerful AI assistance.
English | δΈζ
- π― Multi-Model Support: Compatible with OpenAI, Anthropic, Gemini, and any OpenAI-compatible APIs
- π§ Built-in Tools: File operations, shell commands, web fetching, and search capabilities
- π Extensible: MCP (Model Context Protocol) support for custom integrations
- π» Terminal-First: Designed for developers who live in the command line
- π‘οΈ Open Source: Fully open source and community-driven
- π¦ IDE Integration: VSCode and JetBrains plugins for seamless workflow
- Query and edit large codebases with AI assistance
- Generate new applications from natural language descriptions
- Debug issues and troubleshoot with intelligent suggestions
- Multi-file context awareness for better code understanding
- Automate operational tasks with AI-powered workflows
- Use MCP servers to connect new capabilities and tools
- Run non-interactively in scripts for workflow automation
- IDE integration for seamless development experience
- Multiple Configuration Profiles: Switch between different API and model configurations
- Conversation Checkpointing: Save and resume complex sessions with
/resume - Custom System Prompts: Tailor AI behavior for your specific needs
- File Snapshots: Automatic rollback capability for AI-made changes
- Yolo Mode: Unattended execution for trusted operations
- Token Caching: Optimize token usage with intelligent caching
- Node.js version 16 or higher
- npm >= 8.3.0
- macOS, Linux, or Windows
node --versionIf your version is below 16.x, please upgrade:
# Using nvm (recommended)
nvm install 16
nvm use 16
# Or download from official website
# https://nodejs.org/npm install -g snow-aigit clone https://github.com/MayDay-wpf/snow-cli
cd snow-cli
npm install
npm run link # builds and globally links `snow`
# to remove the link later: npm run unlink- Download snow-cli-x.x.x.vsix
- Open VSCode, click
ExtensionsβInstall from VSIX...β select the downloaded file
- Download JetBrains plugins
- Follow JetBrains plugin installation instructions
After install snow and Extension/plugin, start Snow CLI in terminal.
snowsnow --updatesnow --versionsnow -cSnow CLI supports multiple AI providers and allows you to save multiple configuration profiles. From v0.3.2 onward the bundled vendor SDKs were removed to keep the tool lightweight, so everything is configured through API & Model Settings.
After starting Snow CLI, enter API & Model Settings to configure:
- Profile: Switch or create new configurations for different API setups
- Base URL: Request endpoint for your AI provider
- OpenAI/Anthropic: Requires
/v1suffix - Gemini: Requires
/v1betasuffix
- OpenAI/Anthropic: Requires
- API Key: Your API key for authentication
- Request Method: Choose:
- Chat Completions - OpenAI-Compatible API
- Responses - OpenAI's Responses API (Codex CLI)
- Gemini - Google Gemini API
- Anthropic - Anthropic Claude API
- Anthropic Beta: Enable beta features for Anthropic requests
- Model Configuration:
- Advanced Model: High-performance model for complex tasks
- Basic Model: Smaller model for summarization
- Compact Model: Efficient model for context compression
- All three model slots share the configured Base URL and API Key. Snow auto-fetches available models from the
/modelsendpoint (with filtering); use Manual Input to specify a model name when the providerβs list is incomplete.
- Max Context Tokens: Model's maximum context window (e.g., 1000000 for Gemini). This only affects UI calculations for context percentage and does not change the actual model context.
- Max Tokens: Maximum tokens per response (added to API requests)
After configuring, click Start to open the conversation view. When launched from VSCode or other editors, Snow automatically connects to the IDE via the Snow CLI plugin and shows a connection message.
- File Selection: Use
@to select files for context- In VSCode: Hold
Shiftand drag files for quick selection
- In VSCode: Hold
- Slash Commands: Use
/to access built-in commands/init- Build project documentationAGENTS.md/clear- Create a new session/resume- Restore conversation history/mcp- Check MCP connection status and reconnect/yolo- Unattended mode (auto-approve all tool calls; use with caution)/ide- Manually connect to IDE/compact- Compress context (use sparingly)
- Windows:
Alt+V- Paste image - macOS/Linux:
Ctrl+V- Paste image (with prompt) Ctrl+L- Clear input from cursor to leftCtrl+R- Clear input from cursor to rightShift+Tab- Toggle Yolo modeESC- Stop AI generation- Double-click
ESC- Rollback conversation with file checkpoints
The input area shows real-time token statistics:
- Context usage percentage
- Total token count
- Cache hit tokens
- Cache creation tokens
Configure system proxy and search engine preferences:
- Automatic system proxy detection (usually no changes needed)
- Browser selection for web search (Edge/Chrome auto-detected unless you changed installation paths)
- Custom proxy port configuration
Customize AI behavior with your own system prompts:
- Supplements (does not replace) Snow's built-in prompt; the default prompt is downgraded to a user message and appended to your first user message
- Opens the system text editor for editing (Notepad on Windows; default terminal editor on macOS/Linux)
- Requires restart after saving (shows:
Custom system prompt saved successfully! Please use 'snow' to restart!)
Add custom HTTP headers to API requests:
- Extends default headers (cannot override built-in headers)
- Useful for custom authentication or routing
Configure Model Context Protocol servers:
- JSON format compatible with Cursor
- Extends Snow CLI with custom tools and capabilities
- Same editing workflow as system prompts
All Snow CLI files are stored in ~/.snow/:
.snow/
βββ log/ # Runtime logs (local only, safe to delete)
βββ profiles/ # Multiple API/model configurations
βββ sessions/ # Conversation history for /resume
βββ snapshots/ # File backups for rollback
βββ todo/ # Persisted todo lists
βββ active-profile.txt # Current active profile
βββ config.json # Main API configuration
βββ custom-headers.json # Custom request headers
βββ mcp-config.json # MCP service configuration
βββ system-prompt.txt # Custom system prompt
- Logs: Local-only runtime logs; safe to delete for cleanup
- Sessions: Stored locally and required for conversation history features like
/resume - Snapshots: Automatic file checkpoints that enable rollback functionality
- Todo: Persists tasks so they survive unexpected exits
We welcome contributions! Snow CLI is fully open source, and we encourage the community to:
- Report bugs and suggest features
- Improve documentation
- Submit code improvements
- Share your MCP servers and extensions
Visit our GitHub repository to get started.
- GitHub Repository - Source code and issues
- NPM Package - Package registry
- Releases - Download IDE extensions
- License: Open source (License Type TBC)
- Privacy: All data stored locally, no telemetry
Built with β€οΈ by the open source community