Seek your answers π―% LOCALLY within VSCode
LocalSeek is a powerful, privacy-first AI chat extension for Visual Studio Code that brings conversational AI directly to your development environment - completely locally. Chat with your code, leverage your knowledge base, and get AI assistance without ever leaving your editor or compromising your privacy.
- Chat with AI models through Ollama without sending your data anywhere
- Choose between sidebar panel or standalone window
- Watch responses stream in real-time with full markdown rendering and syntax highlighting
- Switch between different models instantly
- Index your files to give AI context about your project
- Toggle "Use RAG" on/off per query
- AI automatically searches your Knowledge Base to provide relevant, project-specific answers instead of generic responses
- Select any code β right-click β "Send to LocalSeek Chat" for instant context
- AI responds with code? Click "Insert" to put it directly in your editor at cursor position
- Copy code blocks with one click
- All code gets proper syntax highlighting
- All chats automatically saved with generated titles
- Resume any previous conversation exactly where you left off
- Browse your chat history with timestamps and message counts
- Clean up conversations you don't need anymore
- Download new Ollama models directly from the extension interface
- Watch real-time download progress with detailed status updates
- View model information like size and modification dates
- Remove unused models to free up disk space
- Dark theme that matches your editor perfectly
- Responsive design that works on any screen size
- Smooth animations and intuitive controls
- Everything feels native to VSCode - no jarring external interfaces
- Visual Studio Code (latest version recommended)
- Ollama installed and running locally
- Minimum 8GB RAM (16GB+ recommended for larger models)
- Available Storage for AI models (varies by model size)
# Install Ollama (visit https://ollama.com for platform-specific instructions)
# Then pull some recommended models:
ollama pull gpt-oss # Best model now
ollama pull deepseek-r1:14b # Excellent reasoning model
ollama pull llama3.2:latest # Versatile and reliable
ollama pull phi3:mini # Lightweight and fast
ollama pull mistral:latest # Great for coding tasks
ollama pull qwen2.5-coder # Specialized for code generation- Open VSCode
- Go to Extensions (
Ctrl+Shift+X/Cmd+Shift+X) - Search for "LocalSeek"
- Click "Install"
- Download the latest
.vsixfile from GitHub Releases - Open VSCode Extensions view
- Click the "..." menu β "Install from VSIX"
- Select the downloaded file
Sidebar Panel (Recommended)
- Click the LocalSeek icon in the Activity Bar (left sidebar)
- The chat panel opens in the sidebar for easy access while coding
Standalone Window
- Open Command Palette (
Ctrl+Shift+P/Cmd+Shift+P) - Type "LocalSeek: Open AI Chat"
- Chat opens in a separate panel
- Select Model: Choose your preferred Ollama model from the dropdown
- Enable Knowledge Base: Toggle the "Use RAG" switch to include your indexed documents (off by default)
- Type Message: Enter your question or request
- Send: Press
Enteror click the Send button - View Response: Watch the AI response stream in real-time
Send Code to Chat
- Select any code in your editor
- Right-click β "Send Selected Code to LocalSeek Chat"
- The code appears in your chat input with proper formatting
- Add your question and send
Insert AI Code
- Click "Insert" button on any code block in AI responses
- Code is automatically inserted at your cursor position
- Replaces selected text if you have a selection
-
Configure Knowledge Base Path (Required)
- Open Settings (
Ctrl+,/Cmd+,) and search "LocalSeek" - Set "Knowledge Base Path" to a specific directory - this is required to use RAG
- Open Settings (
-
Index Your Documents
- Open Command Palette (
Ctrl+Shift+P) - Type "LocalSeek: Index Knowledge Base"
- Extension scans for files in the specified path
- Open Command Palette (
-
Use in Chat
- Toggle the "Use RAG" switch on in the chat interface (off by default)
- Ask questions related to your documentation
- AI will automatically include relevant context from indexed files
View Chat History
- Click the history button (clock icon) in the chat interface
- Browse all your previous conversations
- Click any conversation to resume it
Start New Chat
- Click the new chat button (+ icon)
- Starts a fresh conversation
- Previous chat is automatically saved
Download New Models
- Click the models button (layers icon) in chat interface
- Enter model name (e.g., "llama3.2", "deepseek-r1:7b")
- Click "Download"
- Monitor download progress in real-time
Remove Models
- Open Model Management modal
- Click "Remove" next to any installed model
- Confirm deletion to free up disk space
LocalSeek: Open AI Chat- Open standalone chat windowLocalSeek: Send Selected Code- Send selected code to chatLocalSeek: Index Knowledge Base- Index workspace documents
-
Context Management: Use the KB toggle strategically - turn it off for general questions, on for project-specific queries
-
Model Selection:
- Use smaller models (phi3) for quick questions
- Use larger models (deepseek-r1) for complex reasoning
- Use code-specific models for programming tasks
-
Efficient Workflows:
- Keep sidebar chat open while coding
- Use "Send Selected Code" for quick code reviews
- Leverage chat history to build on previous conversations
-
Performance Optimization:
- Close unused models to free RAM
- Index only essential documents for faster search
- Use smaller models for better response times
β 100% Local Processing - All AI inference happens on your machine β No Data Transmission - Your code and conversations never leave your computer β No Telemetry - Zero tracking or analytics β Offline Capable - Works completely without internet connection β Your Data, Your Control - Full ownership of all conversations and data
- No External Dependencies for AI processing
- Local Storage Only for chat history and settings
- No API Keys Required - no risk of key exposure
- Open Source - transparent and auditable code
We welcome contributions from the community!
- π Report Bugs - Help us identify and fix issues
- π‘ Suggest Features - Share ideas for new functionality
- π Improve Docs - Help make documentation clearer
- π οΈ Submit Code - Contribute bug fixes or new features
- β Star the Repo - Show your support
- π More File Types - Support for additional document formats
- π Advanced Search - Enhanced knowledge base search capabilities
- π¨ Theme Customization - Multiple UI themes and customization options
- π Plugin System - Extensible architecture for custom integrations
- π Analytics Dashboard - Usage insights and conversation analytics
- π Multi-language Support - Interface localization
Check the Changelog for detailed version history and updates.
MIT License - see LICENSE file for details.
Developed with β€οΈ by HariHaren
LocalSeek - Your local AI companion for VSCode. Seek your answers, locally and privately. πβ¨