A privacy-focused Android client for local LLM servers running Ollama
Your conversations never leave your local network π
- 100% Local Network: All communication stays on your local network - no data ever leaves your devices or network
- Offline-First: Full functionality works without internet connection once configured
- Local Storage: All conversations saved locally using Room database with automatic persistence
- Real-Time Streaming: See AI responses appear token-by-token in real-time for instant feedback
- Multiple Chat Sessions: Create, manage, and switch between unlimited conversation threads
- Markdown Rendering: Beautifully formatted responses with full Markdown support including code blocks, lists, and formatting
- Chat History: Persistent chat history with automatic title generation from the first message
- Model Selection: Choose from any model available on your Ollama server (llama, qwen, mistral, etc.)
- Reasoning Mode: Enable "thinking" mode for models that support it - see the AI's reasoning process (toggle with π‘ icon)
- Tool Calling: Optional web search integration via SearXNG for real-time information when needed
- Smart System Prompts: Intelligent system prompts that automatically adapt based on enabled features
- Multiple Themes: Choose between Space (near-black) and Dark (grey) themes
- Modern Design: Clean, terminal-inspired aesthetic with Material 3 components
- Custom Typography: JetBrains Mono and Space Grotesk fonts for a distinctive, professional look
- Edge-to-Edge: Immersive full-screen experience that adapts to your device
- Responsive Layout: Optimized for all Android screen sizes from phones to tablets
- Start a New Chat: Tap the menu icon (β°) to open the sidebar, then tap "New Chat"
- Send Messages: Type your message in the input field and tap send (or press Enter)
- View Responses: Responses stream in real-time as the AI generates them
Toggle the lightbulb icon to enable/disable:
- Enabled: AI can use reasoning mode (shows thinking process) and web search tools
- Disabled: Standard chat mode without tools or reasoning
- Tap the model name in the input bar to see available models
- Select a different model for each chat session
- Your model preference is saved automatically
- Swipe right or tap the menu icon (β°) to access your chat list
- Tap any chat to resume the conversation
- Long-press a chat to delete it
- Use "Delete All" in settings to clear all chats
- Access settings via the gear icon
- Choose between Space (near-black, default) and Dark (grey) themes
- Theme preference is saved automatically
See SETUP.md for more detailed usage instructions.
- Android: 8.0 (API 26) or higher
- Ollama Server: Running on your local network (see setup below)
- Network: Both devices (Android phone and server) must be on the same Wi-Fi network
- SearXNG (Optional): For web search capabilities when brain toggle is enabled
The app requires custom fonts. Run the provided script:
cd scripts
chmod +x download_fonts.sh
./download_fonts.shSee SETUP.md for manual font installation instructions.
On your server machine (PC, Mac, or Linux):
# Install Ollama
curl -fsSL https://ollama.com/install.sh | sh
# Pull a model
ollama pull llama3.2
# Start Ollama with network access (required!)
OLLAMA_HOST=0.0.0.0 ollama serve
β οΈ Important: By default, Ollama only listens on localhost. You must setOLLAMA_HOST=0.0.0.0to allow connections from other devices.
- Open localgrok on your Android device
- Tap the Settings icon (βοΈ) in the top right
- Enter your server IP address (find it with
ip addrorifconfigon Linux/Mac) - Port should be
11434(default Ollama port) - Tap "Test Connection" to verify
- Tap "Save Settings"
See SETUP.md for detailed setup instructions including SearXNG configuration.
- Android Studio: Hedgehog (2023.1.1) or later
- JDK: 17 or higher
- Android SDK: API 35 (Android 15)
- Gradle: 8.13.1 (included via wrapper)
# Clone the repository
git clone https://github.com/yourusername/localgrok.git
cd localgrok
# Download required fonts
cd scripts && ./download_fonts.sh && cd ..
# Build debug APK
./gradlew assembleDebug
# Install on connected device
./gradlew installDebugThe APK will be located at app/build/outputs/apk/debug/app-debug.apk.
- Open Android Studio
- Select File β Open
- Navigate to the
localgrokdirectory - Wait for Gradle sync to complete
- Run the app with Shift+F10 or the Run button
- SETUP.md - Detailed setup, configuration, and usage guide
- ARCHITECTURE.md - Project architecture, tech stack, and design patterns
- TROUBLESHOOTING.md - Common issues and solutions
- CONTRIBUTING.md - Guidelines for contributors
MIT License - see LICENSE file for details.
localgrok - Privacy-first AI on your local network π
Made with β€οΈ for the privacy-conscious