An AI-powered search engine with a generative UI - Configured for Local AI Models
This is a customized version of Morphic configured to work exclusively with local AI models through OpenAI-compatible endpoints (like KoboldCpp). All cloud AI providers have been removed for a fully self-hosted experience.
- β All models configured to use local AI endpoint
- β Interactive logo with eye-tracking mouse movement
- β Docker deployment optimized for local model usage
- β Removed dependency on cloud AI providers
- π Features
- π§± Stack
- π Quickstart
- π³ Docker Deployment
- π§ Building Custom Image
- π Search Engine
- π License
- AI-powered search with GenerativeUI using local models
- Natural language question understanding
- Multiple search providers support (Tavily, SearXNG)
- Interactive logo with eye-tracking animation
- Reasoning models with visible thought process
- Chat history functionality
- Share search results
- Redis support (Local/Docker)
- URL-specific search
- Video search support
- SearXNG integration with:
- Customizable search depth (basic/advanced)
- Configurable engines
- Adjustable results limit
- Safe search options
- Custom time range filtering
- Next.js - App Router, React Server Components
- TypeScript - Type safety
- Vercel AI SDK - Text streaming / Generative UI
- Local AI Model via OpenAI-compatible endpoint
- KoboldCpp - Recommended local AI server
- Tavily AI - Default search provider
- SearXNG - Self-hosted search alternative
- Redis - Caching and session storage
- Tailwind CSS - Utility-first CSS framework
- shadcn/ui - Re-usable components
- Radix UI - Unstyled, accessible components
- Lucide Icons - Beautiful & consistent icons
- Docker and Docker Compose installed
- A local AI model server (KoboldCpp recommended)
- Node.js 18+ and Bun (for development)
git clone https://github.com/Peterfish/morphic.git morphic-custom
cd morphic-custom# Clone and build KoboldCpp
git clone https://github.com/LostRuins/koboldcpp
cd koboldcpp
make
# Run with your preferred model
./koboldcpp --model path/to/your/model.gguf --port 5001cp .env.local.example .env.localEdit .env.local:
# Local AI Configuration
OPENAI_COMPATIBLE_API_KEY=any-key-will-work
OPENAI_COMPATIBLE_API_BASE_URL=http://localhost:5001/v1
# Search Configuration
TAVILY_API_KEY=your-tavily-api-key # Get from https://app.tavily.com
# Optional
BASE_URL=http://localhost:3000# Start all services
docker compose up -d
# Or if using separate KoboldCpp container
docker compose up -d && docker compose -f docker-compose.kobold.yaml up -dVisit http://localhost:3030 in your browser.
docker pull ghcr.io/Peterfish/morphic-custom:latest
# Run with docker-compose
docker compose up -dservices:
morphic:
image: ghcr.io/Peterfish/morphic-custom:latest
env_file: .env.local
ports:
- '3030:3000'
depends_on:
- redis
- searxngSee BUILD_CUSTOM_IMAGE.md for detailed instructions on:
- Modifying the codebase for your needs
- Building your own Docker image
- Pushing to GitHub Container Registry
- Deploying your custom build
- Open your browser settings
- Navigate to search engine settings
- Add new search engine:
- Name: Morphic Local
- Keyword: morphic
- URL:
http://localhost:3030/search?q=%s
- Set as default search engine
-
Connection to AI model failed
- Ensure KoboldCpp is running on port 5001
- Check
OPENAI_COMPATIBLE_API_BASE_URLin.env.local
-
Search not working
- Verify
TAVILY_API_KEYis set correctly - Check SearXNG container is running
- Verify
-
Docker networking issues
- Use container names for inter-container communication
- Example:
http://koboldcpp:5001/v1instead oflocalhost
# Install dependencies
bun install
# Run development server
bun dev
# Build for production
bun build- Original Morphic by @miurla
- Built with Vercel AI SDK
- Interactive logo inspired by classic animated eyes
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.