A conversational chatbot API built with FastAPI, PostgreSQL, ChromaDB, and LM Studio for local LLM inference.
- π€ AI-Powered Chat: Connect to LM Studio for local LLM inference
- π₯ User Management: Create, read, update, delete users
- π¬ Conversation Management: Organize chats into conversations
- π Message History: Store and retrieve conversation history
- π Vector Search: ChromaDB integration for RAG capabilities
- π³ Docker Ready: Full Docker Compose setup for easy deployment
| Component | Technology |
|---|---|
| API Framework | FastAPI |
| Database | PostgreSQL (asyncpg) |
| Vector Store | ChromaDB |
| LLM Backend | LM Studio (OpenAI-compatible API) |
| ORM | SQLAlchemy 2.0 (async) |
| Validation | Pydantic v2 |
| Container | Docker / Docker Compose |
botgpt/
βββ app/
β βββ main.py # FastAPI application entry point
β βββ core/ # Business logic (LLM client, embedder)
β βββ db/ # Database clients (Postgres, ChromaDB)
β βββ models/ # SQLAlchemy ORM models
β βββ routes/v1/ # API endpoints
β βββ schemas/ # Pydantic schemas
β βββ utils/ # Utilities (logging, security)
β βββ docs/ # Documentation
βββ tests/ # Test files
βββ docker-compose.yml # Docker services
βββ Dockerfile # Application container
βββ requirements.txt # Python dependencies
- Docker and Docker Compose
- LM Studio running on your host machine
- Python 3.13+ (for local development)
git clone https://github.com/maharanasarkar/bot-gpt.git
cd bot-gpt- Download and install LM Studio
- Load a model (e.g.,
qwen/qwen3-4b) - Start the local server on port
1234
# Build and start all services
./docker.sh
# Or manually:
docker build -t botgpt:1.0.0 .
docker-compose up -d- API Base URL:
http://localhost:5000 - Swagger Docs:
http://localhost:5000/docs - ReDoc:
http://localhost:5000/redoc
Configure these in docker-compose.yml or a .env file:
| Variable | Description | Example |
|---|---|---|
DATABASE_URL |
PostgreSQL connection string | postgresql+asyncpg://user:pass@host:5432/db |
LLM_BASE_URL |
LM Studio API endpoint | http://host.docker.internal:1234/v1 |
OPENAI_API_KEY |
API key for LM Studio | lm-studio |
LLM_MODEL |
Model name to use | qwen/qwen3-4b |
SYSTEM_PROMPT |
System prompt for the AI | You are a helpful assistant... |
CHROMA_API_HOST |
ChromaDB hostname | chromadb |
CHROMA_API_PORT |
ChromaDB port | 8000 |
CREATE_TABLES |
Auto-create tables on startup | 1 |
curl -X POST http://localhost:5000/v1/users/ \
-H "Content-Type: application/json" \
-d '{
"email": "[email protected]",
"password": "mypassword",
"first_name": "John",
"last_name": "Doe"
}'curl -X POST http://localhost:5000/v1/conversations/ \
-H "Content-Type: application/json" \
-d '{
"user_id": "<USER_ID>",
"title": "My First Chat"
}'curl -X POST http://localhost:5000/v1/messages/ \
-H "Content-Type: application/json" \
-d '{
"user_id": "<USER_ID>",
"conv_id": "<CONVERSATION_ID>",
"message_sender": "user",
"content": "Hello, how are you?"
}'Response:
{
"user_message": {
"id": "msg123",
"content": "Hello, how are you?",
"message_sender": "user",
...
},
"model_response": "Hello! I'm doing well. How can I assist you today?"
}curl "http://localhost:5000/v1/messages/?conv_id=<CONVERSATION_ID>"python -m venv .venv
source .venv/bin/activate # Linux/Mac
# or
.venv\Scripts\activate # Windows
pip install -r requirements.txt# Start PostgreSQL and ChromaDB with Docker
docker-compose up -d postgres chromadb
# Run FastAPI locally
uvicorn app.main:app --reload --port 5000pytest tests/| Service | Port | Description |
|---|---|---|
app |
5000 | FastAPI application |
postgres |
5432 | PostgreSQL database |
chromadb |
8000 | ChromaDB vector store |
- API Reference - Complete API documentation
- Architecture - System design and structure
If you see "Client not initialized" errors:
- Ensure LM Studio is running on your host machine
- Verify the server is started on port 1234
- Check that
LLM_BASE_URLis set tohttp://host.docker.internal:1234/v1
- Check PostgreSQL container is running:
docker-compose ps - Verify
DATABASE_URLformat:postgresql+asyncpg://user:pass@postgres:5432/dbname
If ports are already in use, modify the port mappings in docker-compose.yml.
MIT License
- Fork the repository
- Create a feature branch
- Commit your changes
- Push to the branch
- Open a Pull Request