Thanks to visit codestin.com
Credit goes to github.com

Skip to content

maharanasarkar/bot-gpt

Repository files navigation

BotGPT

A conversational chatbot API built with FastAPI, PostgreSQL, ChromaDB, and LM Studio for local LLM inference.

Features

  • πŸ€– AI-Powered Chat: Connect to LM Studio for local LLM inference
  • πŸ‘₯ User Management: Create, read, update, delete users
  • πŸ’¬ Conversation Management: Organize chats into conversations
  • πŸ“ Message History: Store and retrieve conversation history
  • πŸ” Vector Search: ChromaDB integration for RAG capabilities
  • 🐳 Docker Ready: Full Docker Compose setup for easy deployment

Tech Stack

Component Technology
API Framework FastAPI
Database PostgreSQL (asyncpg)
Vector Store ChromaDB
LLM Backend LM Studio (OpenAI-compatible API)
ORM SQLAlchemy 2.0 (async)
Validation Pydantic v2
Container Docker / Docker Compose

Project Structure

botgpt/
β”œβ”€β”€ app/
β”‚   β”œβ”€β”€ main.py              # FastAPI application entry point
β”‚   β”œβ”€β”€ core/                # Business logic (LLM client, embedder)
β”‚   β”œβ”€β”€ db/                  # Database clients (Postgres, ChromaDB)
β”‚   β”œβ”€β”€ models/              # SQLAlchemy ORM models
β”‚   β”œβ”€β”€ routes/v1/           # API endpoints
β”‚   β”œβ”€β”€ schemas/             # Pydantic schemas
β”‚   β”œβ”€β”€ utils/               # Utilities (logging, security)
β”‚   └── docs/                # Documentation
β”œβ”€β”€ tests/                   # Test files
β”œβ”€β”€ docker-compose.yml       # Docker services
β”œβ”€β”€ Dockerfile               # Application container
└── requirements.txt         # Python dependencies

Prerequisites

  • Docker and Docker Compose
  • LM Studio running on your host machine
  • Python 3.13+ (for local development)

Quick Start

1. Clone the Repository

git clone https://github.com/maharanasarkar/bot-gpt.git
cd bot-gpt

2. Start LM Studio

  1. Download and install LM Studio
  2. Load a model (e.g., qwen/qwen3-4b)
  3. Start the local server on port 1234

3. Run with Docker

# Build and start all services
./docker.sh

# Or manually:
docker build -t botgpt:1.0.0 .
docker-compose up -d

4. Access the API

  • API Base URL: http://localhost:5000
  • Swagger Docs: http://localhost:5000/docs
  • ReDoc: http://localhost:5000/redoc

Environment Variables

Configure these in docker-compose.yml or a .env file:

Variable Description Example
DATABASE_URL PostgreSQL connection string postgresql+asyncpg://user:pass@host:5432/db
LLM_BASE_URL LM Studio API endpoint http://host.docker.internal:1234/v1
OPENAI_API_KEY API key for LM Studio lm-studio
LLM_MODEL Model name to use qwen/qwen3-4b
SYSTEM_PROMPT System prompt for the AI You are a helpful assistant...
CHROMA_API_HOST ChromaDB hostname chromadb
CHROMA_API_PORT ChromaDB port 8000
CREATE_TABLES Auto-create tables on startup 1

API Usage

Create a User

curl -X POST http://localhost:5000/v1/users/ \
  -H "Content-Type: application/json" \
  -d '{
    "email": "[email protected]",
    "password": "mypassword",
    "first_name": "John",
    "last_name": "Doe"
  }'

Create a Conversation

curl -X POST http://localhost:5000/v1/conversations/ \
  -H "Content-Type: application/json" \
  -d '{
    "user_id": "<USER_ID>",
    "title": "My First Chat"
  }'

Send a Message

curl -X POST http://localhost:5000/v1/messages/ \
  -H "Content-Type: application/json" \
  -d '{
    "user_id": "<USER_ID>",
    "conv_id": "<CONVERSATION_ID>",
    "message_sender": "user",
    "content": "Hello, how are you?"
  }'

Response:

{
  "user_message": {
    "id": "msg123",
    "content": "Hello, how are you?",
    "message_sender": "user",
    ...
  },
  "model_response": "Hello! I'm doing well. How can I assist you today?"
}

Get Conversation Messages

curl "http://localhost:5000/v1/messages/?conv_id=<CONVERSATION_ID>"

Local Development

Setup Virtual Environment

python -m venv .venv
source .venv/bin/activate  # Linux/Mac
# or
.venv\Scripts\activate     # Windows

pip install -r requirements.txt

Run Locally

# Start PostgreSQL and ChromaDB with Docker
docker-compose up -d postgres chromadb

# Run FastAPI locally
uvicorn app.main:app --reload --port 5000

Run Tests

pytest tests/

Docker Services

Service Port Description
app 5000 FastAPI application
postgres 5432 PostgreSQL database
chromadb 8000 ChromaDB vector store

Documentation

Troubleshooting

LM Studio Connection Issues

If you see "Client not initialized" errors:

  1. Ensure LM Studio is running on your host machine
  2. Verify the server is started on port 1234
  3. Check that LLM_BASE_URL is set to http://host.docker.internal:1234/v1

Database Connection Issues

  1. Check PostgreSQL container is running: docker-compose ps
  2. Verify DATABASE_URL format: postgresql+asyncpg://user:pass@postgres:5432/dbname

Port Conflicts

If ports are already in use, modify the port mappings in docker-compose.yml.

License

MIT License

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Commit your changes
  4. Push to the branch
  5. Open a Pull Request

About

Chatbot.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published