This project comprises a multi-service application designed to provide a mental support chatbot experience. The system consists of a frontend User Interface (UI), a backend API, a dedicated MySQL database, and an external Language Model (LLM) service. It is orchestrated using Docker Compose for streamlined setup and deployment.
The Mental Support Chat system allows users to register, log in, and engage in chat conversations with an AI bot powered by a Large Language Model. User data and chat history are stored persistently. The system architecture separates concerns into distinct services:
- UI: The user-facing web application.
- API: The backend layer handling user authentication, chat logic, and orchestrating interactions between the UI, Database, and LLM Service.
- DB: A dedicated MySQL database for storing application data.
- LLM Service: An external service responsible for running the language model and generating bot responses.
The UI communicates with the API, which in turn interacts with the Database for data storage and the LLM Service for generating chat responses. Docker Compose manages the lifecycle and networking of these interconnected services.
- User Authentication: Secure user registration, login, and session management using JWT. Includes a password recovery mechanism via security questions.
- Chat Functionality: Users can send messages to the bot, view conversation history, and clear their chat sessions.
- AI-Powered Responses: Integration with an external LLM service to generate empathetic and supportive bot responses based on chat context.
- Chat History Persistence: All chat messages and user data are stored reliably in a MySQL database.
- Chat Export: Ability to export chat conversation history as a PDF document.
- User Account Management: Functionality for authenticated users to update their profile details and delete their account.
- Containerized Deployment: The entire system is designed to be built and run easily using Docker and Docker Compose.
The system leverages a variety of technologies across its components:
- Frontend (UI): React, TypeScript, Tailwind CSS, React Router DOM, React Hook Form, Headless UI, React Aria/Stately, Framer Motion, Axios.
- Backend (API): Node.js, Express.js, MySQL (via
mysql2and Sequelize), bcryptjs, jsonwebtoken, uuid, express-validator, pdfkit, axios, dotenv, morgan, cors. - Database (DB): MySQL 8.0.
- LLM Service: Python, Flask, llama-cpp-python.
- Orchestration: Docker, Docker Compose.
The entire system can be easily set up and run using Docker Compose.
- Docker and Docker Compose must be installed on your system.
-
Configure Environment Variables for the API: Navigate to the
mental-support-apiproject directory and create a.envfile if one doesn't exist. Populate it with the necessary configuration, ensuringDB_HOSTandLLM_API_URLpoint to the service names defined in thedocker-compose.yml.# Inside mental-support-api/.env NODE_ENV=development PORT=3001 DB_HOST=db # Service name in docker-compose.yml DB_PORT=3306 DB_USER=chatbot_user DB_PASSWORD=chatbot_password DB_NAME=chatbotdb JWT_SECRET=YOUR_SUPER_SECURE_RANDOM_JWT_KEY # <<< Replace with a strong, unique key JWT_EXPIRES_IN=7d LLM_API_URL=http://llm-service:5000/chat # Service name and port in docker-compose.yml
It is critical to replace
YOUR_SUPER_SECURE_RANDOM_JWT_KEYwith a strong, unique secret. -
Prepare Database Initialization Script: Ensure the
init.sqlfile containing the database schema (CREATE TABLEstatements forUsers,Chats,Messages) is located at the path referenced in thedocker-compose.ymlvolume mount for thedbservice (e.g.,./mental-support-db/init.sql). This script will automatically set up the database schema when thedbcontainer starts for the first time. -
Obtain the LLM Model File: Download your desired Llama language model file (in
.ggufformat) and place it in themodelsdirectory within yourmental-support-llmproject directory. Ensure theMODEL_PATHenvironment variable in thedocker-compose.ymlfor thellm-servicecorrectly points to this file within the container (e.g.,/app/models/your_model_file.gguf). The provided compose file expects/app/models/tinyllama-1.1b-chat-v1.0.Q2_K.gguf. -
Run with Docker Compose: Navigate to the directory containing the main
docker-compose.ymlfile and run the following command:docker-compose up --build -d
up: Starts all services defined in the compose file.--build: Builds the Docker images for the services (UI, API, LLM Service) based on their Dockerfiles before starting containers.-d: Runs the containers in detached mode (in the background).
-
Verify Services are Running: You can check the status of all running containers:
docker-compose ps
You should see containers for
ui,api,db, andllm-servicelisted withUpstatus. Theapiservice is configured with a health check (service_healthy) dependency on thedbservice, ensuring the API waits for the database to be ready. -
Access the Application: Once all services are healthy and running, open your web browser and navigate to
http://localhost:3000. This is the port where the UI service is exposed.
The backend API service (mental-support-api) provides the interface for the UI and interacts with the database and LLM service. Its endpoints are typically accessed under the /api path.
/api/auth/*: Authentication and user management endpoints./api/chat/*: Chat message and history management endpoints./api/user/*: Authenticated user profile management endpoints.
Refer to the mental-support-api project's README for a detailed list and description of its API endpoints.
System-wide configuration is primarily managed through environment variables and the docker-compose.yml file:
- API (
mental-support-api): Configured via its.envfile for database credentials, JWT secret, and the LLM service URL. - Database (
db): Configured via environment variables directly indocker-compose.ymlfor root password, database name, user, and password. - LLM Service (
llm-service): Configured via environment variables indocker-compose.ymlfor the model path and inference parameters. - UI (
ui): Configured via build-time environment variables indocker-compose.yml(REACT_APP_API_BASE_URL) to point to the API service within the Docker network.
The docker-compose.yml file itself defines the service dependencies, port mappings, and volumes needed for persistent data storage (MySQL data and LLM models).