Thanks to visit codestin.com
Credit goes to github.com

Skip to content

JordiPRBCN/MentalSupportChat

Repository files navigation

Mental Support Chat System

This project comprises a multi-service application designed to provide a mental support chatbot experience. The system consists of a frontend User Interface (UI), a backend API, a dedicated MySQL database, and an external Language Model (LLM) service. It is orchestrated using Docker Compose for streamlined setup and deployment.

✨ Overview

The Mental Support Chat system allows users to register, log in, and engage in chat conversations with an AI bot powered by a Large Language Model. User data and chat history are stored persistently. The system architecture separates concerns into distinct services:

  • UI: The user-facing web application.
  • API: The backend layer handling user authentication, chat logic, and orchestrating interactions between the UI, Database, and LLM Service.
  • DB: A dedicated MySQL database for storing application data.
  • LLM Service: An external service responsible for running the language model and generating bot responses.

The UI communicates with the API, which in turn interacts with the Database for data storage and the LLM Service for generating chat responses. Docker Compose manages the lifecycle and networking of these interconnected services.

🚀 Key Features

  • User Authentication: Secure user registration, login, and session management using JWT. Includes a password recovery mechanism via security questions.
  • Chat Functionality: Users can send messages to the bot, view conversation history, and clear their chat sessions.
  • AI-Powered Responses: Integration with an external LLM service to generate empathetic and supportive bot responses based on chat context.
  • Chat History Persistence: All chat messages and user data are stored reliably in a MySQL database.
  • Chat Export: Ability to export chat conversation history as a PDF document.
  • User Account Management: Functionality for authenticated users to update their profile details and delete their account.
  • Containerized Deployment: The entire system is designed to be built and run easily using Docker and Docker Compose.

🌐 Technologies Used

The system leverages a variety of technologies across its components:

  • Frontend (UI): React, TypeScript, Tailwind CSS, React Router DOM, React Hook Form, Headless UI, React Aria/Stately, Framer Motion, Axios.
  • Backend (API): Node.js, Express.js, MySQL (via mysql2 and Sequelize), bcryptjs, jsonwebtoken, uuid, express-validator, pdfkit, axios, dotenv, morgan, cors.
  • Database (DB): MySQL 8.0.
  • LLM Service: Python, Flask, llama-cpp-python.
  • Orchestration: Docker, Docker Compose.

🔧 Setup and Installation

The entire system can be easily set up and run using Docker Compose.

Prerequisites

  • Docker and Docker Compose must be installed on your system.

Steps

  1. Configure Environment Variables for the API: Navigate to the mental-support-api project directory and create a .env file if one doesn't exist. Populate it with the necessary configuration, ensuring DB_HOST and LLM_API_URL point to the service names defined in the docker-compose.yml.

    # Inside mental-support-api/.env
    NODE_ENV=development
    PORT=3001
    
    DB_HOST=db # Service name in docker-compose.yml
    DB_PORT=3306
    DB_USER=chatbot_user
    DB_PASSWORD=chatbot_password
    DB_NAME=chatbotdb
    
    JWT_SECRET=YOUR_SUPER_SECURE_RANDOM_JWT_KEY # <<< Replace with a strong, unique key
    JWT_EXPIRES_IN=7d
    
    LLM_API_URL=http://llm-service:5000/chat # Service name and port in docker-compose.yml

    It is critical to replace YOUR_SUPER_SECURE_RANDOM_JWT_KEY with a strong, unique secret.

  2. Prepare Database Initialization Script: Ensure the init.sql file containing the database schema (CREATE TABLE statements for Users, Chats, Messages) is located at the path referenced in the docker-compose.yml volume mount for the db service (e.g., ./mental-support-db/init.sql). This script will automatically set up the database schema when the db container starts for the first time.

  3. Obtain the LLM Model File: Download your desired Llama language model file (in .gguf format) and place it in the models directory within your mental-support-llm project directory. Ensure the MODEL_PATH environment variable in the docker-compose.yml for the llm-service correctly points to this file within the container (e.g., /app/models/your_model_file.gguf). The provided compose file expects /app/models/tinyllama-1.1b-chat-v1.0.Q2_K.gguf.

  4. Run with Docker Compose: Navigate to the directory containing the main docker-compose.yml file and run the following command:

    docker-compose up --build -d
    • up: Starts all services defined in the compose file.
    • --build: Builds the Docker images for the services (UI, API, LLM Service) based on their Dockerfiles before starting containers.
    • -d: Runs the containers in detached mode (in the background).
  5. Verify Services are Running: You can check the status of all running containers:

    docker-compose ps

    You should see containers for ui, api, db, and llm-service listed with Up status. The api service is configured with a health check (service_healthy) dependency on the db service, ensuring the API waits for the database to be ready.

  6. Access the Application: Once all services are healthy and running, open your web browser and navigate to http://localhost:3000. This is the port where the UI service is exposed.

📄 API Endpoints

The backend API service (mental-support-api) provides the interface for the UI and interacts with the database and LLM service. Its endpoints are typically accessed under the /api path.

  • /api/auth/*: Authentication and user management endpoints.
  • /api/chat/*: Chat message and history management endpoints.
  • /api/user/*: Authenticated user profile management endpoints.

Refer to the mental-support-api project's README for a detailed list and description of its API endpoints.

⚙️ Configuration

System-wide configuration is primarily managed through environment variables and the docker-compose.yml file:

  • API (mental-support-api): Configured via its .env file for database credentials, JWT secret, and the LLM service URL.
  • Database (db): Configured via environment variables directly in docker-compose.yml for root password, database name, user, and password.
  • LLM Service (llm-service): Configured via environment variables in docker-compose.yml for the model path and inference parameters.
  • UI (ui): Configured via build-time environment variables in docker-compose.yml (REACT_APP_API_BASE_URL) to point to the API service within the Docker network.

The docker-compose.yml file itself defines the service dependencies, port mappings, and volumes needed for persistent data storage (MySQL data and LLM models).

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published