Centauri Chat Service is a lightweight, scalable WebSocket-based AI chat service. It allows multiple conversational sessions to be maintained in memory, with a focus on modularity and extendibility. Built using FastAPI, it follows a Hexagonal Architecture to ensure flexibility and maintainability.
The service uses an in-memory singleton pattern to store data, suitable for low-demand scenarios. However, the architecture is designed to allow easy integration with Redis or traditional databases if higher scalability is required.
- WebSocket Support: Real-time communication through
/chat/and/chat/{chat_id}endpoints. - Session Management: Multiple conversations stored and managed in memory.
- Health Check Endpoint:
/checkto verify service status. - Hexagonal Architecture: Clear separation of concerns, enabling future extensibility.
- In-Memory Storage: Uses a singleton pattern for low-demand use cases.
- Extendable to Redis or Databases: Ready for scaling with minimal changes.
- Python: Version 3.9 or higher.
- Docker: Ensure Docker is installed and running.
- Make: For convenience in executing commands (optional, but recommended).
Follow these steps to get the project up and running:
git clone [email protected]:Endika/Centauri.git
cd Centauriecho "OPENAI_API_KEY=your-token" > .envmake build
make up
# or
docker-compose up --build -dmake chat
# or
docker exec -it centauri-client wscat -c ws://centauri:8000/chat/make chat-flight
# or
docker exec -it centauri-client wscat -c ws://centauri:8000/chat/flight_attendant