The Chat UI project is a minimalist and efficient way to create a fully functional chatbot interface. It integrates with various backend solutions, including OpenAI, HuggingFace, and more, all with minimal setup. This repository allows for quick experimentation with chatbot functionality while maintaining simplicity.
- OpenAI-format compatibility: Easily integrates with
HuggingFace,vLLM, and more backends. - Multiple response formats: Supports
OpenAI,Cloudflare AI, andplain textresponses without additional configuration. - Custom backend support: Configure your own endpoints for universal chatbot usage across projects.
- Chat history download: Save chat history for future reference or testing.
- Multimodal support: Send image inputs for vision models.
- Markdown support: Toggle between
original formatandMarkdowndisplay. - Internationalization: Full support for localization (
i18n) to reach a wider audience.
Option 1: Goto demo AIQL
The demo will use
Llama-3.2by default, image upload is only supported for vision models
Option 2: Download Index and open it locally (recommended)
Option 3: Download Index and deploy it by python
cd /path/to/your/directory
python3 -m http.server 8000Then, open your browser and access
http://localhost:8000
Option 4: fork this repo and link it to Cloudflare pages
Option 5: Deploy your own Chatbot by Docker
docker run -p 8080:8080 -d aiql/chat-uiOption 6: Deploy within Huggingface
Don't forget add
app_port: 8080inREADME.md
Option 7: Deploy within K8s
By default, the Chat UI uses OpenAI's API format. You can easily change the API to other vendors by configuring the API Key and Endpoint.
- Download the configuration template from the example folder.
- Insert your own API Key for quick configuration.
If you're having trouble accessing the page or experiencing issues, try the following:
- Click the
Refreshicon in the upper-right corner of the Interface Configuration.
- Click the hidden button on the right side of the index page.
- Click the
Reset All Configicon.
- Right-click on the page and open the
Networksection. - Clear your browser's cache and cookies to ensure you're using the latest version.
- Check the browser's Network section to find any failing resources and see if the issue is location-specific.
- Introduce the image as sidecar container
spec:
template:
metadata:
labels:
app: my-app
spec:
containers:
- name: chat-ui
image: aiql/chat-ui
ports:
- containerPort: 8080- Add service
apiVersion: v1
kind: Service
metadata:
name: chat-ui-service
spec:
selector:
app: my-app
ports:
- protocol: TCP
port: 8080
targetPort: 8080
type: LoadBalancer- You can access the port or add other ingress
apiVersion: networking.k8s.io/v1
kind: Ingress
metadata:
name: my-app-ingress
annotations:
nginx.ingress.kubernetes.io/rewrite-target: /$1
spec:
rules:
- host: chat-ui.example.com
http:
paths:
- path: /
pathType: Prefix
backend:
service:
name: chat-ui-service
port:
number: 8080Author: Haider Manzoor
