A single-file tkinter-based Ollama GUI project with no external dependencies.
-
Updated
Mar 21, 2025 - Python
A single-file tkinter-based Ollama GUI project with no external dependencies.
Easy to setup, self-hosted, multi-model AI chatbot & API server.
A lightweight system tray application that monitors NVIDIA GPU VRAM usage in real-time. Perfect for AI/ML developers, LLM users, and anyone who needs to keep an eye on their GPU memory usage without opening Task Manager.
This desktop application, built with customtkinter, provides an interactive chat interface for local Large Language Models (LLMs) served via Ollama.
Gradio ChatUI for Ollama with basic RAG.
Add a description, image, and links to the ollama-ui topic page so that developers can more easily learn about it.
To associate your repository with the ollama-ui topic, visit your repo's landing page and select "manage topics."