Simple and lightweight chat interface written in Vue for Ollama models based on the client's ollama server.
Warning
This project is not finished yet — it's just a testing demo. Views and components are not finalized, not optimized, and currently do not support chat history or session management. This is a minimal proof of concept for real-time interaction with Ollama models.
Visit Local Ollama for demo preview. You can freely use this Git source and host it for free on Cloudflare Pages.
Fix it using this tutorial