Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Kourva/LocalOllama

Repository files navigation

Local Ollama

Simple and lightweight chat interface written in Vue for Ollama models based on the client's ollama server.

Warning

This project is not finished yet — it's just a testing demo. Views and components are not finalized, not optimized, and currently do not support chat history or session management. This is a minimal proof of concept for real-time interaction with Ollama models.

Demo

Visit Local Ollama for demo preview. You can freely use this Git source and host it for free on Cloudflare Pages.

Models not shown? (Ollama's CORS error)

Fix it using this tutorial

About

Lightweight Vue chat interface for interacting with Ollama models.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published