A Python application that provides a user-friendly interface for interacting with Qwen language models running locally via Ollama.
- Chat with Qwen language models running locally via Ollama
- Select from available Qwen models
- Customize system prompts to set the context for the model
- Save and load chat histories
- Clean, intuitive Streamlit interface
- Python 3.8+
- Ollama installed and running locally (default: http://localhost:11434)
- A Qwen model installed in Ollama (qwen:4b, qwen2:4b, qwen3:4b, or other variants)
-
Clone this repository:
git clone <repository-url> cd <repository-directory> -
Install the required Python packages:
pip install -r requirements.txt
-
Clone this repository:
git clone <repository-url> cd <repository-directory> -
Install the package:
pip install -e .This will install the application and its dependencies, and create a command-line entry point.
-
Make sure Ollama is installed and running. You can download it from Ollama's website.
-
Install a Qwen model in Ollama:
ollama pull qwen:4b(You can replace
qwen:4bwith any other Qwen model variant available in Ollama)
-
Start the application:
If you installed from source (Option 1):
python run.pyOr directly with Streamlit:
streamlit run app.pyIf you installed as a Python package (Option 2):
qwen-chatbot -
The application will open in your web browser. If Ollama is running and a Qwen model is installed, you'll see the chat interface.
-
You can also run the test script to verify that your Ollama setup is working correctly:
python test_ollama_api.py -
Use the sidebar to:
- Select a different Qwen model
- Customize the system prompt
- Clear the chat history
- Save and load chat histories
-
Type your messages in the input field at the bottom of the chat window and press Enter to send them.
The application includes error handling for common issues:
- If Ollama is not running, you'll see an error message with instructions to start it.
- If no Qwen models are found, you'll see instructions for installing one.
- If there are issues with API requests, error messages will be displayed in the chat.
app.py: Main application file with the Streamlit UIollama_api.py: Module for communicating with the Ollama APIchat_persistence.py: Module for saving and loading chat historiesrun.py: Script to check requirements and start the applicationtest_ollama_api.py: Script to test the Ollama API connection__init__.py: Package initialization filerequirements.txt: List of required Python packagessetup.py: Setup script for installing the application as a Python packageMANIFEST.in: Manifest file for package distributionLICENSE: MIT License file.gitignore: Git ignore filesaved_chats/: Directory where chat histories are saved (created automatically)
az