Command line tool to summarize text files using AI with a local Ollama instance. The project is mainly developed with AI itself using Visual Studio Code, the Continue.dev extension, local Ollama instance and Python. So it's more a playground for developing with AI assistant (as agent and chat), using AI tools and learning about AI itself.
- Summarize text files using AI
--model MODELargument to select model (defaultllama3.1:8b)--file FILEargument to select file--text "a text"argument to provide a text--question QUESTIONargument to select question (defaultSummarize the following text:)--chunkedargument to chunk the input file/text based on (currently hardcoded) chunk size
- List all available models with
--list-models/-lsm - Set Ollama host with optional
--host HOSTargument. Default ishttp://127.0.0.1:11434
To summarize a text file using the default settings, you can run:
python src/main.py --file example/python-ai.mdpython src/main.py --file example/python-ai.md --question "Count how many times AI is mentioned in this text:"python src/main.py --file example/python-ai.md --model qwen2.5python src/main.py --text "strawberry" --question "How many letter 'R' are in this text:"
List all available models:
python src/main.py --list-models
- Install Python (3.8 or higher)
- Clone this repository
- Run
pip install -r requirements.txt - Run
python src/main.py
- Ollama installed and running locally
- Optional: Default model is
llama3.1:8b. Otherwise it's necessary to specifiy the model with the--modelargument.
Feel free to contribute via pull requests.
The project is licensed by the Apache 2 license.