WebAISum is a Python script that allows you to summarize web pages using AI models. It supports both local models like Ollama and remote services like OpenAI.
- Summarize web pages using AI models ππ€
- Support for local models (Ollama) and remote services (OpenAI) π βοΈ
- Customizable model selection π§
- Debug mode for verbose output π
- Python 3.6 or higher π
requestslibrary π¦langchain_communitylibrary πlangchain_openailibrary π
- Clone the repository:
git clone https://github.com/dkruyt/webaisum.git- Install the required dependencies:
pip install -r requirements.txtTo summarize a web page, run the webaisum.py script with the desired URL:
python webaisum.py https://example.com--model: Specify the AI model to use (default: llama3 for Ollama or gpt-4-turbo for OpenAI)--server: Specify the base URL of the remote AI server to use (for Ollama)--debug: Enable debug mode to print verbose output--use-openai: Use OpenAI instead of Ollama for summarization
Summarize a web page using the default Ollama model:
python webaisum.py https://example.comSummarize a web page using a specific Ollama model and remote server:
python webaisum.py https://example.com --model llama2 --server http://ollama-server.comSummarize a web page using OpenAI:
python webaisum.py https://example.com --use-openaiSummarize a web page using OpenAI with a specific model:
python webaisum.py https://example.com --use-openai --model gpt-3.5-turbo-16kContributions are welcome! If you find any issues or have suggestions for improvements, please open an issue or submit a pull request.
This project is licensed under the MIT License.