A self-hosted, privacy-first document ingestion service that captures, processes, extracts metadata, and archives documents to Paperless-ngx.
-
Updated
Feb 1, 2026 - Python
A self-hosted, privacy-first document ingestion service that captures, processes, extracts metadata, and archives documents to Paperless-ngx.
Run AI models offline without relying on internet access or cloud infrastructure.
Ansible collection for deploying vLLM on AMD Ryzen AI Max "Strix Halo" (gfx1151) APUs. Toolbox and systemd service modes, kernel tuning, model prefetching, and optional Open WebUI frontend.
Download, manage, and chat with LLMs, completely private and local.
Add a description, image, and links to the openweb-ui topic page so that developers can more easily learn about it.
To associate your repository with the openweb-ui topic, visit your repo's landing page and select "manage topics."