- 🎮 Discord: https://discord.gg/pcPuWcbY
- 📺 YouTube: https://www.youtube.com/@VibrantEinstein
A deep private RAG core for macOS/iOS, inspired by the depths of the ocean and now empowered with multi-modal intelligence.
- 🛡️ Privacy-first, fully on-device RAG with extended support for visual data (images, diagrams, charts)
- 💻 Runs natively on Apple Silicon (macOS/iOS, M1+) with full feature parity across platforms
- 🐟 Pre-packaged with embedded models and sample "RAGfish DB" for instant exploration
- 🗂️ Supports user-imported RAGpacks (chunked & embedded docs, ZIP format)
- Multi-RAGpack support
- Threaded Q&A history
- Full Apple Silicon optimization
- ⚡️ Fast local search, summary, and QA—no cloud, no tracking
- 🖼️ Multi-modal capabilities: image recognition, diagram/chart interpretation, and visual search
- 🚀 New model support: Jan-v1-4B/4V with GGUF model option for enhanced performance and flexibility
- Future plans: web & cloudless multi-user collaboration
- Download and launch NoesisNoema app for macOS/iOS
- (Optional) Preprocess your own docs and images using the RAGpack Colab Preprocessor Notebook from the noesisnoema-pipeline project, download the RAGpack ZIP, and import it in-app
- Start asking multi-modal questions in natural language or with images. Enjoy!
Note: Official release builds are coming soon. For now, use the main branch of NoesisNoema app. See RAGfish for specs and dev tools.
RAGfish is the core RAGpack specification and toolkit.
It is designed to be used with the following companion projects:
- Reference implementation: A private RAG client for macOS and iOS that leverages RAGfish RAGpack format with full feature parity.
- Features:
- Full offline, on-device retrieval-augmented generation (RAG) with multi-modal query support (text + image)
- QA history & thread management (your questions are stored and retrievable)
- Multi-RAGpack support (import as many knowledge packs as you want)
- Modern two-pane UI (QA history/threads, detail view, future extensibility)
- Privacy-first, Apple Silicon optimized, now including visual data privacy
- Get started:
- Download NoesisNoema
- Import RAGpacks (sample or your own)
- Ask questions, explore knowledge, upload images, and see instant answers with cited chunks and visual insights
- Reference pipeline: Create your own RAGpack (.zip) from PDF/text/images using Google Colab or CLI.
- Features:
- Chunking & embedding for any document (English or multilingual) and images (including diagrams and charts)
- Outputs compatible RAGpack (.zip) for direct import in RAGfish/NoesisNoema
- Supports export of both
.npy
and.csv
for easy integration (see docs) - Fast prototyping: no local environment required if using Colab
- Get started:
- Go to noesisnoema-pipeline
- Follow the notebook or CLI steps to generate your own multi-modal RAGpack from PDF, text, or images
- Download and import the RAGpack (.zip) in NoesisNoema or RAGfish
RAGfish consists of three main parts:
- The pipeline (for preprocessing and generating multi-modal RAGpacks)
- The RAGfish core engine (which performs fast local retrieval and QA across text and visual data)
- User-facing apps (NoesisNoema for macOS/iOS)
All knowledge flows through the RAGpack format for seamless integration and privacy. The architecture now embraces a multi-modal flow, combining text embeddings with visual embeddings extracted from images, diagrams, and charts. This enables rich, context-aware retrieval-augmented generation that understands and reasons over both textual and visual information—all fully on-device and privacy-preserving.
See the detailed architecture documentation and diagrams:
See how RAGfish revolutionizes document Q&A and knowledge workflows compared to conventional approaches:
Key Benefits:
- Unified, instant document and visual data Q&A (no more fragmented search)
- On-device privacy—no cloud, no leaks, now including images and diagrams
- Fully offline, always available
- Easy to extend with your own knowledge packs (RAGpacks)
- Built for business productivity and compliance
RAGfish is built on the principle that knowledge work should remain private, powerful, and beautiful—like the silent depths of the ocean. We believe LLM-powered RAG should require no server, no login, and no risk. Your documents, images, and brain—on your device.
- iOS universal app with full macOS/iOS feature parity
- Advanced chunk highlighting, source trace, and cross-document QA
- In-app RAGpack generation (pipeline-less)
- UI/UX: Enhanced right-pane for document preview, chunk roots, annotation, and visual data display
- Cloudless peer-to-peer knowledge sharing (private multi-device sync)
- More LLM model support (future Apple ML, third-party LLMs)
- API for plugin and automation extension
- In-app image-to-text analysis, diagram reasoning, and chart Q&A for seamless multi-modal interaction
- More: See Issues