RLAMA is a powerful AI-driven question-answering tool for your documents, seamlessly integrating with your local Ollama models. It enables you to create, manage, and interact with Retrieval-Augmented Generation (RAG) systems tailored to your documentation needs.
- Ollama installed and running
curl -fsSL https://raw.githubusercontent.com/dontizi/rlama/main/install.sh | shYou can get help on all commands by using:
rlama --helpCreates a new RAG system by indexing all documents in the specified folder.
rlama rag [model] [rag-name] [folder-path]Parameters:
model: Name of the Ollama model to use (e.g., llama3, mistral, gemma).rag-name: Unique name to identify your RAG system.folder-path: Path to the folder containing your documents.
Example:
rlama rag llama3 documentation ./docsStarts an interactive session to interact with an existing RAG system.
rlama run [rag-name]Parameters:
rag-name: Name of the RAG system to use.
Example:
rlama run documentation
> How do I install the project?
> What are the main features?
> exitDisplays a list of all available RAG systems.
rlama listPermanently deletes a RAG system and all its indexed documents.
rlama delete [rag-name] [--force/-f]Parameters:
rag-name: Name of the RAG system to delete.--forceor-f: (Optional) Delete without asking for confirmation.
Example:
rlama delete old-projectOr to delete without confirmation:
rlama delete old-project --forceChecks if a new version of RLAMA is available and installs it.
rlama update [--force/-f]Options:
--forceor-f: (Optional) Update without asking for confirmation.
Displays the current version of RLAMA.
rlama --versionor
rlama -vTo uninstall RLAMA:
If you installed via go install:
rlama uninstallRLAMA stores its data in ~/.rlama. To remove it:
rm -rf ~/.rlamaRLAMA supports many file formats:
- Text:
.txt,.md,.html,.json,.csv,.yaml,.yml,.xml - Code:
.go,.py,.js,.java,.c,.cpp,.h,.rb,.php,.rs,.swift,.kt - Documents:
.pdf,.docx,.doc,.rtf,.odt,.pptx,.ppt,.xlsx,.xls,.epub
Installing dependencies via install_deps.sh is recommended to improve support for certain formats.
If you encounter connection errors to Ollama:
- Check that Ollama is running.
- Ollama must be accessible at
http://localhost:11434. - Check Ollama logs for potential errors.
If you encounter problems with certain formats:
- Install dependencies via
./scripts/install_deps.sh. - Verify that your system has the required tools (
pdftotext,tesseract, etc.).
If the answers are not relevant:
- Check that the documents are properly indexed with
rlama list. - Make sure the content of the documents is properly extracted.
- Try rephrasing your question more precisely.
For any other issues, please open an issue on the GitHub repository providing:
- The exact command used.
- The complete output of the command.
- Your operating system and architecture.
- The RLAMA version (
rlama --version).