Create unlimited AI Chatbot Agents for your websites
Qarīn.ai lets you create unlimited AI chatbot agents for your websites — no coding required.
It works with any LLM provider that supports the OpenAI-Compatible API, including self-hosted providers like llama.cpp or Ollama.
With Qarīn.ai, you can:
- Define an agent's name, identity, and instructions.
- Instantly generate a chat bubble widget to embed on your site.
- Enhance agents by connecting to MCP Servers or importing your own API specs.
- Build vector stores from documents for retrieval-augmented generation (RAG).
- Expose vector stores or MCP Servers to external AI agents.
- No Coding Needed!
- Supports RAG & MCP out of the box.
- Works with any OpenAI-Compatible LLM provider (including self-hosted).
- One-click MCP Server generation from existing REST API specs (Swagger/OpenAPI).
- Native vector storage with optional MCP Server exposure for each store.
- Easy-to-use chat bubble widget for quick website integration.
- Simple Docker or Kubernetes deployment.
Requirements
- Docker & Docker Compose installed.
Run Qarīn.ai
git clone https://github.com/qarinai/qarinai.git
cd qarinai
docker compose up -dThis starts Qarīn.ai with all required environment variables pre-configured.
- Username:
admin - Password:
admin
Once running:
- Connect your desired LLM provider (OpenAI, Ollama, llama.cpp, etc.).
- Select the models you want to use.
- Set a default model for minor app tasks (summarization, descriptions, etc.).
- (Optional) Import MCP Servers into Qarīn.ai.
- (Optional) Import Swagger/OpenAPI specs to auto-generate MCP Servers.
- (Optional) Create vector stores and upload documents for RAG.
- Create your AI agent — define name, identity, and instructions.
- Embed it — click “Add to Website” to get your dynamic snippet code.
Qarīn.ai is still in active development — several existing features are only partially implemented or missing certain CRUD operations and use cases.
The near-term focus will be completing and stabilizing all current functionality before expanding further.
Planned enhancements:
- 🛠 Complete existing CRUD operations and missing use cases for certain resources.
- 🔒 Enhanced security:
- User management with multiple accounts and ACL (Access Control Lists).
- Personal Access Tokens with scopes instead of full unrestricted access.
- Secure the public bubble APIs.
- 📊 Conversation tracking UI:
- Visual interface to inspect each conversation and message per agent (currently only stored in DB).
- 🎨 Agent UI customization:
- Style, color, and branding customization for the chat bubble and iframe widget.
- 🚀 Additional quality-of-life improvements after stabilization.
This project is licensed under the Apache 2.0 License — see LICENSE for details.
Qarīn.ai would not have been possible without the incredible work of the open-source community.
This project stands on the shoulders of countless developers and contributors who make their tools, libraries, and frameworks available for everyone to learn from and build upon.
A special thanks to the maintainers of the amazing technologies powering Qarīn.ai, including (but not limited to):
- Frontend & UI: Angular, PrimeNG, TailwindCSS, ngx-markdown
- Backend & APIs: NestJS, @modelcontextprotocol/sdk, OpenAI SDK, Zod
- AI & NLP: @huggingface/transformers, @langchain/textsplitters
- Database & Storage: TypeORM, PostgreSQL, pgvector
- Build & Tooling: Webpack, Gulp, Docker
- And many more... — every dependency plays a role in making this project possible.
If you maintain one of these libraries: thank you for your dedication to open-source! ❤️
Contributions are not open at the moment as the project is still in an early stage.
However, suggestions and feedback are welcome — feel free to open an issue or discussion.