Self-hosted MCP server for hybrid semantic code search and repository intelligence.
-
Updated
Nov 8, 2025 - C#
Self-hosted MCP server for hybrid semantic code search and repository intelligence.
.NET native and Vulkan inference engine
A lightweight, cross-platform .NET library for building RAG (Retrieval-Augmented Generation) pipelines with local embedding models and SQLite vector storage. Perfect for developers who need privacy-focused, offline-capable document search and AI-powered question answering without external API dependencies.
AI-powered Information Retrieval with integrated AI Assistant, MCP Client, Local LLM server
A full-stack AI chatbot using local (Ollama) & cloud (OpenRouter) LLMs. Built with .NET 9 API & Angular 20 UI. Easily run models like PHI-3, Mistral, Gemma, Llama3 locally or online.
Add a description, image, and links to the local-ai topic page so that developers can more easily learn about it.
To associate your repository with the local-ai topic, visit your repo's landing page and select "manage topics."