With Skald you don't need to implement RAG ever again. Push context to our API, and get chat, search, document generation and more out of the box.
Node SDK example
import { Skald } from '@skald-labs/skald-node';
const skald = new Skald('your-api-key-here');
const result = await skald.createMemo({
title: 'Meeting Notes',
content: 'Full content of the memo...'
});
const result = await skald.chat({
query: 'What were the main points discussed in the Q1 meeting?'
});Python - Ruby - Go - PHP - C# - MCP - CLI
git clone https://github.com/skaldlabs/skald
cd skald
echo "OPENAI_API_KEY=<your_key>" > .env
docker-compose upFor a production self-hosted deploy, check out our self-hosting docs.
Running Skald without any third-party services
You can deploy Skald without any third-party dependencies (including OpenAI), but that will require hosting your own LLM inference server and using a local embeddings service (we've provided one for you in the local-embedding docker compose profile). This is advanced usage and is classed as experimental, check out our docs for more details.
- Chat: Chat with your knowledge in Skald with just one API call.
- Search: Use semantic search to find relevant context based on user queries.
- Generate: Generate content from your knowledge like documentation and reports.
- Powerful filtering: Speed up and improve responses by filtering the accessible knowledge in every query.
- Amazing DX, no bullsh*t: Implement in minutes with SDKs for every major language. Don't see yours? open an issue and we'll build it!
- Truly open-source: Our open source version is fully-featured, easy to deploy, and can even run with no third-party dependencies.
- Cloud: free tier with no credit card required
- Self-hosted: get a fully-featured production deploy with SSL live in less than an hour
We'd be glad to have your contributions! See CONTRIBUTING.md for instructions on how to run Skald locally and how to contribute.
MIT 🤸