Thanks to visit codestin.com
Credit goes to github.com

Skip to content

SamDc73/Talimio

Repository files navigation

Talimio

An all-in-one, AI-powered learning platform that adapts to you.

Talimio homepage showing a course Creation and AI sidechat

Why Talimio?

  • Turn any topic into a full course in seconds. Upload your own documents to ground the AI with RAG, or let the model build from scratch.
  • Create adaptive courses where the AI builds a dependency graph (DAG) and only unlocks new lessons once you’ve mastered the prerequisites.
  • We don’t do boring flashcards. A LECTOR-based scheduler tracks concepts you struggle with and resurfaces them in future lessons exactly when you need them.
  • Lessons are fully interactive. Like Claude Artifacts or the demos on Brilliant, you get hands-on widgets you can click and play with right inside the lesson.
  • Executable code blocks are built in: run, tweak, and break code directly inside the lesson.
  • Answer quick questions and self-assessment quizzes inside lessons; your responses continuously shape and adapt the course to you.
  • Upload a PDF book or drop in a YouTube link and chat with it instantly.
  • Run the whole stack 100% offline with Ollama, or plug in your own cloud API keys (supports many providers via LiteLLM).
  • Persistent memory of your preferences and learning style, so it gets better at adapting to you the more you use it.
  • Extensible via the Model Context Protocol (MCP), so you can connect your own tools and APIs to interpret data or take actions.

Quick Start (Docker Compose)

Prerequisites: Docker and Docker Compose installed.

Environment: All required .env values are defined directly in docker-compose.yml. You do not need backend/.env. To customize, edit the backend service environment block and optionally uncomment provider API keys.

  1. Clone the repo
git clone https://github.com/SamDc73/Talimio.git
cd Talimio
  1. Start the stack
docker compose up -d
  1. Open the apps
  1. Optional: Pull Ollama models (first run)
  • Skip if you use cloud LLMs and set provider keys in docker-compose.yml.
docker exec -it ollama ollama pull gpt-oss:20b
docker exec -it ollama ollama pull nomic-embed-text

Note: To disable the local LLM, comment out the ollama service in docker-compose.yml and set PRIMARY_LLM_MODEL to a cloud model. Provide the relevant API key(s) in the same environment block.

  1. Stop/Update the stack
docker compose down
# update later
docker compose pull && docker compose up -d

Local Development (without Docker)

  1. Clone the repo
git clone https://github.com/SamDc73/Talimio.git
cd Talimio
  1. Backend (Python 3.12+, uv):
cd backend
uv sync
cp .env.example .env
uv run uvicorn src.main:app --reload --port 8080
  1. Frontend (Node + pnpm):
  • Now in a diffrent tab/window:
cd web
cp .env.example .env
pnpm install
pnpm dev
  1. Now you can open the apps:

Contributing

Any type of contribution is greatly apprietiated!

First timer flow:

  1. Pick one tiny improvement
  • Fix a typo, clarify a log/error message, tidy a function name, or improve copy.
  • Good places: README.md, backend/src/** (messages/docs), web/src/** (copy/UI nits).
  1. Run it locally
  • Use Quick Start above, and test it out.
  1. Run checks before PR
  • Backend: cd backend && ruff check src --fix
  • Backend types: cd backend && uvx ty check src
  • Frontend: cd frontend && pnpm run lint
  1. Open a draft PR
  • Two bullets are enough: why it helps, what changed.

Support

Questions, help, or feedback? Join our Discord