Lists (16)
Sort Name ascending (A-Z)
Starred repositories
"AI-Trader: Can AI Beat the Market?" Live Trading: https://hkuds.github.io/AI-Trader/
An open-source runtime for composable workflows. Great for AI agents and CI/CD.
Python ETL framework for stream processing, real-time analytics, LLM pipelines, and RAG.
Course to get into Large Language Models (LLMs) with roadmaps and Colab notebooks.
Convert Any OpenAPI V3 API to MCP Server
Connect APIs, remarkably fast. Free for developers.
[CSUR 2025] Continual Learning of Large Language Models: A Comprehensive Survey
Synthetic data curation for post-training and structured data extraction
OpenTelemetry Instrumentation for AI Observability
Build PowerPoint presentations with JavaScript. Works with Node, React, web browsers, and more.
The open-source LLMOps platform: prompt playground, prompt management, LLM evaluation, and LLM observability all in one place.
A platform for packaging and launching blockchain infra. Think docker compose for blockchain
A toolkit for detecting and protecting against vulnerabilities in Large Language Models (LLMs).
An awesome & curated list of best LLMOps tools for developers
Open-source observability for your GenAI or LLM application, based on OpenTelemetry
LLM-powered framework for deep document understanding, semantic retrieval, and context-aware answers using RAG paradigm.
Olive: Simplify ML Model Finetuning, Conversion, Quantization, and Optimization for CPUs, GPUs and NPUs.
๐ฅ The Web Data API for AI - Turn entire websites into LLM-ready markdown or structured data
Deepchecks: Tests for Continuous Validation of ML Models & Data. Deepchecks is a holistic open-source solution for all of your AI & ML validation needs, enabling to thoroughly test your data and moโฆ
Embedding Atlas is a tool that provides interactive visualizations for large embeddings. It allows you to visualize, cross-filter, and search embeddings and metadata.
๐ The fast, Pythonic way to build MCP servers and clients
Python package and backend for the Elysia platform app.