-
IBM
Stars
Get your documents ready for gen AI
Evaluate and Enhance Your LLM Deployments for Real-World Inference Needs
A Model Context Protocol (MCP) Gateway & Registry. Serves as a central management point for tools, resources, and prompts that can be accessed by MCP-compatible LLM applications. Converts REST API …
Community maintained hardware plugin for vLLM on Spyre
Achieve state of the art inference performance with modern accelerators on Kubernetes
IBM development fork of https://github.com/huggingface/text-generation-inference
KubeStellar - a flexible solution for multi-cluster configuration management for edge, multi-cloud, and hybrid cloud