LMDeploy is a toolkit for compressing, deploying, and serving LLMs.
-
Updated
Sep 12, 2025 - Python
LMDeploy is a toolkit for compressing, deploying, and serving LLMs.
LMDeploy is a toolkit for compressing, deploying, and serving LLMs.
Firefly: 大模型训练工具,支持训练Qwen2.5、Qwen2、Yi1.5、Phi-3、Llama3、Gemma、MiniCPM、Yi、Deepseek、Orion、Xverse、Mixtral-8x7B、Zephyr、Mistral、Baichuan2、Llma2、Llama、Qwen、Baichuan、ChatGLM2、InternLM、Ziya2、Vicuna、Bloom等大模型
Firefly: 大模型训练工具,支持训练Qwen2.5、Qwen2、Yi1.5、Phi-3、Llama3、Gemma、MiniCPM、Yi、Deepseek、Orion、Xverse、Mixtral-8x7B、Zephyr、Mistral、Baichuan2、Llma2、Llama、Qwen、Baichuan、ChatGLM2、InternLM、Ziya2、Vicuna、Bloom等大模型
Openai style api for open large language models, using LLMs just as chatgpt! Support for LLaMA, LLaMA-2, BLOOM, Falcon, Baichuan, Qwen, Xverse, SqlCoder, CodeLLaMA, ChatGLM, ChatGLM2, ChatGLM3 etc. 开源大模型的统一后端接口
Openai style api for open large language models, using LLMs just as chatgpt! Support for LLaMA, LLaMA-2, BLOOM, Falcon, Baichuan, Qwen, Xverse, SqlCoder, CodeLLaMA, ChatGLM, ChatGLM2, ChatGLM3 etc. 开源大模型的统一后端接口
Famous Vision Language Models and Their Architectures
Famous Vision Language Models and Their Architectures
🐋MindChat(漫谈)——心理大模型:漫谈人生路, 笑对风霜途
🐋MindChat(漫谈)——心理大模型:漫谈人生路, 笑对风霜途
InternEvo is an open-sourced lightweight training framework aims to support model pre-training without the need for extensive dependencies.
InternEvo is an open-sourced lightweight training framework aims to support model pre-training without the need for extensive dependencies.
Inferflow is an efficient and highly configurable inference engine for large language models (LLMs).
Inferflow is an efficient and highly configurable inference engine for large language models (LLMs).
这是一个基于 Next.js 构建的多语言 AI 模型评估平台,支持多模型对比和实时流式响应。A multilingual AI model evaluation platform built with Next.js, allowing users to compare responses from multiple models and receive a final judgment.
Add a description, image, and links to the internlm topic page so that developers can more easily learn about it.
To associate your repository with the internlm topic, visit your repo's landing page and select "manage topics."