Thanks to visit codestin.com
Credit goes to GitHub.com

Skip to content

An open-source AI-native operating system to define, organize, and run intelligent agent teams

License

Notifications You must be signed in to change notification settings

wecode-ai/Wegent

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Wegent

🚀 An open-source AI-native operating system to define, organize, and run intelligent agent teams

English | 简体中文

Python FastAPI Next.js Docker Claude Gemini Version


✨ Core Features

💬 Chat Mode

A fully open-source chat agent with powerful capabilities:

  • Multi-Model Support: Compatible with Claude, OpenAI, Gemini, DeepSeek, GLM and other mainstream models
  • Conversation History: Create new conversations, multi-turn dialogues, save and share chat history
  • Group Chat: AI group chat where AI responds based on conversation context with @mentions
  • Attachment Parsing: Send txt, pdf, ppt, doc, images and other file formats in single/group chats
  • Follow-up Mode: AI asks clarifying questions to help refine your requirements
  • Error Correction Mode: Multiple AI models automatically detect and correct response errors
  • Long-term Memory: Supports mem0 integration for conversation memory persistence
  • Sandbox Execution: Execute commands or modify files via sandbox, E2B protocol compatible
  • Extensions: Customize prompts, MCP tools and Skills (includes chart drawing skill)

💻 Code Mode

A cloud-based Claude Code execution engine:

  • Multi-Model Configuration: Configure various Claude-compatible models
  • Concurrent Execution: Run multiple coding tasks simultaneously in the cloud
  • Requirement Clarification: AI analyzes code and asks questions to generate specification documents
  • Git Integration: Integrate with GitHub/GitLab/Gitea/Gerrit to clone, modify and create PRs
  • MCP/Skill Support: Configure MCP tools and Skills for agents
  • Multi-turn Conversations: Continue conversations with follow-up questions

📡 Feed Mode

A cloud-based AI task trigger system:

  • Full Capability Access: Tasks can use all Chat and Code mode capabilities
  • Scheduled/Event Triggers: Set up cron schedules or event-based AI task execution
  • Information Feed: Display AI-generated content as an information stream
  • Event Filtering: Filter conditions like "only notify me if it will rain tomorrow"

📚 Knowledge Mode

A cloud-based AI document repository:

  • Document Management: Upload and manage txt/doc/ppt/xls and other document formats
  • Web Import: Import web pages and DingTalk multi-dimensional tables
  • NotebookLM Mode: Select documents directly in notebooks for Q&A
  • Online Editing: Edit text files directly in notebook mode
  • Chat Integration: Reference knowledge bases in single/group chats for AI responses

🔧 Customization

All features above are fully customizable:

  • Custom Agents: Create custom agents in the web UI, configure prompts, MCP, Skills and multi-agent collaboration
  • Agent Creation Wizard: 4-step creation: Describe requirements → AI asks questions → Real-time fine-tuning → One-click create
  • Organization Management: Create and join groups, share agents, models, Skills within groups

🔧 Extensibility

  • Agent Creation Wizard: 4-step creation: Describe requirements → AI asks questions → Real-time fine-tuning → One-click create
  • Collaboration Modes: 4 out-of-the-box multi-Agent collaboration modes (Sequential/Parallel/Router/Loop)
  • Skill Support: Dynamically load skill packages to improve Token efficiency
  • MCP Tools: Model Context Protocol for calling external tools and services
  • Execution Engines: ClaudeCode / Agno sandboxed isolation, Dify API proxy, Chat direct mode
  • YAML Config: Kubernetes-style CRD for defining Ghost / Bot / Team / Skill
  • API: OpenAI-compatible interface for easy integration with other systems

🚀 Quick Start

curl -fsSL https://raw.githubusercontent.com/wecode-ai/Wegent/main/install.sh | bash

Then open http://localhost:3000 in your browser.

Optional: Enable RAG features with docker compose --profile rag up -d


📦 Built-in Agents

Team Purpose
chat-team General AI assistant + Mermaid diagrams
translator Multi-language translation
dev-team Git workflow: branch → code → commit → PR
wiki-team Codebase Wiki documentation generation

🏗️ Architecture

Frontend (Next.js) → Backend (FastAPI) → Executor Manager → Executors (ClaudeCode/Agno/Dify/Chat)

Core Concepts:

  • Ghost (prompt) + Shell (environment) + Model = Bot
  • Multiple Bots + Collaboration Mode = Team

See Core Concepts | YAML Spec


🤝 Contributing

We welcome contributions! Please see our Contributing Guide for details.

📞 Support

👥 Contributors

Thanks to the following developers for their contributions and efforts to make this project better. 💪

qdaxb
Axb
feifei325
Feifei
Micro66
MicroLee
cc-yafei
YaFei Liu
johnny0120
Johnny0120
kissghosts
Yanhe
joyway1978
Joyway78
moqimoqidea
Moqimoqidea
2561056571
Xuemin
yixiangxx
Yi Xiang
junbaor
Junbaor
icycrystal4
Icycrystal4
FicoHu
FicoHu
maquan0927
Just Quan
fingki
Fingki
parabala
Parabala
fengkuizhi
Fengkuizhi
jolestar
Jolestar
qwertyerge
Erdawang
sunnights
Jake Zhang
DeadLion
Jasper Zhong
andrewzq777
Andrewzq777
graindt
Graindt
salt-hai
Salt-hai

Made with ❤️ by WeCode-AI Team