Production-Ready Rust Implementation of the Model Context Protocol with blazing-fast performance, comprehensive tools, and a web-based inspector.
mcp-framework is a complete, production-ready Rust implementation of the Model Context Protocol, enabling you to:
- π€ Build AI Agents - Create intelligent agents with LLM integration (Claude, OpenAI) and multi-step reasoning
- π οΈ Create MCP Servers - Register tools, resources, and prompts easily
- π‘ Connect to MCP Servers - HTTP client for programmatic tool access
- π Debug with Inspector - Beautiful web-based dashboard for testing tools
- β‘ High Performance - Blazing-fast Rust implementation
- π‘οΈ Type-Safe - Leverage Rust's type system for safety and reliability
| Feature | Status | Details |
|---|---|---|
| MCP Server | β Complete | Register tools, handle execution, JSON-RPC protocol |
| MCP Client | β Complete | Multi-transport client (HTTP, HTTPS, stdio) with session management |
| AI Agent | β Complete | Agentic loop with pluggable LLM providers |
| Web Inspector | β Complete | Interactive UI at http://localhost:8123 |
| Claude Integration | β Complete | AnthropicAdapter for Claude models with tool use |
| OpenAI Integration | β Complete | OpenAIAdapter with Responses API and internal tool loop |
| Browser Automation | β Complete | Playwright MCP integration for web automation |
| Protocol Types | β Complete | Tools and Messages (Core MCP protocol) |
| Session Management | β Complete | Multi-server sessions with connectors |
| Resources | β³ Planned | For serving files and data to clients |
| Prompts | β³ Planned | Callable prompt templates with dynamic generation |
| Authentication | β³ Planned | Bearer tokens, OAuth 2.0 support |
| .env Support | β Complete | Load API keys from environment files |
β’ echo - String echo utility
β’ calculator - Math: add, subtract, multiply, divide, power, sqrt
β’ get_weather - Weather lookup for cities worldwide
β’ search_text - Find pattern occurrences in text
β’ string_length - Get character count
β’ text_reverse - Reverse text strings
β’ json_parser - Validate and format JSON
β’ http_status - Look up HTTP status codes
# Requires Rust 1.70+
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | shgit clone https://github.com/koki7o/mcp-framework
cd mcp-framework
# Create .env for API keys (optional but recommended)
cp .env.example .env
# Edit .env and add ANTHROPIC_API_KEY or OPENAI_API_KEYMinimal Server (1 tool):
cargo runServer with 8 Tools + Inspector UI:
cargo run --example server_with_tools
# Visit: http://localhost:8123AI Agent with Claude:
# Requires ANTHROPIC_API_KEY in .env
cargo run --example anthropic_agent_demo_with_tools --releaseAI Agent with OpenAI:
# Requires OPENAI_API_KEY in .env
cargo run --example openai_agent_demo_with_tools --releaseBrowser Automation (OpenAI):
# Requires OPENAI_API_KEY in .env
# Install: npm install -g @playwright/mcp@latest && npx playwright install firefox
cargo run --example browser_agent_openaiBrowser Automation (Claude):
# Requires ANTHROPIC_API_KEY in .env
# Install: npm install -g @playwright/mcp@latest && npx playwright install firefox
cargo run --example browser_agent_anthropicCreate intelligent agents that can use MCP tools to accomplish complex tasks.
Quick Example:
use mcp_framework::prelude::*;
use std::sync::Arc;
#[tokio::main]
async fn main() -> Result<()> {
mcp_framework::load_env();
let client = McpClient::new("http://localhost:3000");
let llm = AnthropicAdapter::from_env("claude-sonnet-4-5-20250929".to_string())?;
let mut agent = Agent::new(client, Arc::new(llm), AgentConfig::default());
let response = agent.run("What is 15 + 27?").await?;
println!("{}", response);
Ok(())
}Run Examples:
cargo run --example anthropic_agent_demo_with_tools --release- Claude democargo run --example openai_agent_demo_with_tools --release- OpenAI demo
Build your own MCP servers with custom tools.
Quick Example:
use mcp_framework::prelude::*;
use mcp_framework::server::{McpServer, ServerConfig, ToolHandler};
use std::sync::Arc;
struct MyToolHandler;
#[async_trait::async_trait]
impl ToolHandler for MyToolHandler {
async fn execute(&self, name: &str, arguments: serde_json::Value)
-> Result<Vec<ResultContent>> {
match name {
"greet" => Ok(vec![ResultContent::Text {
text: format!("Hello, {}!", arguments.get("name").and_then(|v| v.as_str()).unwrap_or("stranger"))
}]),
_ => Err(Error::ToolNotFound(name.to_string())),
}
}
}
#[tokio::main]
async fn main() -> Result<()> {
let config = ServerConfig {
name: "My Server".to_string(),
version: "1.0.0".to_string(),
capabilities: ServerCapabilities {
tools: Some(ToolsCapability { list_changed: Some(false) }),
resources: None, // Not implemented yet
prompts: None, // Not implemented yet
},
};
let server = McpServer::new(config, Arc::new(MyToolHandler));
server.register_tool(Tool {
name: "greet".to_string(),
description: Some("Greet someone".to_string()),
input_schema: None,
});
Ok(())
}Examples:
cargo run- Minimal server (1 tool)cargo run --example server_with_tools- Comprehensive example (8 tools + Inspector)
Connect to MCP servers and call tools programmatically.
Quick Example:
use mcp_framework::prelude::*;
use serde_json::json;
#[tokio::main]
async fn main() -> Result<()> {
let client = McpClient::new("http://localhost:3000");
// List all tools
let tools = client.list_tools().await?;
println!("Available tools: {:?}", tools);
// Call a tool
let result = client.call_tool("echo", json!({
"message": "Hello, MCP!"
})).await?;
println!("Result: {:?}", result);
Ok(())
}Example:
cargo run --example client_usage- Full client usage example
Test and debug MCP servers interactively with a web-based UI.
cargo run --example server_with_tools
# Open browser to: http://localhost:8123The Inspector provides:
- π View all registered tools with descriptions
- π§ͺ Test tools interactively with auto-generated forms
- π See full request/response history
- π Inspect tool outputs and errors in real-time
mcp-framework/
βββ src/
β βββ lib.rs β Main library entry point (prelude + exports)
β βββ protocol.rs β MCP type definitions (Tools, Messages, Protocol)
β βββ server.rs β McpServer implementation & tool registration
β βββ client.rs β McpClient implementation (HTTP-based)
β βββ agent.rs β AI Agent with agentic loop & LLM integration
β βββ inspector.rs β Web-based debugging UI (localhost:8123)
β βββ error.rs β Error types and JSON-RPC codes
β βββ adapters/
β βββ mod.rs
β βββ anthropic.rs β Claude (Anthropic) LLM adapter
β βββ openai.rs β OpenAI GPT LLM adapter
βββ examples/
β βββ server_with_tools.rs β 8-tool server with Inspector
β βββ anthropic_agent_demo_with_tools.rs β Claude agent example
β βββ openai_agent_demo_with_tools.rs β OpenAI agent example
β βββ browser_agent_openai.rs β Browser automation with OpenAI
β βββ browser_agent_anthropic.rs β Browser automation with Claude
β βββ client_usage.rs β Client usage example
β βββ simple_server.rs β Minimal server example
βββ assets/
β βββ banner.png
βββ Cargo.toml
βββ Cargo.lock
βββ LICENSE β MIT License
βββ .env.example β Environment variables template
βββ .gitignore
βββ README.md
let config = ServerConfig {
name: "My Server".to_string(),
version: "1.0.0".to_string(),
capabilities: ServerCapabilities {
tools: Some(ToolsCapability { list_changed: Some(false) }),
resources: None, // Not implemented yet
prompts: None, // Not implemented yet
},
};
let handler = Arc::new(MyToolHandler);
let server = McpServer::new(config, handler);use std::collections::HashMap;
use serde_json::json;
let mut properties = HashMap::new();
properties.insert("param".to_string(), json!({"type": "string"}));
server.register_tool(Tool {
name: "my_tool".to_string(),
description: Some("Does something useful".to_string()),
input_schema: Some(ToolInputSchema {
schema_type: "object".to_string(),
properties,
required: Some(vec!["param".to_string()]),
}),
});#[async_trait::async_trait]
impl ToolHandler for MyHandler {
async fn execute(&self, name: &str, arguments: Value)
-> Result<Vec<ResultContent>> {
match name {
"my_tool" => {
// Extract and validate arguments
let param = arguments.get("param")
.and_then(|v| v.as_str())?;
// Implement your logic
let result = do_something(param);
Ok(vec![ResultContent::Text {
text: result.to_string()
}])
}
_ => Err(Error::ToolNotFound(name.to_string())),
}
}
}use std::sync::Arc;
#[tokio::main]
async fn main() -> Result<()> {
mcp_framework::load_env();
let client = McpClient::new("http://localhost:3000");
let llm = AnthropicAdapter::from_env("claude-sonnet-4-5-20250929".to_string())?;
let mut agent = Agent::new(client, Arc::new(llm), AgentConfig {
max_iterations: 10,
max_tokens: Some(2048),
});
let response = agent.run("Your query here").await?;
println!("Response: {}", response);
Ok(())
}let client = McpClient::new("http://localhost:3000");
// List available tools
let tools = client.list_tools().await?;
// Call a tool
let result = client.call_tool("echo", json!({
"message": "Hello!"
})).await?;# Run all tests
cargo test
# Run with output
cargo test -- --nocapture
# Run specific test
cargo test test_name
# Run with release optimizations
cargo test --releaseContributions are welcome! Please feel free to submit a Pull Request.
MIT License - see LICENSE file for details
- Model Context Protocol - Official MCP website
- MCP Specification - Official protocol specification
- Rust Book - Learn Rust
- Tokio Docs - Async runtime documentation
- Serde Documentation - Serialization framework
Made with β€οΈ for the MCP community