Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Production-Ready Rust Implementation of the Model Context Protocol with blazing-fast performance, comprehensive tools, and a web-based inspector.

License

Notifications You must be signed in to change notification settings

koki7o/mcp-framework

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

10 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

MCP Framework Banner

πŸš€ MCP Framework - Rust Implementation

MCP Framework

Production-Ready Rust Implementation of the Model Context Protocol with blazing-fast performance, comprehensive tools, and a web-based inspector.


License MCP Spec Rust


🌐 What is mcp-framework?

mcp-framework is a complete, production-ready Rust implementation of the Model Context Protocol, enabling you to:

  • πŸ€– Build AI Agents - Create intelligent agents with LLM integration (Claude, OpenAI) and multi-step reasoning
  • πŸ› οΈ Create MCP Servers - Register tools, resources, and prompts easily
  • πŸ“‘ Connect to MCP Servers - HTTP client for programmatic tool access
  • πŸ” Debug with Inspector - Beautiful web-based dashboard for testing tools
  • ⚑ High Performance - Blazing-fast Rust implementation
  • πŸ›‘οΈ Type-Safe - Leverage Rust's type system for safety and reliability

✨ Key Features

🎯 Core Components

Feature Status Details
MCP Server βœ… Complete Register tools, handle execution, JSON-RPC protocol
MCP Client βœ… Complete Multi-transport client (HTTP, HTTPS, stdio) with session management
AI Agent βœ… Complete Agentic loop with pluggable LLM providers
Web Inspector βœ… Complete Interactive UI at http://localhost:8123
Claude Integration βœ… Complete AnthropicAdapter for Claude models with tool use
OpenAI Integration βœ… Complete OpenAIAdapter with Responses API and internal tool loop
Browser Automation βœ… Complete Playwright MCP integration for web automation
Protocol Types βœ… Complete Tools and Messages (Core MCP protocol)
Session Management βœ… Complete Multi-server sessions with connectors
Resources ⏳ Planned For serving files and data to clients
Prompts ⏳ Planned Callable prompt templates with dynamic generation
Authentication ⏳ Planned Bearer tokens, OAuth 2.0 support
.env Support βœ… Complete Load API keys from environment files

πŸ› οΈ 8 Built-in Example Tools

β€’ echo           - String echo utility
β€’ calculator     - Math: add, subtract, multiply, divide, power, sqrt
β€’ get_weather    - Weather lookup for cities worldwide
β€’ search_text    - Find pattern occurrences in text
β€’ string_length  - Get character count
β€’ text_reverse   - Reverse text strings
β€’ json_parser    - Validate and format JSON
β€’ http_status    - Look up HTTP status codes

πŸ“¦ Quick Start

Prerequisites

# Requires Rust 1.70+
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh

1. Clone & Setup

git clone https://github.com/koki7o/mcp-framework
cd mcp-framework

# Create .env for API keys (optional but recommended)
cp .env.example .env
# Edit .env and add ANTHROPIC_API_KEY or OPENAI_API_KEY

2. Run Examples

Minimal Server (1 tool):

cargo run

Server with 8 Tools + Inspector UI:

cargo run --example server_with_tools
# Visit: http://localhost:8123

AI Agent with Claude:

# Requires ANTHROPIC_API_KEY in .env
cargo run --example anthropic_agent_demo_with_tools --release

AI Agent with OpenAI:

# Requires OPENAI_API_KEY in .env
cargo run --example openai_agent_demo_with_tools --release

Browser Automation (OpenAI):

# Requires OPENAI_API_KEY in .env
# Install: npm install -g @playwright/mcp@latest && npx playwright install firefox
cargo run --example browser_agent_openai

Browser Automation (Claude):

# Requires ANTHROPIC_API_KEY in .env
# Install: npm install -g @playwright/mcp@latest && npx playwright install firefox
cargo run --example browser_agent_anthropic

🎯 What Do You Want to Build?

πŸ€– Build an AI Agent

Create intelligent agents that can use MCP tools to accomplish complex tasks.

Quick Example:

use mcp_framework::prelude::*;
use std::sync::Arc;

#[tokio::main]
async fn main() -> Result<()> {
    mcp_framework::load_env();

    let client = McpClient::new("http://localhost:3000");
    let llm = AnthropicAdapter::from_env("claude-sonnet-4-5-20250929".to_string())?;
    let mut agent = Agent::new(client, Arc::new(llm), AgentConfig::default());

    let response = agent.run("What is 15 + 27?").await?;
    println!("{}", response);

    Ok(())
}

Run Examples:

  • cargo run --example anthropic_agent_demo_with_tools --release - Claude demo
  • cargo run --example openai_agent_demo_with_tools --release - OpenAI demo

πŸ› οΈ Create an MCP Server

Build your own MCP servers with custom tools.

Quick Example:

use mcp_framework::prelude::*;
use mcp_framework::server::{McpServer, ServerConfig, ToolHandler};
use std::sync::Arc;

struct MyToolHandler;

#[async_trait::async_trait]
impl ToolHandler for MyToolHandler {
    async fn execute(&self, name: &str, arguments: serde_json::Value)
        -> Result<Vec<ResultContent>> {
        match name {
            "greet" => Ok(vec![ResultContent::Text {
                text: format!("Hello, {}!", arguments.get("name").and_then(|v| v.as_str()).unwrap_or("stranger"))
            }]),
            _ => Err(Error::ToolNotFound(name.to_string())),
        }
    }
}

#[tokio::main]
async fn main() -> Result<()> {
    let config = ServerConfig {
        name: "My Server".to_string(),
        version: "1.0.0".to_string(),
        capabilities: ServerCapabilities {
            tools: Some(ToolsCapability { list_changed: Some(false) }),
            resources: None,  // Not implemented yet
            prompts: None,    // Not implemented yet
        },
    };

    let server = McpServer::new(config, Arc::new(MyToolHandler));

    server.register_tool(Tool {
        name: "greet".to_string(),
        description: Some("Greet someone".to_string()),
        input_schema: None,
    });

    Ok(())
}

Examples:

  • cargo run - Minimal server (1 tool)
  • cargo run --example server_with_tools - Comprehensive example (8 tools + Inspector)

πŸ“‘ Use MCP Client

Connect to MCP servers and call tools programmatically.

Quick Example:

use mcp_framework::prelude::*;
use serde_json::json;

#[tokio::main]
async fn main() -> Result<()> {
    let client = McpClient::new("http://localhost:3000");

    // List all tools
    let tools = client.list_tools().await?;
    println!("Available tools: {:?}", tools);

    // Call a tool
    let result = client.call_tool("echo", json!({
        "message": "Hello, MCP!"
    })).await?;
    println!("Result: {:?}", result);

    Ok(())
}

Example:

  • cargo run --example client_usage - Full client usage example

πŸ” Debug with Inspector

Test and debug MCP servers interactively with a web-based UI.

cargo run --example server_with_tools
# Open browser to: http://localhost:8123

The Inspector provides:

  • πŸ“‹ View all registered tools with descriptions
  • πŸ§ͺ Test tools interactively with auto-generated forms
  • πŸ“Š See full request/response history
  • πŸ” Inspect tool outputs and errors in real-time

πŸ“ Project Structure

mcp-framework/
β”œβ”€β”€ src/
β”‚   β”œβ”€β”€ lib.rs                ← Main library entry point (prelude + exports)
β”‚   β”œβ”€β”€ protocol.rs           ← MCP type definitions (Tools, Messages, Protocol)
β”‚   β”œβ”€β”€ server.rs             ← McpServer implementation & tool registration
β”‚   β”œβ”€β”€ client.rs             ← McpClient implementation (HTTP-based)
β”‚   β”œβ”€β”€ agent.rs              ← AI Agent with agentic loop & LLM integration
β”‚   β”œβ”€β”€ inspector.rs          ← Web-based debugging UI (localhost:8123)
β”‚   β”œβ”€β”€ error.rs              ← Error types and JSON-RPC codes
β”‚   └── adapters/
β”‚       β”œβ”€β”€ mod.rs
β”‚       β”œβ”€β”€ anthropic.rs      ← Claude (Anthropic) LLM adapter
β”‚       └── openai.rs         ← OpenAI GPT LLM adapter
β”œβ”€β”€ examples/
β”‚   β”œβ”€β”€ server_with_tools.rs               ← 8-tool server with Inspector
β”‚   β”œβ”€β”€ anthropic_agent_demo_with_tools.rs ← Claude agent example
β”‚   β”œβ”€β”€ openai_agent_demo_with_tools.rs    ← OpenAI agent example
β”‚   β”œβ”€β”€ browser_agent_openai.rs            ← Browser automation with OpenAI
β”‚   β”œβ”€β”€ browser_agent_anthropic.rs         ← Browser automation with Claude
β”‚   β”œβ”€β”€ client_usage.rs                    ← Client usage example
β”‚   └── simple_server.rs                   ← Minimal server example
β”œβ”€β”€ assets/
β”‚   └── banner.png           
β”œβ”€β”€ Cargo.toml
β”œβ”€β”€ Cargo.lock
β”œβ”€β”€ LICENSE                   ← MIT License
β”œβ”€β”€ .env.example              ← Environment variables template
β”œβ”€β”€ .gitignore
└── README.md

πŸš€ Core API Reference

Create a Server

let config = ServerConfig {
    name: "My Server".to_string(),
    version: "1.0.0".to_string(),
    capabilities: ServerCapabilities {
        tools: Some(ToolsCapability { list_changed: Some(false) }),
        resources: None,  // Not implemented yet
        prompts: None,    // Not implemented yet
    },
};

let handler = Arc::new(MyToolHandler);
let server = McpServer::new(config, handler);

Register a Tool

use std::collections::HashMap;
use serde_json::json;

let mut properties = HashMap::new();
properties.insert("param".to_string(), json!({"type": "string"}));

server.register_tool(Tool {
    name: "my_tool".to_string(),
    description: Some("Does something useful".to_string()),
    input_schema: Some(ToolInputSchema {
        schema_type: "object".to_string(),
        properties,
        required: Some(vec!["param".to_string()]),
    }),
});

Implement ToolHandler

#[async_trait::async_trait]
impl ToolHandler for MyHandler {
    async fn execute(&self, name: &str, arguments: Value)
        -> Result<Vec<ResultContent>> {
        match name {
            "my_tool" => {
                // Extract and validate arguments
                let param = arguments.get("param")
                    .and_then(|v| v.as_str())?;

                // Implement your logic
                let result = do_something(param);

                Ok(vec![ResultContent::Text {
                    text: result.to_string()
                }])
            }
            _ => Err(Error::ToolNotFound(name.to_string())),
        }
    }
}

Create an Agent

use std::sync::Arc;

#[tokio::main]
async fn main() -> Result<()> {
    mcp_framework::load_env();

    let client = McpClient::new("http://localhost:3000");
    let llm = AnthropicAdapter::from_env("claude-sonnet-4-5-20250929".to_string())?;

    let mut agent = Agent::new(client, Arc::new(llm), AgentConfig {
        max_iterations: 10,
        max_tokens: Some(2048),
    });

    let response = agent.run("Your query here").await?;
    println!("Response: {}", response);

    Ok(())
}

Use the Client

let client = McpClient::new("http://localhost:3000");

// List available tools
let tools = client.list_tools().await?;

// Call a tool
let result = client.call_tool("echo", json!({
    "message": "Hello!"
})).await?;

πŸ§ͺ Testing

# Run all tests
cargo test

# Run with output
cargo test -- --nocapture

# Run specific test
cargo test test_name

# Run with release optimizations
cargo test --release

🀝 Contributing

Contributions are welcome! Please feel free to submit a Pull Request.


πŸ“„ License

MIT License - see LICENSE file for details


πŸ”— Resources


Made with ❀️ for the MCP community

Report Issues β€’ Discussions

About

Production-Ready Rust Implementation of the Model Context Protocol with blazing-fast performance, comprehensive tools, and a web-based inspector.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages