MCPJam inspector is the local-first development platform for MCP servers. Visually test your server's tools, resources, and prompts. Try your server against different models in the LLM playground. Now with support for OpenAI Apps SDK.
Start up the MCPJam inspector:
npx @mcpjam/inspector@latest
Feature | Description |
---|---|
Protocol handshake testing | Visually test your MCP server's tools, resources, prompts, elicitation, and OAuth 2. MCPJam is compliant with the latest MCP specs. |
All transports | Connect to any MCP server. MCPJam inspector supports STDIO, SSE, and Streamable HTTP transports. |
LLM Playground | Integrated chat playground with OpenAI, Anthropic Claude, Google Gemini, and Ollama model support. Test how your MCP server would behave against an LLM |
Test OAuth | Test your server's OAuth and Dynamic Client Registration implementation. |
View JSON-RPC | View every JSON-RPC message sent over network. Provides granular observability and debugging. |
MCP-UI and OpenAI Apps SDK | Test your MCP server's implementation of MCP-UI or OpenAI Apps SDK |
Developing with Apps SDK is pretty restricted right now as it requires ChatGPT developer mode access and an OpenAI partner to approve access. We wanted to make that more accessible for developers today by putting it in an open source project, give yβall a head start.
Test your Apps SDK app with:
- Tools tab. Deterministically call tools and view your UI
- LLM playground to see your Apps SDK UI in a chat environment
The feature is in beta, and still needs polishing. Please report any bugs in the issues tab. We encourage the community to contibute!
We recommend starting MCPJam inspector via npx
:
npx @mcpjam/inspector@latest
or download the Mac / Windows desktop app on our site.
Run MCPJam Inspector using Docker:
# Run the latest version from Docker Hub
docker run -p 3001:3001 mcpjam/mcp-inspector:latest
# Or run in the background
docker run -d -p 3001:3001 --name mcp-inspector mcpjam/mcp-inspector:latest
The application will be available at http://localhost:3001
.
# Launch with custom port
npx @mcpjam/inspector@latest --port 4000
# Shortcut for starting MCPJam and an Ollama model
npx @mcpjam/inspector@latest --ollama llama3.2
# Local FastMCP STDIO example
npx @mcpjam/inspector@latest uv run fastmcp run /Users/matt8p/demo/src/server.py
# Local Node example
npx @mcpjam/inspector@latest npx -y /Users/matt8p/demo-ts/dist/index.js
You can import your mcp.json
MCP server configs from Claude Desktop and Cursor with the command:
npx @mcpjam/inspector@latest --config mcp.json
Spin up the MCPJam inspector
npx @mcpjam/inspector@latest
In the UI "MCP Servers" tab, click add server, select HTTP, then paste in your server URL. Support for OAuth 2.0 testing.
# Clone the repository
git clone https://github.com/mcpjam/inspector.git
cd inspector
# Install dependencies
npm install
# Start development server
npm run dev
The development server will start at http://localhost:6274
with hot reloading enabled.
# Build the application
npm run build
# Start production server
npm run start
We welcome contributions to MCPJam Inspector V1! Please read our Contributing Guide for development guidelines and best practices.
- π¬ Discord: Join the MCPJam Community
- π MCP Protocol: Model Context Protocol Documentation
- π€ AI SDK: Vercel AI SDK
- β‘ FastApps DooiLabs/FastApps - The Python framework to build OpenAI Apps.
- βοΈ xMCP xMCP - The Typescript MCP framework. Ship on Vercel instantly.
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
MCPJam Inspector V1 β’ Built with Hono.js and β€οΈ for the MCP community
π Website β’ π Docs β’ π Issues