Thanks to visit codestin.com
Credit goes to github.com

Skip to content

MCPJam/inspector

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

MCPJam inspector is the testing and debugging platform for MCP servers & OpenAI apps. Visually inspect your server's tools, resources, prompts, and OAuth. Try your server against different models in the LLM playground.

πŸš€ Quick Start

Start up the MCPJam inspector:

npx @mcpjam/inspector@latest

MCPJam Inspector Demo

Table of contents

Installation Guides

Requirements

Node.js TypeScript

Install via NPM

We recommend starting MCPJam inspector via npx:

npx @mcpjam/inspector@latest

We also have a Mac and Windows desktop app:

Docker

Run MCPJam Inspector using Docker:

# Using Docker Compose (recommended)
docker-compose up -d

# Or using Docker run
docker run -d \
  -p 6274:6274 \
  --env-file .env.production \
  -e NODE_ENV=production \
  --add-host host.docker.internal:host-gateway \
  --name mcp-inspector \
  --restart unless-stopped \
  mcpjam/mcp-inspector:latest

The application will be available at http://127.0.0.1:6274.

Important for macOS/Windows users:

  • Access the app via http://127.0.0.1:6274 (not localhost)
  • When connecting to MCP servers on your host machine, use http://host.docker.internal:PORT instead of http://127.0.0.1:PORT

Example:

# Your MCP server runs on host at: http://127.0.0.1:8080/mcp
# In Docker, configure it as: http://host.docker.internal:8080/mcp

Key Features

OpenAI Apps & MCP-UI

Develop OpenAI apps or MCP-UI apps locally. No ngrok needed. MCPJam is the only local-first OpenAI app emulator.

MCPJam LLM playground

OAuth Debugger

View every step of the OAuth handshake in detail, with guided explanations.

MCPJam OAuth Flow Debugger

LLM Playground

Try your server against any LLM model. We provide frontier models like GPT-5, Claude Sonnet, Gemini 2.5. No API key needed, it's on us.

MCPJam LLM playground

Contributing πŸ‘¨β€πŸ’»

We're grateful for you considering contributing to MCPJam. Please read our contributing guide.

You can also reach out to the contributors that hang out in our Discord channel.

Links πŸ”—

Community 🌍

Shoutouts πŸ“£

Some of our partners and favorite frameworks:

  • Stytch - Our favorite MCP OAuth provider
  • DooiLabs/FastApps - The Python framework to build OpenAI Apps.
  • xMCP - The Typescript MCP framework. Ship on Vercel instantly.
  • Alpic - Host MCP servers

License πŸ“„

This project is licensed under the Apache License 2.0 - see the LICENSE.