Your MCP Concierge - The complete platform for discovering, managing, and accessing MCPs.
MCP turns AI into your assistant. NCP turns your assistant into an executive.
From desperation to delegation - your AI goes from overwhelmed to unstoppable.
NCP transforms N scattered MCP servers into 1 intelligent orchestrator. Your AI sees just 2 simple tools instead of 50+ complex ones, while NCP handles all the routing, discovery, and execution behind the scenes.
🚀 NEW: Project-level configuration - each project can define its own MCPs automatically
Result: Same tools, same capabilities, but your AI becomes focused, efficient, and cost-effective again.
What's MCP? The Model Context Protocol by Anthropic lets AI assistants connect to external tools and data sources. Think of MCPs as "plugins" that give your AI superpowers like file access, web search, databases, and more.
You gave your AI assistant 50 tools to be more capable. Instead, you got desperation:
- Paralyzed by choice ("Should I use
read_file
orget_file_content
?") - Exhausted before starting ("I've spent my context limit analyzing which tool to use")
- Costs explode (50+ tool schemas burn tokens before any real work happens)
- Asks instead of acts (used to be decisive, now constantly asks for clarification)
Think about it:
A child with one toy → Treasures it, masters it, creates endless games with it
A child with 50 toys → Can't hold them all, loses pieces, gets overwhelmed, stops playing entirely
Your AI is that child. MCPs are the toys. More isn't always better.
Or picture this: You're craving pizza. Someone hands you a pizza → Pure joy! 🍕
But take you to a buffet with 200 dishes → Analysis paralysis. You spend 20 minutes deciding, lose your appetite, leave unsatisfied.
Same with your AI: Give it one perfect tool → Instant action. Give it 50 tools → Cognitive overload.
The most creative people thrive with constraints, not infinite options. Your AI is no different.
Think about it:
-
A poet with "write about anything" → Writer's block
-
A poet with "write a haiku about rain" → Instant inspiration
-
A developer with access to "all programming languages" → Analysis paralysis
-
A developer with "Python for this task" → Focused solution
Your AI needs the same focus. NCP gives it constraints that spark creativity, not chaos that kills it.
When your AI assistant manages 50 tools directly:
🤖 AI Assistant Context:
├── Filesystem MCP (12 tools) ─ 15,000 tokens
├── Database MCP (8 tools) ─── 12,000 tokens
├── Web Search MCP (6 tools) ── 8,000 tokens
├── Email MCP (15 tools) ───── 18,000 tokens
├── Shell MCP (10 tools) ───── 14,000 tokens
├── GitHub MCP (20 tools) ──── 25,000 tokens
└── Slack MCP (9 tools) ────── 11,000 tokens
💀 Total: 80 tools = 103,000 tokens of schemas
What happens:
- AI burns 50%+ of context just understanding what tools exist
- Spends 5-8 seconds analyzing which tool to use
- Often picks wrong tool due to schema confusion
- Hits context limits mid-conversation
With NCP as Chief of Staff:
🤖 AI Assistant Context:
└── NCP (2 unified tools) ──── 2,500 tokens
🎯 Behind the scenes: NCP manages all 80 tools
📈 Context saved: 100,500 tokens (97% reduction!)
⚡ Decision time: Sub-second tool selection
🎪 AI behavior: Confident, focused, decisive
Real results from our testing:
Your MCP Setup | Without NCP | With NCP | Token Savings |
---|---|---|---|
Small (5 MCPs, 25 tools) | 15,000 tokens | 8,000 tokens | 47% saved |
Medium (15 MCPs, 75 tools) | 45,000 tokens | 12,000 tokens | 73% saved |
Large (30 MCPs, 150 tools) | 90,000 tokens | 15,000 tokens | 83% saved |
Enterprise (50+ MCPs, 250+ tools) | 150,000 tokens | 20,000 tokens | 87% saved |
Translation:
- 5x faster responses (8 seconds → 1.5 seconds)
- 12x longer conversations before hitting limits
- 90% reduction in wrong tool selection
- Zero context exhaustion in typical sessions
- Node.js 18+ (Download here)
- npm (included with Node.js) or npx for running packages
- Command line access (Terminal on Mac/Linux, Command Prompt/PowerShell on Windows)
Choose your preferred installation method:
Method | Best For | Downloads |
---|---|---|
📦 .dxt Bundle | Claude Desktop users | |
📥 npm Package | All MCP clients, CLI users |
For Claude Desktop users - Download and double-click to install:
- Download NCP Desktop Extension: ncp.dxt
- Double-click the downloaded
ncp.dxt
file - Claude Desktop will prompt you to install - click "Install"
- Auto-sync with Claude Desktop - NCP continuously syncs MCPs:
- Detects MCPs from
claude_desktop_config.json
- Detects .dxt-installed extensions
- Runs on every startup to find new MCPs
- Uses internal
add
command for cache coherence
- Detects MCPs from
🔄 Continuous sync: NCP automatically detects and imports new MCPs every time you start it! Add an MCP to Claude Desktop → NCP auto-syncs it on next startup. Zero manual configuration needed.
If you want to add more MCPs later, configure manually by editing ~/.ncp/profiles/all.json
:
# Edit the profile configuration
nano ~/.ncp/profiles/all.json
{
"mcpServers": {
"filesystem": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/Users/yourname"]
},
"github": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-github"],
"env": {
"GITHUB_PERSONAL_ACCESS_TOKEN": "ghp_xxx"
}
}
}
}
- Restart Claude Desktop and NCP will load your configured MCPs
ℹ️ About .dxt (Desktop Extension) format:
- Slim & Fast: Desktop extension is MCP-only (126KB, no CLI code)
- Manual config: Edit JSON files directly (no
ncp add
command)- Power users: Fastest startup, direct control over configuration
- Optional CLI: Install
npm install -g @portel/ncp
separately if you want CLI toolsWhy .dxt is slim: The .dxt (Desktop Extension) format excludes all CLI code, making it 13% smaller and faster to load than the full npm package. Perfect for production use where you manage configs manually or via automation.
Already have MCPs? Don't start over - import everything instantly:
# Install NCP globally (recommended)
npm install -g @portel/ncp
# Copy your claude_desktop_config.json content to clipboard:
# 1. Open your claude_desktop_config.json file (see locations above)
# 2. Select all content (Ctrl+A / Cmd+A) and copy (Ctrl+C / Cmd+C)
# 3. Then run:
ncp config import
# ✨ Magic! NCP auto-detects and imports ALL your MCPs from clipboard
Note: All commands below assume global installation (
npm install -g
). For npx usage, see the Alternative Installation section.
Replace your entire MCP configuration with this single entry:
{
"mcpServers": {
"ncp": {
"command": "ncp"
}
}
}
Your AI now sees just 2 simple tools instead of 50+ complex ones:
🎉 Done! Same tools, same capabilities, but your AI is now focused and efficient.
Want to experience what your AI experiences? NCP has a human-friendly CLI:
# Ask like your AI would ask:
ncp find "I need to read a file"
ncp find "help me send an email"
ncp find "search for something online"
Notice: NCP understands intent, not just keywords. Just like your AI needs.
# See your complete MCP ecosystem:
ncp list --depth 2
# Get help anytime:
ncp --help
# Test any tool safely:
ncp run filesystem:read_file --params '{"path": "/tmp/test.txt"}'
Why this matters: You can debug and test tools directly, just like your AI would use them.
# 1. Check NCP is installed correctly
ncp --version
# 2. Confirm your MCPs are imported
ncp list
# 3. Test tool discovery
ncp find "file"
# 4. Test a simple tool (if you have filesystem MCP)
ncp run filesystem:read_file --params '{"path": "/tmp/test.txt"}' --dry-run
✅ Success indicators:
- NCP shows version number
ncp list
shows your imported MCPsncp find
returns relevant tools- Your AI client shows only NCP in its tool list
Prefer not to install globally? Use npx
for any client configuration:
# All the above commands work with npx - just replace 'ncp' with 'npx @portel/ncp':
# Import MCPs
npx @portel/ncp config import
# Add MCPs
npx @portel/ncp add filesystem npx @modelcontextprotocol/server-filesystem ~/Documents
# Find tools
npx @portel/ncp find "file operations"
# Configure client (example: Claude Desktop)
{
"mcpServers": {
"ncp": {
"command": "npx",
"args": ["@portel/ncp"]
}
}
}
When to use npx: Perfect for trying NCP, CI/CD environments, or when you can't install packages globally.
- Desperate Assistant: "I see 50 tools... which should I use... let me think..."
- Executive Assistant: "I need file access. Done." (NCP handles the details)
- Before: 100k+ tokens burned on tool confusion
- After: 2.5k tokens for focused execution
- Result: 40x token efficiency = 40x longer conversations
- Desperate: AI freezes, picks wrong tool, asks for clarification
- Executive: NCP's Chief of Staff finds the RIGHT tool instantly
- Before: 8-second delays, hesitation, "Which tool should I use?"
- After: Instant decisions, immediate execution, zero doubt
Bottom line: Your AI goes from desperate assistant to executive assistant.
Prefer to build from scratch? Add MCPs manually:
# Add the most popular MCPs:
# AI reasoning and memory
ncp add sequential-thinking npx @modelcontextprotocol/server-sequential-thinking
ncp add memory npx @modelcontextprotocol/server-memory
# File and development tools
ncp add filesystem npx @modelcontextprotocol/server-filesystem ~/Documents # Path: directory to access
ncp add github npx @modelcontextprotocol/server-github # No path needed
# Search and productivity
ncp add brave-search npx @modelcontextprotocol/server-brave-search # No path needed
💡 Pro tip: Browse Smithery.ai (2,200+ MCPs) or mcp.so to discover tools for your specific needs.
# Community favorites (download counts from Smithery.ai):
ncp add sequential-thinking npx @modelcontextprotocol/server-sequential-thinking # 5,550+ downloads
ncp add memory npx @modelcontextprotocol/server-memory # 4,200+ downloads
ncp add brave-search npx @modelcontextprotocol/server-brave-search # 680+ downloads
# Popular dev tools:
ncp add filesystem npx @modelcontextprotocol/server-filesystem ~/code
ncp add github npx @modelcontextprotocol/server-github
ncp add shell npx @modelcontextprotocol/server-shell
# Enterprise favorites:
ncp add gmail npx @mcptools/gmail-mcp
ncp add slack npx @modelcontextprotocol/server-slack
ncp add google-drive npx @modelcontextprotocol/server-gdrive
ncp add postgres npx @modelcontextprotocol/server-postgres
ncp add puppeteer npx @hisma/server-puppeteer
Configuration File Location:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json
- Windows:
%APPDATA%\Claude\claude_desktop_config.json
- Linux:
~/.config/Claude/claude_desktop_config.json
Replace your entire claude_desktop_config.json
with:
{
"mcpServers": {
"ncp": {
"command": "ncp"
}
}
}
📌 Important: Restart Claude Desktop after saving the config file.
Note: Configuration file locations are current as of this writing. For the most up-to-date setup instructions, please refer to the official Claude Desktop documentation.
NCP works automatically! Just run:
ncp add <your-mcps>
Settings File Location:
- macOS:
~/Library/Application Support/Code/User/settings.json
- Windows:
%APPDATA%\Code\User\settings.json
- Linux:
~/.config/Code/User/settings.json
Add to your VS Code settings.json
:
{
"mcp.servers": {
"ncp": {
"command": "ncp"
}
}
}
📌 Important: Restart VS Code after saving the settings file.
Disclaimer: Configuration paths and methods are accurate as of this writing. VS Code and its extensions may change these locations or integration methods. Please consult the official VS Code documentation for the most current information.
{
"mcp": {
"servers": {
"ncp": {
"command": "ncp"
}
}
}
}
Disclaimer: Configuration format and location may vary by Cursor IDE version. Please refer to Cursor's official documentation for the most up-to-date setup instructions.
NCP includes powerful internal MCPs that extend functionality beyond external tool orchestration:
Schedule any MCP tool to run automatically using cron or natural language schedules.
# Schedule a daily backup check
ncp run schedule:create --params '{
"name": "Daily Backup",
"schedule": "every day at 2am",
"tool": "filesystem:list_directory",
"parameters": {"path": "/backups"}
}'
Features:
- ✅ Natural language schedules ("every day at 9am", "every monday")
- ✅ Standard cron expressions for advanced control
- ✅ Automatic validation before scheduling
- ✅ Execution history and monitoring
- ✅ Works even when NCP is not running (system cron integration)
Install and configure MCPs dynamically through natural language.
# AI can discover and install MCPs for you
ncp find "install mcp"
# Shows: mcp:install, mcp:search, mcp:configure
Features:
- ✅ Search and discover MCPs from registries
- ✅ Install MCPs without manual configuration
- ✅ Update and remove MCPs programmatically
- ✅ AI can self-extend with new capabilities
Configuration: Internal MCPs are disabled by default. Enable in your profile settings:
{
"settings": {
"enable_schedule_mcp": true,
"enable_mcp_management": true
}
}
NCP automatically detects broken MCPs and routes around them:
ncp list --depth 1 # See health status
ncp config validate # Check configuration health
🎯 Result: Your AI never gets stuck on broken tools.
Organize MCPs by project or environment:
# Development setup
ncp add --profile dev filesystem npx @modelcontextprotocol/server-filesystem ~/dev
# Production setup
ncp add --profile prod database npx production-db-server
# Use specific profile
ncp --profile dev find "file tools"
New: Configure MCPs per project with automatic detection - perfect for teams and Cloud IDEs:
# In any project directory, create local MCP configuration:
mkdir .ncp
ncp add filesystem npx @modelcontextprotocol/server-filesystem ./
ncp add github npx @modelcontextprotocol/server-github
# NCP automatically detects and uses project-local configuration
ncp find "save file" # Uses only project MCPs
How it works:
- 📁 Local
.ncp
directory exists → Uses project configuration - 🏠 No local
.ncp
directory → Falls back to global~/.ncp
- 🎯 Zero profile management needed → Everything goes to default
all.json
Perfect for:
- 🤖 Claude Code projects (project-specific MCP tooling)
- 👥 Team consistency (ship
.ncp
folder with your repo) - 🔧 Project-specific tooling (each project defines its own MCPs)
- 📦 Environment isolation (no global MCP conflicts)
# Example project structures:
frontend-app/
.ncp/profiles/all.json # → playwright, lighthouse, browser-context
src/
api-backend/
.ncp/profiles/all.json # → postgres, redis, docker, kubernetes
server/
NCP supports both stdio (local) and HTTP/SSE (remote) MCP servers:
Stdio Transport (Traditional):
# Local MCP servers running as processes
ncp add filesystem npx @modelcontextprotocol/server-filesystem ~/Documents
HTTP/SSE Transport (Remote):
{
"mcpServers": {
"remote-mcp": {
"url": "https://mcp.example.com/api",
"auth": {
"type": "bearer",
"token": "your-token-here"
}
}
}
}
🔋 Hibernation-Enabled Servers:
NCP automatically supports hibernation-enabled MCP servers (like Cloudflare Durable Objects or Metorial):
- Zero configuration needed - Hibernation works transparently
- Automatic wake-up - Server wakes on demand when NCP makes requests
- State preservation - Server state is maintained across hibernation cycles
- Cost savings - Only pay when MCPs are actively processing requests
How it works:
- Server hibernates when idle (consumes zero resources)
- NCP sends a request → Server wakes instantly
- Server processes request and responds
- Server returns to hibernation after idle timeout
Perfect for:
- 💰 Cost optimization - Only pay for active processing time
- 🌐 Cloud-hosted MCPs - Metorial, Cloudflare Workers, serverless platforms
- ♻️ Resource efficiency - No idle server costs
- 🚀 Scale to zero - Servers automatically sleep when not needed
Note: Hibernation is a server-side feature. NCP's standard HTTP/SSE client automatically works with both traditional and hibernation-enabled servers without any special configuration.
# From clipboard (any JSON config)
ncp config import
# From specific file
ncp config import "~/my-mcp-config.json"
# From Claude Desktop (auto-detected paths)
ncp config import
# Check what was imported
ncp list
# Validate health of imported MCPs
ncp config validate
# See detailed import logs
DEBUG=ncp:* ncp config import
- Check connection:
ncp list
(should show your MCPs) - Test discovery:
ncp find "your query"
- Validate config: Ensure your AI client points to
ncp
command
# Check MCP health (unhealthy MCPs slow everything down)
ncp list --depth 1
# Clear cache if needed
rm -rf ~/.ncp/cache
# Monitor with debug logs
DEBUG=ncp:* ncp find "test"
Like Yin and Yang, everything relies on the balance of things.
Compute gives us precision and certainty. AI gives us creativity and probability.
We believe breakthrough products emerge when you combine these forces in the right ratio.
How NCP embodies this balance:
What NCP Does | AI (Creativity) | Compute (Precision) | The Balance |
---|---|---|---|
Tool Discovery | Understands "read a file" semantically | Routes to exact tool deterministically | Natural request → Precise execution |
Orchestration | Flexible to your intent | Reliable tool execution | Natural flow → Certain outcomes |
Health Monitoring | Adapts to patterns | Monitors connections, auto-failover | Smart adaptation → Reliable uptime |
Neither pure AI (too unpredictable) nor pure compute (too rigid).
Your AI stays creative. NCP handles the precision.
Want the technical details? Token analysis, architecture diagrams, and performance benchmarks:
Learn about:
- Vector similarity search algorithms
- N-to-1 orchestration architecture
- Real-world token usage comparisons
- Health monitoring and failover systems
Help make NCP even better:
- 🐛 Bug reports: GitHub Issues
- 💡 Feature requests: GitHub Discussions
- 🔄 Pull requests: Contributing Guide
Elastic License 2.0 - Full License
TLDR: Free for all use including commercial. Cannot be offered as a hosted service to third parties.