An intelligent proxy server designed for Claude Code with dual routing modes: direct pass-through to Anthropic API or smart conversion to OpenRouter API, featuring comprehensive monitoring and analytics.
This project provides a local proxy server that sits between Claude Code and the Anthropic API, acting as an intelligent middleware layer that enhances your development experience with Claude.
When using Claude Code, you typically send requests directly to Anthropic's servers. While this works perfectly, you miss out on valuable insights about your API usage. This proxy server solves that by providing:
🔍 Complete Request Visibility: See every API call made by Claude Code, including full request/response data, token usage, and timing metrics.
📊 Usage Analytics: Track your API consumption patterns, model usage, success rates, and performance metrics over time.
🐛 Debugging Support: Inspect failed requests, analyze response times, and troubleshoot API integration issues with detailed logging.
💰 Cost Monitoring: Monitor token usage and estimate API costs to better manage your Claude Code usage.
📈 Performance Optimization: Identify slow requests, optimize your prompts, and improve overall workflow efficiency.
graph LR
A[Claude Code] -->|API Requests| B[Anthropic Proxy]
B -->|Forwards Requests| C[Anthropic API]
C -->|Returns Responses| B
B -->|Returns Responses| A
B -->|Stores & Analyzes| D[Monitoring Dashboard]
- Transparent Proxying: Claude Code sends requests to your local proxy instead of directly to Anthropic
- Request Interception: The proxy captures all request/response data for analysis
- API Forwarding: Requests are forwarded to Anthropic API without modification
- Real-time Monitoring: All interactions are logged and analyzed in a web dashboard
- Data Export: Export usage data for further analysis or reporting
- Claude Code Power Users: Developers who want to optimize their AI-assisted workflows
- API Cost Conscious Users: Those who need to monitor and control their API spending
- Development Teams: Teams that need visibility into AI tool usage across projects
- API Integration Developers: Developers building applications with Anthropic API
- Performance Analysts: Users who want to analyze and optimize their prompt efficiency
✅ Zero Code Changes: Works with existing Claude Code installation - just set one environment variable
✅ Real-time Insights: Live dashboard with immediate feedback on API usage
✅ Privacy Focused: All data stays on your local machine
✅ Production Ready: Includes Docker, PM2, and deployment configurations
✅ Export Capabilities: Get your data out in standard formats for further analysis
anthropic-proxy/
├── src/ # Source code
│ ├── server.js # Main server application
│ ├── monitor/ # Monitoring modules
│ │ ├── store.js # Request/response data storage
│ │ └── ui.js # Web monitoring interface
│ └── utils/ # Utility functions
├── docs/ # Documentation
├── examples/ # Configuration examples
│ ├── Dockerfile # Docker container setup
│ ├── docker-compose.yml # Docker Compose configuration
│ └── pm2.config.js # PM2 process management
├── package.json
├── README.md
├── .env.example # Environment variables template
├── .gitignore
└── LICENSE
- 🚀 Node.js-based proxy for Anthropic API
- 🔀 Dual routing modes: Direct Anthropic API or OpenRouter with automatic format conversion
- 🛠 Advanced configuration UI at
/configwith real-time updates- Switchable proxy modes (Anthropic/OpenRouter)
- Simplified 3-model family mapping (Sonnet/Opus/Haiku)
- Dynamic model loading from OpenRouter with 1-hour caching
- Searchable model selection with HTML5 datalist filtering
- Toast notifications for all user actions
- Navigation links between monitor and config pages
- 🔄 Live configuration reload - All changes take effect immediately without restart
- 🔐 Secure API key management via environment variables only
- 📊 Comprehensive monitoring dashboard with real-time SSE updates
- 🔍 Enhanced logging - Detailed logs for both Anthropic and OpenRouter modes
- 🔒 API key masking in logs and UI for security
- 📈 Performance metrics and token usage tracking with filtering
- 🌊 Full streaming support with chunk-by-chunk analysis
- 💾 Smart data export with compressed export and filtering
- 🐳 Docker support with production examples
- ⚡ Production ready with PM2 and systemd configurations
Purpose: Acts as a transparent proxy, forwarding Claude Code requests directly to Anthropic API while providing complete monitoring
How to Use:
- Start the proxy server:
npx github:kingoliang/anthropic-proxy
- Configure Claude Code to use the proxy:
export ANTHROPIC_BASE_URL=http://localhost:8082 - Continue using Claude Code normally - Anthropic API key is passed through headers (x-api-key or authorization)
- Monitor all requests in real-time at
http://localhost:8082/monitor
One-liner Example:
ANTHROPIC_BASE_URL=http://localhost:8082 claudeBenefits:
- ✅ Zero code changes, fully compatible with existing Claude Code setup
- ✅ Maintains native Anthropic API experience
- ✅ Complete request/response monitoring and analysis
- ✅ Supports all Anthropic models and features
Purpose: Intelligently converts Claude Code's Anthropic API requests to OpenRouter format, enabling use of cheaper third-party models
How to Use:
- Set your OpenRouter API key:
export OPENROUTER_API_KEY=sk-or-v1-your_key_here - Start the proxy server:
npx github:kingoliang/anthropic-proxy
- Open the configuration interface to switch modes:
open http://localhost:8082/config
- Select "OpenRouter" mode in the web interface
- Configure model mappings (optional):
- Sonnet → Choose an OpenRouter model
- Opus → Choose an OpenRouter model
- Haiku → Choose an OpenRouter model
- Save configuration and continue using Claude Code normally
Complete Example:
# 1. Set environment variables
export OPENROUTER_API_KEY=sk-or-v1-your_key_here
export ANTHROPIC_BASE_URL=http://localhost:8082
# 2. Start the proxy
npx github:kingoliang/anthropic-proxy
# 3. Visit http://localhost:8082/config in browser and switch to OpenRouter mode
# 4. Use Claude Code (will automatically use OpenRouter models)
claudeBenefits:
- 💰 Cost Savings: Use OpenRouter's cheaper third-party models
- 🔄 Automatic Conversion: Request/response formats automatically converted, Claude Code is unaware
- 🌐 More Choices: 99+ models available, including various open source and commercial models
- 📊 Full Monitoring: Transparent conversion process, view original requests and converted OpenRouter requests
# Run immediately without cloning
npx github:kingoliang/anthropic-proxy
# Or with custom configuration
PORT=3000 LOG_LEVEL=DEBUG npx github:kingoliang/anthropic-proxy# Clone the repository
git clone https://github.com/kingoliang/anthropic-proxy.git
cd anthropic-proxy
# Install dependencies
npm install
# Configure environment (optional)
cp .env.example .env
# Edit .env file with your settings
# Run the server
npm start
# or
npx .# Install globally from GitHub
npm install -g github:kingoliang/anthropic-proxy
# Run anywhere
anthropic-proxy
# Or with environment variables
PORT=3000 LOG_LEVEL=DEBUG anthropic-proxy# Clone and link for development
git clone https://github.com/kingoliang/anthropic-proxy.git
cd anthropic-proxy
npm install
npm link
# Run from anywhere
anthropic-proxyCreate a .env file or set environment variables:
# Server configuration
HOST=0.0.0.0
PORT=8082
# API base URLs
ANTHROPIC_BASE_URL=https://api.anthropic.com
# Request timeout (milliseconds)
REQUEST_TIMEOUT=120000
# Log level
LOG_LEVEL=INFO
# OpenRouter Configuration (required for OpenRouter mode)
OPENROUTER_API_KEY=your_openrouter_api_key_hereConfiguration changes take effect immediately without restart:
- Proxy Mode: Switch between Anthropic/OpenRouter instantly via web UI
- Model Mappings: Update 3-family mappings (Sonnet/Opus/Haiku), changes apply to new requests
- Model Lists: Dynamic loading from OpenRouter API with 1-hour intelligent caching
- API Keys: Read from environment variables on each request for maximum security
- Manual Reload: Call
POST /api/config/reloadto force configuration and environment refresh - UI Feedback: Toast notifications confirm all configuration changes
# Run with custom port
PORT=3000 npx .
# Run in debug mode
LOG_LEVEL=DEBUG npx .
# Combine multiple environment variables
PORT=3000 LOG_LEVEL=DEBUG npx .POST /v1/messages- Main messages endpoint (supports streaming)POST /v1/messages/count_tokens- Token counting endpointGET /health- Health checkGET /- Redirect to monitoring dashboard
GET /monitor- Web monitoring dashboardGET /api/monitor/requests- Get request list with filteringGET /api/monitor/requests/:id- Get a single request by IDGET /api/monitor/stats- Get real-time statistics (supports filter parameters)GET /api/monitor/stream- Server-sent events for real-time updatesPOST /api/monitor/clear- Clear all monitoring dataGET /api/monitor/export- Export monitoring data as JSON (supports filter parameters)GET /api/monitor/analyze- Generate analysis report (supports filter parameters)GET /api/monitor/config- Get server info (logLevel, port, host)
GET /config- Advanced configuration UI with searchable model selectionGET /api/config- Get current configuration with live environment integrationPOST /api/config- Update and persist configuration with instant effectPOST /api/config/reset- Reset configuration to defaults with confirmationPOST /api/config/test-openrouter- Validate OpenRouter API key from environmentGET /api/config/models- List available OpenRouter models (1-hour smart cache)POST /api/config/reload- Force reload configuration and environment variables
Access the built-in monitoring interface at: http://localhost:8082/monitor
- Real-time request/response tracking for both Anthropic and OpenRouter modes
- Performance metrics dashboard - dynamically updates based on filter conditions
- Stream chunk timeline visualization with detailed timing analysis
- Dual-mode logging - comprehensive logs for both proxy modes with raw response data
- API key masking for security (environment variables only)
- Smart filtering system (status, model, time range, provider)
- Filter conditions apply to all functions (statistics, export, analysis)
- Model list shows all available models from both providers
- Advanced data export - filtered data with compressed export options
- Detailed analysis reports - provider-specific insights and performance metrics
- Auto-refresh with SSE - real-time updates without page reload
- Provider transparency - clear indication of which API backend was used
- Smart Statistics Panel - Real-time statistics that update based on filter conditions
- Advanced Filters - Multi-dimensional filtering by status, model, and time range
- Request List - Filterable table of API calls with real-time updates
- Detail View - Complete request/response inspection
- Stream Analysis - Chunk-by-chunk streaming visualization
- Export Tools - JSON data export with filtering support
- Analysis Reports - Comprehensive analysis based on filtered data
- API Keys: Stored in environment variables only - never in configuration files
- Automatic masking in logs and monitoring interface (first 10 + "..." + last 4 characters)
- No authentication required for monitoring dashboard (designed for local development use)
- Sensitive headers automatically filtered in request logging
- Safe to commit:
config.jsoncontains no secrets - only model mappings and settings - Environment isolation: All sensitive data via
.envfile (excluded from version control) - Local data: All configuration and monitoring data stays on your machine
- API key validation: Real-time testing ensures keys are valid before use
- Node.js 18+
- API Keys:
- Anthropic API key (for Claude Code) via headers (
x-api-keyorauthorization) - OpenRouter API key (optional) via
OPENROUTER_API_KEYenvironment variable
- Anthropic API key (for Claude Code) via headers (
- Network: Outbound access to Anthropic API and/or OpenRouter API
- Browser: Modern browser with HTML5 datalist support for configuration UI
Once the proxy server is running, configure Claude Code to use it:
# Start on default port 8082
npx github:kingoliang/anthropic-proxy
# Or start on custom port (e.g., 3000)
PORT=3000 npx github:kingoliang/anthropic-proxySet the environment variable to point Claude Code to your proxy:
# For default port 8082
export ANTHROPIC_BASE_URL=http://localhost:8082
# For custom port (e.g., 3000)
export ANTHROPIC_BASE_URL=http://localhost:3000# Claude Code will now use your proxy server
claudeOption 1: Inline environment variable
ANTHROPIC_BASE_URL=http://localhost:8082 claudeOption 2: Add to your shell profile
# Add to ~/.bashrc, ~/.zshrc, or ~/.profile
echo 'export ANTHROPIC_BASE_URL=http://localhost:8082' >> ~/.bashrc
source ~/.bashrcOption 3: Create a startup script
#!/bin/bash
# start-claude-with-proxy.sh
export ANTHROPIC_BASE_URL=http://localhost:8082
claude- Check proxy is running: Visit
http://localhost:8082/monitor - Test Claude Code: Make any request in Claude Code
- Monitor requests: Watch real-time requests in the monitoring dashboard
- Set API Key: Add
OPENROUTER_API_KEY=your_key_hereto.envfile (required) - Access Configuration: Open
http://localhost:8082/config - Switch Mode: Select "OpenRouter" mode in the web interface
- Configure Models: Set up 3-family model mappings (Sonnet→Model, Opus→Model, Haiku→Model)
- Search and select from 99+ available OpenRouter models
- Models are fetched dynamically with 1-hour caching
- Use searchable dropdowns with built-in filtering
- Save & Test: Save configuration and test connection (instant feedback via toast)
- Live Updates: All changes take effect immediately - no restart required
- Monitor Usage: View detailed logs and metrics in the monitoring dashboard
How it works:
- Requests automatically converted from Anthropic format to OpenAI/OpenRouter format
- Responses converted back to Anthropic format for full Claude Code compatibility
- Full streaming support with chunk-by-chunk processing
- Comprehensive logging shows both original OpenRouter response and converted output
- Framework: Express.js with ES modules
- Monitoring: In-memory storage with circular buffer (max 1000 requests)
- Real-time Updates: Server-Sent Events (SSE)
- Stream Processing: Full chunk tracking and content merging
- Error Handling: Comprehensive error catching and logging
# Build and run with Docker
docker build -t anthropic-proxy .
docker run -p 8082:8082 anthropic-proxy
# Or use Docker Compose
docker-compose -f examples/docker-compose.yml up# Install PM2
npm install -g pm2
# Start with PM2
pm2 start examples/pm2.config.js
# Monitor
pm2 monit
# Stop
pm2 stop anthropic-proxy# Create service file
sudo nano /etc/systemd/system/anthropic-proxy.service
# Add service configuration
[Unit]
Description=Anthropic API Proxy
After=network.target
[Service]
Type=simple
User=nodejs
WorkingDirectory=/path/to/anthropic-proxy
ExecStart=/usr/bin/node src/server.js
Restart=always
Environment=NODE_ENV=production
Environment=PORT=8082
[Install]
WantedBy=multi-user.target
# Enable and start
sudo systemctl enable anthropic-proxy
sudo systemctl start anthropic-proxy- Port already in use: Change
PORTenvironment variable or kill existing processes - API key not working:
- Verify Anthropic key format in Claude Code headers
- Check OpenRouter key is set in
.envfile asOPENROUTER_API_KEY - Use configuration page to test OpenRouter connection
- OpenRouter connection fails: Verify API key and network access to openrouter.ai
- Model mapping issues: Use configuration UI to select valid models from dropdown
- Configuration not saving: Check file permissions and disk space
- Timeout errors: Increase
REQUEST_TIMEOUTvalue for slow models - Memory usage: Monitoring data auto-rotates after 1000 requests
- Module not found: Ensure you're running from the correct directory
- Toast notifications not working: Clear browser cache and reload configuration page
LOG_LEVEL=DEBUG npx github:kingoliang/anthropic-proxycurl http://localhost:8082/health- GitHub Repository: https://github.com/kingoliang/anthropic-proxy
- NPM Package:
npx github:kingoliang/anthropic-proxy - Anthropic API Documentation: https://docs.anthropic.com/
- Docker Hub: (coming soon)
This proxy server is designed for development and testing purposes. Contributions are welcome:
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Submit a pull request
MIT License - see LICENSE for details.
- Built with Express.js
- Monitoring UI powered by Alpine.js and Tailwind CSS
- Generated with assistance from Claude Code