A proxy service that allows Anthropic API requests (especially from Claude Code to be routed through an OpenAI-compatible URL to access alternative models.
Claude Proxy provides a compatibility layer between Claude Code and alternative models available through OpenRouter.ai or your chosen base URL. It dynamically reroutes LLM requests from Claude Code to the providers you want to use.
Key features:
- FastAPI web server exposing Anthropic-compatible endpoints
- Format conversion between Anthropic and OpenAI requests/responses (see mapping for translation details)
- Support for both streaming and non-streaming responses
- Dynamic model selection based on requested Claude model
- Detailed request/response logging
- Token counting
It is recommended to use docker, using uv is recommended for development only.
- Claude Code installed (as of July 2025 the proxy is tested on version
1.0.56, which you can install withnpm install -g @anthropic-ai/[email protected]) - An OpenRouter API key
- Optional if you don't want to use docker
- Python 3.10+
- uv
-
Download the repo:
git clone https://github.com/ujisati/claude-code-provider-proxy/ -
Either modify the
environment:indocker-composeor create a.envfile at the root of the repo with for example:
OPENAI_API_KEY=<your openrouter api key>
BIG_MODEL_NAME=google/gemini-2.5-pro-preview
SMALL_MODEL_NAME=google/gemini-2.0-flash-lite-001
LOG_LEVEL=DEBUGA list of known working models can be found at the bottom of that README.md.
For more configuration options, see the Settings class in src/main.py.
Note: you can use environment variables instead of an .env file, but note that any value of the .env file that is also present in the environment will be overwritten (the environment takes priority). In particular, if you set OPENAI_API_KEY in .env to the OPENROUTER_API_KEY, if you don't unset OPENAI_API_KEY, the key received by src/main.py will be the one from OpenAI instead of OpenRouter.ai.
- Run the proxy:
Docker: (recommended)
# Build and run with docker-compose
docker-compose up --build --detach
# Or build and run manually
docker build -f docker/Dockerfile -t claude-code-proxy .
docker run -p 8080:8080 --env-file .env claude-code-proxyLocal Development:
uv run src/main.py- Run Claude Code
ANTHROPIC_BASE_URL=http://localhost:8080 claudeTo make a more permanent alias you can run echo 'alias claude="ANTHROPIC_BASE_URL=http://localhost:8080 claude"' >> ~/.bashrc for example.
- Optional: to further customize
claude, there are environment variables you can modify, for example:
CLAUDE_CODE_EXTRA_BODY
MAX_THINKING_TOKENS
API_TIMEOUT_MS
ANTHROPIC_BASE_URL
DISABLE_TELEMETRY
DISABLE_ERROR_REPORTING
Note: These variables modify claude, not the proxy.
The proxy server exposes the following endpoints:
POST /v1/messages: Create a message (main endpoint)POST /v1/messages/count_tokens: Count tokens for a requestGET /: Health check endpoint
-
As
BIG_MODEL_NAME:-
anthropic/claude-sonnet-4 -
problematic
google/gemini-2.5-pro: seems to sometimes struggle to respect the edit format expected byclaude
-
-
As
SMALL_MODEL_NAME:- anthropic/claude-3.5-haiku
- google/gemini-2.5-flash