Use Anthropic clients (like Claude Code) with Gemini, OpenAI, or direct Anthropic backends. 🤝
A proxy server that lets you use Anthropic clients with Gemini, OpenAI, or Anthropic models themselves (a transparent proxy of sorts), all via LiteLLM. 🌉
- OpenAI API key 🔑
- Google AI Studio (Gemini) API key (if using Google provider) 🔑
- uv installed.
-
Clone this repository:
git clone https://github.com/hangts/claude-code-proxy.git cd claude-code-proxy -
Install uv (if you haven't already):
curl -LsSf https://astral.sh/uv/install.sh | sh(
uvwill handle dependencies based onpyproject.tomlwhen you run the server) -
Configure Environment Variables: Copy the example environment file:
cp .env.example .env
Edit
.envand fill in your API keys and model configurations:- ONEAPI_BASE_URL: (Optional) ONEAPI URL, default: http://oneapi.blockbeat.hk/v1
- SAVING_MODEL: (Optional) Claude will automatically call the haiku (claude-haiku-4-5-20251001) model to perform simple tasks; replace it here, default: deepseek/deepseek-v3.2-exp
- HEIGHT_MODEL: (Optional) Claude will automatically call the sonnet (claude-sonnet-4-5-20250929) model to perform tasks; replace it here, default: anthropic/claude-sonnet-4.5
-
Run the server:
uv run uvicorn server:app --host 0.0.0.0 --port 8082 --reload
(
--reloadis optional, for development)
-
Install Claude Code (if you haven't already):
npm install -g @anthropic-ai/claude-code
-
Claude Code Settings:
{
"env": {
"ANTHROPIC_BASE_URL": "{your proxy url}",
"ANTHROPIC_AUTH_TOKEN": "{your oneapi auth token}",
"ANTHROPIC_MODEL": "oneapi/{your model name}",
"ANTHROPIC_SMALL_FAST_MODEL": "oneapi/{your model name}",
"API_TIMEOUT_MS": "3000000",
"CLAUDE_CODE_DISABLE_NONESSENTIAL_TRAFFIC": "I"
}
}- That's it! Your Claude Code client will now use the configured backend models through the proxy. 🎯
This proxy works by:
- Receiving requests in Anthropic's API format 📥
- Translating the requests to OpenAI format via LiteLLM 🔄
- Sending the translated request to OpenAI 📤
- Converting the response back to Anthropic format 🔄
- Returning the formatted response to the client ✅
The proxy handles both streaming and non-streaming responses, maintaining compatibility with all Claude clients. 🌊
Contributions are welcome! Please feel free to submit a Pull Request. 🎁
