This repository lets you use Anthropic's Claude Code CLI with OpenAI's GPT-5 via a local LiteLLM proxy.
⚠️ ATTENTION⚠️ If you're here to set up
your own LiteLLM Server(potentially withLibreChator similar UI), head over to the main-boilerplate branch. It contains a "boilerplate" version of this repo with Claude Code CLI stuff stripped away for simplicity, and with a version of README.md that specifically explains how to build on top of this repo as a boilerplate.
- OpenAI API key 🔑
- Anthropic API key 🔑 - optional (if you decide not to remap some Claude models to OpenAI)
- Either uv or Docker Desktop, depending on your preferred setup method
If you are going to use GPT-5 via API for the first time, OpenAI may require you to verify your identity via Persona. You may encounter an OpenAI error asking you to “verify your organization.” To resolve this, you can go through the verification process here:
-
Clone this repository:
git clone https://github.com/teremterem/claude-code-gpt-5.git cd claude-code-gpt-5 -
Configure Environment Variables:
Copy the template file to create your
.env:cp .env.template .env
Edit
.envand add your OpenAI API key:OPENAI_API_KEY=your-openai-api-key-here # Optional: only needed if you plan to use Anthropic models # ANTHROPIC_API_KEY=your-anthropic-api-key-here # Optional (see .env.template for details): # LITELLM_MASTER_KEY=your-master-key-here # Optional: override the default remaps if you need to (the values you see # below are the defaults - see .env.template for more info) # REMAP_CLAUDE_HAIKU_TO=gpt-5-mini-reason-minimal # REMAP_CLAUDE_SONNET_TO=gpt-5-reason-medium # REMAP_CLAUDE_OPUS_TO=gpt-5-reason-high # Some more optional settings (see .env.template for details) ...
-
Run the proxy:
-
EITHER via
uv(make sure to install uv first):OPTION 1: Use a script for
uv:./uv-run.sh
OPTION 2: Run via a direct
uvcommand:uv run litellm --config config.yaml
-
OR via
Docker(make sure to install Docker Desktop first):OPTION 3: Run
Dockerin the foreground:./run-docker.sh
OPTION 4: Run
Dockerin the background:./deploy-docker.sh
OPTION 5: Run
Dockervia a direct command:docker run -d \ --name claude-code-gpt-5 \ -p 4000:4000 \ --env-file .env \ --restart unless-stopped \ ghcr.io/teremterem/claude-code-gpt-5:latest
NOTE: To run with this command in the foreground instead of the background, remove the
-dflag.To see the logs, run:
docker logs -f claude-code-gpt-5
To stop and remove the container, run:
./kill-docker.sh
NOTE: The
Dockeroptions above will pull the latest image fromGHCRand will ignore all your local files except.env. For more detailedDockerdeployment instructions and more options (like buildingDockerimage from source yourself, usingDocker Compose, etc.), see docs/DOCKER_TIPS.md
-
-
Install Claude Code (if you haven't already):
npm install -g @anthropic-ai/claude-code
-
Connect to GPT-5 instead of Claude:
ANTHROPIC_BASE_URL=http://localhost:4000 claude
If you set
LITELLM_MASTER_KEYfor the proxy (see.env.templatefor details), pass it as the Anthropic API key for the CLI:ANTHROPIC_API_KEY="<LITELLM_MASTER_KEY>" \ ANTHROPIC_BASE_URL=http://localhost:4000 \ claudeNOTE: In this case, if you've previously authenticated, run
claude /logoutfirst. -
That's it! Your Claude Code client will now use the selected GPT-5 variant(s) with your chosen reasoning effort level(s). 🎯
- GPT-5:
gpt-5-reason-minimalgpt-5-reason-lowgpt-5-reason-mediumgpt-5-reason-high
- GPT-5-mini:
gpt-5-mini-reason-minimalgpt-5-mini-reason-lowgpt-5-mini-reason-mediumgpt-5-mini-reason-high
- GPT-5-nano:
gpt-5-nano-reason-minimalgpt-5-nano-reason-lowgpt-5-nano-reason-mediumgpt-5-nano-reason-high
NOTE: Generally, you can use arbitrary models from arbitrary providers, but for providers other than OpenAI or Anthropic, you will need to specify the provider in the model name, e.g.
gemini/gemini-pro,gemini/gemini-pro-reason-disableetc. (as well as set the respective API keys along with any other environment variables that the provider might require in your.envfile).
The Web Search tool currently does not work with this setup. You may see an error like:
API Error (500 {"error":{"message":"Error calling litellm.acompletion for non-Anthropic model: litellm.BadRequestError: OpenAIException - Invalid schema for function 'web_search': 'web_search_20250305' is not valid under any of the given schemas.","type":"None","param":"None","code":"500"}}) · Retrying in 1 seconds… (attempt 1/10)
This is planned to be fixed soon.
NOTE: The
Fetchtool (getting web content from specific URLs) is not affected and works normally.