A proxy server/adapter layer that relays requests to VS Code APIs.
Listens at localhost:5555 by default.
Open output/vscode-cp-proxy for authentication token for client authentication with the proxy.
- OpenAI
- /openai/v1/models
- Lists the models supported
- /openai/v1/chat/completions
- Translates the incoming requests in OpenAI format into VS Code chat API invocations.
- Streams responses by default. Non-streaming behavior not tested.
- Supports tool calling.
- /openai/v1/models
- Load/require vscode-cp-proxy.el
- M-x vscode-cp-proxy-set-gptel-backend
- Enter the token obtained from output/vscode-cp-proxy from VS Code.
- Choose from available models
- If a model, e.g. Sonnet4 emits partial output and quits, try upgrading VS Code version.