Thanks to visit codestin.com
Credit goes to github.com

Skip to content

sliplane/openwebui-helicone-proxy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Open WebUI Helicone Proxy

A Caddy-based proxy server that bridges Open WebUI and Helicone by translating header formats, enabling proper tracking of user properties in Helicone when using Open WebUI.

Problem

When using Open WebUI with Helicone (an LLMOps platform), there's a header compatibility issue:

  • Helicone expects custom properties in headers with the format: Helicone-Property-*
  • Open WebUI sends user information in headers with the format: X-OpenWebUI-* (when ENABLE_FORWARD_USER_INFO_HEADERS=true)
  • Neither platform supports custom header key configuration

This proxy solves this by rewriting the headers in-flight.

How It Works

The proxy intercepts requests from Open WebUI to Helicone and:

  1. Maps Open WebUI headers to Helicone property headers
  2. Removes the original Open WebUI headers to avoid clutter
  3. Forwards the request to Helicone with properly formatted headers

Header Mappings

Open WebUI Header Helicone Property Header
X-OpenWebUI-User-Name Helicone-Property-UserName
X-OpenWebUI-User-Id Helicone-Property-UserId
X-OpenWebUI-User-Email Helicone-Property-UserEmail
X-OpenWebUI-User-Role Helicone-Property-UserRole
X-OpenWebUI-Chat-Id Helicone-Property-ChatId

Prerequisites

  • Docker
  • Open WebUI instance with ENABLE_FORWARD_USER_INFO_HEADERS=true
  • Helicone API key

Deployment

Pull and run the pre-built container from GitHub Container Registry:

docker run -d -p 8080:80 --name helicone-proxy ghcr.io/sliplane/openwebui-helicone-proxy:latest

Or using docker-compose:

services:
  proxy:
    image: ghcr.io/sliplane/openwebui-helicone-proxy:latest
    ports:
      - "8080:80"
    restart: unless-stopped

Configuration

Open WebUI Configuration

  1. Set the environment variable in your Open WebUI deployment:

    ENABLE_FORWARD_USER_INFO_HEADERS=true
  2. Configure your OpenAI API connection to point to the proxy instead of Helicone directly:

    • Instead of: https://oai.helicone.ai/v1/helicone-api-key
    • Use: http://your-proxy-host:8080/v1/helicone-api-key
  3. Keep your OpenAI API key in the Authorization header as usual

Security Considerations

⚠️ Important: This proxy should be deployed in a private network, with proper authentication, and with HTTPS:

  • It does not implement any authentication mechanisms
  • It forwards all requests to Helicone
  • Deploy it within your private network (e.g., Docker network, VPC)
  • If exposing publicly, use a reverse proxy with authentication

Example Deployment Architecture

[Open WebUI] → [Private Network] → [Helicone Proxy] → [Internet] → [Helicone API]

Verification

After deployment, you can verify the proxy is working by:

  1. Making a request through Open WebUI
  2. Checking your Helicone dashboard
  3. Confirming that custom properties appear:
    • UserName
    • UserId
    • UserEmail
    • UserRole
    • ChatId

License

MIT

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors