OpenRouter Provider not loaded #2287
Closed
arrrrny
started this conversation in
Bug Reports
Replies: 2 comments
-
opened a PR #2288 |
Beta Was this translation helpful? Give feedback.
0 replies
-
Thanks for the PR! We just merged the OpenRouter model provider, so the current release doesn't support it (you can still use the OpenAI-compatible provider / or build from |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
here is the part of my config file:
2025-05-29T20:02:01.029694Z INFO gateway: Starting TensorZero Gateway 2025.5.7 (commit: f6ff580)
2025-05-29T20:02:01.032184Z ERROR tensorzero_internal::error: models.deepseek_r1_free.providers.openrouter_primary: unknown variant
openrouter
, expected one ofanthropic
,aws_sagemaker
,azure
,gcp_vertex_anthropic
,gcp_vertex_gemini
,google_ai_studio_gemini
,github_copilot
,hyperbolic
,fireworks
,mistral
,openai
,together
,vllm
,xai
,tgi
,sglang
,deepseek
I couldn't find where the enum for providers.
Beta Was this translation helpful? Give feedback.
All reactions