-
-
Notifications
You must be signed in to change notification settings - Fork 258
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Basic checks
- I searched existing issues - this hasn't been reported
- I can reproduce this consistently
- This is a RubyLLM bug, not my application code
What's broken?
With options model_id: 'gpt-5', provider: 'openai'
I'm getting the error:
Unsupported value: 'temperature' does not support 0.7 with this model. Only the default (1) value is supported.
I'm not providing a temperature
value in the request. I'm on ruby_llm version 1.6.0
Also tried RubyLLM.models.refresh!
How to reproduce
model_id: 'gpt-5',
provider: 'openai'
Expected behavior
Normal response
What actually happened
DEBUG -- RubyLLM: Accumulating error chunk: {
"error": {
"message": "Unsupported value: 'temperat
[2025-08-11T15:59:53.364489 #47425] DEBUG -- RubyLLM: error: /usr/local/rvm/gems/ruby-3.4.3/gems/ruby_llm-1.6.0/lib/ruby_llm/error.rb:62:in 'RubyLLM::ErrorMiddleware.parse_error': Unsupported value: 'temperature' does not support 0.7 with this model. Only the default (1) value is supported. (RubyLLM::BadRequestError)
Environment
ruby_llm version 1.6.0
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working