Thanks to visit codestin.com
Credit goes to github.com

Skip to content

[BUG] Unsupported value temperature when using gpt-5 #339

@mnort9

Description

@mnort9

Basic checks

  • I searched existing issues - this hasn't been reported
  • I can reproduce this consistently
  • This is a RubyLLM bug, not my application code

What's broken?

With options model_id: 'gpt-5', provider: 'openai' I'm getting the error:

Unsupported value: 'temperature' does not support 0.7 with this model. Only the default (1) value is supported.

I'm not providing a temperature value in the request. I'm on ruby_llm version 1.6.0

Also tried RubyLLM.models.refresh!

How to reproduce

model_id: 'gpt-5',
provider: 'openai'

Expected behavior

Normal response

What actually happened

DEBUG -- RubyLLM: Accumulating error chunk: {
 "error": {
 "message": "Unsupported value: 'temperat
 [2025-08-11T15:59:53.364489 #47425] DEBUG -- RubyLLM: error: /usr/local/rvm/gems/ruby-3.4.3/gems/ruby_llm-1.6.0/lib/ruby_llm/error.rb:62:in 'RubyLLM::ErrorMiddleware.parse_error': Unsupported value: 'temperature' does not support 0.7 with this model. Only the default (1) value is supported. (RubyLLM::BadRequestError)

Environment

ruby_llm version 1.6.0

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions