-
Notifications
You must be signed in to change notification settings - Fork 481
fix: LLMConfig Validation Error on 'stream=true' #1953
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: LLMConfig Validation Error on 'stream=true' #1953
Conversation
ede719f to
9723424
Compare
…rs/priyanshu/Documents/GitHub/ag2/test/oai/test.pyeam=True' fix: formatting and test fix: formatting fix: formatting fix: groupchat test fix: formatting
9723424 to
f214984
Compare
…ld-in-config_list-raises-validation-error
|
@priyansh4320 are you able to resolve the branch conflicts with client.py so this can be merged? |
…ld-in-config_list-raises-validation-error
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @priyansh4320!
Codecov Report✅ All modified and coverable lines are covered by tests.
... and 65 files with indirect coverage changes 🚀 New features to boost your workflow:
|
Why are these changes needed?
stream=Trueis set.Root Cause Analysis:
Solution:
stream:bool= Falseinto OpenAILLMConfigEntry, AzureOpenAILLMConfigEntry, and DeepSeekLLMConfigEntry classes.Related issue number
Closes #1912
Checks