-
Notifications
You must be signed in to change notification settings - Fork 106
Recursively rebuild models in openai.types #967
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
@@ -1736,3 +1736,22 @@ async def test_workflow_method_tools(client: Client): | |||
execution_timeout=timedelta(seconds=10), | |||
) | |||
await workflow_handle.result() | |||
|
|||
|
|||
async def test_response_serialization(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I assume this test fails until openai/openai-agents-python#1131 (or equivalent) is merged?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So... it does work if we explicitly rebuild all the models, but I'm hoping they will fix their dataclasses so we don't have to do this at all.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you give an idea of the cost of that full model rebuild? Just confirming it doesn't take like a whole second would be good. Do you think we should wait until they fix this issue or should we model rebuild now? Also, can discuss off-PR the current status of that PR vs their other possible solutions.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not exactly valid statistics but I ran it a few times locally with a result of ~.2 seconds.
I'm conflicted on fixing it at the moment. I think we need a resolution for public preview, and I don't know how likely it is we get a fix from them soon. Their fix has to go through stainless and they were unable to share the PR with me.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Similarly conflicted as it is rough continually working around broken dependencies lest we make that our default posture. Up to you.
@@ -1736,3 +1736,22 @@ async def test_workflow_method_tools(client: Client): | |||
execution_timeout=timedelta(seconds=10), | |||
) | |||
await workflow_handle.result() | |||
|
|||
|
|||
async def test_response_serialization(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Similarly conflicted as it is rough continually working around broken dependencies lest we make that our default posture. Up to you.
What was changed
Recursively rebuild models in openai.types
Why?
In some scenarios, encoding would fail because the models had only been partially built on import
Checklist
Closes [Bug] pydantic_core._pydantic_core.PydanticSerializationError: Error serializing to JSON: TypeError: 'MockValSer' object cannot be converted to 'SchemaSerializer' #965
How was this tested:
New test
Any docs updates needed?