-
Notifications
You must be signed in to change notification settings - Fork 1
capture azure-only properties #10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
@@ -353,8 +353,14 @@ async def create( | |||
extra_body=extra_body, | |||
timeout=timeout | |||
) | |||
if isinstance(response, AsyncStream): | |||
response._cast_to = AzureChatCompletionChunk # or rebuild the stream? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, this is where we want to add a compat clause/understand what the actual public contract of the streaming API is. If it is only intended to be an async iterable, then a yield for each item would have been enough. But since it is typed as a Stream, presumably the intent is also that the response
attribute is available.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
...in the meantime, we can look at overriding client._process_response_data
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Agreed on understanding the contract. it works as-is now, but I've definitely upped the amount of _internal things I've been touching...
src/openai/azure/_azuremodels.py
Outdated
|
||
class AzureChatCompletionChoice(ChatChoice): | ||
content_filter_results: Optional[ContentFilterResults] = None | ||
message: AzureChatCompletionMessage # TODO typing hates this |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, subclassing invariant types can indeed be an issue here. This will be a gnarly issue I suspect.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've sprinkled in some type: ignores for now...
src/openai/azure/_azuremodels.py
Outdated
choices: List[AzureChatCompletionChoice] # TODO typing hates this | ||
# TODO service is still returning prompt_filter_results OR prompt_annotations | ||
# prompt_filter_results: Optional[List[PromptFilterResult]] = None | ||
prompt_annotations: Optional[List[PromptFilterResult]] = None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we can try to use typing.Sequence to limit the untyped:ness we expose ourselves to. If we get negative feedback, it will give us an opportunity to better understand why someone would want to mutate the return value.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Using typing.Sequence in place of typing.List seems to break the deserialization
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmmm... interesting. Pydantic supports sequence for deserialization. So there has to be something else that is going on there?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I definitely blamed myself first, but toggling only that is what breaks it. Will need to investigate more...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not urgent.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just to be clear, when I say "breaks" I don't mean that it explodes. I just mean that it returns a dict instead of the model
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We are going to have to do some additional clean-up. But it illustrates open issues well.
TODO typing