Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Fixed CLI streamed chat completions. #319

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
May 8, 2023
Merged

Conversation

MikeAmy
Copy link
Contributor

@MikeAmy MikeAmy commented Mar 19, 2023

Streamed chat completions use a different response structure vs non-streamed, i.e. deltas, which caused KeyError exceptions. Also these token deltas may have roles, and there may be empty deltas at the end of a completion. Handle these cases sensibly without breaking the non-streamed path.

I manually tested both the streamed and non-streamed paths, but no unit tests, sorry, was busy.

Streamed chat completions use a different response structure per returned token, also they may have roles and empty tokens at the end. Handle this sensibly.
@MikeAmy MikeAmy changed the title Fixed streamed chat completions. Fixed CLI streamed chat completions. Mar 19, 2023
@hallacy hallacy requested a review from athyuttamre March 30, 2023 04:43
@athyuttamre
Copy link

Thanks for this PR (and apologies for the delay)! Our CLI today only renders the content of the completion, so I removed the part that renders the role.

@athyuttamre athyuttamre merged commit 91a63f2 into openai:main May 8, 2023
megamanics pushed a commit to devops-testbed/openai-python that referenced this pull request Aug 14, 2024
* Fixed streamed chat completions.

Streamed chat completions use a different response structure per returned token, also they may have roles and empty tokens at the end. Handle this sensibly.

* Only render content

---------

Co-authored-by: Atty Eleti <[email protected]>
cgayapr pushed a commit to cgayapr/openai-python that referenced this pull request Dec 14, 2024
* Fixed streamed chat completions.

Streamed chat completions use a different response structure per returned token, also they may have roles and empty tokens at the end. Handle this sensibly.

* Only render content

---------

Co-authored-by: Atty Eleti <[email protected]>
safa0 pushed a commit to safa0/openai-agents-python that referenced this pull request Apr 27, 2025
We don't really need mypy on 3.9 (unit tests would catch any real
issues), and it causes issues with the rest of this stack.


---
[//]: # (BEGIN SAPLING FOOTER)
* openai#324
* openai#322
* openai#321
* openai#320
* __->__ openai#319
safa0 pushed a commit to safa0/openai-agents-python that referenced this pull request Apr 27, 2025
### Summary:
1. Add the MCP dep for python 3.10, since it doesn't support 3.9 and
below
2. Create MCPServer, which is the agents SDK representation of an MCP
server
3. Create implementations for HTTP-SSE and StdIO servers, directly
copying the [MCP SDK
example](https://github.com/modelcontextprotocol/python-sdk/blob/main/examples/clients/simple-chatbot/mcp_simple_chatbot/main.py)
4. Add a util to transform MCP tools into Agent SDK tools

Note: I added optional caching support to the servers. That way, if you
happen to know a server's tools don't change, you can just cache them.

### Test Plan:

Checks pass. I added tests at the end of the stack.

--- 

openai#324
openai#322
openai#321
-> openai#320
openai#319
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants