Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Commit 92e44aa

Browse files
authored
Add new include option for verbose chunks in streaming response (#21)
1 parent f2ee2fa commit 92e44aa

File tree

1 file changed

+8
-0
lines changed

1 file changed

+8
-0
lines changed

proto/xai/api/v1/chat.proto

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -470,6 +470,14 @@ enum IncludeOption {
470470

471471
// Include the inline citations in the final response.
472472
INCLUDE_OPTION_INLINE_CITATIONS = 7;
473+
474+
// Stream back any chunks that are generated by the model or the agent tools
475+
// even if there is no user-visible content in the chunk, e.g. only the usage
476+
// statistics are being updated.
477+
// The chunks without user-visible content are not streamed to the client when
478+
// this option is not included by default.
479+
// This option is only available for streaming responses.
480+
INCLUDE_OPTION_VERBOSE_STREAMING = 8;
473481
}
474482

475483
// A message in a conversation. This message is part of the model input. Each

0 commit comments

Comments
 (0)