You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Add fallback to Converse API for models without streaming support
Add automatic fallback to the non-streaming Converse API when a model
doesn't support ConverseStream.
- Detect streaming validation errors and retry with Converse API
- Remember non-streaming models to avoid future retry attempts
This change allows llmcli to work with newly added gpt-oss models.
Copy file name to clipboardExpand all lines: README.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -18,7 +18,7 @@ llmcli -h
18
18
19
19
## Requirements
20
20
21
-
- AWS account with Bedrock access and at least one [model that supports ConverseStream API and system prompt](https://docs.aws.amazon.com/bedrock/latest/userguide/conversation-inference-supported-models-features.html) enabled.
21
+
- AWS account with Bedrock access and at least one [model that supports ConverseStream (or Converse) API and system prompt](https://docs.aws.amazon.com/bedrock/latest/userguide/conversation-inference-supported-models-features.html) enabled.
22
22
Configure which model to use with `LLMCLI_MODEL` environment variable (example: `us.amazon.nova-micro-v1:0`).
23
23
- Properly configured AWS credentials.
24
24
This tool tries to use AWS profile named “llmcli”, and falls back to default AWS credentials if profile is not found.
0 commit comments