Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Conversation

@daniel-lxs
Copy link
Member

@daniel-lxs daniel-lxs commented Oct 28, 2025

Fixes immediate retry loop for mid-stream failures by applying the same exponential backoff logic used for first-chunk errors.

Changes:

  • Extract shared backoffAndAnnounce helper to prevent code duplication
  • Add retryAttempt counter to stack items for proper backoff progression
  • Apply exponential backoff when autoApprovalEnabled and alwaysApproveResubmit are enabled
  • Include debug throw for testing the mid-stream retry path

Previously, mid-stream failures would retry immediately without any delay, potentially overwhelming failing APIs. Now they follow the same rate-limiting policy as first-chunk failures.


Important

Adds exponential backoff for mid-stream retry failures in Task.ts, aligning retry behavior with first-chunk errors.

  • Behavior:
    • Implements exponential backoff for mid-stream retry failures in recursivelyMakeClineRequests() in Task.ts, aligning with first-chunk error handling.
    • Adds retryAttempt counter to StackItem for tracking retry attempts.
    • Applies backoff when autoApprovalEnabled and alwaysApproveResubmit are true.
  • Functions:
    • Extracts backoffAndAnnounce() for shared backoff logic, reducing code duplication.
  • Misc:
    • Adds debug throw for testing mid-stream retry path.

This description was created by Ellipsis for 45ed85d. You can customize this summary. It will automatically update as commits are pushed.

- Extend StackItem with retryAttempt counter
- Extract shared backoffAndAnnounce helper for consistent retry UX
- Apply exponential backoff to mid-stream failures when auto-approval enabled
- Add debug throw for testing mid-stream retry path
@daniel-lxs daniel-lxs requested review from cte, jr and mrubens as code owners October 28, 2025 15:51
@dosubot dosubot bot added size:L This PR changes 100-499 lines, ignoring generated files. bug Something isn't working labels Oct 28, 2025
@roomote
Copy link
Contributor

roomote bot commented Oct 28, 2025

Review Complete

No issues found. The implementation correctly adds exponential backoff for mid-stream retry failures.

Follow Along on Roo Code Cloud

Allows early exit from exponential backoff if task is cancelled during delay
@hannesrudolph hannesrudolph added the Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. label Oct 28, 2025
@dosubot dosubot bot added the lgtm This PR has been approved by a maintainer label Oct 28, 2025
@mrubens mrubens merged commit be119bc into main Oct 28, 2025
15 checks passed
@mrubens mrubens deleted the feat/mid-stream-retry-backoff branch October 28, 2025 16:55
@github-project-automation github-project-automation bot moved this from Triage to Done in Roo Code Roadmap Oct 28, 2025
@github-project-automation github-project-automation bot moved this from New to Done in Roo Code Roadmap Oct 28, 2025
@mini2s
Copy link

mini2s commented Oct 29, 2025

@daniel-lxs @mrubens

  1. if (autoApprovalEnabled && alwaysApproveResubmit)
image

2.send bad message (you can give provider some error apikey)
image

3.click Cancel or click New Task
image

image

4.The program is stuck in a loop

// note that this api_req_failed ask is unique in that we only present this option if the api hasn't streamed any content yet (ie it fails on the first chunk due), as it would allow them to hit a retry button. However if the api failed mid-stream, it could be in any arbitrary state where some tools may have executed, so that error is handled differently and requires cancelling the task entirely.
			if (autoApprovalEnabled && alwaysApproveResubmit) {😱😱loop😱😱
				let errorMsg

				if (error.error?.metadata?.raw) {
					errorMsg = JSON.stringify(error.error.metadata.raw, null, 2)
				} else if (error.message) {
					errorMsg = error.message
				} else {
					errorMsg = "Unknown error"
				}

				// Apply shared exponential backoff and countdown UX
				await this.backoffAndAnnounce(retryAttempt, error, errorMsg) 😱😱loop😱😱

				// Delegate generator output from the recursive call with
				// incremented retry count.
				yield* this.attemptApiRequest(retryAttempt + 1) 😱😱loop😱😱

				return
			} else {

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. lgtm This PR has been approved by a maintainer size:L This PR changes 100-499 lines, ignoring generated files.

Projects

Archived in project

Development

Successfully merging this pull request may close these issues.

5 participants