Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Conversation

@roomote
Copy link
Contributor

@roomote roomote bot commented Jan 29, 2026

Summary

This PR attempts to address Issue #11071 by enabling the mergeToolResultText option for LM Studio and BaseOpenAiCompatibleProvider to align their message formatting behavior with the Z.ai provider.

Problem

GLM4.5 and other models get stuck in repeated file read loops when using LM Studio or OpenAI-compatible endpoints. This is because text content after tool_results (like environment_details) creates separate user messages that can disrupt the model's conversation flow.

Solution

Enable the mergeToolResultText: true option when calling convertToOpenAiMessages() in:

  • src/api/providers/lm-studio.ts
  • src/api/providers/base-openai-compatible-provider.ts

This matches the behavior of the Z.ai provider, which already uses this option to preserve conversation flow by merging text content after tool_results into the last tool message.

Changes

  • LM Studio provider: Added mergeToolResultText: true to convertToOpenAiMessages() call
  • BaseOpenAiCompatibleProvider: Added mergeToolResultText: true to convertToOpenAiMessages() call

Testing

  • All existing tests pass (65 tests across the affected files)
  • TypeScript type checking passes
  • Linting passes

Related

Feedback and guidance are welcome.


Important

Enable mergeToolResultText for LM Studio and OpenAI-compatible providers to prevent conversation flow disruption.

  • Behavior:
    • Enables mergeToolResultText: true in convertToOpenAiMessages() for LM Studio and BaseOpenAiCompatibleProvider to prevent conversation flow disruption.
  • Files:
    • lm-studio.ts: Updated createMessage() to use mergeToolResultText: true.
    • base-openai-compatible-provider.ts: Updated createStream() to use mergeToolResultText: true.
  • Testing:
    • All existing tests pass (65 tests across affected files).
    • TypeScript type checking and linting pass.

This description was created by Ellipsis for 74deee8. You can customize this summary. It will automatically update as commits are pushed.

…roviders

Enable the mergeToolResultText option for LM Studio and BaseOpenAiCompatibleProvider
to align their message formatting behavior with the Z.ai provider. This change helps
prevent models like GLM4.5 from getting stuck in repeated file read loops by merging
text content after tool_results into the last tool message instead of creating separate
user messages that can disrupt the model conversation flow.

Fixes #11071
@roomote
Copy link
Contributor Author

roomote bot commented Jan 29, 2026

Rooviewer Clock   See task on Roo Cloud

Review complete. No issues found.

The changes correctly enable mergeToolResultText for LM Studio and OpenAI-compatible providers, which aligns their message formatting behavior with the Z.ai provider. This helps prevent models like GLM4.5 from getting stuck in repeated file read loops by merging text content after tool_results into the last tool message instead of creating separate user messages.

Mention @roomote in a comment to request specific changes to this pull request or fix all unresolved issues.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

No open projects
Status: Triage

Development

Successfully merging this pull request may close these issues.

[BUG] GLM4.5 via LMStudio as well as via an OpenAI-compatible endpoint stuck repeating file reads

1 participant