Thanks to visit codestin.com
Credit goes to github.com

Skip to content

feat: add zai-org/GLM-4.5-turbo model to Chutes provider#8157

Merged
mrubens merged 1 commit intomainfrom
feature/add-glm-4.5-turbo-chutes
Sep 23, 2025
Merged

feat: add zai-org/GLM-4.5-turbo model to Chutes provider#8157
mrubens merged 1 commit intomainfrom
feature/add-glm-4.5-turbo-chutes

Conversation

@roomote
Copy link
Contributor

@roomote roomote bot commented Sep 18, 2025

This PR adds support for the zai-org/GLM-4.5-turbo model to the Chutes API provider.

Changes

  • Added zai-org/GLM-4.5-turbo to the ChutesModelId type definition
  • Configured the model with correct metadata from the Chutes API:
    • Context window: 131,072 tokens (128K)
    • Input pricing: $1 per 1M tokens
    • Output pricing: $3 per 1M tokens
    • Description: "GLM-4.5-turbo model with 128K token context window, optimized for fast inference."
  • Added comprehensive test coverage for the new model

Testing

  • ✅ All existing tests pass
  • ✅ New test case added and passing
  • ✅ TypeScript compilation successful
  • ✅ Linting checks pass

Related Issues

Fixes #8155

Context

This implementation uses the exact model metadata provided by @mugnimaestra from the Chutes API endpoint (https://llm.chutes.ai/v1/models).

Feedback and guidance are welcome!


Important

Add zai-org/GLM-4.5-turbo model to Chutes provider with specific configuration and test coverage.

  • Behavior:
    • Add zai-org/GLM-4.5-turbo to ChutesModelId in chutes.ts.
    • Configure model with 131,072 token context window, $1 input and $3 output pricing per 1M tokens.
    • Description: "GLM-4.5-turbo model with 128K token context window, optimized for fast inference."
  • Testing:
    • Add test in chutes.spec.ts to verify zai-org/GLM-4.5-turbo model configuration.
    • Ensure test checks for correct maxTokens, contextWindow, inputPrice, outputPrice, and description.
  • Misc:
    • All existing tests pass.
    • TypeScript compilation and linting checks pass.

This description was created by Ellipsis for 5a068d4. You can customize this summary. It will automatically update as commits are pushed.

- Added GLM-4.5-turbo to ChutesModelId type definition
- Configured model with 128K context window and $1/$3 pricing
- Added comprehensive test coverage for the new model
- Verified all tests pass and TypeScript compilation succeeds

Fixes #8155
@roomote roomote bot requested review from cte, jr and mrubens as code owners September 18, 2025 16:59
@dosubot dosubot bot added size:M This PR changes 30-99 lines, ignoring generated files. Enhancement New feature or request labels Sep 18, 2025
@hannesrudolph hannesrudolph added the Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. label Sep 18, 2025
Copy link
Contributor Author

@roomote roomote bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reviewing my own code is like grading my own homework - suspiciously perfect yet somehow still wrong.

supportsPromptCache: false,
inputPrice: 1,
outputPrice: 3,
description: "GLM-4.5-turbo model with 128K token context window, optimized for fast inference.",
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The implementation correctly adds the GLM-4.5-turbo model with accurate metadata from the Chutes API. The pricing ( input, output per 1M tokens) and context window (131,072 tokens) match the specification provided by @mugnimaestra.

supportsPromptCache: false,
inputPrice: 1,
outputPrice: 3,
description: "GLM-4.5-turbo model with 128K token context window, optimized for fast inference.",
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good test coverage! The test properly verifies all the model configuration parameters including the default temperature (0.5) for non-DeepSeek models.

Copy link
Member

@daniel-lxs daniel-lxs left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@daniel-lxs daniel-lxs moved this from Triage to PR [Needs Review] in Roo Code Roadmap Sep 23, 2025
@dosubot dosubot bot added the lgtm This PR has been approved by a maintainer label Sep 23, 2025
@mrubens mrubens merged commit 382ab63 into main Sep 23, 2025
23 checks passed
@github-project-automation github-project-automation bot moved this from PR [Needs Review] to Done in Roo Code Roadmap Sep 23, 2025
@github-project-automation github-project-automation bot moved this from New to Done in Roo Code Roadmap Sep 23, 2025
@mrubens mrubens deleted the feature/add-glm-4.5-turbo-chutes branch September 23, 2025 21:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Enhancement New feature or request Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. lgtm This PR has been approved by a maintainer size:M This PR changes 30-99 lines, ignoring generated files.

Projects

No open projects
Archived in project

Development

Successfully merging this pull request may close these issues.

[ENHANCEMENT] Add zai-org/GLM-4.5-turbo to Chutes API provider

4 participants