-
Notifications
You must be signed in to change notification settings - Fork 2.2k
feat: add zai-org/GLM-4.5-turbo model to Chutes provider #8157
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
- Added GLM-4.5-turbo to ChutesModelId type definition - Configured model with 128K context window and $1/$3 pricing - Added comprehensive test coverage for the new model - Verified all tests pass and TypeScript compilation succeeds Fixes #8155
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Reviewing my own code is like grading my own homework - suspiciously perfect yet somehow still wrong.
supportsPromptCache: false, | ||
inputPrice: 1, | ||
outputPrice: 3, | ||
description: "GLM-4.5-turbo model with 128K token context window, optimized for fast inference.", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The implementation correctly adds the GLM-4.5-turbo model with accurate metadata from the Chutes API. The pricing ( input, output per 1M tokens) and context window (131,072 tokens) match the specification provided by @mugnimaestra.
supportsPromptCache: false, | ||
inputPrice: 1, | ||
outputPrice: 3, | ||
description: "GLM-4.5-turbo model with 128K token context window, optimized for fast inference.", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good test coverage! The test properly verifies all the model configuration parameters including the default temperature (0.5) for non-DeepSeek models.
This PR adds support for the zai-org/GLM-4.5-turbo model to the Chutes API provider.
Changes
zai-org/GLM-4.5-turbo
to theChutesModelId
type definitionTesting
Related Issues
Fixes #8155
Context
This implementation uses the exact model metadata provided by @mugnimaestra from the Chutes API endpoint (https://llm.chutes.ai/v1/models).
Feedback and guidance are welcome!
Important
Add
zai-org/GLM-4.5-turbo
model to Chutes provider with specific configuration and test coverage.zai-org/GLM-4.5-turbo
toChutesModelId
inchutes.ts
.chutes.spec.ts
to verifyzai-org/GLM-4.5-turbo
model configuration.maxTokens
,contextWindow
,inputPrice
,outputPrice
, anddescription
.This description was created by
for 5a068d4. You can customize this summary. It will automatically update as commits are pushed.