fix(ollama): ensure stable message ID for streaming responses#1008
Merged
LearningGp merged 3 commits intoagentscope-ai:mainfrom Mar 26, 2026
Merged
fix(ollama): ensure stable message ID for streaming responses#1008LearningGp merged 3 commits intoagentscope-ai:mainfrom
LearningGp merged 3 commits intoagentscope-ai:mainfrom
Conversation
Contributor
There was a problem hiding this comment.
Pull request overview
Fixes Ollama streaming so all emitted ChatResponse chunks share a single stable id, addressing downstream consumers (e.g., AG-UI) that require a consistent messageId across start/content/end events.
Changes:
- Cache and reuse the first streamed
ChatResponse.idfor the remainder of the stream inOllamaChatModel. - Add a unit test to assert non-null and stable IDs across all streamed chunks.
Reviewed changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated 1 comment.
| File | Description |
|---|---|
agentscope-core/src/main/java/io/agentscope/core/model/OllamaChatModel.java |
Ensures a stable ChatResponse.id is applied across the full Ollama streaming lifecycle. |
agentscope-core/src/test/java/io/agentscope/core/model/OllamaChatModelTest.java |
Adds coverage verifying streaming responses have a non-null, consistent id across chunks. |
agentscope-core/src/main/java/io/agentscope/core/model/OllamaChatModel.java
Show resolved
Hide resolved
Codecov Report✅ All modified and coverable lines are covered by tests. 📢 Thoughts on this report? Let us know! |
LearningGp
requested changes
Mar 24, 2026
agentscope-core/src/main/java/io/agentscope/core/model/OllamaChatModel.java
Outdated
Show resolved
Hide resolved
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Description
Close #897
Fixed the issue of Olama streaming output generating different UUIDs for each data block due to the underlying layer not returning an ID. By caching and reusing the ID of the first data block during the lifecycle of the stream, it ensures that downstream AG-UI clients can receive stable messageIDs.
Checklist
Please check the following items before code is ready to be reviewed.
mvn spotless:applymvn test)