Skip to content

Conversation

logicalor
Copy link

Q A
Bug fix? yes
New feature? yes
Docs? yes/no
Issues Fix #338
License MIT

Both a bug fix and a feature? Support streaming output from Ollama, preventing an error being thrown if streaming output is sent with the chat request.

@carsonbot carsonbot added Bug Something isn't working Feature New feature Status: Needs Review labels Aug 21, 2025
@carsonbot carsonbot changed the title 338 Support streaming output from Ollama 338 Support streaming output from Ollama Aug 21, 2025
@logicalor logicalor changed the title 338 Support streaming output from Ollama [Platform] Support streaming output from Ollama Aug 21, 2025
@OskarStark
Copy link
Contributor

Can you please add an example in examples/ directory? Thanks

@OskarStark OskarStark changed the title [Platform] Support streaming output from Ollama [Platform][Ollama] Support streaming output Aug 21, 2025
@OskarStark OskarStark added the Platform Issues & PRs about the AI Platform component label Aug 21, 2025
@logicalor
Copy link
Author

Can you please add an example in examples/ directory? Thanks

@OskarStark updated with an example

@OskarStark
Copy link
Contributor

Thanks

It looks like your committer email is not associated with your GitHub account

@logicalor
Copy link
Author

Thanks

It looks like your committer email is not associated with your GitHub account

@OskarStark thanks, hadn't noticed that. Added.

@logicalor logicalor requested a review from Guikingone August 23, 2025 05:26
if ($msg) {
yield $msg;
}
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Question for @chr-hertel (not relevant for the review), isn't the same situation that we're facing in #324?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's not since this is only working on the response in the ResultConverter, which is only called when finally fetching the content. In #324 the stream is called already in the client

@logicalor logicalor requested a review from Guikingone August 23, 2025 10:00
// Emit each chunk as it is received
foreach ($result as $chunk) {
echo $chunk->getContent();
}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
}
}
echo \PHP_EOL;

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's call the file only stream.php - it is more consistent with the other stream examples, and ollama is basically the folder already

@chr-hertel
Copy link
Member

which ollama version do you use?
image

Comment on lines +38 to +41
$data = json_decode($json, true);
if (!\is_array($data)) {
return null;
}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please move this to the result converter - it's their job to convert to data. fine with having the value object, but let's keep it simple than 👍

Copy link
Member

@chr-hertel chr-hertel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @logicalor for reporting and working on this - we need this :)

Made some comments, but i think all in all we're close to merging it 👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Bug Something isn't working Feature New feature Platform Issues & PRs about the AI Platform component Status: Needs Work
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Platform] Support streaming output for Ollama
5 participants