Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support reasoning generation property #39

Open
gerukin opened this issue Feb 11, 2025 · 0 comments
Open

Support reasoning generation property #39

gerukin opened this issue Feb 11, 2025 · 0 comments

Comments

@gerukin
Copy link

gerukin commented Feb 11, 2025

Is your feature request related to a problem? Please describe.
Reasoning models output both a final answer and a reasoning (chain of thought) part. The format depends on the provider and model. This Ollama provider does not seem to currently allow this for any model, requiring manual parsing to separate the reasoning from the actual answer.

Describe the solution you'd like
Some models and providers (ex: R1) support a reasoning property on the generated response object. It would be ideal if this were automatically set based on the selected model.

Describe alternatives you've considered
It is possible to use extractReasoningMiddleware from the AI SDK, but this requires keeping track of model specifics (like model name regex and tag) in the app. This is more of a fallback solution, and normally not required for well supported models in other providers.

Additional context

  • N/A -
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant