Skip to content

Conversation

@ilopezluna
Copy link
Contributor

This pull request refactors how Ollama API options are mapped to the OpenAI-compatible request format in the HTTPHandler. The main improvement is the introduction of a dedicated helper function to centralize and streamline option mapping.

@ilopezluna ilopezluna marked this pull request as ready for review December 1, 2025 12:11
@ilopezluna ilopezluna requested a review from a team December 1, 2025 12:11
Copy link
Contributor

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey there - I've reviewed your changes - here's some feedback:

  • Consider making the backendSpecificOptions list a package-level var or const (or at least a shared slice) so it’s not reallocated on every call to mapOllamaOptionsToOpenAI, especially if this path is hot.
  • The debug log inside the loop over backendSpecificOptions will emit a line per option per request; if this handler is high-traffic, you might want to either aggregate this logging, gate it behind a higher log level, or remove it to avoid log noise.
Prompt for AI Agents
Please address the comments from this code review:

## Overall Comments
- Consider making the `backendSpecificOptions` list a package-level `var` or `const` (or at least a shared slice) so it’s not reallocated on every call to `mapOllamaOptionsToOpenAI`, especially if this path is hot.
- The debug log inside the loop over `backendSpecificOptions` will emit a line per option per request; if this handler is high-traffic, you might want to either aggregate this logging, gate it behind a higher log level, or remove it to avoid log noise.

## Individual Comments

### Comment 1
<location> `pkg/ollama/http_handler.go:691-700` </location>
<code_context>
+
+	// Backend-specific options that may not be supported by all OpenAI-compatible backends
+	// We'll pass these through and let the backend decide whether to use them
+	backendSpecificOptions := []string{
+		"repeat_last_n",
+		"typical_p",
+		"min_p",
+		"num_keep",
+		"num_batch",
+		"num_gpu",
+		"main_gpu",
+		"use_mmap",
+		"num_thread",
+	}
+
</code_context>

<issue_to_address>
**issue (bug_risk):** Passing backend-specific options through as top-level fields may break stricter OpenAI-compatible backends

Forwarding `repeat_last_n`, `num_gpu`, `num_thread`, etc. as top-level `openAIReq` fields assumes all OpenAI-compatible backends ignore unknown keys. Some may instead reject the request. Consider either scoping this to specific backends, namespacing under a dedicated options field, or making the passthrough configurable so stricter backends can disable it.
</issue_to_address>

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

@ilopezluna ilopezluna merged commit badb86e into main Dec 2, 2025
9 checks passed
@ilopezluna ilopezluna deleted the map-ollama-options-to-openai branch December 2, 2025 12:26
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants