-
Notifications
You must be signed in to change notification settings - Fork 35
Add option to change the base_url of openai provider #58
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
WalkthroughA new optional Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant CLI
participant Config
participant Main
participant Provider
participant OpenAIConfig
User->>CLI: Provide --base-url (optional)
CLI->>Config: Pass base_url (from CLI or env)
Config->>Main: Build LumenConfig with base_url
Main->>Provider: Call LumenProvider::new(..., base_url)
Provider->>OpenAIConfig: Call OpenAIConfig::new(..., base_url)
OpenAIConfig->>OpenAIConfig: Set api_base_url (custom or default)
Poem
✨ Finishing Touches
🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🔭 Outside diff range comments (1)
src/provider/mod.rs (1)
75-111
: 💡 Verification agent🧩 Analysis chain
Consider extending base_url support to other applicable providers.
While the OpenAI implementation is correct, other providers like Ollama (which often runs locally) and OpenRouter (which provides multiple endpoints) could also benefit from configurable base URLs. This would make the feature more comprehensive and the API more consistent.
Consider extending base_url support to providers that could benefit from it:
ProviderType::Ollama => { let model = model.ok_or(LumenError::MissingModel("Ollama".to_string()))?; - let config = OllamaConfig::new(model); + let config = OllamaConfig::new(model, base_url); let provider = LumenProvider::Ollama(Box::new(OllamaProvider::new(client, config))); Ok(provider) }Let me check which other providers could benefit from configurable base URLs:
🏁 Script executed:
#!/bin/bash # Check the constructor signatures of other provider configs to see if they support base_url fd -e rs . src/provider/ --exec grep -l "Config::new" {} \; | xargs grep -A 5 "pub fn new"Length of output: 277
#!/bin/bash # List each provider config file and its `new` constructor signature grep -R -lE 'struct [A-Za-z0-9]+Config' src/provider | while read -r file; do echo "File: $file" grep -R -n 'pub fn new' "$file" echo done
Extend base_url support to all providers
Currently only the OpenAI provider respects a custom
base_url
. To keep the API consistent, addbase_url: Option<String>
to each provider’s config and pass it through insrc/provider/mod.rs
.Providers to update:
- PhindConfig::new(model)
- GroqConfig::new(api_key, model)
- ClaudeConfig::new(api_key, model)
- OllamaConfig::new(model)
- OpenRouterConfig::new(api_key, model)
- DeepSeekConfig::new(api_key, model)
Example diff in src/provider/mod.rs:
ProviderType::Ollama => { let model = model.ok_or(LumenError::MissingModel("Ollama".to_string()))?; - let config = OllamaConfig::new(model); + let config = OllamaConfig::new(model, base_url.clone()); let provider = LumenProvider::Ollama(Box::new(OllamaProvider::new(client, config))); Ok(provider) }Repeat the same pattern for Phind, Groq, Claude, OpenRouter, and DeepSeek—updating each
Config::new
signature and constructor call to accept and forwardbase_url
.
🧹 Nitpick comments (4)
src/config/cli.rs (1)
24-25
: Consider adding provider-specific documentation to the help text.The implementation is correct, but the help text should clarify that this option currently only affects the OpenAI provider to set user expectations properly.
Consider updating the help documentation:
- #[arg(short = 'u', long = "base-url")] + #[arg(short = 'u', long = "base-url", help = "Custom base URL for OpenAI provider")]src/main.rs (1)
36-36
: Consider adding URL validation for the base_url parameter.The implementation correctly passes the base_url to the provider constructor. However, consider validating that the base_url is a properly formatted URL before initialization to provide better error messages to users.
Consider adding URL validation:
// Before line 35, add validation if base_url is present if let Some(ref url) = config.base_url { if url.parse::<reqwest::Url>().is_err() { return Err(LumenError::InvalidArguments( format!("Invalid base URL format: {}", url) )); } }src/provider/mod.rs (1)
66-66
: Consider documenting that base_url only affects OpenAI provider.The method signature accepts base_url for all provider types, but the parameter is currently only used by the OpenAI provider. Consider adding documentation to clarify this limitation.
Add documentation to the method:
impl LumenProvider { + /// Creates a new LumenProvider instance. + /// Note: base_url parameter currently only affects OpenAI provider. pub fn new(src/config/configuration.rs (1)
27-28
: LGTM! Consider adding URL validation.The field follows the established pattern for optional configuration fields. Consider adding URL validation to ensure the provided base_url is well-formed.
You could add validation in the
build
method:let base_url = cli.base_url.clone().or(config.base_url); + + // Validate base_url if provided + if let Some(ref url) = base_url { + if url::Url::parse(url).is_err() { + return Err(LumenError::InvalidConfiguration( + format!("Invalid base_url: {}", url) + )); + } + }
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (5)
src/config/cli.rs
(1 hunks)src/config/configuration.rs
(4 hunks)src/main.rs
(1 hunks)src/provider/mod.rs
(1 hunks)src/provider/openai.rs
(1 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (1)
src/provider/mod.rs (1)
src/provider/openai.rs (2)
new
(15-21)new
(30-32)
🔇 Additional comments (5)
src/provider/openai.rs (1)
15-21
: LGTM! Clean implementation with appropriate fallback.The implementation correctly handles the optional base_url parameter while maintaining backward compatibility with the default OpenAI endpoint. The use of
unwrap_or_else
with a closure is appropriate here.src/provider/mod.rs (1)
69-74
: LGTM! OpenAI provider correctly receives the base_url parameter.The implementation properly passes the base_url parameter to the OpenAI configuration, enabling custom endpoint support for OpenAI-compatible APIs.
src/config/configuration.rs (3)
85-87
: LGTM! Environment variable naming is consistent.The function follows the same pattern as other default functions and uses appropriate environment variable naming that aligns with the existing
LUMEN_*
convention.
126-126
: LGTM! CLI override logic is correct.The precedence logic correctly implements CLI args > config file > environment variable, which matches the expected behavior and is consistent with other configuration fields.
132-132
: LGTM! Struct field integration is complete.The base_url field is correctly included in both the
build
method result and theDefault
implementation, ensuring consistency across all configuration paths.Also applies to: 158-158
This PR Adds an option to change the openai base URL (for use in proxys or local llm with OpenAI-style API support).
The cli option is -u
The env is LUMEN_BASE_URL
Sorry i'm not proficient in rust so not sure about correct coding conventions or style.
Summary by CodeRabbit