Skip to content

Conversation

kiwnix
Copy link

@kiwnix kiwnix commented May 31, 2025

This PR Adds an option to change the openai base URL (for use in proxys or local llm with OpenAI-style API support).

The cli option is -u
The env is LUMEN_BASE_URL

Sorry i'm not proficient in rust so not sure about correct coding conventions or style.

Summary by CodeRabbit

  • New Features
    • Added support for specifying a custom API base URL via a new command-line option and configuration field.
    • Users can now set the API base URL using a CLI flag, configuration file, or environment variable.
    • The application will use the provided base URL for API requests, or default to the standard URL if none is set.

@coderabbitai
Copy link

coderabbitai bot commented May 31, 2025

Walkthrough

A new optional base_url configuration was introduced and propagated throughout the application's configuration system, CLI, and provider initialization logic. This allows users to specify a custom API base URL via CLI, environment variable, or configuration file. Related constructors and methods were updated to accept and handle this new parameter.

Changes

File(s) Change Summary
src/config/cli.rs Added base_url as an optional CLI argument to the Cli struct.
src/config/configuration.rs Added base_url to LumenConfig, defaulting from LUMEN_BASE_URL; updated build and Default.
src/main.rs Updated LumenProvider::new call to pass config.base_url as an additional parameter.
src/provider/mod.rs Modified LumenProvider::new to accept and forward an optional base_url parameter.
src/provider/openai.rs Updated OpenAIConfig::new to accept and use an optional base_url for api_base_url initialization.

Sequence Diagram(s)

sequenceDiagram
    participant User
    participant CLI
    participant Config
    participant Main
    participant Provider
    participant OpenAIConfig

    User->>CLI: Provide --base-url (optional)
    CLI->>Config: Pass base_url (from CLI or env)
    Config->>Main: Build LumenConfig with base_url
    Main->>Provider: Call LumenProvider::new(..., base_url)
    Provider->>OpenAIConfig: Call OpenAIConfig::new(..., base_url)
    OpenAIConfig->>OpenAIConfig: Set api_base_url (custom or default)
Loading

Poem

In the warren of code, a new path unfurls,
Now you can set your own base URLs!
From CLI to config, the value flows through,
To OpenAI’s door, it’s handed to you.
Hop, skip, and jump—custom endpoints in sight,
The code’s now more flexible, and oh, what delight!
🐇✨

✨ Finishing Touches
  • 📝 Generate Docstrings

🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Explain this complex logic.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai explain this code block.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and explain its main purpose.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai generate sequence diagram to generate a sequence diagram of the changes in this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🔭 Outside diff range comments (1)
src/provider/mod.rs (1)

75-111: 💡 Verification agent

🧩 Analysis chain

Consider extending base_url support to other applicable providers.

While the OpenAI implementation is correct, other providers like Ollama (which often runs locally) and OpenRouter (which provides multiple endpoints) could also benefit from configurable base URLs. This would make the feature more comprehensive and the API more consistent.

Consider extending base_url support to providers that could benefit from it:

            ProviderType::Ollama => {
                let model = model.ok_or(LumenError::MissingModel("Ollama".to_string()))?;
-                let config = OllamaConfig::new(model);
+                let config = OllamaConfig::new(model, base_url);
                let provider = LumenProvider::Ollama(Box::new(OllamaProvider::new(client, config)));
                Ok(provider)
            }

Let me check which other providers could benefit from configurable base URLs:


🏁 Script executed:

#!/bin/bash
# Check the constructor signatures of other provider configs to see if they support base_url
fd -e rs . src/provider/ --exec grep -l "Config::new" {} \; | xargs grep -A 5 "pub fn new"

Length of output: 277


#!/bin/bash
# List each provider config file and its `new` constructor signature
grep -R -lE 'struct [A-Za-z0-9]+Config' src/provider | while read -r file; do
  echo "File: $file"
  grep -R -n 'pub fn new' "$file"
  echo
done

Extend base_url support to all providers

Currently only the OpenAI provider respects a custom base_url. To keep the API consistent, add base_url: Option<String> to each provider’s config and pass it through in src/provider/mod.rs.

Providers to update:

  • PhindConfig::new(model)
  • GroqConfig::new(api_key, model)
  • ClaudeConfig::new(api_key, model)
  • OllamaConfig::new(model)
  • OpenRouterConfig::new(api_key, model)
  • DeepSeekConfig::new(api_key, model)

Example diff in src/provider/mod.rs:

 ProviderType::Ollama => {
     let model = model.ok_or(LumenError::MissingModel("Ollama".to_string()))?;
-    let config = OllamaConfig::new(model);
+    let config = OllamaConfig::new(model, base_url.clone());
     let provider = LumenProvider::Ollama(Box::new(OllamaProvider::new(client, config)));
     Ok(provider)
 }

Repeat the same pattern for Phind, Groq, Claude, OpenRouter, and DeepSeek—updating each Config::new signature and constructor call to accept and forward base_url.

🧹 Nitpick comments (4)
src/config/cli.rs (1)

24-25: Consider adding provider-specific documentation to the help text.

The implementation is correct, but the help text should clarify that this option currently only affects the OpenAI provider to set user expectations properly.

Consider updating the help documentation:

-    #[arg(short = 'u', long = "base-url")]
+    #[arg(short = 'u', long = "base-url", help = "Custom base URL for OpenAI provider")]
src/main.rs (1)

36-36: Consider adding URL validation for the base_url parameter.

The implementation correctly passes the base_url to the provider constructor. However, consider validating that the base_url is a properly formatted URL before initialization to provide better error messages to users.

Consider adding URL validation:

// Before line 35, add validation if base_url is present
if let Some(ref url) = config.base_url {
    if url.parse::<reqwest::Url>().is_err() {
        return Err(LumenError::InvalidArguments(
            format!("Invalid base URL format: {}", url)
        ));
    }
}
src/provider/mod.rs (1)

66-66: Consider documenting that base_url only affects OpenAI provider.

The method signature accepts base_url for all provider types, but the parameter is currently only used by the OpenAI provider. Consider adding documentation to clarify this limitation.

Add documentation to the method:

impl LumenProvider {
+    /// Creates a new LumenProvider instance.
+    /// Note: base_url parameter currently only affects OpenAI provider.
    pub fn new(
src/config/configuration.rs (1)

27-28: LGTM! Consider adding URL validation.

The field follows the established pattern for optional configuration fields. Consider adding URL validation to ensure the provided base_url is well-formed.

You could add validation in the build method:

        let base_url = cli.base_url.clone().or(config.base_url);
+        
+        // Validate base_url if provided
+        if let Some(ref url) = base_url {
+            if url::Url::parse(url).is_err() {
+                return Err(LumenError::InvalidConfiguration(
+                    format!("Invalid base_url: {}", url)
+                ));
+            }
+        }
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between c9ae97a and 622c659.

📒 Files selected for processing (5)
  • src/config/cli.rs (1 hunks)
  • src/config/configuration.rs (4 hunks)
  • src/main.rs (1 hunks)
  • src/provider/mod.rs (1 hunks)
  • src/provider/openai.rs (1 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (1)
src/provider/mod.rs (1)
src/provider/openai.rs (2)
  • new (15-21)
  • new (30-32)
🔇 Additional comments (5)
src/provider/openai.rs (1)

15-21: LGTM! Clean implementation with appropriate fallback.

The implementation correctly handles the optional base_url parameter while maintaining backward compatibility with the default OpenAI endpoint. The use of unwrap_or_else with a closure is appropriate here.

src/provider/mod.rs (1)

69-74: LGTM! OpenAI provider correctly receives the base_url parameter.

The implementation properly passes the base_url parameter to the OpenAI configuration, enabling custom endpoint support for OpenAI-compatible APIs.

src/config/configuration.rs (3)

85-87: LGTM! Environment variable naming is consistent.

The function follows the same pattern as other default functions and uses appropriate environment variable naming that aligns with the existing LUMEN_* convention.


126-126: LGTM! CLI override logic is correct.

The precedence logic correctly implements CLI args > config file > environment variable, which matches the expected behavior and is consistent with other configuration fields.


132-132: LGTM! Struct field integration is complete.

The base_url field is correctly included in both the build method result and the Default implementation, ensuring consistency across all configuration paths.

Also applies to: 158-158

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant