Skip to content

Conversation

@luw2007
Copy link

@luw2007 luw2007 commented Dec 19, 2025

🎯 Overview

Add support for customizing OpenAI API endpoint via OPENAI_BASE_URL environment variable, enabling users to:

  • Use OpenAI-compatible proxies and enterprise gateways
  • Point to local development services (Ollama, LocalAI, etc.)
  • Route through corporate network infrastructure

💡 Motivation

Many users need to:

  1. Use OpenAI through corporate proxies that require custom endpoints
  2. Develop locally with OpenAI-compatible services
  3. Test against staging/development OpenAI environments
  4. Route through network gateways for security/compliance

Currently, the OpenAI provider has a hardcoded endpoint, requiring code changes or creating separate providers.

🔧 Solution

This PR adds minimal, backward-compatible support for endpoint customization:

Changes Made

  1. New constants (2 lines):

    • OPENAI_BASE_URL_ENV: Environment variable name
    • DEFAULT_OPENAI_API_URL: Renamed from OPENAI_API_URL
  2. New helper function (16 lines):

    • get_api_endpoint(): Resolves URL from env or default
  3. Request execution update (2 lines):

    • Modified to use dynamic endpoint
  4. Comprehensive tests (77 lines):

    • 7 unit tests covering all edge cases

Total: 96 lines added, 2 lines modified

Design Principles

  • Minimal changes: Only 3 touch points in code
  • 100% backward compatible: Default behavior unchanged
  • No breaking changes: Existing users unaffected
  • No logic changes: Pricing, validation, caching logic untouched

📊 Testing

All tests passing:

running 7 tests
test test_get_api_endpoint_default ... ok
test test_get_api_endpoint_custom_base_url ... ok
test test_get_api_endpoint_trailing_slash ... ok
test test_get_api_endpoint_empty_string ... ok
test test_get_api_endpoint_whitespace ... ok
test test_get_api_endpoint_https ... ok
test test_get_api_endpoint_http ... ok

test result: ok. 7 passed; 0 failed

Test Coverage

  • ✅ Default URL (no env var set)
  • ✅ Custom base URL
  • ✅ Trailing slash normalization
  • ✅ Empty string fallback
  • ✅ Whitespace-only fallback
  • ✅ HTTPS support
  • ✅ HTTP support (local dev)

📖 Usage Examples

Default OpenAI (unchanged)

export OPENAI_API_KEY="sk-..."
# Uses: https://api.openai.com/v1/chat/completions

Local proxy

export OPENAI_BASE_URL="http://localhost:8080/v1"
export OPENAI_API_KEY="local-key"
# Uses: http://localhost:8080/v1/chat/completions

Enterprise gateway

export OPENAI_BASE_URL="https://gateway.company.com/openai/v1"
export OPENAI_API_KEY="company-key"
# Uses: https://gateway.company.com/openai/v1/chat/completions

🔍 Edge Cases Handled

Case Behavior
Env var not set Use default OpenAI URL
Empty string "" Fallback to default
Whitespace " " Fallback to default
Trailing slash /v1/ Auto-normalize to /v1
HTTP vs HTTPS Both supported

⚠️ Important Notes

  1. Model validation still applies: This only changes the endpoint URL. The provider still validates OpenAI model names (gpt-4o, gpt-4.5, etc.)

  2. For non-OpenAI models: If you need to use models with different names (e.g., llama-3), consider using a separate custom provider or relaxing supports_model().

  3. Embeddings not included: This PR only covers chat completions. If needed, a follow-up PR can add similar support to src/embedding/provider/openai.rs.

📝 Checklist

  • Code changes are minimal and focused
  • All existing tests pass
  • New tests added for new functionality
  • Backward compatibility maintained
  • No breaking changes
  • Documentation included in commit message

🤝 Design Review

This implementation was designed after consulting with multiple AI models (Gemini and Codex) to ensure:

  • Minimal code complexity
  • Maximum backward compatibility
  • Clean separation of concerns
  • Proper edge case handling

Co-Authored-By: Claude Sonnet 4.5 [email protected]

Add support for customizing OpenAI API endpoint via OPENAI_BASE_URL
environment variable, enabling users to use OpenAI-compatible proxies,
enterprise gateways, or local services.

Changes:
- Add OPENAI_BASE_URL_ENV constant for environment variable name
- Add get_api_endpoint() helper function to resolve URL dynamically
- Update execute_openai_request() to use dynamic endpoint
- Add 7 comprehensive unit tests for URL resolution

Features:
- 100% backward compatible (default behavior unchanged)
- Automatic trailing slash normalization
- Empty/whitespace fallback to default URL
- Support both HTTP and HTTPS endpoints

Usage example:
  export OPENAI_BASE_URL="http://localhost:8080/v1"
  export OPENAI_API_KEY="your-key"

Co-Authored-By: Claude Sonnet 4.5 <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant