-
Notifications
You must be signed in to change notification settings - Fork 5
Add OpenRouter provider support for unified LLM API access #2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
@malaksedarous 👋 This repository doesn't have Copilot instructions. With Copilot instructions, I can understand the repository better, work faster and produce higher quality PRs. I can generate a .github/copilot-instructions.md file for you automatically. Click here to open a pre-filled issue and assign it to me. I'll write the instructions, and then tag you for review. |
Co-authored-by: malaksedarous <[email protected]>
Co-authored-by: malaksedarous <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR adds OpenRouter as a new LLM provider to enable unified API access to multiple upstream models through a single API key. OpenRouter acts as a router that provides access to OpenAI, Anthropic, Google, and other models via an OpenAI-compatible API format.
- Implements
OpenRouterProviderclass with native fetch for lightweight HTTP requests - Adds OpenRouter configuration support with environment variables and validation
- Integrates OpenRouter into the existing provider factory system
Reviewed Changes
Copilot reviewed 5 out of 6 changed files in this pull request and generated 1 comment.
Show a summary per file
| File | Description |
|---|---|
src/providers/openrouter.ts |
New OpenRouter provider implementation with error handling and optional branding headers |
src/providers/factory.ts |
Added OpenRouter case to provider factory for instantiation |
src/config/schema.ts |
Extended provider union type and config interface to include OpenRouter |
src/config/manager.ts |
Added OpenRouter environment variable support and validation logic |
test/openrouter.test.ts |
Comprehensive test suite covering all provider functionality and error cases |
- Added VALID_LLM_PROVIDERS constant to avoid code duplication - Updated getLLMProvider() and validateConfiguration() to use the constant - Error messages now dynamically generate from the constant - Maintains same functionality while improving maintainability
This PR adds OpenRouter as a first-class LLM provider, enabling users to route requests through OpenRouter's unified API to access multiple upstream models (OpenAI, Anthropic, Google, open models, etc.) with a single API key.
What's Added
New OpenRouter Provider: Implements
OpenRouterProviderextendingBaseLLMProviderusing nativefetchfor explicit control and lighter footprint. Features include:openai/gpt-4o-mini(cost-effective choice)HTTP-Referer,X-Title) via environment variablesConfiguration Support:
openrouterto provider union type in schemaCONTEXT_OPT_LLM_PROVIDER=openrouterandCONTEXT_OPT_OPENROUTER_KEYhasOpenrouterKeybooleanFactory Integration: Updated
LLMProviderFactoryto create OpenRouter provider instances seamlessly alongside existing providers (gemini, claude, openai).Comprehensive Testing: Added 11 unit tests covering:
Usage
Technical Details
https://openrouter.ai/api/v1/chat/completionsmodel,messages,temperature,max_tokensfetch)All existing functionality remains unchanged. The implementation follows the same patterns as other providers and maintains backward compatibility.
Fixes #1.
Warning
Firewall rules blocked me from connecting to one or more addresses
I tried to connect to the following addresses, but was blocked by firewall rules:
generativelanguage.googleapis.comIf you need me to access, download, or install something from one of these locations, you can either:
💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more Copilot coding agent tips in the docs.