ConduitLLM is a comprehensive LLM management and routing system that allows you to interact with multiple LLM providers through a unified interface. It provides advanced routing capabilities, virtual key management, and a web-based configuration UI.
- .NET 9.0 SDK or later (for local development)
- SQLite or PostgreSQL for database storage
- Docker (recommended for deployment)
- API keys for any LLM providers you plan to use (OpenAI, Anthropic, Cohere, Gemini, Fireworks, OpenRouter)
-
Pull the latest public Docker image:
docker pull ghcr.io/knnlabs/conduit:latest
-
Create a directory for persistent data:
mkdir -p ./data
-
Run the container (SQLite example):
docker run -d \ -p 5000:5000 \ -v $(pwd)/data:/data \ -e DB_PROVIDER=sqlite \ -e CONDUIT_SQLITE_PATH=/data/conduit.db \ -e CONDUIT_MASTER_KEY=your_secure_master_key \ ghcr.io/knnlabs/conduit:latest
For PostgreSQL, use:
docker run -d \ -p 5000:5000 \ -e DB_PROVIDER=postgres \ -e CONDUIT_POSTGRES_CONNECTION_STRING=Host=yourhost;Port=5432;Database=conduitllm;Username=youruser;Password=yourpassword \ -e CONDUIT_MASTER_KEY=your_secure_master_key \ ghcr.io/knnlabs/conduit:latest
- Clone the repository:
git clone https://github.com/your-org/ConduitLLM.git cd ConduitLLM
- Run the start script:
./start.sh
- Clone the repository:
git clone https://github.com/your-org/ConduitLLM.git cd ConduitLLM
- Build the solution:
dotnet build
- Run the WebUI project:
cd ConduitLLM.WebUI dotnet run
As of April 2025, ConduitLLM is distributed as two separate Docker images:
- WebUI Image: The Blazor-based admin dashboard (
ConduitLLM.WebUI
) - Http Image: The OpenAI-compatible REST API gateway (
ConduitLLM.Http
)
Each image is published independently:
ghcr.io/knnlabs/conduit-webui:latest
(WebUI)ghcr.io/knnlabs/conduit-http:latest
(API Gateway)
Create a docker-compose.yml
:
services:
webui:
image: ghcr.io/knnlabs/conduit-webui:latest
ports:
- "5001:8080"
environment:
# ... WebUI environment variables
http:
image: ghcr.io/knnlabs/conduit-http:latest
ports:
- "5000:8080"
environment:
# ... API Gateway environment variables
Then start both services:
docker compose up -d
Or run separately:
docker run -d --name conduit-webui -p 5001:8080 ghcr.io/knnlabs/conduit-webui:latest
docker run -d --name conduit-http -p 5000:8080 ghcr.io/knnlabs/conduit-http:latest
Note: Update all deployment scripts and CI/CD workflows to use the new image tags. See
.github/workflows/docker-release.yml
for reference.
-
Open your browser and navigate to the WebUI.
- Local Development (
./start.sh
):http://localhost:5001
- Docker/Deployed: Access via the URL configured in the
CONDUIT_API_BASE_URL
environment variable (e.g.,https://conduit.yourdomain.com
), typically through an HTTPS reverse proxy, orhttp://localhost:5000
if running locally.
- Local Development (
-
Navigate to the Configuration page to set up:
- LLM providers (API keys and endpoints)
- Model mappings
- Global settings including the master key
-
Set up your first provider:
- Select a provider (e.g., OpenAI)
- Enter your API key (obtain from the provider's website)
- Configure any additional provider-specific settings
The router allows you to distribute requests across different model deployments:
- Configure model deployments through the WebUI
- Select a routing strategy (simple, random, round-robin)
- Set up fallback configurations between models
- Explore the Architecture Overview to understand the system components
- Check the Configuration Guide for detailed configuration options
- See the API Reference for available endpoints
- Learn about Virtual Keys for managing access and budgets