The UI for the inference-gateway, providing a user-friendly interface to interact with and visualize inference results and manage models
-
Updated
Apr 30, 2025 - TypeScript
The UI for the inference-gateway, providing a user-friendly interface to interact with and visualize inference results and manage models
An SDK written in Rust for the Inference Gateway
Extensive documentation of the inference-gateway
An SDK written in Python for the Inference Gateway
An SDK written in Typescript for the Inference Gateway
An open-source, high-performance gateway unifying multiple LLM providers, from local solutions like Ollama to major cloud providers such as OpenAI, Groq, Cohere, Anthropic, Cloudflare and DeepSeek.
Add a description, image, and links to the inference-gateway topic page so that developers can more easily learn about it.
To associate your repository with the inference-gateway topic, visit your repo's landing page and select "manage topics."