Customize and Extend Claude Code with ccproxy: Route to OpenAI, Gemini, Qwen, OpenRouter, and Ollama. Gain full control of your Claude Max/Pro Subscription with your own router.
-
Updated
Nov 21, 2025 - Python
Customize and Extend Claude Code with ccproxy: Route to OpenAI, Gemini, Qwen, OpenRouter, and Ollama. Gain full control of your Claude Max/Pro Subscription with your own router.
XAI
Setup scripts for using TensorBlock Forge with Claude Code - access any AI model through Claude's interface
An AI chat proxy with universal tool access, protocol conversion, load balancing, key isolation, prompt enhancement, centralized MCP hub, and built-in WebSearch & WebFetch — more than an AI assistant for chat, translation, mind maps, flowcharts, and search.
Experimental application that integrates Spring AI and CodeGate
Lightweight AI inference gateway - local model registry & parameter transformer (Python SDK) - with optional Envoy proxy processor and FastAPI registry server deployment options.
A lightweight proxy for LLM API calls with guardrails, metrics, and monitoring. A vibe coding experiment.
AI Proxy Server - A high-performance, secure unified API gateway for multiple LLM providers (OpenAI, Gemini, Groq, OpenRouter, Cloudflare) with intelligent routing, rate limiting, and streaming support. Features modular architecture, enhanced security, and optimized performance.
🚀 Enterprise AI Proxy: Claude Code SDK + LiteLLM integration with AWS masking, Redis persistence, TDD architecture. Complete headless mode support for production deployment.
低价中转api_中转api为什么便宜_为什么AI中转站价格会便宜_Api中转平台_中转api推荐,推荐低价Chatgpt中转api_Openaiapi中转_Gemini中转api_Claude中转api
Hybrid AI routing: LOCAL Ollama + CLOUD GitHub Copilot
Poc pour tester fesabilité d'un proxy ia simple avec monitoring
OpenAI-compatible AI proxy: Anthropic Claude, Google Gemini, GPT-5, Cloudflare AI. Free hosting, automatic failover, token rotation. Deploy in 1 minute.
Add a description, image, and links to the ai-proxy topic page so that developers can more easily learn about it.
To associate your repository with the ai-proxy topic, visit your repo's landing page and select "manage topics."