Build mods for Claude Code: Hook any request, modify any response, /model "with-your-custom-model", intelligent model routing using your logic or ours
-
Updated
Feb 4, 2026 - Python
Build mods for Claude Code: Hook any request, modify any response, /model "with-your-custom-model", intelligent model routing using your logic or ours
XAI
Setup scripts for using TensorBlock Forge with Claude Code - access any AI model through Claude's interface
🧬 The adaptive model routing system for exploration and exploitation.
An AI chat proxy with universal tool access, protocol conversion, load balancing, key isolation, prompt enhancement, centralized MCP hub, and built-in WebSearch & WebFetch — more than an AI assistant for chat, translation, mind maps, flowcharts, and search.
OpenAI-compatible AI proxy: Anthropic Claude, Google Gemini, GPT-5, Cloudflare AI. Free hosting, automatic failover, token rotation. Deploy in 1 minute.
Experimental application that integrates Spring AI and CodeGate
Lightweight AI inference gateway - local model registry & parameter transformer (Python SDK) - with optional Envoy proxy processor and FastAPI registry server deployment options.
Bridge your local AI tools to the web. Exposes Claude Code CLI, Gemini CLI, and Antigravity sessions as a standard API, allowing you to use these accounts with any OpenAI/Claude/Gemini-compatible API client.
A lightweight proxy for LLM API calls with guardrails, metrics, and monitoring. A vibe coding experiment.
A unified AI proxy server for free access to multiple LLM providers through Puter.js SDK - No expensive API keys needed!
AI Proxy Server - A high-performance, secure unified API gateway for multiple LLM providers (OpenAI, Gemini, Groq, OpenRouter, Cloudflare) with intelligent routing, rate limiting, and streaming support. Features modular architecture, enhanced security, and optimized performance.
OpenAI-compatible proxy for OpenClaw agents. Enables seamless use of your local OpenClaw assistant in web UIs like OpenWebUI, SillyTavern, and LM Studio with full streaming, Docker support, and secure deployment.
Hybrid AI routing: LOCAL Ollama + CLOUD GitHub Copilot
🚀 Use Claude Code CLI with Gemini, GPT-5, Grok & 20+ AI models — Multi-account load balancing, real-time dashboard, per-window model switching
Add a description, image, and links to the ai-proxy topic page so that developers can more easily learn about it.
To associate your repository with the ai-proxy topic, visit your repo's landing page and select "manage topics."