Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
-
Updated
Nov 5, 2025 - Python
Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
A model-driven approach to building AI agents in just a few lines of code.
The most accurate document search and store for building AI apps
Evaluate your LLM's response with Prometheus and GPT4 💯
A set of tools that gives agents powerful capabilities.
Claude Code settings, commands and agents for vibe coding
Agent samples built using the Strands Agents SDK.
A website where you can compare every AI Model ✨
An example agent demonstrating streaming, tool use, and interactivity from your terminal. This agent builder can help you to build your own agents and tools.
Co-create PowerPoint presentations with AI
This MCP server provides documentation about Strands Agents to your GenAI tools, so you can use your favorite AI coding assistant to vibe-code Strands Agents.
GPT-5 adapter for Claude Code CLI + boilerplate for custom LiteLLM servers with LibreChat UI - a dual-purpose repo.
Documentation for the Strands Agents SDK. A model-driven approach to building AI agents in just a few lines of code.
Add a description, image, and links to the litellm topic page so that developers can more easily learn about it.
To associate your repository with the litellm topic, visit your repo's landing page and select "manage topics."