Skip to main content
🔀

LiteLLM

Open-source unified API for 100+ LLM providers with OpenAI compatibility

Code & Development
LiteLLM logo

LiteLLM

Open-source unified API for 100+ LLM providers with OpenAI compatibility

LiteLLM is an open-source Python library and proxy server that provides a unified OpenAI-compatible API across 100+ LLM providers including Anthropic, Google, Azure, Bedrock, and more. It adds load balancing, cost tracking, rate limiting, and fallbacks. Developers and enterprises use LiteLLM to standardize LLM calls across their codebase and manage multi-provider AI infrastructure.

Key Features

  • 100+ LLM provider support
  • OpenAI-compatible API
  • Load balancing and fallbacks
  • Cost and usage tracking
  • Self-hosted proxy server
#LLM infrastructure#open source#multi-model#API gateway#developer tools

Get Started

Visit LiteLLM
🟢
Free
Completely free to use

Quick Info

Category
Code & Development
Pricing
Free

More Code & Development Tools