OneLLM Documentation
A “drop-in” replacement for OpenAI’s client that offers a unified interface for interacting with large language models from various providers, with support for hundreds of models, built-in fallback mechanisms, and enhanced reliability features.
This project was created by Ran Aroussi, and is released under the Apache 2.0 license.
Support this project by starring it on GitHub >
More stars → more visibility → more contributors → better features → more robust tool for everyone 🎉
Welcome to the OneLLM documentation! OneLLM is a unified interface for 300+ LLMs across 18+ providers, designed as a drop-in replacement for the OpenAI Python client.
🚀 Get Started
- Installation - Install OneLLM in seconds
- Quick Start - Your first OneLLM script
- Configuration - Set up API keys and options
- Provider Setup - Configure your providers
🌟 Key Features
- Drop-in OpenAI Replacement: Use the same code with 300+ models
- Unified Interface: One API for all providers (OpenAI, Anthropic, Google, etc.)
- Smart Fallbacks: Automatic failover between providers
- Type Safety: Full type hints and IDE support
- Async Support: Native async/await capabilities
- Provider Agnostic: Switch models with just a string change
📖 Documentation
Core Concepts
- Architecture - How OneLLM works under the hood
- Provider System - Understanding providers and models
- Error Handling - Handling errors gracefully
- Advanced Features - Fallbacks, retries, and more
API Reference
- Client API - OpenAI-compatible client interface
- Chat Completions - Chat completion methods
Providers
- Available Providers - List of supported providers
- Provider Capabilities - Feature support matrix
- Azure OpenAI - Azure-specific configuration
- AWS Bedrock - Bedrock setup guide
- Local Models - Run models locally
💡 Example
from onellm import ChatCompletion
# Basic usage (identical to OpenAI's client)
response = ChatCompletion.create(
model="openai/gpt-4",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello!"}
]
)
print(response.choices[0].message["content"])
🤝 Support
- GitHub Issues - Report bugs or request features
- Discussions - Ask questions and share ideas