Call all LLM APIs using the OpenAI format [Anthropic, Huggingface, Cohere, Azure OpenAI etc.] liteLLM supports streaming the model response back, pass stream=True to get a streaming iterator in response. Streaming is supported for OpenAI, Azure, Anthropic, and Huggingface models.
Features
- Translating inputs to the provider's completion and embedding endpoints
- Guarantees consistent output, text responses will always be available
- Exception mapping
- Common exceptions across providers are mapped to the OpenAI exception types
- LiteLLM Client - debugging & 1-click add new LLMs
- liteLLM supports streaming the model response back
License
MIT LicenseFollow LiteLLM
Other Useful Business Software
Wiz: #1 Cloud Security Software for Modern Cloud Protection
Use the Wiz Cloud Security Platform to build faster in the cloud, enabling security, dev and devops to work together in a self-service model built for the scale and speed of your cloud development.
Rate This Project
Login To Rate This Project
User Reviews
Be the first to post a review of LiteLLM!