diff --git a/README.md b/README.md index 306f07ec2..2153ae948 100644 --- a/README.md +++ b/README.md @@ -11,7 +11,7 @@

Call all LLM APIs using the OpenAI format [Bedrock, Huggingface, VertexAI, TogetherAI, Azure, OpenAI, Groq etc.]

-

OpenAI Proxy Server | Hosted Proxy (Preview) | Enterprise Tier

+

LiteLLM Proxy Server | Hosted Proxy (Preview) | Enterprise Tier

PyPI Version @@ -35,7 +35,7 @@ LiteLLM manages: - Translate inputs to provider's `completion`, `embedding`, and `image_generation` endpoints - [Consistent output](https://docs.litellm.ai/docs/completion/output), text responses will always be available at `['choices'][0]['message']['content']` - Retry/fallback logic across multiple deployments (e.g. Azure/OpenAI) - [Router](https://docs.litellm.ai/docs/routing) -- Set Budgets & Rate limits per project, api key, model [OpenAI Proxy Server](https://docs.litellm.ai/docs/simple_proxy) +- Set Budgets & Rate limits per project, api key, model [LiteLLM Proxy Server](https://docs.litellm.ai/docs/simple_proxy) [**Jump to OpenAI Proxy Docs**](https://github.com/BerriAI/litellm?tab=readme-ov-file#openai-proxy---docs)
[**Jump to Supported LLM Providers**](https://github.com/BerriAI/litellm?tab=readme-ov-file#supported-providers-docs) diff --git a/cookbook/litellm_router/error_log.txt b/cookbook/litellm_router/error_log.txt index 6853ef465..983b47cbb 100644 --- a/cookbook/litellm_router/error_log.txt +++ b/cookbook/litellm_router/error_log.txt @@ -1,10 +1,10 @@ -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -21,13 +21,13 @@ Exception: Expecting value: line 1 column 1 (char 0) Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -49,7 +49,7 @@ Exception: Expecting value: line 1 column 1 (char 0) Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -61,7 +61,7 @@ Exception: Expecting value: line 1 column 1 (char 0) Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -70,7 +70,7 @@ Exception: Expecting value: line 1 column 1 (char 0) Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -79,7 +79,7 @@ Exception: Expecting value: line 1 column 1 (char 0) Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -109,7 +109,7 @@ Question: Given this context, what is litellm? LiteLLM about: About Call all LLM APIs using the OpenAI format. Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -128,7 +128,7 @@ Exception: Expecting value: line 1 column 1 (char 0) Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -148,7 +148,7 @@ Exception: Expecting value: line 1 column 1 (char 0) Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -162,7 +162,7 @@ Question: Given this context, what is litellm? LiteLLM about: About Call all LLM APIs using the OpenAI format. Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -174,7 +174,7 @@ Exception: Expecting value: line 1 column 1 (char 0) Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -184,7 +184,7 @@ Question: Given this context, what is litellm? LiteLLM about: About Call all LLM APIs using the OpenAI format. Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -193,19 +193,19 @@ Exception: Expecting value: line 1 column 1 (char 0) Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -214,7 +214,7 @@ Exception: Expecting value: line 1 column 1 (char 0) Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -234,7 +234,7 @@ Exception: Expecting value: line 1 column 1 (char 0) Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -244,7 +244,7 @@ Question: Given this context, what is litellm? LiteLLM about: About Call all LLM APIs using the OpenAI format. Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -253,7 +253,7 @@ Exception: Expecting value: line 1 column 1 (char 0) Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -267,31 +267,31 @@ Question: Given this context, what is litellm? LiteLLM about: About Call all LLM APIs using the OpenAI format. Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -305,7 +305,7 @@ Question: Given this context, what is litellm? LiteLLM about: About Call all LLM APIs using the OpenAI format. Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -330,7 +330,7 @@ Question: Given this context, what is litellm? LiteLLM about: About Call all LLM APIs using the OpenAI format. Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -339,7 +339,7 @@ Exception: Expecting value: line 1 column 1 (char 0) Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -360,7 +360,7 @@ Question: Given this context, what is litellm? LiteLLM about: About Call all LLM APIs using the OpenAI format. Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -369,7 +369,7 @@ Exception: Expecting value: line 1 column 1 (char 0) Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -378,7 +378,7 @@ Exception: Expecting value: line 1 column 1 (char 0) Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -388,7 +388,7 @@ Question: Given this context, what is litellm? LiteLLM about: About Call all LLM APIs using the OpenAI format. Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -409,7 +409,7 @@ Exception: Expecting value: line 1 column 1 (char 0) Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -422,13 +422,13 @@ Exception: Expecting value: line 1 column 1 (char 0) Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -438,7 +438,7 @@ Question: Given this context, what is litellm? LiteLLM about: About Call all LLM APIs using the OpenAI format. Exception: Expecting value: line 1 column 1 (char 0) -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -462,7 +462,7 @@ Question: Given this context, what is litellm? LiteLLM about: About Call all LLM APIs using the OpenAI format. Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -482,7 +482,7 @@ Exception: 'Response' object has no attribute 'get' Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -492,7 +492,7 @@ Question: Given this context, what is litellm? LiteLLM about: About Call all LLM APIs using the OpenAI format. Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -516,7 +516,7 @@ Question: Given this context, what is litellm? LiteLLM about: About Call all LLM APIs using the OpenAI format. Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -529,7 +529,7 @@ Exception: 'Response' object has no attribute 'get' Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -546,13 +546,13 @@ Question: Given this context, what is litellm? LiteLLM about: About Call all LLM APIs using the OpenAI format. Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -580,13 +580,13 @@ Question: Given this context, what is litellm? LiteLLM about: About Call all LLM APIs using the OpenAI format. Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -624,7 +624,7 @@ Question: Given this context, what is litellm? LiteLLM about: About Call all LLM APIs using the OpenAI format. Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -638,13 +638,13 @@ Question: Given this context, what is litellm? LiteLLM about: About Call all LLM APIs using the OpenAI format. Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -660,7 +660,7 @@ Question: Given this context, what is litellm? LiteLLM about: About Call all LLM APIs using the OpenAI format. Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -681,7 +681,7 @@ Question: Given this context, what is litellm? LiteLLM about: About Call all LLM APIs using the OpenAI format. Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -691,31 +691,31 @@ Question: Given this context, what is litellm? LiteLLM about: About Call all LLM APIs using the OpenAI format. Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -771,7 +771,7 @@ Question: Given this context, what is litellm? LiteLLM about: About Call all LLM APIs using the OpenAI format. Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -780,7 +780,7 @@ Exception: 'Response' object has no attribute 'get' Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -800,7 +800,7 @@ Question: Given this context, what is litellm? LiteLLM about: About Call all LLM APIs using the OpenAI format. Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -820,7 +820,7 @@ Exception: 'Response' object has no attribute 'get' Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -830,7 +830,7 @@ Question: Given this context, what is litellm? LiteLLM about: About Call all LLM APIs using the OpenAI format. Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -840,7 +840,7 @@ Question: Given this context, what is litellm? LiteLLM about: About Call all LLM APIs using the OpenAI format. Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -850,7 +850,7 @@ Question: Given this context, what is litellm? LiteLLM about: About Call all LLM APIs using the OpenAI format. Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -862,13 +862,13 @@ Exception: 'Response' object has no attribute 'get' Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -877,7 +877,7 @@ Exception: 'Response' object has no attribute 'get' Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -898,7 +898,7 @@ Question: Given this context, what is litellm? LiteLLM about: About Call all LLM APIs using the OpenAI format. Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -919,7 +919,7 @@ Question: Given this context, what is litellm? LiteLLM about: About Call all LLM APIs using the OpenAI format. Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -936,19 +936,19 @@ Question: Given this context, what is litellm? LiteLLM about: About Call all LLM APIs using the OpenAI format. Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -961,25 +961,25 @@ Exception: 'Response' object has no attribute 'get' Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -993,7 +993,7 @@ Question: Given this context, what is litellm? LiteLLM about: About Call all LLM APIs using the OpenAI format. Exception: 'Response' object has no attribute 'get' -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 diff --git a/cookbook/litellm_router/request_log.txt b/cookbook/litellm_router/request_log.txt index 0aed74904..821d87ab5 100644 --- a/cookbook/litellm_router/request_log.txt +++ b/cookbook/litellm_router/request_log.txt @@ -20,7 +20,7 @@ Call all LLM APIs using the OpenAI format. Response ID: 52dbbd49-eedb-4c11-8382-3ca7deb1af35 Url: /queue/response/52dbbd49-eedb-4c11-8382-3ca7deb1af35 Time: 3.50 seconds -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 @@ -35,7 +35,7 @@ Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. C Response ID: ae1e2b71-d711-456d-8df0-13ce0709eb04 Url: /queue/response/ae1e2b71-d711-456d-8df0-13ce0709eb04 Time: 5.60 seconds -Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 10 diff --git a/cookbook/litellm_router/test_questions/question3.txt b/cookbook/litellm_router/test_questions/question3.txt index a12278750..d6006f9c7 100644 --- a/cookbook/litellm_router/test_questions/question3.txt +++ b/cookbook/litellm_router/test_questions/question3.txt @@ -1,4 +1,4 @@ -What endpoints does the litellm proxy have 💥 OpenAI Proxy Server +What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server LiteLLM Server manages: Calling 100+ LLMs Huggingface/Bedrock/TogetherAI/etc. in the OpenAI ChatCompletions & Completions format diff --git a/docs/my-website/docs/budget_manager.md b/docs/my-website/docs/budget_manager.md index 1a2c7e7ee..6bea96ef9 100644 --- a/docs/my-website/docs/budget_manager.md +++ b/docs/my-website/docs/budget_manager.md @@ -7,14 +7,14 @@ Don't want to get crazy bills because either while you're calling LLM APIs **or* :::info -If you want a server to manage user keys, budgets, etc. use our [OpenAI Proxy Server](./proxy/virtual_keys.md) +If you want a server to manage user keys, budgets, etc. use our [LiteLLM Proxy Server](./proxy/virtual_keys.md) ::: LiteLLM exposes: * `litellm.max_budget`: a global variable you can use to set the max budget (in USD) across all your litellm calls. If this budget is exceeded, it will raise a BudgetExceededError * `BudgetManager`: A class to help set budgets per user. BudgetManager creates a dictionary to manage the user budgets, where the key is user and the object is their current cost + model-specific costs. -* `OpenAI Proxy Server`: A server to call 100+ LLMs with an openai-compatible endpoint. Manages user budgets, spend tracking, load balancing etc. +* `LiteLLM Proxy Server`: A server to call 100+ LLMs with an openai-compatible endpoint. Manages user budgets, spend tracking, load balancing etc. ## quick start diff --git a/docs/my-website/docs/index.md b/docs/my-website/docs/index.md index 6b472ee6c..a560ecf76 100644 --- a/docs/my-website/docs/index.md +++ b/docs/my-website/docs/index.md @@ -10,11 +10,11 @@ https://github.com/BerriAI/litellm - Translate inputs to provider's `completion`, `embedding`, and `image_generation` endpoints - [Consistent output](https://docs.litellm.ai/docs/completion/output), text responses will always be available at `['choices'][0]['message']['content']` - Retry/fallback logic across multiple deployments (e.g. Azure/OpenAI) - [Router](https://docs.litellm.ai/docs/routing) -- Track spend & set budgets per project [OpenAI Proxy Server](https://docs.litellm.ai/docs/simple_proxy) +- Track spend & set budgets per project [LiteLLM Proxy Server](https://docs.litellm.ai/docs/simple_proxy) ## How to use LiteLLM You can use litellm through either: -1. [OpenAI proxy Server](#openai-proxy) - Server to call 100+ LLMs, load balance, cost tracking across projects +1. [LiteLLM Proxy Server](#openai-proxy) - Server to call 100+ LLMs, load balance, cost tracking across projects 2. [LiteLLM python SDK](#basic-usage) - Python Client to call 100+ LLMs, load balance, cost tracking ## LiteLLM Python SDK diff --git a/docs/my-website/docs/proxy/deploy.md b/docs/my-website/docs/proxy/deploy.md index 35fc0a508..c7617196e 100644 --- a/docs/my-website/docs/proxy/deploy.md +++ b/docs/my-website/docs/proxy/deploy.md @@ -246,7 +246,7 @@ helm install lite-helm ./litellm-helm kubectl --namespace default port-forward $POD_NAME 8080:$CONTAINER_PORT ``` -Your OpenAI proxy server is now running on `http://127.0.0.1:4000`. +Your LiteLLM Proxy Server is now running on `http://127.0.0.1:4000`. @@ -301,7 +301,7 @@ docker run \ --config /app/config.yaml --detailed_debug ``` -Your OpenAI proxy server is now running on `http://0.0.0.0:4000`. +Your LiteLLM Proxy Server is now running on `http://0.0.0.0:4000`. @@ -399,7 +399,7 @@ kubectl apply -f /path/to/service.yaml kubectl port-forward service/litellm-service 4000:4000 ``` -Your OpenAI proxy server is now running on `http://0.0.0.0:4000`. +Your LiteLLM Proxy Server is now running on `http://0.0.0.0:4000`. @@ -441,7 +441,7 @@ kubectl \ 4000:4000 ``` -Your OpenAI proxy server is now running on `http://127.0.0.1:4000`. +Your LiteLLM Proxy Server is now running on `http://127.0.0.1:4000`. If you need to set your litellm proxy config.yaml, you can find this in [values.yaml](https://github.com/BerriAI/litellm/blob/main/deploy/charts/litellm-helm/values.yaml) @@ -486,7 +486,7 @@ helm install lite-helm ./litellm-helm kubectl --namespace default port-forward $POD_NAME 8080:$CONTAINER_PORT ``` -Your OpenAI proxy server is now running on `http://127.0.0.1:4000`. +Your LiteLLM Proxy Server is now running on `http://127.0.0.1:4000`. diff --git a/docs/my-website/docs/proxy_server.md b/docs/my-website/docs/proxy_server.md index ef9352ab1..0d08db744 100644 --- a/docs/my-website/docs/proxy_server.md +++ b/docs/my-website/docs/proxy_server.md @@ -1,7 +1,7 @@ import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; -# [OLD PROXY 👉 [NEW proxy here](./simple_proxy)] Local OpenAI Proxy Server +# [OLD PROXY 👉 [NEW proxy here](./simple_proxy)] Local LiteLLM Proxy Server A fast, and lightweight OpenAI-compatible server to call 100+ LLM APIs. diff --git a/docs/my-website/docs/routing.md b/docs/my-website/docs/routing.md index 905954e97..d83755e68 100644 --- a/docs/my-website/docs/routing.md +++ b/docs/my-website/docs/routing.md @@ -14,7 +14,7 @@ In production, litellm supports using Redis as a way to track cooldown server an :::info -If you want a server to load balance across different LLM APIs, use our [OpenAI Proxy Server](./proxy/load_balancing.md) +If you want a server to load balance across different LLM APIs, use our [LiteLLM Proxy Server](./proxy/load_balancing.md) ::: @@ -1637,7 +1637,7 @@ response = router.completion( ## Deploy Router -If you want a server to load balance across different LLM APIs, use our [OpenAI Proxy Server](./simple_proxy#load-balancing---multiple-instances-of-1-model) +If you want a server to load balance across different LLM APIs, use our [LiteLLM Proxy Server](./simple_proxy#load-balancing---multiple-instances-of-1-model) ## Init Params for the litellm.Router diff --git a/docs/my-website/docs/secret.md b/docs/my-website/docs/secret.md index 91ae38368..c44f2cd10 100644 --- a/docs/my-website/docs/secret.md +++ b/docs/my-website/docs/secret.md @@ -90,7 +90,7 @@ litellm.secret_manager = client litellm.get_secret("your-test-key") ``` -### Usage with OpenAI Proxy Server +### Usage with LiteLLM Proxy Server 1. Install Proxy dependencies ```bash @@ -129,7 +129,7 @@ litellm --config /path/to/config.yaml Use encrypted keys from Google KMS on the proxy -### Usage with OpenAI Proxy Server +### Usage with LiteLLM Proxy Server ## Step 1. Add keys to env ``` diff --git a/docs/my-website/docs/simple_proxy_old_doc.md b/docs/my-website/docs/simple_proxy_old_doc.md index 195728d1b..2d68db329 100644 --- a/docs/my-website/docs/simple_proxy_old_doc.md +++ b/docs/my-website/docs/simple_proxy_old_doc.md @@ -2,7 +2,7 @@ import Image from '@theme/IdealImage'; import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; -# 💥 OpenAI Proxy Server +# 💥 LiteLLM Proxy Server LiteLLM Server manages: diff --git a/docs/my-website/sidebars.js b/docs/my-website/sidebars.js index 6674d91ac..e57f340c7 100644 --- a/docs/my-website/sidebars.js +++ b/docs/my-website/sidebars.js @@ -20,10 +20,10 @@ const sidebars = { { type: "doc", id: "index" }, // NEW { type: "category", - label: "💥 OpenAI Proxy Server", + label: "💥 LiteLLM Proxy Server", link: { type: "generated-index", - title: "💥 OpenAI Proxy Server", + title: "💥 LiteLLM Proxy Server", description: `Proxy Server to call 100+ LLMs in a unified interface & track spend, set budgets per virtual key/user`, slug: "/simple_proxy", }, diff --git a/docs/my-website/src/pages/index.md b/docs/my-website/src/pages/index.md index 308ed0831..36d47aedf 100644 --- a/docs/my-website/src/pages/index.md +++ b/docs/my-website/src/pages/index.md @@ -10,7 +10,7 @@ https://github.com/BerriAI/litellm - Translate inputs to provider's `completion`, `embedding`, and `image_generation` endpoints - [Consistent output](https://docs.litellm.ai/docs/completion/output), text responses will always be available at `['choices'][0]['message']['content']` - Retry/fallback logic across multiple deployments (e.g. Azure/OpenAI) - [Router](https://docs.litellm.ai/docs/routing) -- Track spend & set budgets per project [OpenAI Proxy Server](https://docs.litellm.ai/docs/simple_proxy) +- Track spend & set budgets per project [LiteLLM Proxy Server](https://docs.litellm.ai/docs/simple_proxy) ## Basic usage