docs - use consistent name for LiteLLM proxy server

This commit is contained in:
Ishaan Jaff 2024-08-03 12:49:35 -07:00
parent 6faeac4cf6
commit 5bc8b59b11
13 changed files with 99 additions and 99 deletions

View file

@ -11,7 +11,7 @@
<p align="center">Call all LLM APIs using the OpenAI format [Bedrock, Huggingface, VertexAI, TogetherAI, Azure, OpenAI, Groq etc.] <p align="center">Call all LLM APIs using the OpenAI format [Bedrock, Huggingface, VertexAI, TogetherAI, Azure, OpenAI, Groq etc.]
<br> <br>
</p> </p>
<h4 align="center"><a href="https://docs.litellm.ai/docs/simple_proxy" target="_blank">OpenAI Proxy Server</a> | <a href="https://docs.litellm.ai/docs/hosted" target="_blank"> Hosted Proxy (Preview)</a> | <a href="https://docs.litellm.ai/docs/enterprise"target="_blank">Enterprise Tier</a></h4> <h4 align="center"><a href="https://docs.litellm.ai/docs/simple_proxy" target="_blank">LiteLLM Proxy Server</a> | <a href="https://docs.litellm.ai/docs/hosted" target="_blank"> Hosted Proxy (Preview)</a> | <a href="https://docs.litellm.ai/docs/enterprise"target="_blank">Enterprise Tier</a></h4>
<h4 align="center"> <h4 align="center">
<a href="https://pypi.org/project/litellm/" target="_blank"> <a href="https://pypi.org/project/litellm/" target="_blank">
<img src="https://img.shields.io/pypi/v/litellm.svg" alt="PyPI Version"> <img src="https://img.shields.io/pypi/v/litellm.svg" alt="PyPI Version">
@ -35,7 +35,7 @@ LiteLLM manages:
- Translate inputs to provider's `completion`, `embedding`, and `image_generation` endpoints - Translate inputs to provider's `completion`, `embedding`, and `image_generation` endpoints
- [Consistent output](https://docs.litellm.ai/docs/completion/output), text responses will always be available at `['choices'][0]['message']['content']` - [Consistent output](https://docs.litellm.ai/docs/completion/output), text responses will always be available at `['choices'][0]['message']['content']`
- Retry/fallback logic across multiple deployments (e.g. Azure/OpenAI) - [Router](https://docs.litellm.ai/docs/routing) - Retry/fallback logic across multiple deployments (e.g. Azure/OpenAI) - [Router](https://docs.litellm.ai/docs/routing)
- Set Budgets & Rate limits per project, api key, model [OpenAI Proxy Server](https://docs.litellm.ai/docs/simple_proxy) - Set Budgets & Rate limits per project, api key, model [LiteLLM Proxy Server](https://docs.litellm.ai/docs/simple_proxy)
[**Jump to OpenAI Proxy Docs**](https://github.com/BerriAI/litellm?tab=readme-ov-file#openai-proxy---docs) <br> [**Jump to OpenAI Proxy Docs**](https://github.com/BerriAI/litellm?tab=readme-ov-file#openai-proxy---docs) <br>
[**Jump to Supported LLM Providers**](https://github.com/BerriAI/litellm?tab=readme-ov-file#supported-providers-docs) [**Jump to Supported LLM Providers**](https://github.com/BerriAI/litellm?tab=readme-ov-file#supported-providers-docs)

View file

@ -1,10 +1,10 @@
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -21,13 +21,13 @@ Exception: Expecting value: line 1 column 1 (char 0)
Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -49,7 +49,7 @@ Exception: Expecting value: line 1 column 1 (char 0)
Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -61,7 +61,7 @@ Exception: Expecting value: line 1 column 1 (char 0)
Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -70,7 +70,7 @@ Exception: Expecting value: line 1 column 1 (char 0)
Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -79,7 +79,7 @@ Exception: Expecting value: line 1 column 1 (char 0)
Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -109,7 +109,7 @@ Question: Given this context, what is litellm? LiteLLM about: About
Call all LLM APIs using the OpenAI format. Call all LLM APIs using the OpenAI format.
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -128,7 +128,7 @@ Exception: Expecting value: line 1 column 1 (char 0)
Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -148,7 +148,7 @@ Exception: Expecting value: line 1 column 1 (char 0)
Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -162,7 +162,7 @@ Question: Given this context, what is litellm? LiteLLM about: About
Call all LLM APIs using the OpenAI format. Call all LLM APIs using the OpenAI format.
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -174,7 +174,7 @@ Exception: Expecting value: line 1 column 1 (char 0)
Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -184,7 +184,7 @@ Question: Given this context, what is litellm? LiteLLM about: About
Call all LLM APIs using the OpenAI format. Call all LLM APIs using the OpenAI format.
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -193,19 +193,19 @@ Exception: Expecting value: line 1 column 1 (char 0)
Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -214,7 +214,7 @@ Exception: Expecting value: line 1 column 1 (char 0)
Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -234,7 +234,7 @@ Exception: Expecting value: line 1 column 1 (char 0)
Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -244,7 +244,7 @@ Question: Given this context, what is litellm? LiteLLM about: About
Call all LLM APIs using the OpenAI format. Call all LLM APIs using the OpenAI format.
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -253,7 +253,7 @@ Exception: Expecting value: line 1 column 1 (char 0)
Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -267,31 +267,31 @@ Question: Given this context, what is litellm? LiteLLM about: About
Call all LLM APIs using the OpenAI format. Call all LLM APIs using the OpenAI format.
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -305,7 +305,7 @@ Question: Given this context, what is litellm? LiteLLM about: About
Call all LLM APIs using the OpenAI format. Call all LLM APIs using the OpenAI format.
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -330,7 +330,7 @@ Question: Given this context, what is litellm? LiteLLM about: About
Call all LLM APIs using the OpenAI format. Call all LLM APIs using the OpenAI format.
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -339,7 +339,7 @@ Exception: Expecting value: line 1 column 1 (char 0)
Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -360,7 +360,7 @@ Question: Given this context, what is litellm? LiteLLM about: About
Call all LLM APIs using the OpenAI format. Call all LLM APIs using the OpenAI format.
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -369,7 +369,7 @@ Exception: Expecting value: line 1 column 1 (char 0)
Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -378,7 +378,7 @@ Exception: Expecting value: line 1 column 1 (char 0)
Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -388,7 +388,7 @@ Question: Given this context, what is litellm? LiteLLM about: About
Call all LLM APIs using the OpenAI format. Call all LLM APIs using the OpenAI format.
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -409,7 +409,7 @@ Exception: Expecting value: line 1 column 1 (char 0)
Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -422,13 +422,13 @@ Exception: Expecting value: line 1 column 1 (char 0)
Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -438,7 +438,7 @@ Question: Given this context, what is litellm? LiteLLM about: About
Call all LLM APIs using the OpenAI format. Call all LLM APIs using the OpenAI format.
Exception: Expecting value: line 1 column 1 (char 0) Exception: Expecting value: line 1 column 1 (char 0)
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -462,7 +462,7 @@ Question: Given this context, what is litellm? LiteLLM about: About
Call all LLM APIs using the OpenAI format. Call all LLM APIs using the OpenAI format.
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -482,7 +482,7 @@ Exception: 'Response' object has no attribute 'get'
Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -492,7 +492,7 @@ Question: Given this context, what is litellm? LiteLLM about: About
Call all LLM APIs using the OpenAI format. Call all LLM APIs using the OpenAI format.
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -516,7 +516,7 @@ Question: Given this context, what is litellm? LiteLLM about: About
Call all LLM APIs using the OpenAI format. Call all LLM APIs using the OpenAI format.
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -529,7 +529,7 @@ Exception: 'Response' object has no attribute 'get'
Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -546,13 +546,13 @@ Question: Given this context, what is litellm? LiteLLM about: About
Call all LLM APIs using the OpenAI format. Call all LLM APIs using the OpenAI format.
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -580,13 +580,13 @@ Question: Given this context, what is litellm? LiteLLM about: About
Call all LLM APIs using the OpenAI format. Call all LLM APIs using the OpenAI format.
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -624,7 +624,7 @@ Question: Given this context, what is litellm? LiteLLM about: About
Call all LLM APIs using the OpenAI format. Call all LLM APIs using the OpenAI format.
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -638,13 +638,13 @@ Question: Given this context, what is litellm? LiteLLM about: About
Call all LLM APIs using the OpenAI format. Call all LLM APIs using the OpenAI format.
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -660,7 +660,7 @@ Question: Given this context, what is litellm? LiteLLM about: About
Call all LLM APIs using the OpenAI format. Call all LLM APIs using the OpenAI format.
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -681,7 +681,7 @@ Question: Given this context, what is litellm? LiteLLM about: About
Call all LLM APIs using the OpenAI format. Call all LLM APIs using the OpenAI format.
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -691,31 +691,31 @@ Question: Given this context, what is litellm? LiteLLM about: About
Call all LLM APIs using the OpenAI format. Call all LLM APIs using the OpenAI format.
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -771,7 +771,7 @@ Question: Given this context, what is litellm? LiteLLM about: About
Call all LLM APIs using the OpenAI format. Call all LLM APIs using the OpenAI format.
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -780,7 +780,7 @@ Exception: 'Response' object has no attribute 'get'
Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -800,7 +800,7 @@ Question: Given this context, what is litellm? LiteLLM about: About
Call all LLM APIs using the OpenAI format. Call all LLM APIs using the OpenAI format.
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -820,7 +820,7 @@ Exception: 'Response' object has no attribute 'get'
Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -830,7 +830,7 @@ Question: Given this context, what is litellm? LiteLLM about: About
Call all LLM APIs using the OpenAI format. Call all LLM APIs using the OpenAI format.
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -840,7 +840,7 @@ Question: Given this context, what is litellm? LiteLLM about: About
Call all LLM APIs using the OpenAI format. Call all LLM APIs using the OpenAI format.
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -850,7 +850,7 @@ Question: Given this context, what is litellm? LiteLLM about: About
Call all LLM APIs using the OpenAI format. Call all LLM APIs using the OpenAI format.
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -862,13 +862,13 @@ Exception: 'Response' object has no attribute 'get'
Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -877,7 +877,7 @@ Exception: 'Response' object has no attribute 'get'
Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -898,7 +898,7 @@ Question: Given this context, what is litellm? LiteLLM about: About
Call all LLM APIs using the OpenAI format. Call all LLM APIs using the OpenAI format.
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -919,7 +919,7 @@ Question: Given this context, what is litellm? LiteLLM about: About
Call all LLM APIs using the OpenAI format. Call all LLM APIs using the OpenAI format.
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -936,19 +936,19 @@ Question: Given this context, what is litellm? LiteLLM about: About
Call all LLM APIs using the OpenAI format. Call all LLM APIs using the OpenAI format.
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -961,25 +961,25 @@ Exception: 'Response' object has no attribute 'get'
Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -993,7 +993,7 @@ Question: Given this context, what is litellm? LiteLLM about: About
Call all LLM APIs using the OpenAI format. Call all LLM APIs using the OpenAI format.
Exception: 'Response' object has no attribute 'get' Exception: 'Response' object has no attribute 'get'
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10

View file

@ -20,7 +20,7 @@ Call all LLM APIs using the OpenAI format.
Response ID: 52dbbd49-eedb-4c11-8382-3ca7deb1af35 Url: /queue/response/52dbbd49-eedb-4c11-8382-3ca7deb1af35 Response ID: 52dbbd49-eedb-4c11-8382-3ca7deb1af35 Url: /queue/response/52dbbd49-eedb-4c11-8382-3ca7deb1af35
Time: 3.50 seconds Time: 3.50 seconds
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10
@ -35,7 +35,7 @@ Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. C
Response ID: ae1e2b71-d711-456d-8df0-13ce0709eb04 Url: /queue/response/ae1e2b71-d711-456d-8df0-13ce0709eb04 Response ID: ae1e2b71-d711-456d-8df0-13ce0709eb04 Url: /queue/response/ae1e2b71-d711-456d-8df0-13ce0709eb04
Time: 5.60 seconds Time: 5.60 seconds
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server Question: What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 10 Calling 10

View file

@ -1,4 +1,4 @@
What endpoints does the litellm proxy have 💥 OpenAI Proxy Server What endpoints does the litellm proxy have 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:
Calling 100+ LLMs Huggingface/Bedrock/TogetherAI/etc. in the OpenAI ChatCompletions & Completions format Calling 100+ LLMs Huggingface/Bedrock/TogetherAI/etc. in the OpenAI ChatCompletions & Completions format

View file

@ -7,14 +7,14 @@ Don't want to get crazy bills because either while you're calling LLM APIs **or*
:::info :::info
If you want a server to manage user keys, budgets, etc. use our [OpenAI Proxy Server](./proxy/virtual_keys.md) If you want a server to manage user keys, budgets, etc. use our [LiteLLM Proxy Server](./proxy/virtual_keys.md)
::: :::
LiteLLM exposes: LiteLLM exposes:
* `litellm.max_budget`: a global variable you can use to set the max budget (in USD) across all your litellm calls. If this budget is exceeded, it will raise a BudgetExceededError * `litellm.max_budget`: a global variable you can use to set the max budget (in USD) across all your litellm calls. If this budget is exceeded, it will raise a BudgetExceededError
* `BudgetManager`: A class to help set budgets per user. BudgetManager creates a dictionary to manage the user budgets, where the key is user and the object is their current cost + model-specific costs. * `BudgetManager`: A class to help set budgets per user. BudgetManager creates a dictionary to manage the user budgets, where the key is user and the object is their current cost + model-specific costs.
* `OpenAI Proxy Server`: A server to call 100+ LLMs with an openai-compatible endpoint. Manages user budgets, spend tracking, load balancing etc. * `LiteLLM Proxy Server`: A server to call 100+ LLMs with an openai-compatible endpoint. Manages user budgets, spend tracking, load balancing etc.
## quick start ## quick start

View file

@ -10,11 +10,11 @@ https://github.com/BerriAI/litellm
- Translate inputs to provider's `completion`, `embedding`, and `image_generation` endpoints - Translate inputs to provider's `completion`, `embedding`, and `image_generation` endpoints
- [Consistent output](https://docs.litellm.ai/docs/completion/output), text responses will always be available at `['choices'][0]['message']['content']` - [Consistent output](https://docs.litellm.ai/docs/completion/output), text responses will always be available at `['choices'][0]['message']['content']`
- Retry/fallback logic across multiple deployments (e.g. Azure/OpenAI) - [Router](https://docs.litellm.ai/docs/routing) - Retry/fallback logic across multiple deployments (e.g. Azure/OpenAI) - [Router](https://docs.litellm.ai/docs/routing)
- Track spend & set budgets per project [OpenAI Proxy Server](https://docs.litellm.ai/docs/simple_proxy) - Track spend & set budgets per project [LiteLLM Proxy Server](https://docs.litellm.ai/docs/simple_proxy)
## How to use LiteLLM ## How to use LiteLLM
You can use litellm through either: You can use litellm through either:
1. [OpenAI proxy Server](#openai-proxy) - Server to call 100+ LLMs, load balance, cost tracking across projects 1. [LiteLLM Proxy Server](#openai-proxy) - Server to call 100+ LLMs, load balance, cost tracking across projects
2. [LiteLLM python SDK](#basic-usage) - Python Client to call 100+ LLMs, load balance, cost tracking 2. [LiteLLM python SDK](#basic-usage) - Python Client to call 100+ LLMs, load balance, cost tracking
## LiteLLM Python SDK ## LiteLLM Python SDK

View file

@ -246,7 +246,7 @@ helm install lite-helm ./litellm-helm
kubectl --namespace default port-forward $POD_NAME 8080:$CONTAINER_PORT kubectl --namespace default port-forward $POD_NAME 8080:$CONTAINER_PORT
``` ```
Your OpenAI proxy server is now running on `http://127.0.0.1:4000`. Your LiteLLM Proxy Server is now running on `http://127.0.0.1:4000`.
</TabItem> </TabItem>
@ -301,7 +301,7 @@ docker run \
--config /app/config.yaml --detailed_debug --config /app/config.yaml --detailed_debug
``` ```
Your OpenAI proxy server is now running on `http://0.0.0.0:4000`. Your LiteLLM Proxy Server is now running on `http://0.0.0.0:4000`.
</TabItem> </TabItem>
<TabItem value="kubernetes-deploy" label="Kubernetes"> <TabItem value="kubernetes-deploy" label="Kubernetes">
@ -399,7 +399,7 @@ kubectl apply -f /path/to/service.yaml
kubectl port-forward service/litellm-service 4000:4000 kubectl port-forward service/litellm-service 4000:4000
``` ```
Your OpenAI proxy server is now running on `http://0.0.0.0:4000`. Your LiteLLM Proxy Server is now running on `http://0.0.0.0:4000`.
</TabItem> </TabItem>
@ -441,7 +441,7 @@ kubectl \
4000:4000 4000:4000
``` ```
Your OpenAI proxy server is now running on `http://127.0.0.1:4000`. Your LiteLLM Proxy Server is now running on `http://127.0.0.1:4000`.
If you need to set your litellm proxy config.yaml, you can find this in [values.yaml](https://github.com/BerriAI/litellm/blob/main/deploy/charts/litellm-helm/values.yaml) If you need to set your litellm proxy config.yaml, you can find this in [values.yaml](https://github.com/BerriAI/litellm/blob/main/deploy/charts/litellm-helm/values.yaml)
@ -486,7 +486,7 @@ helm install lite-helm ./litellm-helm
kubectl --namespace default port-forward $POD_NAME 8080:$CONTAINER_PORT kubectl --namespace default port-forward $POD_NAME 8080:$CONTAINER_PORT
``` ```
Your OpenAI proxy server is now running on `http://127.0.0.1:4000`. Your LiteLLM Proxy Server is now running on `http://127.0.0.1:4000`.
</TabItem> </TabItem>
</Tabs> </Tabs>

View file

@ -1,7 +1,7 @@
import Tabs from '@theme/Tabs'; import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem'; import TabItem from '@theme/TabItem';
# [OLD PROXY 👉 [NEW proxy here](./simple_proxy)] Local OpenAI Proxy Server # [OLD PROXY 👉 [NEW proxy here](./simple_proxy)] Local LiteLLM Proxy Server
A fast, and lightweight OpenAI-compatible server to call 100+ LLM APIs. A fast, and lightweight OpenAI-compatible server to call 100+ LLM APIs.

View file

@ -14,7 +14,7 @@ In production, litellm supports using Redis as a way to track cooldown server an
:::info :::info
If you want a server to load balance across different LLM APIs, use our [OpenAI Proxy Server](./proxy/load_balancing.md) If you want a server to load balance across different LLM APIs, use our [LiteLLM Proxy Server](./proxy/load_balancing.md)
::: :::
@ -1637,7 +1637,7 @@ response = router.completion(
## Deploy Router ## Deploy Router
If you want a server to load balance across different LLM APIs, use our [OpenAI Proxy Server](./simple_proxy#load-balancing---multiple-instances-of-1-model) If you want a server to load balance across different LLM APIs, use our [LiteLLM Proxy Server](./simple_proxy#load-balancing---multiple-instances-of-1-model)
## Init Params for the litellm.Router ## Init Params for the litellm.Router

View file

@ -90,7 +90,7 @@ litellm.secret_manager = client
litellm.get_secret("your-test-key") litellm.get_secret("your-test-key")
``` ```
### Usage with OpenAI Proxy Server ### Usage with LiteLLM Proxy Server
1. Install Proxy dependencies 1. Install Proxy dependencies
```bash ```bash
@ -129,7 +129,7 @@ litellm --config /path/to/config.yaml
Use encrypted keys from Google KMS on the proxy Use encrypted keys from Google KMS on the proxy
### Usage with OpenAI Proxy Server ### Usage with LiteLLM Proxy Server
## Step 1. Add keys to env ## Step 1. Add keys to env
``` ```

View file

@ -2,7 +2,7 @@ import Image from '@theme/IdealImage';
import Tabs from '@theme/Tabs'; import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem'; import TabItem from '@theme/TabItem';
# 💥 OpenAI Proxy Server # 💥 LiteLLM Proxy Server
LiteLLM Server manages: LiteLLM Server manages:

View file

@ -20,10 +20,10 @@ const sidebars = {
{ type: "doc", id: "index" }, // NEW { type: "doc", id: "index" }, // NEW
{ {
type: "category", type: "category",
label: "💥 OpenAI Proxy Server", label: "💥 LiteLLM Proxy Server",
link: { link: {
type: "generated-index", type: "generated-index",
title: "💥 OpenAI Proxy Server", title: "💥 LiteLLM Proxy Server",
description: `Proxy Server to call 100+ LLMs in a unified interface & track spend, set budgets per virtual key/user`, description: `Proxy Server to call 100+ LLMs in a unified interface & track spend, set budgets per virtual key/user`,
slug: "/simple_proxy", slug: "/simple_proxy",
}, },

View file

@ -10,7 +10,7 @@ https://github.com/BerriAI/litellm
- Translate inputs to provider's `completion`, `embedding`, and `image_generation` endpoints - Translate inputs to provider's `completion`, `embedding`, and `image_generation` endpoints
- [Consistent output](https://docs.litellm.ai/docs/completion/output), text responses will always be available at `['choices'][0]['message']['content']` - [Consistent output](https://docs.litellm.ai/docs/completion/output), text responses will always be available at `['choices'][0]['message']['content']`
- Retry/fallback logic across multiple deployments (e.g. Azure/OpenAI) - [Router](https://docs.litellm.ai/docs/routing) - Retry/fallback logic across multiple deployments (e.g. Azure/OpenAI) - [Router](https://docs.litellm.ai/docs/routing)
- Track spend & set budgets per project [OpenAI Proxy Server](https://docs.litellm.ai/docs/simple_proxy) - Track spend & set budgets per project [LiteLLM Proxy Server](https://docs.litellm.ai/docs/simple_proxy)
## Basic usage ## Basic usage