mirror of
https://github.com/BerriAI/litellm.git
synced 2025-04-24 18:24:20 +00:00
(docs) simple proxy
This commit is contained in:
parent
032cd0121b
commit
3b89cff65e
1 changed files with 1 additions and 1 deletions
|
@ -7,8 +7,8 @@ import TabItem from '@theme/TabItem';
|
||||||
LiteLLM Server manages:
|
LiteLLM Server manages:
|
||||||
|
|
||||||
* Calling 100+ LLMs [Huggingface/Bedrock/TogetherAI/etc.](#other-supported-models) in the OpenAI `ChatCompletions` & `Completions` format
|
* Calling 100+ LLMs [Huggingface/Bedrock/TogetherAI/etc.](#other-supported-models) in the OpenAI `ChatCompletions` & `Completions` format
|
||||||
|
* Load balancing - between [Multiple Models](#multiple-models---quick-start) + [Deployments of the same model](#multiple-instances-of-1-model) **LiteLLM proxy can handle 1k+ requests/second during load tests**
|
||||||
* Authentication & Spend Tracking [Virtual Keys](#managing-auth---virtual-keys)
|
* Authentication & Spend Tracking [Virtual Keys](#managing-auth---virtual-keys)
|
||||||
* Load balancing - Routing between [Multiple Models](#multiple-models---quick-start) + [Deployments of the same model](#multiple-instances-of-1-model)
|
|
||||||
|
|
||||||
[**See LiteLLM Proxy code**](https://github.com/BerriAI/litellm/tree/main/litellm/proxy)
|
[**See LiteLLM Proxy code**](https://github.com/BerriAI/litellm/tree/main/litellm/proxy)
|
||||||
|
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue