mirror of
https://github.com/BerriAI/litellm.git
synced 2025-04-24 18:24:20 +00:00
(docs) simple proxy
This commit is contained in:
parent
d0f11e7a13
commit
4ece219ec5
1 changed files with 1 additions and 0 deletions
|
@ -7,6 +7,7 @@ import TabItem from '@theme/TabItem';
|
|||
LiteLLM Server manages:
|
||||
|
||||
* Calling 100+ LLMs [Huggingface/Bedrock/TogetherAI/etc.](#other-supported-models) in the OpenAI `ChatCompletions` & `Completions` format
|
||||
* Authentication - [Virtual Keys](#managing-auth---virtual-keys)
|
||||
* Set custom prompt templates + model-specific configs (`temperature`, `max_tokens`, etc.)
|
||||
* Routing between [Multiple Models](#multiple-models---quick-start) + [Deployments of the same model](#multiple-instances-of-1-model)
|
||||
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue