(docs) simple proxy

This commit is contained in:
ishaan-jaff 2023-11-22 15:01:26 -08:00
parent d0f11e7a13
commit 4ece219ec5

View file

@ -7,6 +7,7 @@ import TabItem from '@theme/TabItem';
LiteLLM Server manages:
* Calling 100+ LLMs [Huggingface/Bedrock/TogetherAI/etc.](#other-supported-models) in the OpenAI `ChatCompletions` & `Completions` format
* Authentication - [Virtual Keys](#managing-auth---virtual-keys)
* Set custom prompt templates + model-specific configs (`temperature`, `max_tokens`, etc.)
* Routing between [Multiple Models](#multiple-models---quick-start) + [Deployments of the same model](#multiple-instances-of-1-model)