diff --git a/docs/my-website/docs/simple_proxy.md b/docs/my-website/docs/simple_proxy.md index af3b1fe16..4f717a7ab 100644 --- a/docs/my-website/docs/simple_proxy.md +++ b/docs/my-website/docs/simple_proxy.md @@ -7,6 +7,7 @@ import TabItem from '@theme/TabItem'; LiteLLM Server manages: * Calling 100+ LLMs [Huggingface/Bedrock/TogetherAI/etc.](#other-supported-models) in the OpenAI `ChatCompletions` & `Completions` format +* Authentication - [Virtual Keys](#managing-auth---virtual-keys) * Set custom prompt templates + model-specific configs (`temperature`, `max_tokens`, etc.) * Routing between [Multiple Models](#multiple-models---quick-start) + [Deployments of the same model](#multiple-instances-of-1-model)