From ed17e04bc2ff5f5fd12ba449dfea2014805447a4 Mon Sep 17 00:00:00 2001 From: Ishaan Jaff Date: Thu, 23 Nov 2023 11:39:39 -0800 Subject: [PATCH] Update README.md --- README.md | 5 ++++- 1 file changed, 4 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index e560b2b27e..5bfe2cba70 100644 --- a/README.md +++ b/README.md @@ -116,7 +116,10 @@ print(response) ``` ## OpenAI Proxy - ([Docs](https://docs.litellm.ai/docs/simple_proxy)) -**If you want to use non-openai models in an openai code base**, you can use litellm proxy. Create a server to call 100+ LLMs (Huggingface/Bedrock/TogetherAI/etc) in the OpenAI ChatCompletions & Completions format +LiteLLM Proxy manages: +* Calling 100+ LLMs Huggingface/Bedrock/TogetherAI/etc. in the OpenAI ChatCompletions & Completions format +* Authentication & Spend Tracking Virtual Keys +* Load balancing - Routing between Multiple Models + Deployments of the same model ### Step 1: Start litellm proxy ```shell