From ec8d11eb0008e6fdddfccc745bfb8502d9664400 Mon Sep 17 00:00:00 2001 From: Ishaan Jaff Date: Fri, 16 Feb 2024 16:36:36 -0800 Subject: [PATCH] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 47a584e69..f26f382da 100644 --- a/README.md +++ b/README.md @@ -28,7 +28,7 @@ LiteLLM manages: - Translate inputs to provider's `completion`, `embedding`, and `image_generation` endpoints - [Consistent output](https://docs.litellm.ai/docs/completion/output), text responses will always be available at `['choices'][0]['message']['content']` - Retry/fallback logic across multiple deployments (e.g. Azure/OpenAI) - [Router](https://docs.litellm.ai/docs/routing) -- Set Budgets & Rate limits per project [OpenAI Proxy Server](https://docs.litellm.ai/docs/simple_proxy) +- Set Budgets & Rate limits per project, api key, model [OpenAI Proxy Server](https://docs.litellm.ai/docs/simple_proxy) [**Jump to OpenAI Proxy Docs**](https://github.com/BerriAI/litellm?tab=readme-ov-file#openai-proxy---docs)