docs - use consistent name for LiteLLM proxy server

This commit is contained in:
Ishaan Jaff 2024-08-03 12:49:35 -07:00
parent a454121a1a
commit a8520fd6e4
13 changed files with 99 additions and 99 deletions

View file

@ -10,7 +10,7 @@ https://github.com/BerriAI/litellm
- Translate inputs to provider's `completion`, `embedding`, and `image_generation` endpoints
- [Consistent output](https://docs.litellm.ai/docs/completion/output), text responses will always be available at `['choices'][0]['message']['content']`
- Retry/fallback logic across multiple deployments (e.g. Azure/OpenAI) - [Router](https://docs.litellm.ai/docs/routing)
- Track spend & set budgets per project [OpenAI Proxy Server](https://docs.litellm.ai/docs/simple_proxy)
- Track spend & set budgets per project [LiteLLM Proxy Server](https://docs.litellm.ai/docs/simple_proxy)
## Basic usage