forked from phoenix/litellm-mirror
Update README.md
This commit is contained in:
parent
c27246e6f2
commit
e91823218d
1 changed files with 1 additions and 1 deletions
|
@ -29,7 +29,7 @@ LiteLLM manages:
|
|||
- [Consistent output](https://docs.litellm.ai/docs/completion/output), text responses will always be available at `['choices'][0]['message']['content']`
|
||||
- Load-balance multiple deployments (e.g. Azure/OpenAI) - `Router` **1k+ requests/second**
|
||||
|
||||
[**Jump to OpenAI Proxy Docs**](https://github.com/BerriAI/litellm?tab=readme-ov-file#openai-proxy---docs)
|
||||
[**Jump to OpenAI Proxy Docs**](https://github.com/BerriAI/litellm?tab=readme-ov-file#openai-proxy---docs) <br>
|
||||
[**Jump to Supported Providers**](https://github.com/BerriAI/litellm?tab=readme-ov-file#supported-provider-docs)
|
||||
|
||||
# Usage ([**Docs**](https://docs.litellm.ai/docs/))
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue