forked from phoenix/litellm-mirror
Update README.md
This commit is contained in:
parent
87cc96b590
commit
d915fb8729
1 changed files with 9 additions and 4 deletions
13
README.md
13
README.md
|
@ -24,10 +24,15 @@
|
||||||
</a>
|
</a>
|
||||||
</h4>
|
</h4>
|
||||||
|
|
||||||
LiteLLM manages
|
This Package Provides:
|
||||||
- Translating inputs to the provider's `completion` and `embedding` endpoints
|
- Python client to call 100+ LLMs in OpenAI Format
|
||||||
- Guarantees [consistent output](https://docs.litellm.ai/docs/completion/output), text responses will always be available at `['choices'][0]['message']['content']`
|
- Translate inputs to provider's `completion` and `embedding` endpoints
|
||||||
- Load-balance across multiple deployments (e.g. Azure/OpenAI) - `Router` **1k+ requests/second**
|
- [Consistent output](https://docs.litellm.ai/docs/completion/output), text responses will always be available at `['choices'][0]['message']['content']`
|
||||||
|
- Load-balance multiple deployments (e.g. Azure/OpenAI) - `Router` **1k+ requests/second**
|
||||||
|
- OpenAI Proxy Server:
|
||||||
|
- Track spend across multiple projects/people
|
||||||
|
- Call 100+ LLMs in OpenAI Format
|
||||||
|
|
||||||
|
|
||||||
# OpenAI Proxy - ([Docs](https://docs.litellm.ai/docs/simple_proxy))
|
# OpenAI Proxy - ([Docs](https://docs.litellm.ai/docs/simple_proxy))
|
||||||
|
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue