mirror of
https://github.com/BerriAI/litellm.git
synced 2025-04-25 02:34:29 +00:00
Update README.md
This commit is contained in:
parent
32a0ae5eb2
commit
14bf56b436
1 changed files with 1 additions and 0 deletions
|
@ -28,6 +28,7 @@ LiteLLM manages
|
||||||
- Translating inputs to the provider's `completion` and `embedding` endpoints
|
- Translating inputs to the provider's `completion` and `embedding` endpoints
|
||||||
- Guarantees [consistent output](https://docs.litellm.ai/docs/completion/output), text responses will always be available at `['choices'][0]['message']['content']`
|
- Guarantees [consistent output](https://docs.litellm.ai/docs/completion/output), text responses will always be available at `['choices'][0]['message']['content']`
|
||||||
- Exception mapping - common exceptions across providers are mapped to the OpenAI exception types.
|
- Exception mapping - common exceptions across providers are mapped to the OpenAI exception types.
|
||||||
|
- Load-balance across multiple deployments (e.g. Azure/OpenAI) - `Router`
|
||||||
|
|
||||||
**10/05/2023:** LiteLLM is adopting Semantic Versioning for all commits. [Learn more](https://github.com/BerriAI/litellm/issues/532)
|
**10/05/2023:** LiteLLM is adopting Semantic Versioning for all commits. [Learn more](https://github.com/BerriAI/litellm/issues/532)
|
||||||
**10/16/2023:** **Self-hosted OpenAI-proxy server** [Learn more](https://docs.litellm.ai/docs/simple_proxy)
|
**10/16/2023:** **Self-hosted OpenAI-proxy server** [Learn more](https://docs.litellm.ai/docs/simple_proxy)
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue