@@ -34,8 +40,6 @@ LiteLLM manages
- Guarantees [consistent output](https://docs.litellm.ai/docs/completion/output), text responses will always be available at `['choices'][0]['message']['content']`
- Exception mapping - common exceptions across providers are mapped to the OpenAI exception types.
-**🚨 Seeing errors?** [](https://wa.link/huol9n) [](https://discord.gg/wuPM9dRgDw)
-
**10/05/2023:** LiteLLM is adopting Semantic Versioning for all commits. [Learn more](https://github.com/BerriAI/litellm/issues/532)
**10/16/2023:** **Self-hosted OpenAI-proxy server** [Learn more](https://docs.litellm.ai/docs/proxy_server#deploy-proxy)