deploy link to using litellm

This commit is contained in:
Ishaan Jaff 2024-07-25 20:09:49 -07:00
parent c2e309baf3
commit 9247fc3c64

View file

@ -254,6 +254,15 @@ Your OpenAI proxy server is now running on `http://127.0.0.1:4000`.
**That's it ! That's the quick start to deploy litellm** **That's it ! That's the quick start to deploy litellm**
## Use with Langchain, OpenAI SDK, LlamaIndex, Instructor, Curl
:::info
💡 Go here 👉 [to make your first LLM API Request](user_keys)
LiteLLM is compatible with several SDKs - including OpenAI SDK, Anthropic SDK, Mistral SDK, LLamaIndex, Langchain (Js, Python)
:::
## Options to deploy LiteLLM ## Options to deploy LiteLLM
| Docs | When to Use | | Docs | When to Use |