forked from phoenix/litellm-mirror
deploy link to using litellm
This commit is contained in:
parent
c2e309baf3
commit
9247fc3c64
1 changed files with 9 additions and 0 deletions
|
@ -254,6 +254,15 @@ Your OpenAI proxy server is now running on `http://127.0.0.1:4000`.
|
||||||
|
|
||||||
**That's it ! That's the quick start to deploy litellm**
|
**That's it ! That's the quick start to deploy litellm**
|
||||||
|
|
||||||
|
## Use with Langchain, OpenAI SDK, LlamaIndex, Instructor, Curl
|
||||||
|
|
||||||
|
:::info
|
||||||
|
💡 Go here 👉 [to make your first LLM API Request](user_keys)
|
||||||
|
|
||||||
|
LiteLLM is compatible with several SDKs - including OpenAI SDK, Anthropic SDK, Mistral SDK, LLamaIndex, Langchain (Js, Python)
|
||||||
|
|
||||||
|
:::
|
||||||
|
|
||||||
## Options to deploy LiteLLM
|
## Options to deploy LiteLLM
|
||||||
|
|
||||||
| Docs | When to Use |
|
| Docs | When to Use |
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue