forked from phoenix/litellm-mirror
Merge pull request #4897 from BerriAI/docs_add_example_using_anthropic_sdk
Docs add example using anthropic sdk with litellm proxy
This commit is contained in:
commit
4d513e0b5f
4 changed files with 69 additions and 0 deletions
|
@ -166,6 +166,10 @@ $ litellm --model huggingface/bigcode/starcoder
|
|||
|
||||
### Step 2: Make ChatCompletions Request to Proxy
|
||||
|
||||
|
||||
> [!IMPORTANT]
|
||||
> [Use with Langchain (Python, JS), OpenAI SDK (Python, JS) Anthropic SDK, Mistral SDK, LlamaIndex, Instructor, Curl](https://docs.litellm.ai/docs/migration)
|
||||
|
||||
```python
|
||||
import openai # openai v1.0.0+
|
||||
client = openai.OpenAI(api_key="anything",base_url="http://0.0.0.0:4000") # set proxy to base_url
|
||||
|
|
|
@ -254,6 +254,15 @@ Your OpenAI proxy server is now running on `http://127.0.0.1:4000`.
|
|||
|
||||
**That's it ! That's the quick start to deploy litellm**
|
||||
|
||||
## Use with Langchain, OpenAI SDK, LlamaIndex, Instructor, Curl
|
||||
|
||||
:::info
|
||||
💡 Go here 👉 [to make your first LLM API Request](user_keys)
|
||||
|
||||
LiteLLM is compatible with several SDKs - including OpenAI SDK, Anthropic SDK, Mistral SDK, LLamaIndex, Langchain (Js, Python)
|
||||
|
||||
:::
|
||||
|
||||
## Options to deploy LiteLLM
|
||||
|
||||
| Docs | When to Use |
|
||||
|
|
|
@ -388,6 +388,34 @@ print(response)
|
|||
|
||||
```
|
||||
</TabItem>
|
||||
|
||||
<TabItem value="anthropic-py" label="Anthropic Python SDK">
|
||||
|
||||
```python
|
||||
import os
|
||||
|
||||
from anthropic import Anthropic
|
||||
|
||||
client = Anthropic(
|
||||
base_url="http://localhost:4000", # proxy endpoint
|
||||
api_key="sk-s4xN1IiLTCytwtZFJaYQrA", # litellm proxy virtual key
|
||||
)
|
||||
|
||||
message = client.messages.create(
|
||||
max_tokens=1024,
|
||||
messages=[
|
||||
{
|
||||
"role": "user",
|
||||
"content": "Hello, Claude",
|
||||
}
|
||||
],
|
||||
model="claude-3-opus-20240229",
|
||||
)
|
||||
print(message.content)
|
||||
```
|
||||
|
||||
</TabItem>
|
||||
|
||||
</Tabs>
|
||||
|
||||
[**More Info**](./configs.md)
|
||||
|
|
|
@ -234,6 +234,34 @@ main();
|
|||
```
|
||||
|
||||
</TabItem>
|
||||
|
||||
<TabItem value="anthropic-py" label="Anthropic Python SDK">
|
||||
|
||||
```python
|
||||
import os
|
||||
|
||||
from anthropic import Anthropic
|
||||
|
||||
client = Anthropic(
|
||||
base_url="http://localhost:4000", # proxy endpoint
|
||||
api_key="sk-s4xN1IiLTCytwtZFJaYQrA", # litellm proxy virtual key
|
||||
)
|
||||
|
||||
message = client.messages.create(
|
||||
max_tokens=1024,
|
||||
messages=[
|
||||
{
|
||||
"role": "user",
|
||||
"content": "Hello, Claude",
|
||||
}
|
||||
],
|
||||
model="claude-3-opus-20240229",
|
||||
)
|
||||
print(message.content)
|
||||
```
|
||||
|
||||
</TabItem>
|
||||
|
||||
<TabItem value="mistral-py" label="Mistral Python SDK">
|
||||
|
||||
```python
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue