forked from phoenix/litellm-mirror
Merge pull request #4838 from BerriAI/doc_anthropic_proxy_usage
doc - using anthropic with litellm proxy server
This commit is contained in:
commit
e9d6816011
1 changed files with 9 additions and 9 deletions
|
@ -56,7 +56,7 @@ for chunk in response:
|
||||||
print(chunk["choices"][0]["delta"]["content"]) # same as openai format
|
print(chunk["choices"][0]["delta"]["content"]) # same as openai format
|
||||||
```
|
```
|
||||||
|
|
||||||
## OpenAI Proxy Usage
|
## Usage with LiteLLM Proxy
|
||||||
|
|
||||||
Here's how to call Anthropic with the LiteLLM Proxy Server
|
Here's how to call Anthropic with the LiteLLM Proxy Server
|
||||||
|
|
||||||
|
@ -69,14 +69,6 @@ export ANTHROPIC_API_KEY="your-api-key"
|
||||||
### 2. Start the proxy
|
### 2. Start the proxy
|
||||||
|
|
||||||
<Tabs>
|
<Tabs>
|
||||||
<TabItem value="cli" label="cli">
|
|
||||||
|
|
||||||
```bash
|
|
||||||
$ litellm --model claude-3-opus-20240229
|
|
||||||
|
|
||||||
# Server running on http://0.0.0.0:4000
|
|
||||||
```
|
|
||||||
</TabItem>
|
|
||||||
<TabItem value="config" label="config.yaml">
|
<TabItem value="config" label="config.yaml">
|
||||||
|
|
||||||
```yaml
|
```yaml
|
||||||
|
@ -91,6 +83,14 @@ model_list:
|
||||||
litellm --config /path/to/config.yaml
|
litellm --config /path/to/config.yaml
|
||||||
```
|
```
|
||||||
</TabItem>
|
</TabItem>
|
||||||
|
<TabItem value="cli" label="cli">
|
||||||
|
|
||||||
|
```bash
|
||||||
|
$ litellm --model claude-3-opus-20240229
|
||||||
|
|
||||||
|
# Server running on http://0.0.0.0:4000
|
||||||
|
```
|
||||||
|
</TabItem>
|
||||||
</Tabs>
|
</Tabs>
|
||||||
|
|
||||||
### 3. Test it
|
### 3. Test it
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue