doc - using anthropic with litellm proxy server

This commit is contained in:
Ishaan Jaff 2024-07-23 10:42:17 -07:00
parent a8c88dad64
commit 8845bd4d76

View file

@ -56,7 +56,7 @@ for chunk in response:
print(chunk["choices"][0]["delta"]["content"]) # same as openai format
```
## OpenAI Proxy Usage
## Usage with LiteLLM Proxy
Here's how to call Anthropic with the LiteLLM Proxy Server
@ -69,14 +69,6 @@ export ANTHROPIC_API_KEY="your-api-key"
### 2. Start the proxy
<Tabs>
<TabItem value="cli" label="cli">
```bash
$ litellm --model claude-3-opus-20240229
# Server running on http://0.0.0.0:4000
```
</TabItem>
<TabItem value="config" label="config.yaml">
```yaml
@ -91,6 +83,14 @@ model_list:
litellm --config /path/to/config.yaml
```
</TabItem>
<TabItem value="cli" label="cli">
```bash
$ litellm --model claude-3-opus-20240229
# Server running on http://0.0.0.0:4000
```
</TabItem>
</Tabs>
### 3. Test it