Merge pull request #4838 from BerriAI/doc_anthropic_proxy_usage

doc - using anthropic with litellm proxy server
This commit is contained in:
Ishaan Jaff 2024-07-23 16:38:09 -07:00 committed by GitHub
commit e9d6816011
No known key found for this signature in database
GPG key ID: B5690EEEBB952194

View file

@ -56,7 +56,7 @@ for chunk in response:
print(chunk["choices"][0]["delta"]["content"]) # same as openai format
```
## OpenAI Proxy Usage
## Usage with LiteLLM Proxy
Here's how to call Anthropic with the LiteLLM Proxy Server
@ -69,14 +69,6 @@ export ANTHROPIC_API_KEY="your-api-key"
### 2. Start the proxy
<Tabs>
<TabItem value="cli" label="cli">
```bash
$ litellm --model claude-3-opus-20240229
# Server running on http://0.0.0.0:4000
```
</TabItem>
<TabItem value="config" label="config.yaml">
```yaml
@ -91,6 +83,14 @@ model_list:
litellm --config /path/to/config.yaml
```
</TabItem>
<TabItem value="cli" label="cli">
```bash
$ litellm --model claude-3-opus-20240229
# Server running on http://0.0.0.0:4000
```
</TabItem>
</Tabs>
### 3. Test it