forked from phoenix/litellm-mirror
docs(proxy_server.md): cleanup
This commit is contained in:
parent
2f57dc8906
commit
8d56e6dad2
1 changed files with 0 additions and 13 deletions
|
@ -475,19 +475,6 @@ docker run --name ollama litellm/ollama
|
||||||
|
|
||||||
More details 👉 https://hub.docker.com/r/litellm/ollama
|
More details 👉 https://hub.docker.com/r/litellm/ollama
|
||||||
</TabItem>
|
</TabItem>
|
||||||
<TabItem value="litellm-hosted" label="LiteLLM-Hosted">
|
|
||||||
|
|
||||||
Deploy the proxy to https://api.litellm.ai
|
|
||||||
|
|
||||||
```shell
|
|
||||||
$ export ANTHROPIC_API_KEY=sk-ant-api03-1..
|
|
||||||
$ litellm --model claude-instant-1 --deploy
|
|
||||||
|
|
||||||
#INFO: Uvicorn running on https://api.litellm.ai/44508ad4
|
|
||||||
```
|
|
||||||
|
|
||||||
This will host a ChatCompletions API at: https://api.litellm.ai/44508ad4
|
|
||||||
</TabItem>
|
|
||||||
</Tabs>
|
</Tabs>
|
||||||
|
|
||||||
### Configure Proxy
|
### Configure Proxy
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue