(docs) proxy server

This commit is contained in:
ishaan-jaff 2023-11-08 13:27:17 -08:00
parent 50a1c55ce1
commit 3498e881af

View file

@ -10,6 +10,7 @@ LiteLLM Server manages:
* Set custom prompt templates + model-specific configs (`temperature`, `max_tokens`, etc.) * Set custom prompt templates + model-specific configs (`temperature`, `max_tokens`, etc.)
## Quick Start ## Quick Start
View all the supported args for the Proxy CLI [here](https://docs.litellm.ai/docs/simple_proxy#proxy-cli-arguments)
```shell ```shell
$ litellm --model huggingface/bigcode/starcoder $ litellm --model huggingface/bigcode/starcoder
@ -450,7 +451,7 @@ model_list:
``` ```
## Proxy CLI Args ## Proxy CLI Arguments
#### --host #### --host
- **Default:** `'0.0.0.0'` - **Default:** `'0.0.0.0'`