update proxy server docs

This commit is contained in:
Krrish Dholakia 2023-09-26 14:46:58 -07:00
parent 73487d5910
commit b29bb8454f

View file

@ -5,17 +5,27 @@ Use this to spin up a proxy api to translate openai api calls to any non-openai
This works for async + streaming as well.
## usage
```python
```shell
pip install litellm
```
```python
```shell
litellm --model <your-model-name>
```
This will host a local proxy api at : **http://localhost:8000**
[**Jump to Code**](https://github.com/BerriAI/litellm/blob/fef4146396d5d87006259e00095a62e3900d6bb4/litellm/proxy.py#L36)
## [Advanced] setting api base
If your model is running locally or on a custom endpoint
Pass in the api_base as well
```shell
litellm --model huggingface/meta-llama/llama2 --api_base https://my-endpoint.huggingface.cloud
```
## test it
```curl