From b29bb8454fdf417ef840bb0b2371807fa09fe09b Mon Sep 17 00:00:00 2001 From: Krrish Dholakia Date: Tue, 26 Sep 2023 14:46:58 -0700 Subject: [PATCH] update proxy server docs --- docs/my-website/docs/proxy_server.md | 14 ++++++++++++-- 1 file changed, 12 insertions(+), 2 deletions(-) diff --git a/docs/my-website/docs/proxy_server.md b/docs/my-website/docs/proxy_server.md index 2d5c801b4..1fda83938 100644 --- a/docs/my-website/docs/proxy_server.md +++ b/docs/my-website/docs/proxy_server.md @@ -5,17 +5,27 @@ Use this to spin up a proxy api to translate openai api calls to any non-openai This works for async + streaming as well. ## usage -```python +```shell pip install litellm ``` -```python +```shell litellm --model ``` This will host a local proxy api at : **http://localhost:8000** [**Jump to Code**](https://github.com/BerriAI/litellm/blob/fef4146396d5d87006259e00095a62e3900d6bb4/litellm/proxy.py#L36) + +## [Advanced] setting api base +If your model is running locally or on a custom endpoint + +Pass in the api_base as well + +```shell +litellm --model huggingface/meta-llama/llama2 --api_base https://my-endpoint.huggingface.cloud +``` + ## test it ```curl