diff --git a/docs/my-website/docs/proxy/quick_start.md b/docs/my-website/docs/proxy/quick_start.md index ce3591bcb..141c64603 100644 --- a/docs/my-website/docs/proxy/quick_start.md +++ b/docs/my-website/docs/proxy/quick_start.md @@ -43,7 +43,7 @@ litellm --test This will now automatically route any requests for gpt-3.5-turbo to bigcode starcoder, hosted on huggingface inference endpoints. -### Using LiteLLM Proxy - Curl Request, OpenAI Package, Langchain, Langchain JS +### Using LiteLLM Proxy - Curl Request, OpenAI Package, Langchain @@ -198,6 +198,19 @@ $ export OPENAI_API_KEY=my-api-key $ litellm --model openai/ --api_base # e.g. http://0.0.0.0:3000 ``` + + + +```shell +$ export VERTEX_PROJECT="hardy-project" +$ export VERTEX_LOCATION="us-west" +``` + +```shell +$ litellm --model vertex_ai/gemini-pro +``` + + ```shell