From 94c1d71b2c03ba8f5e033fe25ef9aeca9e915769 Mon Sep 17 00:00:00 2001 From: ishaan-jaff Date: Thu, 23 Nov 2023 11:13:26 -0800 Subject: [PATCH] (docs) proxy server --- docs/my-website/docs/simple_proxy.md | 13 +++++++------ 1 file changed, 7 insertions(+), 6 deletions(-) diff --git a/docs/my-website/docs/simple_proxy.md b/docs/my-website/docs/simple_proxy.md index 87d91065a5..364b2241ab 100644 --- a/docs/my-website/docs/simple_proxy.md +++ b/docs/my-website/docs/simple_proxy.md @@ -74,6 +74,13 @@ print(response) +### Server Endpoints +- POST `/chat/completions` - chat completions endpoint to call 100+ LLMs +- POST `/completions` - completions endpoint +- POST `/embeddings` - embedding endpoint for Azure, OpenAI, Huggingface endpoints +- GET `/models` - available models on server +- POST `/key/generate` - generate a key to access the proxy + ### Supported LLMs All LiteLLM supported LLMs are supported on the Proxy. Seel all [supported llms](https://docs.litellm.ai/docs/providers) @@ -225,12 +232,6 @@ $ litellm --model command-nightly -### Server Endpoints -- POST `/chat/completions` - chat completions endpoint to call 100+ LLMs -- POST `/completions` - completions endpoint -- POST `/embeddings` - embedding endpoint for Azure, OpenAI, Huggingface endpoints -- GET `/models` - available models on server - ## Using with OpenAI compatible projects LiteLLM allows you to set `openai.api_base` to the proxy server and use all LiteLLM supported LLMs in any OpenAI supported project