diff --git a/docs/my-website/docs/tutorials/openweb_ui.md b/docs/my-website/docs/tutorials/openweb_ui.md index d020155e75..94ab0447bd 100644 --- a/docs/my-website/docs/tutorials/openweb_ui.md +++ b/docs/my-website/docs/tutorials/openweb_ui.md @@ -8,27 +8,55 @@ This guide walks you through connecting OpenWeb UI to LiteLLM. Using LiteLLM wit - Access 100+ LLMs on OpenWeb UI - Track Spend / Usage, Set Budget Limits - Send Request/Response Logs to logging destinations like langfuse, s3, gcs buckets, etc. -- Set access controls eg. Team 1 can only access gpt-4o, Team 2 can only access o1-preview +- Set access controls eg. Control what models OpenWebUI can access. ## Quickstart - Make sure to setup LiteLLM with the [LiteLLM Getting Started Guide](https://docs.litellm.ai/docs/proxy/docker_quick_start) -### 1. Connect OpenWeb UI to LiteLLM +## 1. Start LiteLLM & OpenWebUI + +Deploy this docker compose to deploy both OpenWebUI and LiteLLM. + +```bash +docker compose up -d +``` - OpenWebUI starts running on [http://localhost:3000](http://localhost:3000) - LiteLLM starts running on [http://localhost:4000](http://localhost:4000) -### 2. Create a Virtual Key on LiteLLM + +## 2. Create a Virtual Key on LiteLLM + +Virtual Keys are API Keys that allow you to authenticate to LiteLLM Proxy. We will create a Virtual Key that will allow OpenWebUI to access LiteLLM. + +### 2.1 LiteLLM User Management Hierarchy + +On LiteLLM, you can create Organizations, Teams, Users and Virtual Keys. For this tutorial, we will create a Team and a Virtual Key. + +- `Organization` - An Organization is a group of Teams. (US Engineering, EU Developer Tools) +- `Team` - A Team is a group of Users. (OpenWeb UI Team, Data Science Team, etc.) +- `User` - A User is an individual user (employee, developer, eg. `krrish@litellm.ai`) +- `Virtual Key` - A Virtual Key is an API Key that allows you to authenticate to LiteLLM Proxy. A Virtual Key is associated with a User or Team. + +Once the Team is created, you can invite Users to the Team. You can read more about LiteLLM's User Management [here](https://docs.litellm.ai/docs/proxy/user_management_heirarchy). + +### 2.2 Create a Team on LiteLLM + +Navigate to [http://localhost:4000/ui](http://localhost:4000/ui) and create a new team. + + + +### 2.2 Create a Virtual Key on LiteLLM Navigate to [http://localhost:4000/ui](http://localhost:4000/ui) and create a new virtual Key. LiteLLM allows you to specify what models are available on OpenWeb UI (by specifying the models the key will have access to). - + -### 3. Connect OpenWeb UI to LiteLLM +## 3. Connect OpenWeb UI to LiteLLM On OpenWeb UI, navigate to Settings -> Connections and create a new connection to LiteLLM @@ -38,7 +66,7 @@ Enter the following details: -### 4. Test it +### 3.1 Test Request On the top left corner, select models you should only see the models you gave the key access to in Step 2. @@ -46,6 +74,14 @@ Once you selected a model, enter your message content and click on `Submit` +### 3.2 Tracking Spend / Usage + +After your request is made, navigate to `Logs` on the LiteLLM UI, you can see Team, Key, Model, Usage and Cost. + + + + + ## Render `thinking` content on OpenWeb UI OpenWebUI requires reasoning/thinking content to be rendered with `` tags. In order to render this for specific models, you can use the `merge_reasoning_content_in_choices` litellm parameter. diff --git a/docs/my-website/img/litellm_make_key_openweb.gif b/docs/my-website/img/create_key_in_team_oweb.gif similarity index 50% rename from docs/my-website/img/litellm_make_key_openweb.gif rename to docs/my-website/img/create_key_in_team_oweb.gif index 13b1a81453..d24849b259 100644 Binary files a/docs/my-website/img/litellm_make_key_openweb.gif and b/docs/my-website/img/create_key_in_team_oweb.gif differ diff --git a/docs/my-website/img/litellm_create_team.gif b/docs/my-website/img/litellm_create_team.gif new file mode 100644 index 0000000000..e2f12613ec Binary files /dev/null and b/docs/my-website/img/litellm_create_team.gif differ diff --git a/litellm/proxy/proxy_config.yaml b/litellm/proxy/proxy_config.yaml index 2ccb000a47..b64bd84aad 100644 --- a/litellm/proxy/proxy_config.yaml +++ b/litellm/proxy/proxy_config.yaml @@ -1,13 +1,10 @@ model_list: - - model_name: model-1 + - model_name: thinking-us.anthropic.claude-3-7-sonnet-20250219-v1:0 litellm_params: - model: openai/model-1 - api_key: os.environ/OPENAI_API_KEY_MODEL_1 - api_base: https://exampleopenaiendpoint-production.up.railway.app/ - - model_name: model-2 - litellm_params: - model: openai/model-2 - api_key: os.environ/OPENAI_API_KEY_MODEL_2 - api_base: https://exampleopenaiendpoint-production.up.railway.app/ + model: bedrock/us.anthropic.claude-3-7-sonnet-20250219-v1:0 + thinking: {"type": "enabled", "budget_tokens": 1024} + max_tokens: 1080 + merge_reasoning_content_in_choices: true +