diff --git a/docs/my-website/docs/proxy/user_keys.md b/docs/my-website/docs/proxy/user_keys.md index ae4516bf6..ef718f3ad 100644 --- a/docs/my-website/docs/proxy/user_keys.md +++ b/docs/my-website/docs/proxy/user_keys.md @@ -8,11 +8,6 @@ Allows your users to pass in their OpenAI API key (any LiteLLM supported provide Here's how to do it: - - - -Pass in the litellm_params (E.g. api_key, api_base, etc.) via the `extra_body` parameter in the OpenAI client. - ```python import openai client = openai.OpenAI( @@ -31,9 +26,42 @@ response = client.chat.completions.create(model="gpt-3.5-turbo", messages = [ print(response) ``` + +More examples: + + + +Pass in the litellm_params (E.g. api_key, api_base, etc.) via the `extra_body` parameter in the OpenAI client. + +```python +import openai +client = openai.OpenAI( + api_key="sk-1234", + base_url="http://0.0.0.0:8000" +) + +# request sent to model set on litellm proxy, `litellm --model` +response = client.chat.completions.create(model="gpt-3.5-turbo", messages = [ + { + "role": "user", + "content": "this is a test request, write a short poem" + } +], + extra_body={ + "api_key": "my-azure-key", + "api_base": "my-azure-base", + "api_version": "my-azure-version" + }) # 👈 User Key + +print(response) +``` + + +For JS, the OpenAI client accepts passing params in the `create(..)` body as normal. + ```javascript const { OpenAI } = require('openai');