forked from phoenix/litellm-mirror
docs(user_keys.md): improve user keys docs
This commit is contained in:
parent
3026e5aa58
commit
7ce7516621
1 changed files with 33 additions and 5 deletions
|
@ -8,11 +8,6 @@ Allows your users to pass in their OpenAI API key (any LiteLLM supported provide
|
|||
|
||||
Here's how to do it:
|
||||
|
||||
<Tabs>
|
||||
<TabItem value="openai-py" label="OpenAI Python">
|
||||
|
||||
Pass in the litellm_params (E.g. api_key, api_base, etc.) via the `extra_body` parameter in the OpenAI client.
|
||||
|
||||
```python
|
||||
import openai
|
||||
client = openai.OpenAI(
|
||||
|
@ -31,9 +26,42 @@ response = client.chat.completions.create(model="gpt-3.5-turbo", messages = [
|
|||
|
||||
print(response)
|
||||
```
|
||||
|
||||
More examples:
|
||||
<Tabs>
|
||||
<TabItem value="openai-py" label="Azure Credentials">
|
||||
|
||||
Pass in the litellm_params (E.g. api_key, api_base, etc.) via the `extra_body` parameter in the OpenAI client.
|
||||
|
||||
```python
|
||||
import openai
|
||||
client = openai.OpenAI(
|
||||
api_key="sk-1234",
|
||||
base_url="http://0.0.0.0:8000"
|
||||
)
|
||||
|
||||
# request sent to model set on litellm proxy, `litellm --model`
|
||||
response = client.chat.completions.create(model="gpt-3.5-turbo", messages = [
|
||||
{
|
||||
"role": "user",
|
||||
"content": "this is a test request, write a short poem"
|
||||
}
|
||||
],
|
||||
extra_body={
|
||||
"api_key": "my-azure-key",
|
||||
"api_base": "my-azure-base",
|
||||
"api_version": "my-azure-version"
|
||||
}) # 👈 User Key
|
||||
|
||||
print(response)
|
||||
```
|
||||
|
||||
|
||||
</TabItem>
|
||||
<TabItem value="openai-js" label="OpenAI JS">
|
||||
|
||||
For JS, the OpenAI client accepts passing params in the `create(..)` body as normal.
|
||||
|
||||
```javascript
|
||||
const { OpenAI } = require('openai');
|
||||
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue