forked from phoenix/litellm-mirror
docs updated
This commit is contained in:
parent
122289c07f
commit
ab48873a34
1 changed files with 12 additions and 23 deletions
|
@ -1,6 +1,6 @@
|
||||||
import Image from '@theme/IdealImage';
|
import Image from '@theme/IdealImage';
|
||||||
|
|
||||||
# LiteLLM Client: Debug Logs + Instant LLM Deploys
|
# LiteLLM Client: 1-Click Deploy LLMs + Debug Logs
|
||||||
LiteLLM offers a UI to:
|
LiteLLM offers a UI to:
|
||||||
* 1-Click Deploy LLMs - the client stores your api keys + model configurations
|
* 1-Click Deploy LLMs - the client stores your api keys + model configurations
|
||||||
* Debug your Call Logs
|
* Debug your Call Logs
|
||||||
|
@ -8,7 +8,9 @@ LiteLLM offers a UI to:
|
||||||
<Image img={require('../../img/dashboard.png')} alt="Dashboard" />
|
<Image img={require('../../img/dashboard.png')} alt="Dashboard" />
|
||||||
|
|
||||||
## Getting Started
|
## Getting Started
|
||||||
1 Step:
|
<a target="_blank" href="https://colab.research.google.com/github/https://colab.research.google.com/drive/1y2ChqxJOwEByThibxYMEpY5P6_RTNjj4?usp=sharing">
|
||||||
|
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/>
|
||||||
|
</a>
|
||||||
* Make a `litellm.completion()` call 👉 get your debugging dashboard
|
* Make a `litellm.completion()` call 👉 get your debugging dashboard
|
||||||
|
|
||||||
Example Code: Regular `litellm.completion()` call:
|
Example Code: Regular `litellm.completion()` call:
|
||||||
|
@ -23,6 +25,7 @@ All `completion()` calls print with a link to your session dashboard
|
||||||
|
|
||||||
<Image img={require('../../img/dash_output.png')} alt="Dashboard" />
|
<Image img={require('../../img/dash_output.png')} alt="Dashboard" />
|
||||||
|
|
||||||
|
|
||||||
Example Output from litellm completion
|
Example Output from litellm completion
|
||||||
```bash
|
```bash
|
||||||
Here's your LiteLLM Dashboard 👉 https://admin.litellm.ai/88911906-d786-44f2-87c7-9720e6031b45
|
Here's your LiteLLM Dashboard 👉 https://admin.litellm.ai/88911906-d786-44f2-87c7-9720e6031b45
|
||||||
|
@ -38,12 +41,7 @@ Here's your LiteLLM Dashboard 👉 https://admin.litellm.ai/88911906-d786-44f2-8
|
||||||
Once created, your dashboard is viewable at - `admin.litellm.ai/<your_email>` [👋 Tell us if you need better privacy controls](https://calendly.com/d/4mp-gd3-k5k/berriai-1-1-onboarding-litellm-hosted-version?month=2023-08)
|
Once created, your dashboard is viewable at - `admin.litellm.ai/<your_email>` [👋 Tell us if you need better privacy controls](https://calendly.com/d/4mp-gd3-k5k/berriai-1-1-onboarding-litellm-hosted-version?month=2023-08)
|
||||||
See our live dashboard 👉 [admin.litellm.ai](https://admin.litellm.ai/)
|
See our live dashboard 👉 [admin.litellm.ai](https://admin.litellm.ai/)
|
||||||
|
|
||||||
## Usage
|
|
||||||
### Requirements
|
|
||||||
1. **Needs litellm>=0.1.438***
|
|
||||||
2. Set `litellm.email` account. You can set your user email in 2 ways.
|
|
||||||
- By setting it on the module - `litellm.email=<your_email>`.
|
|
||||||
- By setting it as an environment variable - `os.environ["LITELLM_EMAIL"] = "your_email"`.
|
|
||||||
|
|
||||||
## Code Setup
|
## Code Setup
|
||||||
```python
|
```python
|
||||||
|
@ -54,16 +52,19 @@ litellm.email = "test_email@test.com"
|
||||||
|
|
||||||
```
|
```
|
||||||
|
|
||||||
## Using LiteLLM Dashboard - Add New LLMs
|
## LiteLLM Dashboard - 1-Click Deploy LLMs
|
||||||
LiteLLM allows you to add a new model using the liteLLM Dashboard
|
LiteLLM allows you to add a new model using the liteLLM Dashboard
|
||||||
Go to `admin.litellm.ai/<your_email>`
|
|
||||||
Navigate to the 'Add New LLM' Section
|
Navigate to the 'Add New LLM' Section
|
||||||
<Image img={require('../../img/add_model.png')} alt="Dashboard" />
|
<Image img={require('../../img/add_model.png')} alt="Dashboard" />
|
||||||
|
|
||||||
- Select Provider
|
- Select Provider
|
||||||
- Select your LLM
|
- Select your LLM
|
||||||
- Add your LLM Key
|
- Add your LLM Key
|
||||||
|
|
||||||
|
## LiteLLM Dashboard - Debug Logs
|
||||||
|
All your `completion()` and `embedding()` call logs are available on `admin.litellm.ai/<your-token>`
|
||||||
|
See your Logs below
|
||||||
|
|
||||||
### Using your new LLM - Completion() with the LiteLLM Dashboard
|
### Using your new LLM - Completion() with the LiteLLM Dashboard
|
||||||
```python
|
```python
|
||||||
from litellm import embedding, completion
|
from litellm import embedding, completion
|
||||||
|
@ -74,15 +75,3 @@ messages = [{ "content": "Hello, how are you?" ,"role": "user"}]
|
||||||
response = completion(model="gpt-3.5-turbo", messages=[{"role": "user", "content": "Hi 👋 - i'm openai"}])
|
response = completion(model="gpt-3.5-turbo", messages=[{"role": "user", "content": "Hi 👋 - i'm openai"}])
|
||||||
```
|
```
|
||||||
|
|
||||||
## Using LiteLLM Dashboard - Debug your Call Logs
|
|
||||||
|
|
||||||
```python
|
|
||||||
from litellm import embedding, completion
|
|
||||||
# keys set in admin.litellm.ai/<your_email> or .env OPENAI_API_KEY
|
|
||||||
messages = [{ "content": "Hello, how are you?" ,"role": "user"}]
|
|
||||||
|
|
||||||
# openai call
|
|
||||||
response = completion(model="gpt-3.5-turbo", messages=[{"role": "user", "content": "Hi 👋 - i'm openai"}])
|
|
||||||
```
|
|
||||||
|
|
||||||
See your Logs below
|
|
Loading…
Add table
Add a link
Reference in a new issue