forked from phoenix/litellm-mirror
docs
This commit is contained in:
parent
256f1aa060
commit
122289c07f
1 changed files with 5 additions and 3 deletions
|
@ -8,20 +8,22 @@ LiteLLM offers a UI to:
|
|||
<Image img={require('../../img/dashboard.png')} alt="Dashboard" />
|
||||
|
||||
## Getting Started
|
||||
Make a `litellm.completion()` call 👉 get your debugging dashboard
|
||||
1 Step:
|
||||
* Make a `litellm.completion()` call 👉 get your debugging dashboard
|
||||
|
||||
Example Code - Normal `litellm.completion()` call:
|
||||
Example Code: Regular `litellm.completion()` call:
|
||||
```python
|
||||
from litellm import completion
|
||||
messages = [{ "content": "Hello, how are you?" ,"role": "user"}]
|
||||
response = completion(model="gpt-3.5-turbo", messages=[{"role": "user", "content": "Hi 👋 - i'm openai"}])
|
||||
```
|
||||
|
||||
### Output with dashboard
|
||||
## Completion() Output with dashboard
|
||||
All `completion()` calls print with a link to your session dashboard
|
||||
|
||||
<Image img={require('../../img/dash_output.png')} alt="Dashboard" />
|
||||
|
||||
Example Output from litellm completion
|
||||
```bash
|
||||
Here's your LiteLLM Dashboard 👉 https://admin.litellm.ai/88911906-d786-44f2-87c7-9720e6031b45
|
||||
<OpenAIObject chat.completion id=chatcmpl-7r6LtlUXYYu0QayfhS3S0OzroiCel at 0x7fb307375030> JSON: {
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue