forked from phoenix/litellm-mirror
[Feat] Observability integration - Opik by Comet (#6062)
* Added Opik logging and evaluation * Updated doc examples * Default tags should be [] in case appending * WIP * Work in progress * Opik integration * Opik integration * Revert changes on litellm_logging.py * Updated Opik integration for synchronous API calls * Updated Opik documentation --------- Co-authored-by: Douglas Blank <doug@comet.com> Co-authored-by: Doug Blank <doug.blank@gmail.com>
This commit is contained in:
parent
89506053a4
commit
4064bfc6dd
7 changed files with 686 additions and 0 deletions
95
docs/my-website/docs/observability/opik_integration.md
Normal file
95
docs/my-website/docs/observability/opik_integration.md
Normal file
|
@ -0,0 +1,95 @@
|
|||
import Image from '@theme/IdealImage';
|
||||
|
||||
# Comet Opik - Logging + Evals
|
||||
Opik is an open source end-to-end [LLM Evaluation Platform](https://www.comet.com/site/products/opik/?utm_source=litelllm&utm_medium=docs&utm_content=intro_paragraph) that helps developers track their LLM prompts and responses during both development and production. Users can define and run evaluations to test their LLMs apps before deployment to check for hallucinations, accuracy, context retrevial, and more!
|
||||
|
||||
|
||||
<Image img={require('../../img/opik.png')} />
|
||||
|
||||
:::info
|
||||
We want to learn how we can make the callbacks better! Meet the LiteLLM [founders](https://calendly.com/d/4mp-gd3-k5k/berriai-1-1-onboarding-litellm-hosted-version) or
|
||||
join our [discord](https://discord.gg/wuPM9dRgDw)
|
||||
:::
|
||||
|
||||
## Pre-Requisites
|
||||
|
||||
You can learn more about setting up Opik in the [Opik quickstart guide](https://www.comet.com/docs/opik/quickstart/). You can also learn more about self-hosting Opik in our [self-hosting guide](https://www.comet.com/docs/opik/self-host/local_deployment).
|
||||
|
||||
## Quick Start
|
||||
Use just 4 lines of code, to instantly log your responses **across all providers** with Opik
|
||||
|
||||
Get your Opik API Key by signing up [here](https://www.comet.com/signup?utm_source=litelllm&utm_medium=docs&utm_content=api_key_cell)!
|
||||
|
||||
```python
|
||||
from litellm.integrations.opik.opik import OpikLogger
|
||||
import litellm
|
||||
|
||||
opik_logger = OpikLogger()
|
||||
litellm.callbacks = [opik_logger]
|
||||
```
|
||||
|
||||
Full examples:
|
||||
|
||||
```python
|
||||
from litellm.integrations.opik.opik import OpikLogger
|
||||
import litellm
|
||||
import os
|
||||
|
||||
# Configure the Opik API key or call opik.configure()
|
||||
os.environ["OPIK_API_KEY"] = ""
|
||||
os.environ["OPIK_WORKSPACE"] = ""
|
||||
|
||||
# LLM provider API Keys:
|
||||
os.environ["OPENAI_API_KEY"] = ""
|
||||
|
||||
# set "opik" as a callback, litellm will send the data to an Opik server (such as comet.com)
|
||||
opik_logger = OpikLogger()
|
||||
litellm.callbacks = [opik_logger]
|
||||
|
||||
# openai call
|
||||
response = litellm.completion(
|
||||
model="gpt-3.5-turbo",
|
||||
messages=[
|
||||
{"role": "user", "content": "Why is tracking and evaluation of LLMs important?"}
|
||||
]
|
||||
)
|
||||
```
|
||||
|
||||
If you are liteLLM within a function tracked using Opik's `@track` decorator,
|
||||
you will need provide the `current_span_data` field in the metadata attribute
|
||||
so that the LLM call is assigned to the correct trace:
|
||||
|
||||
```python
|
||||
from opik import track
|
||||
from opik.opik_context import get_current_span_data
|
||||
from litellm.integrations.opik.opik import OpikLogger
|
||||
import litellm
|
||||
|
||||
opik_logger = OpikLogger()
|
||||
litellm.callbacks = [opik_logger]
|
||||
|
||||
@track()
|
||||
def streaming_function(input):
|
||||
messages = [{"role": "user", "content": input}]
|
||||
response = litellm.completion(
|
||||
model="gpt-3.5-turbo",
|
||||
messages=messages,
|
||||
metadata = {
|
||||
"opik": {
|
||||
"current_span_data": get_current_span_data(),
|
||||
"tags": ["streaming-test"],
|
||||
},
|
||||
}
|
||||
)
|
||||
return response
|
||||
|
||||
response = streaming_function("Why is tracking and evaluation of LLMs important?")
|
||||
chunks = list(response)
|
||||
```
|
||||
|
||||
## Support & Talk to Founders
|
||||
|
||||
- [Schedule Demo 👋](https://calendly.com/d/4mp-gd3-k5k/berriai-1-1-onboarding-litellm-hosted-version)
|
||||
- [Community Discord 💭](https://discord.gg/wuPM9dRgDw)
|
||||
- Our numbers 📞 +1 (770) 8783-106 / +1 (412) 618-6238
|
||||
- Our emails ✉️ ishaan@berri.ai / krrish@berri.ai
|
BIN
docs/my-website/img/opik.png
Normal file
BIN
docs/my-website/img/opik.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 130 KiB |
|
@ -293,6 +293,7 @@ const sidebars = {
|
|||
"observability/greenscale_integration",
|
||||
"observability/supabase_integration",
|
||||
`observability/telemetry`,
|
||||
"observability/opik_integration",
|
||||
],
|
||||
},
|
||||
{
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue