litellm-mirror/docs/my-website/docs/observability/phoenix_integration.md
exiao fa3136c391
add phoenix docs for observability integration (#8522)
* Add files via upload

* Update arize_integration.md

* Update arize_integration.md

* add Phoenix docs
2025-02-13 13:18:37 -08:00

2.1 KiB
Raw Blame History

import Image from '@theme/IdealImage';

Phoenix OSS

Open source tracing and evaluation platform

:::tip

This is community maintained, Please make an issue if you run into a bug https://github.com/BerriAI/litellm

:::

Pre-Requisites

Make an account on Phoenix OSS OR self-host your own instance of Phoenix

Quick Start

Use just 2 lines of code, to instantly log your responses across all providers with Phoenix

You can also use the instrumentor option instead of the callback, which you can find here.

litellm.callbacks = ["arize_phoenix"]
import litellm
import os

os.environ["PHOENIX_API_KEY"] = "" # Necessary only using Phoenix Cloud
os.environ["PHOENIX_COLLECTOR_HTTP_ENDPOINT"] = "" # The URL of your Phoenix OSS instance
# This defaults to https://app.phoenix.arize.com/v1/traces for Phoenix Cloud

# LLM API Keys
os.environ['OPENAI_API_KEY']=""

# set arize as a callback, litellm will send the data to arize
litellm.callbacks = ["phoenix"]
 
# openai call
response = litellm.completion(
  model="gpt-3.5-turbo",
  messages=[
    {"role": "user", "content": "Hi 👋 - i'm openai"}
  ]
)

Using with LiteLLM Proxy

model_list:
  - model_name: gpt-4o
    litellm_params:
      model: openai/fake
      api_key: fake-key
      api_base: https://exampleopenaiendpoint-production.up.railway.app/

litellm_settings:
  callbacks: ["arize_phoenix"]

environment_variables:
    PHOENIX_API_KEY: "d0*****"
    PHOENIX_COLLECTOR_ENDPOINT: "https://app.phoenix.arize.com/v1/traces" # OPTIONAL, for setting the GRPC endpoint
    PHOENIX_COLLECTOR_HTTP_ENDPOINT: "https://app.phoenix.arize.com/v1/traces" # OPTIONAL, for setting the HTTP endpoint

Support & Talk to Founders