forked from phoenix/litellm-mirror
1.5 KiB
1.5 KiB
import Image from '@theme/IdealImage';
Logfire - Logging LLM Input/Output
Logfire is open Source Observability & Analytics for LLM Apps Detailed production traces and a granular view on quality, cost and latency
<Image img={require('../../img/logfire.png')} />
:::info We want to learn how we can make the callbacks better! Meet the LiteLLM founders or join our discord :::
Pre-Requisites
Ensure you have run pip install logfire
for this integration
pip install logfire litellm
Quick Start
Get your Logfire token from Logfire
litellm.success_callback = ["logfire"]
litellm.failure_callback = ["logfire"] # logs errors to logfire
# pip install logfire
import litellm
import os
# from https://logfire.pydantic.dev/
os.environ["LOGFIRE_TOKEN"] = ""
# LLM API Keys
os.environ['OPENAI_API_KEY']=""
# set logfire as a callback, litellm will send the data to logfire
litellm.success_callback = ["logfire"]
# openai call
response = litellm.completion(
model="gpt-3.5-turbo",
messages=[
{"role": "user", "content": "Hi 👋 - i'm openai"}
]
)
Support & Talk to Founders
- Schedule Demo 👋
- Community Discord 💭
- Our numbers 📞 +1 (770) 8783-106 / +1 (412) 618-6238
- Our emails ✉️ ishaan@berri.ai / krrish@berri.ai