diff --git a/docs/my-website/docs/observability/integrations.md b/docs/my-website/docs/observability/integrations.md index b5d27ccfc..c6a158f69 100644 --- a/docs/my-website/docs/observability/integrations.md +++ b/docs/my-website/docs/observability/integrations.md @@ -4,6 +4,7 @@ | ----------- | -------------------------------------------------------- | ---------------------------------------- | | Promptlayer | `PROMPLAYER_API_KEY` | `litellm.success_callback=["promptlayer"]` | | LLMonitor | `LLMONITOR_APP_ID` | `litellm.success_callback=["llmonitor"]` | +| LangFuse | `LANGFUSE_PUBLIC_KEY`, `LANGFUSE_PRIVATE_KEY` | `litellm.success_callback=["langfuse"]` | | Sentry | `SENTRY_API_URL` | `litellm.success_callback=["sentry"]` | | Posthog | `POSTHOG_API_KEY`,`POSTHOG_API_URL` | `litellm.success_callback=["posthog"]` | | Slack | `SLACK_API_TOKEN`,`SLACK_API_SECRET`,`SLACK_API_CHANNEL` | `litellm.success_callback=["slack"]` | diff --git a/docs/my-website/docs/observability/langfuse_integration.md b/docs/my-website/docs/observability/langfuse_integration.md new file mode 100644 index 000000000..edf0f6cc7 --- /dev/null +++ b/docs/my-website/docs/observability/langfuse_integration.md @@ -0,0 +1,42 @@ +# LangFuse Tutorial + +LangFuse is open Source Observability & Analytics for LLM Apps +Detailed production traces and a granular view on quality, cost and latency + + + +## Use Langfuse to log requests across all LLM Providers (OpenAI, Azure, Anthropic, Cohere, Replicate, PaLM) + +liteLLM provides `callbacks`, making it easy for you to log data depending on the status of your responses. + +### Using Callbacks + +Get your Langfuse API Keys from https://cloud.langfuse.com/ + +Use just 2 lines of code, to instantly log your responses **across all providers** with langfuse: + +```python +litellm.success_callback = ["langfuse"] + +``` + +Complete code + +```python +from litellm import completion + +## set env variables +os.environ["LANGFUSE_PUBLIC_KEY"] = "your key" +os.environ["LANGFUSE_SECRET_KEY"] = "your key" + +os.environ["OPENAI_API_KEY"], os.environ["COHERE_API_KEY"] = "", "" + +# set callbacks +litellm.success_callback = ["langfuse"] + +#openai call +response = completion(model="gpt-3.5-turbo", messages=[{"role": "user", "content": "Hi 👋 - i'm openai"}]) + +#cohere call +response = completion(model="command-nightly", messages=[{"role": "user", "content": "Hi 👋 - i'm cohere"}]) +``` diff --git a/docs/my-website/img/langfuse.gif b/docs/my-website/img/langfuse.gif new file mode 100644 index 000000000..b31585250 Binary files /dev/null and b/docs/my-website/img/langfuse.gif differ diff --git a/docs/my-website/sidebars.js b/docs/my-website/sidebars.js index ba5551c3b..764b01522 100644 --- a/docs/my-website/sidebars.js +++ b/docs/my-website/sidebars.js @@ -65,6 +65,7 @@ const sidebars = { "observability/callbacks", "observability/integrations", "observability/promptlayer_integration", + "observability/langfuse_integration", "observability/traceloop_integration", "observability/llmonitor_integration", "observability/helicone_integration",