diff --git a/docs/my-website/docs/observability/integrations.md b/docs/my-website/docs/observability/integrations.md index ef8891119d..acd37aa333 100644 --- a/docs/my-website/docs/observability/integrations.md +++ b/docs/my-website/docs/observability/integrations.md @@ -9,6 +9,6 @@ | Weights & Biases | `WANDB_API_KEY` | `litellm.success_callback=["wandb"]` | | Sentry | `SENTRY_API_URL` | `litellm.success_callback=["sentry"]` | | Posthog | `POSTHOG_API_KEY`,`POSTHOG_API_URL` | `litellm.success_callback=["posthog"]` | -| Slack | `SLACK_API_TOKEN`,`SLACK_API_SECRET`,`SLACK_API_CHANNEL` | `litellm.success_callback=["slack"]` | +| Slack | `Slack webhook url` | `litellm.success_callback=["slack"]` | | Traceloop | `TRACELOOP_API_TOKEN` | `litellm.success_callback=["traceloop"]` | | Helicone | `HELICONE_API_TOKEN` | `litellm.success_callback=["helicone"]` | diff --git a/docs/my-website/docs/observability/slack_integration.md b/docs/my-website/docs/observability/slack_integration.md new file mode 100644 index 0000000000..162ceb450f --- /dev/null +++ b/docs/my-website/docs/observability/slack_integration.md @@ -0,0 +1,93 @@ +import Image from '@theme/IdealImage'; + +# Slack - Logging LLM Input/Output, Exceptions + + + +:::info +We want to learn how we can make the callbacks better! Meet the LiteLLM [founders](https://calendly.com/d/4mp-gd3-k5k/berriai-1-1-onboarding-litellm-hosted-version) or +join our [discord](https://discord.gg/wuPM9dRgDw) +::: + +## Pre-Requisites + +### Step 1 +```shell +pip install litellm +``` + +### Step 2 +Get a slack webhook url from https://api.slack.com/messaging/webhooks + + + +## Quick Start +### Create a custom Callback to log to slack +We create a custom callback, to log to slack webhooks, see [custom callbacks on litellm](https://docs.litellm.ai/docs/observability/custom_callback) +```python +def send_slack_alert( + kwargs, + completion_response, + start_time, + end_time, +): + print( + "in custom slack callback func" + ) + import requests + import json + + # Define the Slack webhook URL + # get it from https://api.slack.com/messaging/webhooks + slack_webhook_url = os.environ['SLACK_WEBHOOK_URL'] # "https://hooks.slack.com/services/<>/<>/<>" + + # Define the text payload, send data available in litellm custom_callbacks + text_payload = f"""LiteLLM Logging: kwargs: {str(kwargs)}\n\n, response: {str(completion_response)}\n\n, start time{str(start_time)} end time: {str(end_time)} + """ + payload = { + "text": text_payload + } + + # Set the headers + headers = { + "Content-type": "application/json" + } + + # Make the POST request + response = requests.post(slack_webhook_url, json=payload, headers=headers) + + # Check the response status + if response.status_code == 200: + print("Message sent successfully to Slack!") + else: + print(f"Failed to send message to Slack. Status code: {response.status_code}") + print(response.json()) +``` + +### Pass callback to LiteLLM +```python +litellm.success_callback = [send_slack_alert] +``` + +```python +import litellm +litellm.success_callback = [send_slack_alert] # log success +litellm.failure_callback = [send_slack_alert] # log exceptions + +# this will raise an exception +response = litellm.completion( + model="gpt-2", + messages=[ + { + "role": "user", + "content": "Hi 👋 - i'm openai" + } + ] +) +``` +## Support & Talk to Founders + +- [Schedule Demo 👋](https://calendly.com/d/4mp-gd3-k5k/berriai-1-1-onboarding-litellm-hosted-version) +- [Community Discord 💭](https://discord.gg/wuPM9dRgDw) +- Our numbers 📞 +1 (770) 8783-106 / ‭+1 (412) 618-6238‬ +- Our emails ✉️ ishaan@berri.ai / krrish@berri.ai \ No newline at end of file diff --git a/docs/my-website/img/slack.png b/docs/my-website/img/slack.png new file mode 100644 index 0000000000..1736696ca0 Binary files /dev/null and b/docs/my-website/img/slack.png differ diff --git a/docs/my-website/sidebars.js b/docs/my-website/sidebars.js index 3717f5e832..8339de8f1c 100644 --- a/docs/my-website/sidebars.js +++ b/docs/my-website/sidebars.js @@ -120,6 +120,7 @@ const sidebars = { "observability/wandb_integration", "observability/langfuse_integration", "observability/langsmith_integration", + "observability/slack_integration", "observability/traceloop_integration", "observability/llmonitor_integration", "observability/helicone_integration",