diff --git a/docs/my-website/docs/proxy/logging.md b/docs/my-website/docs/proxy/logging.md
index bdd75d647c..2aa069b7ac 100644
--- a/docs/my-website/docs/proxy/logging.md
+++ b/docs/my-website/docs/proxy/logging.md
@@ -3,13 +3,13 @@ import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
-# 🔎 Logging - Custom Callbacks, Langfuse, ClickHouse, s3 Bucket, Sentry, OpenTelemetry, Athina
+# 🔎 Logging - Custom Callbacks, DataDog, Langfuse, s3 Bucket, Sentry, OpenTelemetry, Athina
Log Proxy Input, Output, Exceptions using Custom Callbacks, Langfuse, OpenTelemetry, LangFuse, DynamoDB, s3 Bucket
- [Async Custom Callbacks](#custom-callback-class-async)
- [Async Custom Callback APIs](#custom-callback-apis-async)
-- [Logging to ClickHouse](#logging-proxy-inputoutput---clickhouse)
+- [Logging to DataDog](#logging-proxy-inputoutput---datadog)
- [Logging to Langfuse](#logging-proxy-inputoutput---langfuse)
- [Logging to s3 Buckets](#logging-proxy-inputoutput---s3-buckets)
- [Logging to DynamoDB](#logging-proxy-inputoutput---dynamodb)
@@ -539,32 +539,8 @@ print(response)
-## Logging Proxy Input/Output - Clickhouse
-We will use the `--config` to set `litellm.success_callback = ["clickhouse"]` this will log all successfull LLM calls to ClickHouse DB
-
-### [Optional] - Docker Compose - LiteLLM Proxy + Self Hosted Clickhouse DB
-Use this docker compose yaml to start LiteLLM Proxy + Clickhouse DB
-```yaml
-version: "3.9"
-services:
- litellm:
- image: ghcr.io/berriai/litellm:main-latest
- volumes:
- - ./proxy_server_config.yaml:/app/proxy_server_config.yaml # mount your litellm config.yaml
- ports:
- - "4000:4000"
- environment:
- - AZURE_API_KEY=sk-123
- clickhouse:
- image: clickhouse/clickhouse-server
- environment:
- - CLICKHOUSE_DB=litellm-test
- - CLICKHOUSE_USER=admin
- - CLICKHOUSE_DEFAULT_ACCESS_MANAGEMENT=1
- - CLICKHOUSE_PASSWORD=admin
- ports:
- - "8123:8123"
-```
+## Logging Proxy Input/Output - DataDog
+We will use the `--config` to set `litellm.success_callback = ["datadog"]` this will log all successfull LLM calls to DataDog
**Step 1**: Create a `config.yaml` file and set `litellm_settings`: `success_callback`
```yaml
@@ -573,43 +549,16 @@ model_list:
litellm_params:
model: gpt-3.5-turbo
litellm_settings:
- success_callback: ["clickhouse"]
+ success_callback: ["datadog"]
```
-**Step 2**: Set Required env variables for clickhouse
-
-
-
-
-Env Variables for self hosted click house
-```shell
-CLICKHOUSE_HOST = "localhost"
-CLICKHOUSE_PORT = "8123"
-CLICKHOUSE_USERNAME = "admin"
-CLICKHOUSE_PASSWORD = "admin"
-```
-
-
-
-
-
-
-
-Env Variables for cloud click house
+**Step 2**: Set Required env variables for datadog
```shell
-CLICKHOUSE_HOST = "hjs1z7j37j.us-east1.gcp.clickhouse.cloud"
-CLICKHOUSE_PORT = "8443"
-CLICKHOUSE_USERNAME = "default"
-CLICKHOUSE_PASSWORD = "M~PimRs~c3Z6b"
+DD_API_KEY="5f2d0f310***********" # your datadog API Key
+DD_SITE="us5.datadoghq.com" # your datadog base url
```
-
-
-
-
-
-
**Step 3**: Start the proxy, make a test request
Start proxy
@@ -618,9 +567,27 @@ litellm --config config.yaml --debug
```
Test Request
+
+```shell
+curl --location 'http://0.0.0.0:4000/chat/completions' \
+ --header 'Content-Type: application/json' \
+ --data '{
+ "model": "gpt-3.5-turbo",
+ "messages": [
+ {
+ "role": "user",
+ "content": "what llm are you"
+ }
+ ],
+ "metadata": {
+ "your-custom-metadata": "custom-field",
+ }
+}'
```
-litellm --test
-```
+
+Expected output on Datadog
+
+
## Logging Proxy Input/Output - s3 Buckets
diff --git a/docs/my-website/img/dd_small1.png b/docs/my-website/img/dd_small1.png
new file mode 100644
index 0000000000..aea8f675df
Binary files /dev/null and b/docs/my-website/img/dd_small1.png differ