diff --git a/docs/my-website/docs/observability/sentry.md b/docs/my-website/docs/observability/sentry.md index 255dd55cf..5877db661 100644 --- a/docs/my-website/docs/observability/sentry.md +++ b/docs/my-website/docs/observability/sentry.md @@ -40,5 +40,9 @@ response = completion(model="gpt-3.5-turbo", messages=[{"role": "user", "content print(response) ``` +## Redacting Messages, Response Content from Sentry Logging + +Set `litellm.turn_off_message_logging=True` This will prevent the messages and responses from being logged to sentry, but request metadata will still be logged. + [Let us know](https://github.com/BerriAI/litellm/issues/new?assignees=&labels=enhancement&projects=&template=feature_request.yml&title=%5BFeature%5D%3A+) if you need any additional options from Sentry. diff --git a/docs/my-website/docs/proxy/logging.md b/docs/my-website/docs/proxy/logging.md index 48a5955b1..1a5a7f0f0 100644 --- a/docs/my-website/docs/proxy/logging.md +++ b/docs/my-website/docs/proxy/logging.md @@ -569,6 +569,22 @@ curl -X POST 'http://0.0.0.0:4000/key/generate' \ All requests made with these keys will log data to their team-specific logging. +### Redacting Messages, Response Content from Langfuse Logging + +Set `litellm.turn_off_message_logging=True` This will prevent the messages and responses from being logged to langfuse, but request metadata will still be logged. + +```yaml +model_list: + - model_name: gpt-3.5-turbo + litellm_params: + model: gpt-3.5-turbo +litellm_settings: + success_callback: ["langfuse"] + turn_off_message_logging: True +``` + + + ## Logging Proxy Input/Output - DataDog We will use the `--config` to set `litellm.success_callback = ["datadog"]` this will log all successfull LLM calls to DataDog