forked from phoenix/litellm-mirror
sentry redact messages
This commit is contained in:
parent
01478c9148
commit
f55838d185
2 changed files with 20 additions and 0 deletions
|
@ -40,5 +40,9 @@ response = completion(model="gpt-3.5-turbo", messages=[{"role": "user", "content
|
||||||
print(response)
|
print(response)
|
||||||
```
|
```
|
||||||
|
|
||||||
|
## Redacting Messages, Response Content from Sentry Logging
|
||||||
|
|
||||||
|
Set `litellm.turn_off_message_logging=True` This will prevent the messages and responses from being logged to sentry, but request metadata will still be logged.
|
||||||
|
|
||||||
[Let us know](https://github.com/BerriAI/litellm/issues/new?assignees=&labels=enhancement&projects=&template=feature_request.yml&title=%5BFeature%5D%3A+) if you need any additional options from Sentry.
|
[Let us know](https://github.com/BerriAI/litellm/issues/new?assignees=&labels=enhancement&projects=&template=feature_request.yml&title=%5BFeature%5D%3A+) if you need any additional options from Sentry.
|
||||||
|
|
||||||
|
|
|
@ -569,6 +569,22 @@ curl -X POST 'http://0.0.0.0:4000/key/generate' \
|
||||||
|
|
||||||
All requests made with these keys will log data to their team-specific logging.
|
All requests made with these keys will log data to their team-specific logging.
|
||||||
|
|
||||||
|
### Redacting Messages, Response Content from Langfuse Logging
|
||||||
|
|
||||||
|
Set `litellm.turn_off_message_logging=True` This will prevent the messages and responses from being logged to langfuse, but request metadata will still be logged.
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
model_list:
|
||||||
|
- model_name: gpt-3.5-turbo
|
||||||
|
litellm_params:
|
||||||
|
model: gpt-3.5-turbo
|
||||||
|
litellm_settings:
|
||||||
|
success_callback: ["langfuse"]
|
||||||
|
turn_off_message_logging: True
|
||||||
|
```
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
## Logging Proxy Input/Output - DataDog
|
## Logging Proxy Input/Output - DataDog
|
||||||
We will use the `--config` to set `litellm.success_callback = ["datadog"]` this will log all successfull LLM calls to DataDog
|
We will use the `--config` to set `litellm.success_callback = ["datadog"]` this will log all successfull LLM calls to DataDog
|
||||||
|
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue