From 54bf89fa27cf7b9b1ed137ac93f6fc6d52a0210c Mon Sep 17 00:00:00 2001 From: Marc Abramowitz Date: Fri, 21 Jun 2024 22:10:31 -0700 Subject: [PATCH] Document feature --- docs/my-website/docs/proxy/logging.md | 18 ++++++++++++++++++ 1 file changed, 18 insertions(+) diff --git a/docs/my-website/docs/proxy/logging.md b/docs/my-website/docs/proxy/logging.md index e9be2b837..f9ed5db3d 100644 --- a/docs/my-website/docs/proxy/logging.md +++ b/docs/my-website/docs/proxy/logging.md @@ -210,6 +210,24 @@ litellm_settings: turn_off_message_logging: True ``` +If you have this feature turned on, you can override it for specific requests by +setting a request header `LiteLLM-Disable-Message-Redaction: true`. + +```shell +curl --location 'http://0.0.0.0:4000/chat/completions' \ + --header 'Content-Type: application/json' \ + --header 'LiteLLM-Disable-Message-Redaction: true' \ + --data '{ + "model": "gpt-3.5-turbo", + "messages": [ + { + "role": "user", + "content": "what llm are you" + } + ] +}' +``` + ### 🔧 Debugging - Viewing RAW CURL sent from LiteLLM to provider Use this when you want to view the RAW curl request sent from LiteLLM to the LLM API