From 1e621f716f70802714f79a052e670b649dbcecce Mon Sep 17 00:00:00 2001 From: Krrish Dholakia Date: Sat, 27 Jul 2024 09:28:53 -0700 Subject: [PATCH] docs(debugging.md): cleanup docs --- docs/my-website/docs/proxy/debugging.md | 16 ++++++++++++++++ 1 file changed, 16 insertions(+) diff --git a/docs/my-website/docs/proxy/debugging.md b/docs/my-website/docs/proxy/debugging.md index 38680982a..5cca65417 100644 --- a/docs/my-website/docs/proxy/debugging.md +++ b/docs/my-website/docs/proxy/debugging.md @@ -35,6 +35,22 @@ $ litellm --detailed_debug os.environ["LITELLM_LOG"] = "DEBUG" ``` +### Debug Logs + +Run the proxy with `--detailed_debug` to view detailed debug logs +```shell +litellm --config /path/to/config.yaml --detailed_debug +``` + +When making requests you should see the POST request sent by LiteLLM to the LLM on the Terminal output +```shell +POST Request Sent from LiteLLM: +curl -X POST \ +https://api.openai.com/v1/chat/completions \ +-H 'content-type: application/json' -H 'Authorization: Bearer sk-qnWGUIW9****************************************' \ +-d '{"model": "gpt-3.5-turbo", "messages": [{"role": "user", "content": "this is a test request, write a short poem"}]}' +``` + ## JSON LOGS Set `JSON_LOGS="True"` in your env: