diff --git a/docs/my-website/docs/simple_proxy.md b/docs/my-website/docs/simple_proxy.md index 651e49eb7e..365d6d3728 100644 --- a/docs/my-website/docs/simple_proxy.md +++ b/docs/my-website/docs/simple_proxy.md @@ -710,6 +710,34 @@ https://api.openai.com/v1/chat/completions \ ``` ## Logging Proxy Input/Output - Langfuse +We will use the `--config` with the proxy for logging input/output to Langfuse +- We will use the `litellm.success_callback = ["langfuse"]` this will log all successfull LLM calls to langfuse + +**Step 1**: Create a `config.yaml` file +```yaml +model_list: + - model_name: gpt-3.5-turbo + litellm_params: + model: gpt-3.5-turbo + - model_name: gpt-4-team1 + litellm_params: + model: azure/chatgpt-v-2 + api_base: https://openai-gpt-4-test-v-1.openai.azure.com/ + api_version: "2023-05-15" +litellm_settings: + success_callback: ["langfuse"] +``` + +**Step 2**: Start the proxy, make a test request +```shell +litellm --model gpt-3.5-turbo --debug +``` + +Test Request +``` +litellm --test +``` + ## Proxy CLI Arguments