diff --git a/docs/my-website/docs/observability/callbacks.md b/docs/my-website/docs/observability/callbacks.md index 5a01864dd..1cda57584 100644 --- a/docs/my-website/docs/observability/callbacks.md +++ b/docs/my-website/docs/observability/callbacks.md @@ -6,6 +6,7 @@ liteLLM provides `success_callbacks` and `failure_callbacks`, making it easy for liteLLM supports: +- [Custom Callback Functions](https://docs.litellm.ai/docs/observability/custom_callback) - [LLMonitor](https://llmonitor.com/docs) - [Helicone](https://docs.helicone.ai/introduction) - [Traceloop](https://traceloop.com/docs) diff --git a/docs/my-website/docs/observability/custom_callback.md b/docs/my-website/docs/observability/custom_callback.md index ea6ff2c05..41ee5834a 100644 --- a/docs/my-website/docs/observability/custom_callback.md +++ b/docs/my-website/docs/observability/custom_callback.md @@ -10,6 +10,11 @@ def custom_callback( start_time, end_time # start/end time ): # Your custom code here + print("LITELLM: in custom callback function") + print("kwargs", kwargs) + print("completion_response", completion_response) + print("start_time", start_time) + print("end_time", end_time) ``` ## Using Your Custom Callback Function @@ -35,6 +40,39 @@ print(response) ``` +### Custom Callback to write to Mixpanel + +```python +import mixpanel +import litellm +from litellm import completion + +def custom_callback( + kwargs, # kwargs to completion + completion_response, # response from completion + start_time, end_time # start/end time +): + # Your custom code here + mixpanel.track("LLM Response", {"llm_response": completion_response}) + + +# Assign the custom callback function +litellm.success_callback = [custom_callback] + +response = completion( + model="gpt-3.5-turbo", + messages=[ + { + "role": "user", + "content": "Hi 👋 - i'm openai" + } + ] +) + +print(response) + +``` +