(docs) /metrics endpoint

This commit is contained in:
Ishaan Jaff 2024-03-29 13:36:24 -07:00
parent 15fe00678a
commit c2283235a1
2 changed files with 46 additions and 0 deletions

View file

@ -0,0 +1,45 @@
# Grafana, Prometheus metrics [BETA]
LiteLLM Exposes a `/metrics` endpoint for Prometheus to Poll
## Quick Start
Add this to your proxy config.yaml
```yaml
model_list:
- model_name: gpt-3.5-turbo
litellm_params:
model: gpt-3.5-turbo
litellm_settings:
success_callback: ["prometheus"]
```
Start the proxy
```shell
litellm --config config.yaml --debug
```
Test Request
```shell
curl --location 'http://0.0.0.0:4000/chat/completions' \
--header 'Content-Type: application/json' \
--data '{
"model": "gpt-3.5-turbo",
"messages": [
{
"role": "user",
"content": "what llm are you"
}
]
}'
```
## Metrics Tracked
| Metric Name | Description |
|----------------------|--------------------------------------|
| `litellm_requests_metric` | Number of requests made, per `"user", "key", "model"` |
| `litellm_spend_metric` | Total Spend, per `"user", "key", "model"` |
| `litellm_total_tokens` | input + output tokens per `"user", "key", "model"` |

View file

@ -68,6 +68,7 @@ const sidebars = {
"proxy/streaming_logging",
]
},
"proxy/grafana_metrics",
"proxy/call_hooks",
"proxy/rules",
"proxy/cli",