mirror of
https://github.com/BerriAI/litellm.git
synced 2025-04-26 03:04:13 +00:00
* track api key and team in prom latency metric * add test for latency metric * test prometheus success metrics for latency * track team and key labels for deployment failures * add test for litellm_deployment_failure_responses_total * fix checks for premium user on prometheus * log_success_fallback_event and log_failure_fallback_event * log original_exception in log_success_fallback_event * track key, team and exception status and class on fallback metrics * use get_standard_logging_metadata * fix import error * track litellm_deployment_successful_fallbacks * add test test_proxy_fallback_metrics * add log log_success_fallback_event * fix test prometheus |
||
---|---|---|
.. | ||
router_callbacks | ||
batch_utils.py | ||
client_initalization_utils.py | ||
cooldown_cache.py | ||
cooldown_callbacks.py | ||
cooldown_handlers.py | ||
fallback_event_handlers.py | ||
handle_error.py |