litellm-mirror/litellm
Ishaan Jaff 70883bc1b8 (feat - proxy) Add status_code to litellm_proxy_total_requests_metric_total (#7293)
* fix _select_model_name_for_cost_calc docstring

* add STATUS_CODE  to prometheus

* test prometheus unit tests

* test_prometheus_unit_tests.py

* update Proxy Level Tracking Metrics docs

* fix test_proxy_failure_metrics

* fix test_proxy_failure_metrics
2024-12-18 15:55:02 -08:00
..
adapters Litellm remove circular imports (#7232) 2024-12-14 16:28:34 -08:00
assistants rename llms/OpenAI/ -> llms/openai/ (#7154) 2024-12-10 20:14:07 -08:00
batch_completion Litellm vllm refactor (#7158) 2024-12-10 21:48:35 -08:00
batches Code Quality Improvement - use vertex_ai/ as folder name for vertexAI (#7166) 2024-12-11 00:32:41 -08:00
caching Provider Budget Routing - Get Budget, Spend Details (#7063) 2024-12-06 21:14:12 -08:00
deprecated_litellm_server (refactor) caching use LLMCachingHandler for async_get_cache and set_cache (#6208) 2024-10-14 16:34:01 +05:30
files Code Quality Improvement - use vertex_ai/ as folder name for vertexAI (#7166) 2024-12-11 00:32:41 -08:00
fine_tuning Code Quality Improvement - use vertex_ai/ as folder name for vertexAI (#7166) 2024-12-11 00:32:41 -08:00
integrations (feat - proxy) Add status_code to litellm_proxy_total_requests_metric_total (#7293) 2024-12-18 15:55:02 -08:00
litellm_core_utils (fix) unable to pass input_type parameter to Voyage AI embedding mode (#7276) 2024-12-17 19:23:49 -08:00
llms (fix) unable to pass input_type parameter to Voyage AI embedding mode (#7276) 2024-12-17 19:23:49 -08:00
proxy (feat - proxy) Add status_code to litellm_proxy_total_requests_metric_total (#7293) 2024-12-18 15:55:02 -08:00
realtime_api rename llms/OpenAI/ -> llms/openai/ (#7154) 2024-12-10 20:14:07 -08:00
rerank_api LiteLLM Minor Fixes & Improvements (12/05/2024) (#7037) 2024-12-05 00:02:31 -08:00
router_strategy Litellm dev readd prompt caching (#7299) 2024-12-18 15:13:49 -08:00
router_utils Litellm dev readd prompt caching (#7299) 2024-12-18 15:13:49 -08:00
secret_managers (Refactor) Code Quality improvement - remove /prompt_templates/ , base_aws_llm.py from /llms folder (#7164) 2024-12-11 00:02:46 -08:00
types (feat - proxy) Add status_code to litellm_proxy_total_requests_metric_total (#7293) 2024-12-18 15:55:02 -08:00
__init__.py Litellm dev readd prompt caching (#7299) 2024-12-18 15:13:49 -08:00
_logging.py LiteLLM Minor Fixes & Improvements (10/30/2024) (#6519) 2024-11-02 00:44:32 +05:30
_redis.py (redis fix) - fix AbstractConnection.__init__() got an unexpected keyword argument 'ssl' (#6908) 2024-11-25 22:52:44 -08:00
_service_logger.py LiteLLM Minor Fixes & Improvements (12/05/2024) (#7037) 2024-12-05 00:02:31 -08:00
_version.py Litellm ruff linting enforcement (#5992) 2024-10-01 19:44:20 -04:00
budget_manager.py LITELLM: Remove requests library usage (#7235) 2024-12-17 12:50:04 -08:00
constants.py (feat) Add Bedrock knowledge base pass through endpoints (#7267) 2024-12-16 22:19:34 -08:00
cost.json
cost_calculator.py fix _select_model_name_for_cost_calc docstring 2024-12-18 09:39:31 -08:00
exceptions.py Litellm 12 02 2024 (#6994) 2024-12-02 22:00:01 -08:00
main.py (fix) unable to pass input_type parameter to Voyage AI embedding mode (#7276) 2024-12-17 19:23:49 -08:00
model_prices_and_context_window_backup.json build: bump version 2024-12-18 09:21:13 -08:00
py.typed
router.py Litellm dev readd prompt caching (#7299) 2024-12-18 15:13:49 -08:00
scheduler.py (refactor) caching use LLMCachingHandler for async_get_cache and set_cache (#6208) 2024-10-14 16:34:01 +05:30
timeout.py Litellm ruff linting enforcement (#5992) 2024-10-01 19:44:20 -04:00
utils.py (fix) unable to pass input_type parameter to Voyage AI embedding mode (#7276) 2024-12-17 19:23:49 -08:00