litellm-mirror/litellm/proxy/common_utils
Krish Dholakia 8f86959c32
All checks were successful
Read Version from pyproject.toml / read-version (push) Successful in 13s
Litellm dev 02 27 2025 p6 (#8891)
* fix(http_parsing_utils.py): orjson can throw errors on some emoji's in text, default to json.loads

* fix(sagemaker/handler.py): support passing model id on async streaming

* fix(litellm_pre_call_utils.py): Fixes https://github.com/BerriAI/litellm/issues/7237
2025-02-28 14:34:17 -08:00
..
admin_ui_utils.py (code quality) run ruff rule to ban unused imports (#7313) 2024-12-19 12:33:42 -08:00
callback_utils.py build: merge commit 1b15568af7 2025-02-17 21:56:00 -08:00
debug_utils.py (code quality) run ruff rule to ban unused imports (#7313) 2024-12-19 12:33:42 -08:00
encrypt_decrypt_utils.py (code quality) run ruff rule to ban unused imports (#7313) 2024-12-19 12:33:42 -08:00
http_parsing_utils.py Litellm dev 02 27 2025 p6 (#8891) 2025-02-28 14:34:17 -08:00
load_config_utils.py (code quality) run ruff rule to ban unused imports (#7313) 2024-12-19 12:33:42 -08:00
openai_endpoint_utils.py (feat) /batches Add support for using /batches endpoints in OAI format (#7402) 2024-12-24 16:58:05 -08:00
proxy_state.py (feat) UI - Disable Usage Tab once SpendLogs is 1M+ Rows (#7208) 2024-12-12 18:43:17 -08:00
reset_budget_job.py (Bug Fix + Better Observability) - BudgetResetJob: (#8562) 2025-02-15 16:13:08 -08:00
swagger_utils.py show all error types on swagger 2024-08-29 18:50:41 -07:00