litellm-mirror/litellm
2025-04-08 16:00:57 -07:00
..
anthropic_interface docs anthropic messages interface 2025-03-31 16:53:39 -07:00
assistants refactor(azure.py): refactor to have client init work across all endpoints 2025-03-11 17:27:24 -07:00
batch_completion (code quality) run ruff rule to ban unused imports (#7313) 2024-12-19 12:33:42 -08:00
batches build(pyproject.toml): add new dev dependencies - for type checking (#9631) 2025-03-29 11:02:13 -07:00
caching Squashed commit of the following: (#9709) 2025-04-02 21:24:54 -07:00
experimental_mcp_client fix types on tools.py 2025-03-29 16:48:15 -07:00
files LiteLLM Minor Fixes & Improvements (04/02/2025) (#9725) 2025-04-03 11:48:52 -07:00
fine_tuning build(pyproject.toml): add new dev dependencies - for type checking (#9631) 2025-03-29 11:02:13 -07:00
integrations Merge pull request #9760 from BerriAI/litellm_prometheus_error_monitoring 2025-04-04 21:37:28 -07:00
litellm_core_utils Allow passing thinking param to litellm proxy via client sdk + Code QA Refactor on get_optional_params (get correct values) (#9386) 2025-04-07 21:04:11 -07:00
llms Allow passing thinking param to litellm proxy via client sdk + Code QA Refactor on get_optional_params (get correct values) (#9386) 2025-04-07 21:04:11 -07:00
proxy fix ui sso debug JWT 2025-04-08 16:00:57 -07:00
realtime_api fix(aim.py): fix linting error 2025-03-13 15:32:42 -07:00
rerank_api build(pyproject.toml): add new dev dependencies - for type checking (#9631) 2025-03-29 11:02:13 -07:00
responses build(pyproject.toml): add new dev dependencies - for type checking (#9631) 2025-03-29 11:02:13 -07:00
router_strategy build(pyproject.toml): add new dev dependencies - for type checking (#9631) 2025-03-29 11:02:13 -07:00
router_utils LiteLLM Minor Fixes & Improvements (04/02/2025) (#9725) 2025-04-03 11:48:52 -07:00
secret_managers Squashed commit of the following: (#9709) 2025-04-02 21:24:54 -07:00
types [Feat] LiteLLM Tag/Policy Management (#9813) 2025-04-07 21:54:24 -07:00
__init__.py Add inference providers support for Hugging Face (#8258) (#9738) (#9773) 2025-04-05 10:50:15 -07:00
_logging.py fix(logging): add json formatting for uncaught exceptions (#9615) (#9619) 2025-03-28 15:16:15 -07:00
_redis.py Squashed commit of the following: (#9709) 2025-04-02 21:24:54 -07:00
_service_logger.py clean up service metrics 2025-04-02 17:50:41 -07:00
_version.py Virtual key based policies in Aim Guardrails (#9499) 2025-04-01 21:57:23 -07:00
budget_manager.py Squashed commit of the following: (#9709) 2025-04-02 21:24:54 -07:00
constants.py flush_all_updates_from_in_memory_queue 2025-04-04 15:34:56 -07:00
cost.json
cost_calculator.py Realtime API Cost tracking (#9795) 2025-04-07 16:43:12 -07:00
exceptions.py feat(openai.py): bubble all error information back to client 2025-03-10 15:27:43 -07:00
main.py Add inference providers support for Hugging Face (#8258) (#9738) (#9773) 2025-04-05 10:50:15 -07:00
model_prices_and_context_window_backup.json build(model_prices_and_context_window.json): consistent params 2025-04-08 12:45:33 -07:00
py.typed
router.py fix(router.py): support reusable credentials via passthrough router (#9758) 2025-04-04 18:40:14 -07:00
scheduler.py Squashed commit of the following: (#9709) 2025-04-02 21:24:54 -07:00
timeout.py Litellm ruff linting enforcement (#5992) 2024-10-01 19:44:20 -04:00
utils.py Allow passing thinking param to litellm proxy via client sdk + Code QA Refactor on get_optional_params (get correct values) (#9386) 2025-04-07 21:04:11 -07:00