litellm-mirror/litellm
2025-04-12 18:19:02 -07:00
..
anthropic_interface docs anthropic messages interface 2025-03-31 16:53:39 -07:00
assistants refactor(azure.py): refactor to have client init work across all endpoints 2025-03-11 17:27:24 -07:00
batch_completion (code quality) run ruff rule to ban unused imports (#7313) 2024-12-19 12:33:42 -08:00
batches build(pyproject.toml): add new dev dependencies - for type checking (#9631) 2025-03-29 11:02:13 -07:00
caching Squashed commit of the following: (#9709) 2025-04-02 21:24:54 -07:00
experimental_mcp_client fix types on tools.py 2025-03-29 16:48:15 -07:00
files Support CRUD endpoints for Managed Files (#9924) 2025-04-11 21:48:27 -07:00
fine_tuning build(pyproject.toml): add new dev dependencies - for type checking (#9631) 2025-03-29 11:02:13 -07:00
integrations Support CRUD endpoints for Managed Files (#9924) 2025-04-11 21:48:27 -07:00
litellm_core_utils fix(factory.py): correct indentation for message index increment in ollama, This fixes bug #9822 (#9943) 2025-04-12 09:50:40 -07:00
llms fix(transformation.py): correctly translate 'thinking' param for lite… (#9904) 2025-04-11 23:25:13 -07:00
proxy fix auth check for team member permissions 2025-04-12 18:19:02 -07:00
realtime_api Realtime API: Support 'base_model' cost tracking + show response in spend logs (if enabled) (#9897) 2025-04-10 21:24:45 -07:00
rerank_api build(pyproject.toml): add new dev dependencies - for type checking (#9631) 2025-03-29 11:02:13 -07:00
responses build(pyproject.toml): add new dev dependencies - for type checking (#9631) 2025-03-29 11:02:13 -07:00
router_strategy build(pyproject.toml): add new dev dependencies - for type checking (#9631) 2025-03-29 11:02:13 -07:00
router_utils LiteLLM Minor Fixes & Improvements (04/02/2025) (#9725) 2025-04-03 11:48:52 -07:00
secret_managers Squashed commit of the following: (#9709) 2025-04-02 21:24:54 -07:00
types Litellm add managed files db (#9930) 2025-04-12 08:24:46 -07:00
__init__.py [Feat] Add litellm.supports_reasoning() util to track if an llm supports reasoning (#9923) 2025-04-11 17:56:04 -07:00
_logging.py fix(logging): add json formatting for uncaught exceptions (#9615) (#9619) 2025-03-28 15:16:15 -07:00
_redis.py Squashed commit of the following: (#9709) 2025-04-02 21:24:54 -07:00
_service_logger.py clean up service metrics 2025-04-02 17:50:41 -07:00
_version.py Virtual key based policies in Aim Guardrails (#9499) 2025-04-01 21:57:23 -07:00
budget_manager.py Squashed commit of the following: (#9709) 2025-04-02 21:24:54 -07:00
constants.py [Feat] Emit Key, Team Budget metrics on a cron job schedule (#9528) 2025-04-10 16:59:14 -07:00
cost.json
cost_calculator.py Realtime API: Support 'base_model' cost tracking + show response in spend logs (if enabled) (#9897) 2025-04-10 21:24:45 -07:00
exceptions.py feat(openai.py): bubble all error information back to client 2025-03-10 15:27:43 -07:00
main.py VertexAI non-jsonl file storage support (#9781) 2025-04-09 14:01:48 -07:00
model_prices_and_context_window_backup.json fix: correct the cost for 'gemini/gemini-2.5-pro-preview-03-25' (#9896) 2025-04-12 08:20:04 -07:00
py.typed
router.py Support CRUD endpoints for Managed Files (#9924) 2025-04-11 21:48:27 -07:00
scheduler.py Squashed commit of the following: (#9709) 2025-04-02 21:24:54 -07:00
timeout.py Litellm ruff linting enforcement (#5992) 2024-10-01 19:44:20 -04:00
utils.py [Feat] Add litellm.supports_reasoning() util to track if an llm supports reasoning (#9923) 2025-04-11 17:56:04 -07:00