..
adapters
(code quality) run ruff rule to ban unused imports ( #7313 )
2024-12-19 12:33:42 -08:00
assistants
Revert "fix: add missing parameters order, limit, before, and after in get_as…" ( #7542 )
2025-01-03 16:32:12 -08:00
batch_completion
(code quality) run ruff rule to ban unused imports ( #7313 )
2024-12-19 12:33:42 -08:00
batches
(Feat - Batches API) add support for retrieving vertex api batch jobs ( #7661 )
2025-01-09 18:35:03 -08:00
caching
(Redis Cluster) - Fixes for using redis cluster + pipeline ( #8442 )
2025-02-12 18:01:32 -08:00
files
(code quality) run ruff rule to ban unused imports ( #7313 )
2024-12-19 12:33:42 -08:00
fine_tuning
fix linting
2025-02-14 21:42:51 -08:00
integrations
(Bug Fix + Better Observability) - BudgetResetJob: ( #8562 )
2025-02-15 16:13:08 -08:00
litellm_core_utils
(Bug Fix + Better Observability) - BudgetResetJob: ( #8562 )
2025-02-15 16:13:08 -08:00
llms
feat(openai/o_series_transformation.py): support native streaming for all openai o-series models ( #8552 )
2025-02-14 20:04:19 -08:00
proxy
(Bug Fix + Better Observability) - BudgetResetJob: ( #8562 )
2025-02-15 16:13:08 -08:00
realtime_api
(Refactor) - Re use litellm.completion/litellm.embedding etc for health checks ( #7455 )
2024-12-28 18:38:54 -08:00
rerank_api
(feat) /batches
- track user_api_key_alias
, user_api_key_team_alias
etc for /batch requests ( #7401 )
2024-12-24 17:44:28 -08:00
router_strategy
Litellm dev 01 30 2025 p2 ( #8134 )
2025-01-30 22:18:53 -08:00
router_utils
(Feat) - return x-litellm-attempted-fallbacks
in responses from litellm proxy ( #8558 )
2025-02-15 14:54:23 -08:00
secret_managers
fix: add default credential for azure ( #7095 ) ( #7891 )
2025-01-21 09:01:49 -08:00
types
(Bug Fix + Better Observability) - BudgetResetJob: ( #8562 )
2025-02-15 16:13:08 -08:00
__init__.py
Litellm dev 02 13 2025 p2 ( #8525 )
2025-02-13 20:28:42 -08:00
_logging.py
(sdk perf fix) - only print args passed to litellm when debugging mode is on ( #7708 )
2025-01-11 22:56:20 -08:00
_redis.py
(Redis Cluster) - Fixes for using redis cluster + pipeline ( #8442 )
2025-02-12 18:01:32 -08:00
_service_logger.py
fix svc logger ( #7727 )
2025-01-12 22:00:25 -08:00
_version.py
Litellm ruff linting enforcement ( #5992 )
2024-10-01 19:44:20 -04:00
budget_manager.py
(code quality) run ruff rule to ban unused imports ( #7313 )
2024-12-19 12:33:42 -08:00
constants.py
Litellm dev 02 13 2025 p2 ( #8525 )
2025-02-13 20:28:42 -08:00
cost.json
cost_calculator.py
Fix custom pricing - separate provider info from model info ( #7990 )
2025-01-25 21:49:28 -08:00
exceptions.py
fix(main.py): fix key leak error when unknown provider given ( #8556 )
2025-02-15 14:02:55 -08:00
main.py
fix(main.py): fix key leak error when unknown provider given ( #8556 )
2025-02-15 14:02:55 -08:00
model_prices_and_context_window_backup.json
(Feat) - Add /bedrock/meta.llama3-3-70b-instruct-v1:0
tool calling support + cost tracking + base llm unit test for tool calling ( #8545 )
2025-02-14 14:15:25 -08:00
py.typed
router.py
(Feat) - return x-litellm-attempted-fallbacks
in responses from litellm proxy ( #8558 )
2025-02-15 14:54:23 -08:00
scheduler.py
(refactor) caching use LLMCachingHandler for async_get_cache and set_cache ( #6208 )
2024-10-14 16:34:01 +05:30
timeout.py
Litellm ruff linting enforcement ( #5992 )
2024-10-01 19:44:20 -04:00
utils.py
fix(main.py): fix key leak error when unknown provider given ( #8556 )
2025-02-15 14:02:55 -08:00