mirror of
https://github.com/BerriAI/litellm.git
synced 2025-04-26 03:04:13 +00:00
* feat(router.py): initial commit for loadbalancing azure batch api endpoints Closes https://github.com/BerriAI/litellm/issues/5396 * fix(router.py): working `router.acreate_file()` * feat(router.py): working router.acreate_batch endpoint * feat(router.py): expose router.aretrieve_batch function Make it easy for user to retrieve the batch information * feat(router.py): support 'router.alist_batches' endpoint Adds support for getting all batches across all endpoints * feat(router.py): working loadbalancing on `/v1/files` * feat(proxy_server.py): working loadbalancing on `/v1/batches` * feat(proxy_server.py): working loadbalancing on Retrieve + List batch |
||
---|---|---|
.. | ||
adapters | ||
assistants | ||
batches | ||
deprecated_litellm_server | ||
files | ||
fine_tuning | ||
integrations | ||
litellm_core_utils | ||
llms | ||
proxy | ||
rerank_api | ||
router_strategy | ||
router_utils | ||
tests | ||
types | ||
__init__.py | ||
_logging.py | ||
_redis.py | ||
_service_logger.py | ||
_version.py | ||
budget_manager.py | ||
caching.py | ||
cost.json | ||
cost_calculator.py | ||
exceptions.py | ||
main.py | ||
model_prices_and_context_window_backup.json | ||
py.typed | ||
requirements.txt | ||
router.py | ||
scheduler.py | ||
timeout.py | ||
utils.py |