feat(router.py): Support Loadbalancing batch azure api endpoints (#5469)

* feat(router.py): initial commit for loadbalancing azure batch api endpoints

Closes https://github.com/BerriAI/litellm/issues/5396

* fix(router.py): working `router.acreate_file()`

* feat(router.py): working router.acreate_batch endpoint

* feat(router.py): expose router.aretrieve_batch function

Make it easy for user to retrieve the batch information

* feat(router.py): support 'router.alist_batches' endpoint

Adds support for getting all batches across all endpoints

* feat(router.py): working loadbalancing on `/v1/files`

* feat(proxy_server.py): working loadbalancing on `/v1/batches`

* feat(proxy_server.py): working loadbalancing on Retrieve + List batch
This commit is contained in:
Krish Dholakia 2024-09-02 21:32:55 -07:00 committed by GitHub
parent 9b22359bed
commit 18da7adce9
10 changed files with 667 additions and 37 deletions

View file

@ -4645,6 +4645,8 @@ def get_llm_provider(
For router -> Can also give the whole litellm param dict -> this function will extract the relevant details
Raises Error - if unable to map model to a provider
Return model, custom_llm_provider, dynamic_api_key, api_base
"""
try:
## IF LITELLM PARAMS GIVEN ##