(feat) /batches Add support for using /batches endpoints in OAI format (#7402)

* run azure testing on ci/cd

* update docs on azure batches endpoints

* add input azure.jsonl

* refactor - use separate file for batches endpoints

* fixes for passing custom llm provider to /batch endpoints

* pass custom llm provider to files endpoints

* update azure batches doc

* add info for azure batches api

* update batches endpoints

* use simple helper for raising proxy exception

* update config.yml

* fix imports

* update tests

* use existing settings

* update env var used

* update configs

* update config.yml

* update ft testing
This commit is contained in:
Ishaan Jaff 2024-12-24 16:58:05 -08:00 committed by GitHub
parent fe43403359
commit 47e12802df
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
17 changed files with 718 additions and 464 deletions

View file

@ -234,7 +234,7 @@ def create_batch(
)
else:
raise litellm.exceptions.BadRequestError(
message="LiteLLM doesn't support {} for 'create_batch'. Only 'openai' is supported.".format(
message="LiteLLM doesn't support custom_llm_provider={} for 'create_batch'".format(
custom_llm_provider
),
model="n/a",