mirror of
https://github.com/BerriAI/litellm.git
synced 2025-04-27 03:34:10 +00:00
(feat) /batches
Add support for using /batches
endpoints in OAI format (#7402)
* run azure testing on ci/cd * update docs on azure batches endpoints * add input azure.jsonl * refactor - use separate file for batches endpoints * fixes for passing custom llm provider to /batch endpoints * pass custom llm provider to files endpoints * update azure batches doc * add info for azure batches api * update batches endpoints * use simple helper for raising proxy exception * update config.yml * fix imports * update tests * use existing settings * update env var used * update configs * update config.yml * update ft testing
This commit is contained in:
parent
fe43403359
commit
47e12802df
17 changed files with 718 additions and 464 deletions
|
@ -234,7 +234,7 @@ def create_batch(
|
|||
)
|
||||
else:
|
||||
raise litellm.exceptions.BadRequestError(
|
||||
message="LiteLLM doesn't support {} for 'create_batch'. Only 'openai' is supported.".format(
|
||||
message="LiteLLM doesn't support custom_llm_provider={} for 'create_batch'".format(
|
||||
custom_llm_provider
|
||||
),
|
||||
model="n/a",
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue