mirror of
https://github.com/BerriAI/litellm.git
synced 2025-04-24 18:24:20 +00:00
* remove unused imports * fix AmazonConverseConfig * fix test * fix import * ruff check fixes * test fixes * fix testing * fix imports |
||
---|---|---|
.. | ||
main.py | ||
Readme.md |
Implementation of litellm.batch_completion
, litellm.batch_completion_models
, litellm.batch_completion_models_all_responses
Doc: https://docs.litellm.ai/docs/completion/batching
LiteLLM Python SDK allows you to:
litellm.batch_completion
Batch litellm.completion function for a given model.litellm.batch_completion_models
Send a request to multiple language models concurrently and return the response as soon as one of the models responds.litellm.batch_completion_models_all_responses
Send a request to multiple language models concurrently and return a list of responses from all models that respond.