diff --git a/docs/my-website/docs/completion/batching.md b/docs/my-website/docs/completion/batching.md index 29c520efa..af30f5678 100644 --- a/docs/my-website/docs/completion/batching.md +++ b/docs/my-website/docs/completion/batching.md @@ -2,6 +2,10 @@ In the batch_completion method, you provide a list of `messages` where each sub-list of messages is passed to `litellm.completion()`, allowing you to process multiple prompts efficiently in a single API call. + + Open In Colab + + ## Example Code ```python import litellm