From 62276fc22102aafbc9ea91d0f978a39b252ce75a Mon Sep 17 00:00:00 2001 From: Ishaan Jaff Date: Sat, 11 May 2024 13:45:32 -0700 Subject: [PATCH] docs link to litellm batch completions --- docs/my-website/docs/completion/batching.md | 6 ++++++ 1 file changed, 6 insertions(+) diff --git a/docs/my-website/docs/completion/batching.md b/docs/my-website/docs/completion/batching.md index 05683b3dd..09f59f743 100644 --- a/docs/my-website/docs/completion/batching.md +++ b/docs/my-website/docs/completion/batching.md @@ -4,6 +4,12 @@ LiteLLM allows you to: * Send 1 completion call to many models: Return Fastest Response * Send 1 completion call to many models: Return All Responses +:::info + +Trying to do batch completion on LiteLLM Proxy ? Go here: https://docs.litellm.ai/docs/proxy/user_keys#beta-batch-completions---pass-model-as-list + +::: + ## Send multiple completion calls to 1 model In the batch_completion method, you provide a list of `messages` where each sub-list of messages is passed to `litellm.completion()`, allowing you to process multiple prompts efficiently in a single API call.