forked from phoenix/litellm-mirror
add batch_completions to docs
This commit is contained in:
parent
2b424c8a3e
commit
8440791e04
2 changed files with 42 additions and 1 deletions
34
docs/my-website/docs/completion/batching.md
Normal file
34
docs/my-website/docs/completion/batching.md
Normal file
|
@ -0,0 +1,34 @@
|
|||
# Batching Completion Calls - batch_completion
|
||||
|
||||
Batch Completion allows you to pass a batch of completion() requests to process multiple `messages` in a single API call.
|
||||
|
||||
## Example Code
|
||||
```python
|
||||
import litellm
|
||||
import os
|
||||
from litellm import batch_completion
|
||||
|
||||
os.environ['ANTHROPIC_API_KEY'] = ""
|
||||
|
||||
|
||||
responses = batch_completion(
|
||||
model="claude-2",
|
||||
messages = [
|
||||
[
|
||||
{
|
||||
"role": "user",
|
||||
"content": "good morning? "
|
||||
}
|
||||
],
|
||||
[
|
||||
{
|
||||
"role": "user",
|
||||
"content": "what's the time? "
|
||||
}
|
||||
]
|
||||
]
|
||||
)
|
||||
```
|
||||
|
||||
|
||||
In the batch_completion method, you provide a list of `messages` where each sub-list of messages is passed to `litellm.completion()`, allowing you to process multiple prompts efficiently in a single API call.
|
|
@ -22,7 +22,14 @@ const sidebars = {
|
|||
{
|
||||
type: "category",
|
||||
label: "Completion()",
|
||||
items: ["completion/input", "completion/output", "completion/model_alias", "completion/reliable_completions", "completion/stream"],
|
||||
items: [
|
||||
"completion/input",
|
||||
"completion/output",
|
||||
"completion/model_alias",
|
||||
"completion/reliable_completions",
|
||||
"completion/stream",
|
||||
"completion/batching"
|
||||
],
|
||||
},
|
||||
{
|
||||
type: "category",
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue