mirror of
https://github.com/BerriAI/litellm.git
synced 2025-04-25 10:44:24 +00:00
fix - update abatch_completion docstring
This commit is contained in:
parent
473ec66b84
commit
9ab96e12ed
1 changed files with 23 additions and 0 deletions
|
@ -672,6 +672,29 @@ class Router:
|
|||
Async Batch Completion. Used for 2 scenarios:
|
||||
1. Batch Process 1 request to N models on litellm.Router. Pass messages as List[Dict[str, str]] to use this
|
||||
2. Batch Process N requests to M models on litellm.Router. Pass messages as List[List[Dict[str, str]]] to use this
|
||||
|
||||
Example Request for 1 request to N models:
|
||||
```
|
||||
response = await router.abatch_completion(
|
||||
models=["gpt-3.5-turbo", "groq-llama"],
|
||||
messages=[
|
||||
{"role": "user", "content": "is litellm becoming a better product ?"}
|
||||
],
|
||||
max_tokens=15,
|
||||
)
|
||||
```
|
||||
|
||||
|
||||
Example Request for N requests to M models:
|
||||
```
|
||||
response = await router.abatch_completion(
|
||||
models=["gpt-3.5-turbo", "groq-llama"],
|
||||
messages=[
|
||||
[{"role": "user", "content": "is litellm becoming a better product ?"}],
|
||||
[{"role": "user", "content": "who is this"}],
|
||||
],
|
||||
)
|
||||
```
|
||||
"""
|
||||
############## Helpers for async completion ##################
|
||||
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue