mirror of
https://github.com/BerriAI/litellm.git
synced 2025-04-24 10:14:26 +00:00
docs(routing.md): add queueing to docs
This commit is contained in:
parent
a2681e353f
commit
9d97082eed
9 changed files with 244 additions and 261 deletions
|
@ -13,7 +13,7 @@ import pytest
|
|||
from litellm import Router
|
||||
import litellm
|
||||
litellm.set_verbose=False
|
||||
os.environ.pop("AZURE_AD_TOKEN")
|
||||
# os.environ.pop("AZURE_AD_TOKEN")
|
||||
|
||||
model_list = [{ # list of model deployments
|
||||
"model_name": "gpt-3.5-turbo", # model alias
|
||||
|
@ -142,10 +142,11 @@ successful_calls = 0
|
|||
failed_calls = 0
|
||||
|
||||
for future in futures:
|
||||
if future.result() is not None:
|
||||
successful_calls += 1
|
||||
else:
|
||||
failed_calls += 1
|
||||
if future.done():
|
||||
if future.result() is not None:
|
||||
successful_calls += 1
|
||||
else:
|
||||
failed_calls += 1
|
||||
|
||||
print(f"Load test Summary:")
|
||||
print(f"Total Requests: {concurrent_calls}")
|
||||
|
|
|
@ -0,0 +1,48 @@
|
|||
Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope
|
||||
Response ID: 71a47cd4-92d9-4091-9429-8d22af6b56bf Url: /queue/response/71a47cd4-92d9-4091-9429-8d22af6b56bf
|
||||
Time: 0.77 seconds
|
||||
|
||||
Question: Given this context, what is litellm? LiteLLM about: About
|
||||
Call all LLM APIs using the OpenAI format.
|
||||
Response ID: a0855c20-59ba-4eed-85c1-e0719eebdeab Url: /queue/response/a0855c20-59ba-4eed-85c1-e0719eebdeab
|
||||
Time: 1.46 seconds
|
||||
|
||||
Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope
|
||||
Response ID: b131cdcd-0693-495b-ad41-b0cf2afc4833 Url: /queue/response/b131cdcd-0693-495b-ad41-b0cf2afc4833
|
||||
Time: 2.13 seconds
|
||||
|
||||
Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope
|
||||
Response ID: a58e5185-90e7-4832-9f28-e5a5ac167a40 Url: /queue/response/a58e5185-90e7-4832-9f28-e5a5ac167a40
|
||||
Time: 2.83 seconds
|
||||
|
||||
Question: Given this context, what is litellm? LiteLLM about: About
|
||||
Call all LLM APIs using the OpenAI format.
|
||||
Response ID: 52dbbd49-eedb-4c11-8382-3ca7deb1af35 Url: /queue/response/52dbbd49-eedb-4c11-8382-3ca7deb1af35
|
||||
Time: 3.50 seconds
|
||||
|
||||
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server
|
||||
LiteLLM Server manages:
|
||||
|
||||
Calling 10
|
||||
Response ID: eedda05f-61e1-4081-b49d-27f9449bcf69 Url: /queue/response/eedda05f-61e1-4081-b49d-27f9449bcf69
|
||||
Time: 4.20 seconds
|
||||
|
||||
Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope
|
||||
Response ID: 8a484722-66ec-4193-b19b-2dfc4265cfd2 Url: /queue/response/8a484722-66ec-4193-b19b-2dfc4265cfd2
|
||||
Time: 4.89 seconds
|
||||
|
||||
Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope
|
||||
Response ID: ae1e2b71-d711-456d-8df0-13ce0709eb04 Url: /queue/response/ae1e2b71-d711-456d-8df0-13ce0709eb04
|
||||
Time: 5.60 seconds
|
||||
|
||||
Question: What endpoints does the litellm proxy have 💥 OpenAI Proxy Server
|
||||
LiteLLM Server manages:
|
||||
|
||||
Calling 10
|
||||
Response ID: cfabd174-838e-4252-b82b-648923573db8 Url: /queue/response/cfabd174-838e-4252-b82b-648923573db8
|
||||
Time: 6.29 seconds
|
||||
|
||||
Question: Does litellm support ooobagooba llms? how can i call oobagooba llms. Call all LLM APIs using the Ope
|
||||
Response ID: 02d5b7d6-5443-41e9-94e4-90d8b00d49fb Url: /queue/response/02d5b7d6-5443-41e9-94e4-90d8b00d49fb
|
||||
Time: 7.01 seconds
|
||||
|
Loading…
Add table
Add a link
Reference in a new issue