(docs) how to run a locust load test

This commit is contained in:
ishaan-jaff 2024-03-15 07:37:50 -07:00
parent 82e44e4962
commit 91a47dc17a
4 changed files with 37 additions and 1 deletions

View file

@ -6,7 +6,7 @@ import time
class MyUser(HttpUser):
wait_time = between(1, 5)
@task(3)
@task
def chat_completion(self):
headers = {
"Content-Type": "application/json",

View file

@ -2,6 +2,42 @@ import Image from '@theme/IdealImage';
# 🔥 Load Test LiteLLM
## How to run a locust load test on LiteLLM Proxy
1. `pip install locust`
2. Create a file called `locustfile.py` on your local machine. Copy the contents from the litellm load test located [here](https://github.com/BerriAI/litellm/blob/main/.github/workflows/locustfile.py)
3. Start locust
Run `locust` in the same directory as your `locustfile.py` from step 2
```shell
locust
```
Output on terminal
```
[2024-03-15 07:19:58,893] Starting web interface at http://0.0.0.0:8089
[2024-03-15 07:19:58,898] Starting Locust 2.24.0
```
4. Run Load test on locust
Head to the locust UI on http://0.0.0.0:8089
Set Users=100, Ramp Up Users=10, Host=Base URL of your LiteLLM Proxy
<Image img={require('../img/locust_load_test.png')} />
5. Expected Results
Expect to see the following response times for `/health/readiness`
Median → /health/readiness is `150ms`
Avg → /health/readiness is `219ms`
<Image img={require('../img/litellm_load_test.png')} />
## Load Test LiteLLM Proxy - 1500+ req/s
## 1500+ concurrent requests/s

Binary file not shown.

After

Width:  |  Height:  |  Size: 125 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 204 KiB