Fix issue when generating vLLM distros

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>
This commit is contained in:
Yuan Tang 2025-01-13 18:43:23 -05:00
parent 89e3f81520
commit 7c726826b8
No known key found for this signature in database
3 changed files with 14 additions and 46 deletions

View file

@ -134,7 +134,7 @@ def get_distribution_template() -> DistributionTemplate:
"Inference model loaded into the vLLM server",
),
"VLLM_URL": (
"http://host.docker.internal:5100}/v1",
"http://host.docker.internal:5100/v1",
"URL of the vLLM server with the main inference model",
),
"MAX_TOKENS": (