Fix assert message and call to completion_request_to_prompt in remote:vllm (#709)

The current message is incorrect and model arg is not needed in
`completion_request_to_prompt`.

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>
This commit is contained in:
Yuan Tang 2025-01-03 15:44:49 -06:00 committed by GitHub
parent 96d8375663
commit 04d5b9814f
No known key found for this signature in database
GPG key ID: B5690EEEBB952194

View file

@ -193,10 +193,9 @@ class VLLMInferenceAdapter(Inference, ModelsProtocolPrivate):
else:
assert (
not media_present
), "Together does not support media for Completion requests"
), "vLLM does not support media for Completion requests"
input_dict["prompt"] = await completion_request_to_prompt(
request,
self.register_helper.get_llama_model(request.model),
self.formatter,
)