vllm prompt_logprobs can also be 0

This adjusts the vllm openai_completion endpoint to also pass a
value of 0 for prompt_logprobs, instead of only passing values greater
than zero to the backend.

The existing test_openai_completion_prompt_logprobs was parameterized
to test this case as well.

Signed-off-by: Ben Browning <bbrownin@redhat.com>
This commit is contained in:
Ben Browning 2025-04-09 17:32:03 -04:00
parent 8d10556ce3
commit 8f5cd49159
2 changed files with 10 additions and 3 deletions

View file

@ -446,7 +446,7 @@ class VLLMInferenceAdapter(Inference, ModelsProtocolPrivate):
model_obj = await self._get_model(model)
extra_body: Dict[str, Any] = {}
if prompt_logprobs:
if prompt_logprobs is not None and prompt_logprobs >= 0:
extra_body["prompt_logprobs"] = prompt_logprobs
if guided_choice:
extra_body["guided_choice"] = guided_choice