llama-stack/llama_stack
Ben Browning dd1a366347
fix: logprobs support in remote-vllm provider (#1074)
# What does this PR do?

The remote-vllm provider was not passing logprobs options from
CompletionRequest or ChatCompletionRequests through to the OpenAI client
parameters. I manually verified this, as well as observed this provider
failing `TestInference::test_completion_logprobs`. This was filed as
issue #1073.

This fixes that by passing the `logprobs.top_k` value through to the
parameters we pass into the OpenAI client.

Additionally, this fixes a bug in `test_text_inference.py` where it
mistakenly assumed chunk.delta were of type `ContentDelta` for
completion requests. The deltas are of type `ContentDelta` for chat
completion requests, but for basic completion requests the deltas are of
type string. This test was likely failing for other providers that did
properly support logprobs because of this latter issue in the test,
which was hit while fixing the above issue with the remote-vllm
provider.

(Closes #1073)

## Test Plan

First, you need a vllm running. I ran one locally like this:
```
vllm serve meta-llama/Llama-3.2-3B-Instruct --port 8001 --enable-auto-tool-choice --tool-call-parser llama3_json
```

Next, run test_text_inference.py against this vllm using the remote vllm
provider like this:
```
VLLM_URL="http://localhost:8001/v1" python -m pytest -s -v llama_stack/providers/tests/inference/test_text_inference.py --providers "inference=vllm_remote"
```

Before my change, the test failed with this error:
```
llama_stack/providers/tests/inference/test_text_inference.py:155: in test_completion_logprobs
    assert 1 <= len(response.logprobs) <= 5
E   TypeError: object of type 'NoneType' has no len()
```

After my change, the test passes.

[//]: # (## Documentation)

Signed-off-by: Ben Browning <bbrownin@redhat.com>
2025-02-13 11:00:00 -05:00
..
apis feat: make telemetry attributes be dict[str,PrimitiveType] (#1055) 2025-02-11 15:10:17 -08:00
cli feat: support listing all for llama stack list-providers (#1056) 2025-02-12 22:03:28 -08:00
distribution feat: add support for running in a venv (#1018) 2025-02-12 11:13:04 -05:00
providers fix: logprobs support in remote-vllm provider (#1074) 2025-02-13 11:00:00 -05:00
scripts fix: Gaps in doc codegen (#1035) 2025-02-10 13:24:15 -08:00
templates feat: Adding sqlite-vec as a vectordb (#1040) 2025-02-12 10:50:03 -08:00
__init__.py export LibraryClient 2024-12-13 12:08:00 -08:00