llama-stack-mirror/tests/unit/providers/inference
Ben Browning f586bdd912 fix: remote-vllm event loop blocking unit test on Mac
The remote-vllm `test_chat_completion_doesnt_block_event_loop` unit
test was often failing for me on a Mac. I traced this back to the swap
to the AsyncOpenAI client in the remote-vllm provider as where this
started, and it looks like the async client needs a bit more accurate
HTTP request handling from our mock server.

So, this fixes that unit test to send proper Content-Type and
Content-Length headers which makes the AsyncOpenAI client happier on Macs.

All the test_remote_vllm.py unit tests consistently pass for me on a Mac
now, without any flaking in the event loop one.

`pytest -s -v tests/unit/providers/inference/test_remote_vllm.py`

Signed-off-by: Ben Browning <bbrownin@redhat.com>
2025-06-02 08:36:35 -04:00
..
test_remote_vllm.py fix: remote-vllm event loop blocking unit test on Mac 2025-06-02 08:36:35 -04:00