llama-stack/tests/client-sdk/inference
Yuan Tang b981b49bfa
test: Use JSON tool prompt format for remote::vllm provider (#1019)
# What does this PR do?

This PR removes the warnings when running tests for `remote::vllm`
provider:
```
Detected the chat template content format to be 'openai'. You can set `--chat-template-content-format` to override this.
```

## Test Plan

All tests passed without the warning messages shown above.

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>
2025-02-08 20:42:57 -08:00
..
__init__.py [tests] add client-sdk pytests & delete client.py (#638) 2024-12-16 12:04:56 -08:00
dog.png fix vllm base64 image inference (#815) 2025-01-17 17:07:28 -08:00
test_text_inference.py test: Use JSON tool prompt format for remote::vllm provider (#1019) 2025-02-08 20:42:57 -08:00
test_vision_inference.py test: Split inference tests to text and vision (#1008) 2025-02-07 09:35:49 -08:00