test: Use JSON tool prompt format for remote::vllm provider (#1019)

# What does this PR do?

This PR removes the warnings when running tests for `remote::vllm`
provider:
```
Detected the chat template content format to be 'openai'. You can set `--chat-template-content-format` to override this.
```

## Test Plan

All tests passed without the warning messages shown above.

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>
This commit is contained in:
Yuan Tang 2025-02-08 23:42:57 -05:00 committed by GitHub
parent 80ba9deab1
commit b981b49bfa
No known key found for this signature in database
GPG key ID: B5690EEEBB952194

View file

@ -11,6 +11,7 @@ PROVIDER_TOOL_PROMPT_FORMAT = {
"remote::ollama": "json",
"remote::together": "json",
"remote::fireworks": "json",
"remote::vllm": "json",
}
PROVIDER_LOGPROBS_TOP_K = set(