llama-stack/llama_stack/providers/remote/inference/vllm
Sébastien Han 6ee319ae08
fix: convert boolean string to boolean (#2284)
# What does this PR do?

Handles the case where the vllm config `tls_verify` is set to `false` or
`true`.

Closes: https://github.com/meta-llama/llama-stack/issues/2283

Signed-off-by: Sébastien Han <seb@redhat.com>
2025-05-27 13:05:38 -07:00
..
__init__.py Fix precommit check after moving to ruff (#927) 2025-02-02 06:46:45 -08:00
config.py fix: convert boolean string to boolean (#2284) 2025-05-27 13:05:38 -07:00
vllm.py chore: allow to pass CA cert to remote vllm (#2266) 2025-05-26 20:59:03 +02:00