llama-stack/llama_stack/providers/remote/inference/vllm
Dinesh Yeduguru 8af6951106
remove conflicting default for tool prompt format in chat completion (#742)
# What does this PR do?
We are setting a default value of json for tool prompt format, which
conflicts with llama 3.2/3.3 models since they use python list. This PR
changes the defaults to None and in the code, we infer default based on
the model.

Addresses: #695 

Tests:
❯ LLAMA_STACK_BASE_URL=http://localhost:5000 pytest -v
tests/client-sdk/inference/test_inference.py -k
"test_text_chat_completion"

 pytest llama_stack/providers/tests/inference/test_prompt_adapter.py
2025-01-10 10:41:53 -08:00
..
__init__.py impls -> inline, adapters -> remote (#381) 2024-11-06 14:54:05 -08:00
config.py Auto-generate distro yamls + docs (#468) 2024-11-18 14:57:06 -08:00
vllm.py remove conflicting default for tool prompt format in chat completion (#742) 2025-01-10 10:41:53 -08:00