llama-stack/llama_stack/providers/remote
Dinesh Yeduguru 8af6951106
remove conflicting default for tool prompt format in chat completion (#742)
# What does this PR do?
We are setting a default value of json for tool prompt format, which
conflicts with llama 3.2/3.3 models since they use python list. This PR
changes the defaults to None and in the code, we infer default based on
the model.

Addresses: #695 

Tests:
❯ LLAMA_STACK_BASE_URL=http://localhost:5000 pytest -v
tests/client-sdk/inference/test_inference.py -k
"test_text_chat_completion"

 pytest llama_stack/providers/tests/inference/test_prompt_adapter.py
2025-01-10 10:41:53 -08:00
..
agents [remove import *] clean up import *'s (#689) 2024-12-27 15:45:44 -08:00
datasetio [remove import *] clean up import *'s (#689) 2024-12-27 15:45:44 -08:00
inference remove conflicting default for tool prompt format in chat completion (#742) 2025-01-10 10:41:53 -08:00
memory [remove import *] clean up import *'s (#689) 2024-12-27 15:45:44 -08:00
safety [remove import *] clean up import *'s (#689) 2024-12-27 15:45:44 -08:00
tool_runtime Add X-LlamaStack-Client-Version, rename ProviderData -> Provider-Data (#735) 2025-01-09 11:51:36 -08:00
__init__.py impls -> inline, adapters -> remote (#381) 2024-11-06 14:54:05 -08:00