llama-stack-mirror/docs
Dinesh Yeduguru 8af6951106
remove conflicting default for tool prompt format in chat completion (#742)
# What does this PR do?
We are setting a default value of json for tool prompt format, which
conflicts with llama 3.2/3.3 models since they use python list. This PR
changes the defaults to None and in the code, we infer default based on
the model.

Addresses: #695 

Tests:
❯ LLAMA_STACK_BASE_URL=http://localhost:5000 pytest -v
tests/client-sdk/inference/test_inference.py -k
"test_text_chat_completion"

 pytest llama_stack/providers/tests/inference/test_prompt_adapter.py
2025-01-10 10:41:53 -08:00
..
_static Make a new llama stack image 2024-11-22 23:49:22 -08:00
notebooks agents to use tools api (#673) 2025-01-08 19:01:00 -08:00
openapi_generator Add X-LlamaStack-Client-Version, rename ProviderData -> Provider-Data (#735) 2025-01-09 11:51:36 -08:00
resources remove conflicting default for tool prompt format in chat completion (#742) 2025-01-10 10:41:53 -08:00
source Fixed typo in default VLLM_URL in remote-vllm.md (#723) 2025-01-09 22:34:34 -08:00
to_situate Fix URLs to Llama Stack Read the Docs Webpages (#547) 2024-11-29 10:11:50 -06:00
zero_to_hero_guide Made changes to readme and pinning to llamastack v0.0.61 (#624) 2025-01-02 11:18:07 -08:00
contbuild.sh Fix broken links with docs 2024-11-22 20:42:17 -08:00
dog.jpg Support for Llama3.2 models and Swift SDK (#98) 2024-09-25 10:29:58 -07:00
getting_started.ipynb copy getting_started 2024-12-30 10:42:28 -08:00
license_header.txt Initial commit 2024-07-23 08:32:33 -07:00
make.bat first version of readthedocs (#278) 2024-10-22 10:15:58 +05:30
Makefile first version of readthedocs (#278) 2024-10-22 10:15:58 +05:30
requirements.txt [docs] add playground ui docs (#592) 2024-12-12 10:40:38 -08:00