llama-stack/llama_stack/providers
ehhuang cfa752fc92
fix: pass tool_prompt_format to chat_formatter (#1198)
Summary:

Need this to format the completion message with tool_calls correctly.
See added unittest.

Test Plan:

python -m unittest
llama_stack.providers.tests.inference.test_prompt_adapter
2025-02-20 21:38:35 -08:00
..
inline Fix sqlite_vec config defaults 2025-02-20 17:50:33 -08:00
registry feat: inference passthrough provider (#1166) 2025-02-19 21:47:00 -08:00
remote fix: BuiltinTool JSON serialization in remote vLLM provider (#1183) 2025-02-20 21:18:37 -08:00
tests fix: pass tool_prompt_format to chat_formatter (#1198) 2025-02-20 21:38:35 -08:00
utils fix: pass tool_prompt_format to chat_formatter (#1198) 2025-02-20 21:38:35 -08:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
datatypes.py chore: move all Llama Stack types from llama-models to llama-stack (#1098) 2025-02-14 09:10:59 -08:00