mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-03 09:53:45 +00:00
# What does this PR do? User reports in https://github.com/meta-llama/llama-stack/issues/1769#issuecomment-2755564632 that Agent uses tool even on a prompt 'Hello'. Updated the default prompt. Also move the instruction part out of `function_description` so that user can override it if desired. ## Test Plan <img width="1344" alt="image" src="https://github.com/user-attachments/assets/c606d65d-071f-4211-a719-b4742676acda" /> Also performance on 100 hotpotqa questions are similar to the current prompt. |
||
|---|---|---|
| .. | ||
| multimodal | ||
| prompt_templates | ||
| quantization | ||
| __init__.py | ||
| args.py | ||
| chat_format.py | ||
| dog.jpg | ||
| generation.py | ||
| interface.py | ||
| model.py | ||
| pasta.jpeg | ||
| template_data.py | ||
| tokenizer.model | ||
| tokenizer.py | ||
| tool_utils.py | ||