llama-stack/llama_stack/distribution
ehhuang c9ab72fa82
Support sys_prompt behavior in inference (#937)
# What does this PR do?

The current default system prompt for llama3.2 tends to overindex on
tool calling and doesn't work well when the prompt does not require tool
calling.

This PR adds an option to override the default system prompt, and
organizes tool-related configs into a new config object.

- [ ] Addresses issue (#issue)


## Test Plan

python -m unittest
llama_stack.providers.tests.inference.test_prompt_adapter


## Sources

Please link relevant resources if necessary.


## Before submitting

- [ ] This PR fixes a typo or improves the docs (you can dismiss the
other checks if that's the case).
- [ ] Ran pre-commit to handle lint / formatting issues.
- [ ] Read the [contributor
guideline](https://github.com/meta-llama/llama-stack/blob/main/CONTRIBUTING.md),
      Pull Request section?
- [ ] Updated relevant documentation.
- [ ] Wrote necessary unit or integration tests.
---
[//]: # (BEGIN SAPLING FOOTER)
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with
[ReviewStack](https://reviewstack.dev/meta-llama/llama-stack/pull/937).
* #938
* __->__ #937
2025-02-03 23:35:16 -08:00
..
routers Support sys_prompt behavior in inference (#937) 2025-02-03 23:35:16 -08:00
server Fix precommit check after moving to ruff (#927) 2025-02-02 06:46:45 -08:00
store Fix precommit check after moving to ruff (#927) 2025-02-02 06:46:45 -08:00
ui Fix precommit check after moving to ruff (#927) 2025-02-02 06:46:45 -08:00
utils Fix precommit check after moving to ruff (#927) 2025-02-02 06:46:45 -08:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
build.py Fix precommit check after moving to ruff (#927) 2025-02-02 06:46:45 -08:00
build_conda_env.sh Fix uv pip install timeout issue for PyTorch (#929) 2025-02-03 06:39:35 -08:00
build_container.sh Fix uv pip install timeout issue for PyTorch (#929) 2025-02-03 06:39:35 -08:00
build_venv.sh Fix uv pip install timeout issue for PyTorch (#929) 2025-02-03 06:39:35 -08:00
client.py Fix precommit check after moving to ruff (#927) 2025-02-02 06:46:45 -08:00
common.sh API Updates (#73) 2024-09-17 19:51:35 -07:00
configure.py Fix precommit check after moving to ruff (#927) 2025-02-02 06:46:45 -08:00
configure_container.sh More generic image type for OCI-compliant container technologies (#802) 2025-01-17 16:37:42 -08:00
datatypes.py Fix precommit check after moving to ruff (#927) 2025-02-02 06:46:45 -08:00
distribution.py Fix precommit check after moving to ruff (#927) 2025-02-02 06:46:45 -08:00
inspect.py REST API fixes (#789) 2025-01-16 13:47:08 -08:00
library_client.py Fix precommit check after moving to ruff (#927) 2025-02-02 06:46:45 -08:00
request_headers.py Add X-LlamaStack-Client-Version, rename ProviderData -> Provider-Data (#735) 2025-01-09 11:51:36 -08:00
resolver.py Fix precommit check after moving to ruff (#927) 2025-02-02 06:46:45 -08:00
stack.py Fix precommit check after moving to ruff (#927) 2025-02-02 06:46:45 -08:00
start_conda_env.sh Make llama stack build not create a new conda by default (#788) 2025-01-16 13:44:53 -08:00
start_container.sh Ensure llama stack build --config <> --image-type <> works (#879) 2025-01-25 11:13:36 -08:00