llama-stack-mirror/llama_stack/cli/stack
Ignas Baranauskas c70ca8344f
fix: resolve template name to config path in llama stack run (#2361)
# What does this PR do?
<!-- Provide a short summary of what this PR does and why. Link to
relevant issues if applicable. -->
This PR fixes a bug where running a known template by name using:
`llama stack run ollama`
would fail with the following error:
`ValueError: Config file ollama does not exist`

<!-- If resolving an issue, uncomment and update the line below -->
Closes #2291 

## Test Plan
<!-- Describe the tests you ran to verify your changes with result
summaries. *Provide clear instructions so the plan can be easily
re-executed.* -->
`llama stack run ollama` should work
2025-06-03 14:39:12 -07:00
..
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
_build.py fix: handle None external_providers_dir in build with run arg (#2269) 2025-05-27 09:41:12 +02:00
build.py feat: --image-type argument overrides value in --config build.yaml (#2179) 2025-05-16 14:45:41 -07:00
list_apis.py API Updates (#73) 2024-09-17 19:51:35 -07:00
list_providers.py chore: more mypy fixes (#2029) 2025-05-06 09:52:31 -07:00
list_stacks.py feat: add llama stack rm command (#2127) 2025-05-21 10:25:51 +02:00
remove.py chore: make cprint write to stderr (#2250) 2025-05-24 23:39:57 -07:00
run.py fix: resolve template name to config path in llama stack run (#2361) 2025-06-03 14:39:12 -07:00
stack.py feat: add llama stack rm command (#2127) 2025-05-21 10:25:51 +02:00
utils.py fix: Use CONDA_DEFAULT_ENV presence as a flag to use conda mode (#1555) 2025-03-27 17:13:22 -04:00