llama-stack-mirror/llama_stack/templates/remote-vllm
Yuan Tang 300e6e2702
Fix issue when generating distros (#755)
Addressed comment
https://github.com/meta-llama/llama-stack/pull/723#issuecomment-2581902075.

cc @yanxi0830 

I am not 100% sure if the diff is correct though but this is the result
of running `python llama_stack/scripts/distro_codegen.py`.

---------

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>
2025-01-15 05:34:08 -08:00
..
__init__.py Auto-generate distro yamls + docs (#468) 2024-11-18 14:57:06 -08:00
build.yaml agents to use tools api (#673) 2025-01-08 19:01:00 -08:00
doc_template.md Mark some pages as not-in-toctree explicitly 2024-11-23 15:27:44 -08:00
run-with-safety.yaml agents to use tools api (#673) 2025-01-08 19:01:00 -08:00
run.yaml agents to use tools api (#673) 2025-01-08 19:01:00 -08:00
vllm.py Fix issue when generating distros (#755) 2025-01-15 05:34:08 -08:00