llama-stack-mirror/llama_stack/templates/remote-vllm
Gordon Sim 966b482b2e feat: allow the interface on which the server will listen to be configured
Signed-off-by: Gordon Sim <gsim@redhat.com>
2025-05-16 20:04:57 +01:00
..
__init__.py Auto-generate distro yamls + docs (#468) 2024-11-18 14:57:06 -08:00
build.yaml fix: remove code interpeter implementation (#2087) 2025-05-01 14:35:08 -07:00
doc_template.md docs: Add tips for debugging remote vLLM provider (#1992) 2025-04-18 14:47:47 +02:00
run-with-safety.yaml feat: allow the interface on which the server will listen to be configured 2025-05-16 20:04:57 +01:00
run.yaml feat: allow the interface on which the server will listen to be configured 2025-05-16 20:04:57 +01:00
vllm.py fix: remove code interpeter implementation (#2087) 2025-05-01 14:35:08 -07:00