llama-stack-mirror/llama_stack
Yuan Tang b89f94c674 Expose LLAMASTACK_PORT in cli.stack.run (#722)
This was missed in https://github.com/meta-llama/llama-stack/pull/706. I
tested `llama_stack.distribution.server.server` but didn't test `llama
stack run`. cc @ashwinb

Signed-off-by: Yuan Tang <terrytangyuan@gmail.com>
2025-01-10 09:16:38 -08:00
..
apis agents to use tools api (#673) 2025-01-08 19:01:00 -08:00
cli Expose LLAMASTACK_PORT in cli.stack.run (#722) 2025-01-10 09:16:38 -08:00
distribution rename LLAMASTACK_PORT to LLAMA_STACK_PORT for consistency with other env vars 2025-01-10 09:00:22 -08:00
providers Add persistence for localfs datasets (#557) 2025-01-09 17:34:18 -08:00
scripts Fix to conda env build script 2024-12-17 12:19:34 -08:00
templates rename LLAMASTACK_PORT to LLAMA_STACK_PORT for consistency with other env vars 2025-01-10 09:00:22 -08:00
__init__.py export LibraryClient 2024-12-13 12:08:00 -08:00