llama-stack-mirror/llama_stack/templates/llama_api
grs b8f7e1504d
feat: allow the interface on which the server will listen to be configured (#2015)
# What does this PR do?

It may not always be desirable to listen on all interfaces, which is the
default. As an example, by listening instead only on a loopback
interface, the server cannot be reached except from within the host it
is run on. This PR makes this configurable, through a CLI option, an env
var or an entry on the config file.

## Test Plan

I ran a server with and without the added CLI argument to verify that
the argument is used if provided, but the default is as it was before if
not.

Signed-off-by: Gordon Sim <gsim@redhat.com>
2025-05-16 12:59:31 -07:00
..
__init__.py feat: add api.llama provider, llama-guard-4 model (#2058) 2025-04-29 10:07:41 -07:00
build.yaml fix: remove code interpeter implementation (#2087) 2025-05-01 14:35:08 -07:00
llama_api.py fix: remove code interpeter implementation (#2087) 2025-05-01 14:35:08 -07:00
run.yaml feat: allow the interface on which the server will listen to be configured (#2015) 2025-05-16 12:59:31 -07:00