mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-22 22:42:25 +00:00
When podman is used and the registry is omitted, podman will prompt the user. However, we're piping the output of podman to /dev/null and the user will not see the prompt, the script will end abruptly and this is confusing. This commit explicitly uses the docker.io registry for the ollama image and the llama-stack image so that the prompt is avoided. Signed-off-by: Omer Tuchfeld <omer@tuchfeld.dev> |
||
|---|---|---|
| .. | ||
| check-init-py.sh | ||
| check-workflows-use-hashes.sh | ||
| distro_codegen.py | ||
| gen-changelog.py | ||
| generate_prompt_format.py | ||
| install.sh | ||
| provider_codegen.py | ||
| run_client_sdk_tests.py | ||
| setup_telemetry.sh | ||
| unit-tests.sh | ||