llama-stack-mirror/llama_stack
Russell Bryant c39ba23508 Fix podman+selinux compatibility
When I ran `llama stack configure` for my `docker` based stack on my
system using podman + SELinux (CentOS Stream 9), The `podman run`
command failed due to SELinux blocking access to the volume mount.

As a simple fix, disable SELinux label checking.

Signed-off-by: Russell Bryant <rbryant@redhat.com>
2024-09-28 13:49:37 +00:00
..
apis Support for Llama3.2 models and Swift SDK (#98) 2024-09-25 10:29:58 -07:00
cli minor typo and HuggingFace -> Hugging Face (#113) 2024-09-26 09:48:23 -07:00
distribution Fix podman+selinux compatibility 2024-09-28 13:49:37 +00:00
providers load models using hf model id (#108) 2024-09-25 18:40:09 -07:00
scripts Add a test for CLI, but not fully done so disabled 2024-09-19 13:27:07 -07:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00