mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-06-27 18:50:41 +00:00
fix: Updating images so that they are able to run without root access (#1208)
# What does this PR do? Addresses issues where the container is unable to run as root. Gives write access to required folders. [//]: # (If resolving an issue, uncomment and update the line below) (Closes #[1207]) ## Test Plan I built locally and ran `llama stack build --template remote-vllm --image-type container` and validated I could see my changes in the output: ``` #11 1.186 Installed 11 packages in 61ms #11 1.186 + llama-models==0.1.3 #11 1.186 + llama-stack==0.1.3 #11 1.186 + llama-stack-client==0.1.3 #11 1.186 + markdown-it-py==3.0.0 #11 1.186 + mdurl==0.1.2 #11 1.186 + prompt-toolkit==3.0.50 #11 1.186 + pyaml==25.1.0 #11 1.186 + pygments==2.19.1 #11 1.186 + rich==13.9.4 #11 1.186 + tiktoken==0.9.0 #11 1.186 + wcwidth==0.2.13 #11 DONE 1.6s #12 [ 9/10] RUN mkdir -p /.llama /.cache #12 DONE 0.3s #13 [10/10] RUN chmod -R g+rw /app /.llama /.cache #13 DONE 0.3s #14 exporting to image #14 exporting layers #14 exporting layers 3.7s done #14 writing image sha256:11cc8bd954db6d036037bcaf471b173ddd5261ac4b1e72074cccf85d18aefb96 done #14 naming to docker.io/library/distribution-remote-vllm:0.1.3 done #14 DONE 3.7s + set +x Success! ``` This is what the resulting image looks like:  Also tagged the image as `0.1.3-test` and [pushed to quay](https://quay.io/repository/jland/distribution-remote-vllm?tab=tags) (note there are a bunch of critical vulnerabilities we may want to look into) And for good measure I deployed the resulting image on my Openshift environment using the default Security Context and validated that there were no issue with it coming up. My validation was all done with the `vllm-remote` distribution, but if I am understanding everything correctly the other distributions are just different run.yaml configs. [//]: # (## Documentation) Please let me know if there is anything else I need to do. Co-authored-by: Jamie Land <hokie10@gmail.com>
This commit is contained in:
parent
6634864b19
commit
840fae2259
1 changed files with 9 additions and 0 deletions
|
@ -177,6 +177,15 @@ ENTRYPOINT ["python", "-m", "llama_stack.distribution.server.server"]
|
|||
EOF
|
||||
fi
|
||||
|
||||
# Add other require item commands genearic to all containers
|
||||
add_to_container << EOF
|
||||
|
||||
# Allows running as non-root user
|
||||
RUN mkdir -p /.llama /.cache
|
||||
|
||||
RUN chmod -R g+rw /app /.llama /.cache
|
||||
EOF
|
||||
|
||||
printf "Containerfile created successfully in $TEMP_DIR/Containerfile\n\n"
|
||||
cat $TEMP_DIR/Containerfile
|
||||
printf "\n"
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue