forked from phoenix-oss/llama-stack-mirror
added support of PYPI_VERSION in stack build (#762)
# What does this PR do? To build a conda env for specific Llama Stack version, e.g. `PYPI_VERSION=0.0.58 llama stack build --template together --image-type conda` will install these in the llamastack-together env: ``` llama_models 0.0.58 llama_stack 0.0.58 llama_stack_client 0.0.58 ``` Without `PYPI_VERSION=`, `llama stack build --template together --image-type conda` installs the latest all. In short, provide a summary of what this PR does and why. Usually, the relevant context should be present in a linked issue. - [ ] Addresses issue (#issue) ## Test Plan Please describe: - tests you ran to verify your changes with result summaries. - provide instructions so it can be reproduced. ## Sources Please link relevant resources if necessary. ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Ran pre-commit to handle lint / formatting issues. - [ ] Read the [contributor guideline](https://github.com/meta-llama/llama-stack/blob/main/CONTRIBUTING.md), Pull Request section? - [ ] Updated relevant documentation. - [ ] Wrote necessary unit or integration tests.
This commit is contained in:
parent
e6e4f0858c
commit
91907b714e
2 changed files with 14 additions and 2 deletions
|
@ -9,6 +9,7 @@
|
|||
LLAMA_MODELS_DIR=${LLAMA_MODELS_DIR:-}
|
||||
LLAMA_STACK_DIR=${LLAMA_STACK_DIR:-}
|
||||
TEST_PYPI_VERSION=${TEST_PYPI_VERSION:-}
|
||||
PYPI_VERSION=${PYPI_VERSION:-}
|
||||
BUILD_PLATFORM=${BUILD_PLATFORM:-}
|
||||
|
||||
if [ "$#" -lt 4 ]; then
|
||||
|
@ -113,7 +114,12 @@ RUN pip install --no-cache --extra-index-url https://test.pypi.org/simple/ \
|
|||
llama-models==$TEST_PYPI_VERSION llama-stack-client==$TEST_PYPI_VERSION llama-stack==$TEST_PYPI_VERSION
|
||||
EOF
|
||||
else
|
||||
add_to_docker "RUN pip install --no-cache llama-stack"
|
||||
if [ -n "$PYPI_VERSION" ]; then
|
||||
SPEC_VERSION="llama-stack==${PYPI_VERSION} llama-models==${PYPI_VERSION} llama-stack-client==${PYPI_VERSION}"
|
||||
else
|
||||
SPEC_VERSION="llama-stack"
|
||||
fi
|
||||
add_to_docker "RUN pip install --no-cache $SPEC_VERSION"
|
||||
fi
|
||||
fi
|
||||
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue