forked from phoenix-oss/llama-stack-mirror
# What does this PR do?
- Add Github workflow for publishing docker images.
- Manual Inputs
- We can use a (1) TestPyPi version / (2) build via released PyPi
version
**Notes**
- Keep this workflow manually triggered as we don't want to publish
nightly docker images
**Additional Changes**
- Resolve issue with running llama stack build in non-terminal device
```
File "/home/runner/.local/lib/python3.12/site-packages/llama_stack/distribution/utils/exec.py", line 25, in run_with_pty
old_settings = termios.tcgetattr(sys.stdin)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
termios.error: (25, 'Inappropriate ioctl for device')
```
- Modified build_container.sh to work in non-terminal environment
## Test Plan
- Triggered workflow:
3562217878
<img width="1076" alt="image"
src="https://github.com/user-attachments/assets/f1b5cef6-05ab-49c7-b405-53abc9264734"
/>
- Tested published docker image
<img width="702" alt="image"
src="https://github.com/user-attachments/assets/e7135189-65c8-45d8-86f9-9f3be70e380b"
/>
- /tools API endpoints are served so that docker is correctly using the
TestPyPi package
<img width="296" alt="image"
src="https://github.com/user-attachments/assets/bbcaa7fe-c0a4-4d22-b600-90e3c254bbfd"
/>
- Published tagged images:
https://hub.docker.com/repositories/llamastack
<img width="947" alt="image"
src="https://github.com/user-attachments/assets/2a0a0494-4d45-4643-bc29-72154ecc54a5"
/>
## Sources
Please link relevant resources if necessary.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the
other checks if that's the case).
- [ ] Ran pre-commit to handle lint / formatting issues.
- [ ] Read the [contributor
guideline](https://github.com/meta-llama/llama-stack/blob/main/CONTRIBUTING.md),
Pull Request section?
- [ ] Updated relevant documentation.
- [ ] Wrote necessary unit or integration tests.
99 lines
3.9 KiB
YAML
99 lines
3.9 KiB
YAML
name: Docker Build and Publish
|
|
|
|
on:
|
|
workflow_dispatch:
|
|
inputs:
|
|
version:
|
|
description: 'TestPyPI or PyPI version to build (e.g., 0.0.63.dev20250114)'
|
|
required: true
|
|
type: string
|
|
|
|
jobs:
|
|
build-and-push:
|
|
runs-on: ubuntu-latest
|
|
permissions:
|
|
contents: read
|
|
packages: write
|
|
|
|
steps:
|
|
- name: Checkout repository
|
|
uses: actions/checkout@v4
|
|
|
|
- name: Set up Docker Buildx
|
|
uses: docker/setup-buildx-action@v3
|
|
|
|
- name: Log in to the Container registry
|
|
uses: docker/login-action@v3
|
|
with:
|
|
username: ${{ secrets.DOCKERHUB_USERNAME }}
|
|
password: ${{ secrets.DOCKERHUB_TOKEN }}
|
|
|
|
- name: Set version
|
|
id: version
|
|
run: |
|
|
if [ "${{ github.event_name }}" = "push" ]; then
|
|
echo "VERSION=0.0.63.dev20250114" >> $GITHUB_OUTPUT
|
|
else
|
|
echo "VERSION=${{ inputs.version }}" >> $GITHUB_OUTPUT
|
|
fi
|
|
|
|
- name: Check package version availability
|
|
run: |
|
|
# Function to check if version exists in a repository
|
|
check_version() {
|
|
local repo=$1
|
|
local status_code=$(curl -s -o /dev/null -w "%{http_code}" "https://$repo.org/project/llama-stack/${{ steps.version.outputs.version }}")
|
|
return $([ "$status_code" -eq 200 ])
|
|
}
|
|
|
|
# Check TestPyPI first, then PyPI
|
|
if check_version "test.pypi"; then
|
|
echo "Version ${{ steps.version.outputs.version }} found in TestPyPI"
|
|
echo "PYPI_SOURCE=testpypi" >> $GITHUB_ENV
|
|
elif check_version "pypi"; then
|
|
echo "Version ${{ steps.version.outputs.version }} found in PyPI"
|
|
echo "PYPI_SOURCE=pypi" >> $GITHUB_ENV
|
|
else
|
|
echo "Error: Version ${{ steps.version.outputs.version }} not found in either TestPyPI or PyPI"
|
|
exit 1
|
|
fi
|
|
|
|
- name: Install llama-stack
|
|
run: |
|
|
if [ "${{ github.event_name }}" = "push" ]; then
|
|
pip install -e .
|
|
else
|
|
if [ "$PYPI_SOURCE" = "testpypi" ]; then
|
|
pip install --index-url https://test.pypi.org/simple/ --extra-index-url https://pypi.org/simple llama-stack==${{ steps.version.outputs.version }}
|
|
else
|
|
pip install llama-stack==${{ steps.version.outputs.version }}
|
|
fi
|
|
fi
|
|
|
|
- name: Build docker image
|
|
run: |
|
|
TEMPLATES=("ollama" "bedrock" "remote-vllm" "fireworks" "together" "tgi" "meta-reference-gpu")
|
|
for template in "${TEMPLATES[@]}"; do
|
|
if [ "$PYPI_SOURCE" = "testpypi" ]; then
|
|
TEST_PYPI_VERSION=${{ steps.version.outputs.version }} llama stack build --template $template --image-type docker
|
|
else
|
|
PYPI_VERSION=${{ steps.version.outputs.version }} llama stack build --template $template --image-type docker
|
|
fi
|
|
done
|
|
|
|
- name: List docker images
|
|
run: |
|
|
docker images
|
|
|
|
- name: Push to dockerhub
|
|
run: |
|
|
TEMPLATES=("ollama" "bedrock" "remote-vllm" "fireworks" "together" "tgi" "meta-reference-gpu")
|
|
for template in "${TEMPLATES[@]}"; do
|
|
if [ "$PYPI_SOURCE" = "testpypi" ]; then
|
|
docker tag distribution-$template:test-${{ steps.version.outputs.version }} llamastack/distribution-$template:test-${{ steps.version.outputs.version }}
|
|
docker push llamastack/distribution-$template:test-${{ steps.version.outputs.version }}
|
|
else
|
|
docker tag distribution-$template:${{ steps.version.outputs.version }} llamastack/distribution-$template:${{ steps.version.outputs.version }}
|
|
docker push llamastack/distribution-$template:${{ steps.version.outputs.version }}
|
|
fi
|
|
done
|