mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-03 09:53:45 +00:00
Backport of #4001 to release-0.3.x branch. Fixes CI failures on release branches where uv sync can't resolve RC dependencies. ## The Problem On release branches like `release-0.3.x`, pyproject.toml requires `llama-stack-client>=0.3.1rc1`. RC versions only exist on test.pypi, not PyPI. This causes multiple CI failures: 1. `uv sync` fails because it can't resolve RC versions from PyPI 2. pre-commit hooks (uv-lock, codegen) fail for the same reason 3. mypy workflow section needs uv installed ## The Solution Configure UV to use test.pypi when on release branches: - Set `UV_INDEX_URL=https://test.pypi.org/simple/` (primary) - Set `UV_EXTRA_INDEX_URL=https://pypi.org/simple/` (fallback) - Set `UV_INDEX_STRATEGY=unsafe-best-match` to check both indexes This allows `uv sync` to resolve common packages from PyPI and RC versions from test.pypi. ## Additional Fixes - Export UV env vars to `GITHUB_ENV` so pre-commit hooks inherit them - Install uv in pre-commit workflow for mypy section - Handle missing `type_checking` dependency group on release-0.3.x - Regenerate uv.lock with RC versions for the release branch ## Changes - Created reusable `install-llama-stack-client` action for configuration - Modified `setup-runner` to set UV environment variables before sync - Modified `pre-commit` workflow to configure client and export env vars - Updated uv.lock with RC versions from test.pypi This is a cherry-pick of commitsafa9f0882,c86e6e906,626639bee, and081566321from main, plus additional fixes for release branch compatibility.
55 lines
1.7 KiB
YAML
55 lines
1.7 KiB
YAML
name: 'Setup Test Environment'
|
|
description: 'Common setup steps for integration tests including dependencies, providers, and build'
|
|
|
|
inputs:
|
|
python-version:
|
|
description: 'Python version to use'
|
|
required: true
|
|
client-version:
|
|
description: 'Client version (latest or published)'
|
|
required: true
|
|
setup:
|
|
description: 'Setup to configure (ollama, vllm, gpt, etc.)'
|
|
required: false
|
|
default: 'ollama'
|
|
suite:
|
|
description: 'Test suite to use: base, responses, vision, etc.'
|
|
required: false
|
|
default: ''
|
|
inference-mode:
|
|
description: 'Inference mode (record or replay)'
|
|
required: true
|
|
|
|
runs:
|
|
using: 'composite'
|
|
steps:
|
|
- name: Install dependencies
|
|
uses: ./.github/actions/setup-runner
|
|
with:
|
|
python-version: ${{ inputs.python-version }}
|
|
client-version: ${{ inputs.client-version }}
|
|
|
|
- name: Setup ollama
|
|
if: ${{ (inputs.setup == 'ollama' || inputs.setup == 'ollama-vision') && inputs.inference-mode == 'record' }}
|
|
uses: ./.github/actions/setup-ollama
|
|
with:
|
|
suite: ${{ inputs.suite }}
|
|
|
|
- name: Setup vllm
|
|
if: ${{ inputs.setup == 'vllm' && inputs.inference-mode == 'record' }}
|
|
uses: ./.github/actions/setup-vllm
|
|
|
|
- name: Build Llama Stack
|
|
shell: bash
|
|
run: |
|
|
# Client is already installed by setup-runner via install-llama-stack-client action
|
|
echo "Building Llama Stack"
|
|
|
|
LLAMA_STACK_DIR=. \
|
|
uv run --no-sync llama stack list-deps ci-tests | xargs -L1 uv pip install
|
|
|
|
- name: Configure git for commits
|
|
shell: bash
|
|
run: |
|
|
git config --local user.email "github-actions[bot]@users.noreply.github.com"
|
|
git config --local user.name "github-actions[bot]"
|