mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-10-03 19:57:35 +00:00
Some checks failed
SqlStore Integration Tests / test-postgres (3.12) (push) Failing after 0s
SqlStore Integration Tests / test-postgres (3.13) (push) Failing after 0s
Integration Auth Tests / test-matrix (oauth2_token) (push) Failing after 1s
Python Package Build Test / build (3.12) (push) Failing after 1s
Test External Providers Installed via Module / test-external-providers-from-module (venv) (push) Has been skipped
Integration Tests (Replay) / Integration Tests (, , , client=, ) (push) Failing after 4s
Python Package Build Test / build (3.13) (push) Failing after 2s
API Conformance Tests / check-schema-compatibility (push) Successful in 6s
Vector IO Integration Tests / test-matrix (push) Failing after 4s
Test External API and Providers / test-external (venv) (push) Failing after 3s
Unit Tests / unit-tests (3.12) (push) Failing after 3s
Unit Tests / unit-tests (3.13) (push) Failing after 4s
UI Tests / ui-tests (22) (push) Successful in 39s
Pre-commit / pre-commit (push) Successful in 1m16s
When we update Stainless (editor changes), the `next` branch gets updated. Eventually when one decides on a release, you land changes into `main`. This is the Stainless workflow. This PR makes sure we follow that workflow by pulling from the `next` branch for our integration tests.
43 lines
1.5 KiB
YAML
43 lines
1.5 KiB
YAML
name: Setup runner
|
|
description: Prepare a runner for the tests (install uv, python, project dependencies, etc.)
|
|
inputs:
|
|
python-version:
|
|
description: The Python version to use
|
|
required: false
|
|
default: "3.12"
|
|
client-version:
|
|
description: The llama-stack-client-python version to test against (latest or published)
|
|
required: false
|
|
default: "latest"
|
|
runs:
|
|
using: "composite"
|
|
steps:
|
|
- name: Install uv
|
|
uses: astral-sh/setup-uv@6b9c6063abd6010835644d4c2e1bef4cf5cd0fca # v6.0.1
|
|
with:
|
|
python-version: ${{ inputs.python-version }}
|
|
version: 0.7.6
|
|
|
|
- name: Install dependencies
|
|
shell: bash
|
|
run: |
|
|
echo "Updating project dependencies via uv sync"
|
|
uv sync --all-groups
|
|
|
|
echo "Installing ad-hoc dependencies"
|
|
uv pip install faiss-cpu
|
|
|
|
# Install llama-stack-client-python based on the client-version input
|
|
if [ "${{ inputs.client-version }}" = "latest" ]; then
|
|
echo "Installing latest llama-stack-client-python from next branch"
|
|
uv pip install git+https://github.com/llamastack/llama-stack-client-python.git@next
|
|
elif [ "${{ inputs.client-version }}" = "published" ]; then
|
|
echo "Installing published llama-stack-client-python from PyPI"
|
|
uv pip install llama-stack-client
|
|
else
|
|
echo "Invalid client-version: ${{ inputs.client-version }}"
|
|
exit 1
|
|
fi
|
|
|
|
echo "Installed llama packages"
|
|
uv pip list | grep llama
|