mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-10-24 08:47:26 +00:00
|
Some checks failed
SqlStore Integration Tests / test-postgres (3.13) (push) Failing after 1s
Installer CI / lint (push) Failing after 3s
Test External Providers Installed via Module / test-external-providers-from-module (venv) (push) Has been skipped
Integration Tests (Replay) / Integration Tests (, , , client=, ) (push) Failing after 3s
Python Package Build Test / build (3.13) (push) Failing after 4s
SqlStore Integration Tests / test-postgres (3.12) (push) Failing after 6s
Python Package Build Test / build (3.12) (push) Failing after 5s
Unit Tests / unit-tests (3.12) (push) Failing after 5s
Installer CI / smoke-test-on-dev (push) Failing after 11s
Unit Tests / unit-tests (3.13) (push) Failing after 8s
API Conformance Tests / check-schema-compatibility (push) Successful in 15s
Vector IO Integration Tests / test-matrix (push) Failing after 18s
Test External API and Providers / test-external (venv) (push) Failing after 17s
Integration Auth Tests / test-matrix (oauth2_token) (push) Failing after 44s
UI Tests / ui-tests (22) (push) Successful in 1m28s
Pre-commit / pre-commit (push) Successful in 2m27s
# What does this PR do? ## Test Plan .venv ❯ sh ./scripts/install.sh ⚠️ Found existing container(s) for 'ollama-server', removing... ⚠️ Found existing container(s) for 'llama-stack', removing... ⚠️ Found existing container(s) for 'jaeger', removing... ⚠️ Found existing container(s) for 'otel-collector', removing... ⚠️ Found existing container(s) for 'prometheus', removing... ⚠️ Found existing container(s) for 'grafana', removing... 📡 Starting telemetry stack... 🦙 Starting Ollama... ⏳ Waiting for Ollama daemon... 📦 Ensuring model is pulled: llama3.2:3b... 🦙 Starting Llama Stack... ⏳ Waiting for Llama Stack API... .. 🎉 Llama Stack is ready! 👉 API endpoint: http://localhost:8321 📖 Documentation: https://llamastack.github.io/latest/references/api_reference/index.html 💻 To access the llama stack CLI, exec into the container: docker exec -ti llama-stack bash 📡 Telemetry dashboards: Jaeger UI: http://localhost:16686 Prometheus UI: http://localhost:9090 Grafana UI: http://localhost:3000 (admin/admin) OTEL Collector: http://localhost:4318 🐛 Report an issue @ https://github.com/llamastack/llama-stack/issues if you think it's a bug |
||
|---|---|---|
| .. | ||
| github | ||
| telemetry | ||
| check-init-py.sh | ||
| check-workflows-use-hashes.sh | ||
| diagnose_recordings.py | ||
| distro_codegen.py | ||
| docker.sh | ||
| gen-changelog.py | ||
| gen-ci-docs.py | ||
| generate_prompt_format.py | ||
| get_setup_env.py | ||
| install.sh | ||
| integration-tests.sh | ||
| normalize_recordings.py | ||
| provider_codegen.py | ||
| run-ui-linter.sh | ||
| unit-tests.sh | ||